WorldWideScience

Sample records for marker-based position verification

  1. High-dose intensity-modulated radiotherapy for prostate cancer using daily fiducial marker-based position verification: acute and late toxicity in 331 patients

    International Nuclear Information System (INIS)

    Lips, Irene M; Dehnad, Homan; Gils, Carla H van; Boeken Kruger, Arto E; Heide, Uulke A van der; Vulpen, Marco van

    2008-01-01

    We evaluated the acute and late toxicity after high-dose intensity-modulated radiotherapy (IMRT) with fiducial marker-based position verification for prostate cancer. Between 2001 and 2004, 331 patients with prostate cancer received 76 Gy in 35 fractions using IMRT combined with fiducial marker-based position verification. The symptoms before treatment (pre-treatment) and weekly during treatment (acute toxicity) were scored using the Common Toxicity Criteria (CTC). The goal was to score late toxicity according to the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer (RTOG/EORTC) scale with a follow-up time of at least three years. Twenty-two percent of the patients experienced pre-treatment grade ≥ 2 genitourinary (GU) complaints and 2% experienced grade 2 gastrointestinal (GI) complaints. Acute grade 2 GU and GI toxicity occurred in 47% and 30%, respectively. Only 3% of the patients developed acute grade 3 GU and no grade ≥ 3 GI toxicity occurred. After a mean follow-up time of 47 months with a minimum of 31 months for all patients, the incidence of late grade 2 GU and GI toxicity was 21% and 9%, respectively. Grade ≥ 3 GU and GI toxicity rates were 4% and 1%, respectively, including one patient with a rectal fistula and one patient with a severe hemorrhagic cystitis (both grade 4). In conclusion, high-dose intensity-modulated radiotherapy with fiducial marker-based position verification is well tolerated. The low grade ≥ 3 toxicity allows further dose escalation if the same dose constraints for the organs at risk will be used

  2. High-dose intensity-modulated radiotherapy for prostate cancer using daily fiducial marker-based position verification: acute and late toxicity in 331 patients

    Directory of Open Access Journals (Sweden)

    Boeken Kruger Arto E

    2008-05-01

    Full Text Available Abstract We evaluated the acute and late toxicity after high-dose intensity-modulated radiotherapy (IMRT with fiducial marker-based position verification for prostate cancer. Between 2001 and 2004, 331 patients with prostate cancer received 76 Gy in 35 fractions using IMRT combined with fiducial marker-based position verification. The symptoms before treatment (pre-treatment and weekly during treatment (acute toxicity were scored using the Common Toxicity Criteria (CTC. The goal was to score late toxicity according to the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer (RTOG/EORTC scale with a follow-up time of at least three years. Twenty-two percent of the patients experienced pre-treatment grade ≥ 2 genitourinary (GU complaints and 2% experienced grade 2 gastrointestinal (GI complaints. Acute grade 2 GU and GI toxicity occurred in 47% and 30%, respectively. Only 3% of the patients developed acute grade 3 GU and no grade ≥ 3 GI toxicity occurred. After a mean follow-up time of 47 months with a minimum of 31 months for all patients, the incidence of late grade 2 GU and GI toxicity was 21% and 9%, respectively. Grade ≥ 3 GU and GI toxicity rates were 4% and 1%, respectively, including one patient with a rectal fistula and one patient with a severe hemorrhagic cystitis (both grade 4. In conclusion, high-dose intensity-modulated radiotherapy with fiducial marker-based position verification is well tolerated. The low grade ≥ 3 toxicity allows further dose escalation if the same dose constraints for the organs at risk will be used.

  3. Marker-based quantification of interfractional tumor position variation and the use of markers for setup verification in radiation therapy for esophageal cancer

    NARCIS (Netherlands)

    Jin, Peng; van der Horst, Astrid; de Jong, Rianne; van Hooft, Jeanin E.; Kamphuis, Martijn; van Wieringen, Niek; Machiels, Melanie; Bel, Arjan; Hulshof, Maarten C. C. M.; Alderliesten, Tanja

    2015-01-01

    The aim of this study was to quantify interfractional esophageal tumor position variation using markers and investigate the use of markers for setup verification. Sixty-five markers placed in the tumor volumes of 24 esophageal cancer patients were identified in computed tomography (CT) and follow-up

  4. On marker-based parentage verification via non-linear optimization.

    Science.gov (United States)

    Boerner, Vinzent

    2017-06-15

    Parentage verification by molecular markers is mainly based on short tandem repeat markers. Single nucleotide polymorphisms (SNPs) as bi-allelic markers have become the markers of choice for genotyping projects. Thus, the subsequent step is to use SNP genotypes for parentage verification as well. Recent developments of algorithms such as evaluating opposing homozygous SNP genotypes have drawbacks, for example the inability of rejecting all animals of a sample of potential parents. This paper describes an algorithm for parentage verification by constrained regression which overcomes the latter limitation and proves to be very fast and accurate even when the number of SNPs is as low as 50. The algorithm was tested on a sample of 14,816 animals with 50, 100 and 500 SNP genotypes randomly selected from 40k genotypes. The samples of putative parents of these animals contained either five random animals, or four random animals and the true sire. Parentage assignment was performed by ranking of regression coefficients, or by setting a minimum threshold for regression coefficients. The assignment quality was evaluated by the power of assignment (P[Formula: see text]) and the power of exclusion (P[Formula: see text]). If the sample of putative parents contained the true sire and parentage was assigned by coefficient ranking, P[Formula: see text] and P[Formula: see text] were both higher than 0.99 for the 500 and 100 SNP genotypes, and higher than 0.98 for the 50 SNP genotypes. When parentage was assigned by a coefficient threshold, P[Formula: see text] was higher than 0.99 regardless of the number of SNPs, but P[Formula: see text] decreased from 0.99 (500 SNPs) to 0.97 (100 SNPs) and 0.92 (50 SNPs). If the sample of putative parents did not contain the true sire and parentage was rejected using a coefficient threshold, the algorithm achieved a P[Formula: see text] of 1 (500 SNPs), 0.99 (100 SNPs) and 0.97 (50 SNPs). The algorithm described here is easy to implement

  5. Accuracy of an infrared marker-based patient positioning system (ExacTrac®) for stereotactic body radiotherapy in localizing the planned isocenter using fiducial markers

    Science.gov (United States)

    Montes-Rodríguez, María de los Ángeles; Hernández-Bojórquez, Mariana; Martínez-Gómez, Alma Angélica; Contreras-Pérez, Agustín; Negrete-Hernández, Ingrid Mireya; Hernández-Oviedo, Jorge Omar; Mitsoura, Eleni; Santiago-Concha, Bernardino Gabriel

    2014-11-01

    Stereotactic Body Radiation Therapy (SBRT) requires a controlled immobilization and position monitoring of patient and target. The purpose of this work is to analyze the performance of the imaging system ExacTrac® (ETX) using infrared and fiducial markers. Materials and methods: In order to assure the accuracy of isocenter localization, a Quality Assurance procedure was applied using an infrared marker-based positioning system. Scans were acquired of an inhouse-agar gel and solid water phantom with infrared spheres. In the inner part of the phantom, three reference markers were delineated as reference and one pellet was place internally; which was assigned as the isocenter. The iPlan® RT Dose treatment planning system. Images were exported to the ETX console. Images were acquired with the ETX to check the correctness of the isocenter placement. Adjustments were made in 6D the reference markers were used to fuse the images. Couch shifts were registered. The procedure was repeated for verification purposes. Results: The data recorded of the verifications in translational and rotational movements showed averaged 3D spatial uncertainties of 0.31 ± 0.42 mm respectively 0.82° ± 0.46° in the phantom and the first correction of these uncertainties were of 1.51 ± 1.14 mm respectively and 1.37° ± 0.61°. Conclusions: This study shows a high accuracy and repeatability in positioning the selected isocenter. The ETX-system for verifying the treatment isocenter position has the ability to monitor the tracing position of interest, making it possible to be used for SBRT positioning within uncertainty ≤1mm.

  6. Accuracy of an infrared marker-based patient positioning system (ExacTrac®) for stereotactic body radiotherapy in localizing the planned isocenter using fiducial markers

    Energy Technology Data Exchange (ETDEWEB)

    Montes-Rodríguez, María de los Ángeles, E-mail: angy24538@yahoo.com; Mitsoura, Eleni [Medical Physics Graduate Programme, Universidad Autónoma del Estado de México, Facultad de Medicina, Paseo Tollocan esquina Jesús Carranza Colonia Moderna de la Cruz, C.P. 50180, Toluca, Estado de México (Mexico); Hernández-Bojórquez, Mariana; Martínez-Gómez, Alma Angélica; Contreras-Pérez, Agustín; Negrete-Hernández, Ingrid Mireya; Hernández-Oviedo, Jorge Omar; Santiago-Concha, Bernardino Gabriel [Centro de Cáncer ABC, The American British Cowdray Medical Center I.A.P. (CMABC), Calle Sur 136, no. 116, Colonia las Américas, C.P. 01120, México, D.F. (Mexico)

    2014-11-07

    Stereotactic Body Radiation Therapy (SBRT) requires a controlled immobilization and position monitoring of patient and target. The purpose of this work is to analyze the performance of the imaging system ExacTrac® (ETX) using infrared and fiducial markers. Materials and methods: In order to assure the accuracy of isocenter localization, a Quality Assurance procedure was applied using an infrared marker-based positioning system. Scans were acquired of an inhouse-agar gel and solid water phantom with infrared spheres. In the inner part of the phantom, three reference markers were delineated as reference and one pellet was place internally; which was assigned as the isocenter. The iPlan® RT Dose treatment planning system. Images were exported to the ETX console. Images were acquired with the ETX to check the correctness of the isocenter placement. Adjustments were made in 6D the reference markers were used to fuse the images. Couch shifts were registered. The procedure was repeated for verification purposes. Results: The data recorded of the verifications in translational and rotational movements showed averaged 3D spatial uncertainties of 0.31 ± 0.42 mm respectively 0.82° ± 0.46° in the phantom and the first correction of these uncertainties were of 1.51 ± 1.14 mm respectively and 1.37° ± 0.61°. Conclusions: This study shows a high accuracy and repeatability in positioning the selected isocenter. The ETX-system for verifying the treatment isocenter position has the ability to monitor the tracing position of interest, making it possible to be used for SBRT positioning within uncertainty ≤1mm.

  7. Dosimetric implications of inter- and intrafractional prostate positioning errors during tomotherapy : Comparison of gold marker-based registrations with native MVCT.

    Science.gov (United States)

    Wust, Peter; Joswig, Marc; Graf, Reinhold; Böhmer, Dirk; Beck, Marcus; Barelkowski, Thomasz; Budach, Volker; Ghadjar, Pirus

    2017-09-01

    For high-dose radiation therapy (RT) of prostate cancer, image-guided (IGRT) and intensity-modulated RT (IMRT) approaches are standard. Less is known regarding comparisons of different IGRT techniques and the resulting residual errors, as well as regarding their influences on dose distributions. A total of 58 patients who received tomotherapy-based RT up to 84 Gy for high-risk prostate cancer underwent IGRT based either on daily megavoltage CT (MVCT) alone (n = 43) or the additional use of gold markers (n = 15) under routine conditions. Planned Adaptive (Accuray Inc., Madison, WI, USA) software was used for elaborated offline analysis to quantify residual interfractional prostate positioning errors, along with systematic and random errors and the resulting safety margins after both IGRT approaches. Dosimetric parameters for clinical target volume (CTV) coverage and exposition of organs at risk (OAR) were also analyzed and compared. Interfractional as well as intrafractional displacements were determined. Particularly in the vertical direction, residual interfractional positioning errors were reduced using the gold marker-based approach, but dosimetric differences were moderate and the clinical relevance relatively small. Intrafractional prostate motion proved to be quite high, with displacements of 1-3 mm; however, these did not result in additional dosimetric impairments. Residual interfractional positioning errors were reduced using gold marker-based IGRT; however, this resulted in only slightly different final dose distributions. Therefore, daily MVCT-based IGRT without markers might be a valid alternative.

  8. Dosimetric implications of inter- and intrafractional prostate positioning errors during tomotherapy. Comparison of gold marker-based registrations with native MVCT

    Energy Technology Data Exchange (ETDEWEB)

    Wust, Peter; Joswig, Marc; Graf, Reinhold; Boehmer, Dirk; Beck, Marcus; Barelkowski, Thomasz; Budach, Volker; Ghadjar, Pirus [Charite Universitaetsmedizin Berlin, Department of Radiation Oncology and Radiotherapy, Berlin (Germany)

    2017-09-15

    For high-dose radiation therapy (RT) of prostate cancer, image-guided (IGRT) and intensity-modulated RT (IMRT) approaches are standard. Less is known regarding comparisons of different IGRT techniques and the resulting residual errors, as well as regarding their influences on dose distributions. A total of 58 patients who received tomotherapy-based RT up to 84 Gy for high-risk prostate cancer underwent IGRT based either on daily megavoltage CT (MVCT) alone (n = 43) or the additional use of gold markers (n = 15) under routine conditions. Planned Adaptive (Accuray Inc., Madison, WI, USA) software was used for elaborated offline analysis to quantify residual interfractional prostate positioning errors, along with systematic and random errors and the resulting safety margins after both IGRT approaches. Dosimetric parameters for clinical target volume (CTV) coverage and exposition of organs at risk (OAR) were also analyzed and compared. Interfractional as well as intrafractional displacements were determined. Particularly in the vertical direction, residual interfractional positioning errors were reduced using the gold marker-based approach, but dosimetric differences were moderate and the clinical relevance relatively small. Intrafractional prostate motion proved to be quite high, with displacements of 1-3 mm; however, these did not result in additional dosimetric impairments. Residual interfractional positioning errors were reduced using gold marker-based IGRT; however, this resulted in only slightly different final dose distributions. Therefore, daily MVCT-based IGRT without markers might be a valid alternative. (orig.) [German] Bei der hochdosierten Bestrahlung des Prostatakarzinoms sind die bildgesteuerte (IGRT) und die intensitaetsmodulierte Bestrahlung (IMRT) Standard. Offene Fragen gibt es beim Vergleich von IGRT-Techniken im Hinblick auf residuelle Fehler und Beeinflussungen der Dosisverteilung. Bei 58 Patienten, deren Hochrisiko-Prostatakarzinom am

  9. Role of IGRT in patient positioning and verification

    International Nuclear Information System (INIS)

    Mijnheer, Ben

    2008-01-01

    Image-guided radiation therapy is 'Frequent imaging in the treatment room during a course of radiotherapy to guide the treatment process'. Instrumentation related to IGRT is highlighted. Focus of the lecture was on clinical experience gained by NKI-AVL, such as the use of EPID (electronic portal imaging devices) and CBCT (cone beam computed tomography) and their comparison: good results for head and neck and prostate/bladder patients: portal imaging was replaced by CBCT. After further investigation convincing results for lung patients were obtained: portal imaging was replaced by CBCT. Scan protocols were developed for these patient groups. Since February 2004 CBCT-based decision rules have been developed for: Head and Neck (Bony anatomy); Prostate (Bony anatomy; Soft tissue registration); Lung (Bony anatomy, Soft tissue registration); Brain (Bony anatomy); and Breast, bladder and liver (in progress). Final remarks are as follows: The introduction of various IGRT techniques allowed 3D verification of the position of target volumes and organs at risk just before or during treatment. Because the information is in 3D, or sometimes even in 4D, in principle these IGRT approaches provide more information compared to the use of 2D verification methods (e.g. EPIDs). Clinical data are becoming available to assess quantitatively for which treatment techniques IGRT approaches are advantageous compared to the use of conventional verification methods taking the additional resources (time, money, manpower) into account. (P.A.)

  10. Automatic analysis of intrinsic positional verification films brachytherapy using MATLAB

    International Nuclear Information System (INIS)

    Quiros Higueras, J. D.; Marco Blancas, N. de; Ruiz Rodriguez, J. C.

    2011-01-01

    One of the essential tests in quality control of brachytherapy equipment is verification auto load intrinsic positional radioactive source. A classic method for evaluation is the use of x-ray film and measuring the distance between the marks left by autoradiography of the source with respect to a reference. In our center has developed an automated method of measurement by the radiochromic film scanning and implementation of a macro developed in Matlab, in order to optimize time and reduce uncertainty in the measurement. The purpose of this paper is to describe the method developed, assess their uncertainty and quantify their advantages over the manual method. (Author)

  11. ESTRO ACROP guidelines for positioning, immobilisation and position verification of head and neck patients for radiation therapists

    Directory of Open Access Journals (Sweden)

    Michelle Leech

    2017-03-01

    Full Text Available Background and purpose: Over the last decade, the management of locally advanced head and neck cancers (HNCs has seen a substantial increase in the use of chemoradiation. These guidelines have been developed to assist Radiation TherapisTs (RTTs in positioning, immobilisation and position verification for head and neck cancer patients. Materials and methods: A critical review of the literature was undertaken by the writing committee.Based on the literature review, a survey was developed to ascertain the current positioning, immobilisation and position verification methods for head and neck radiation therapy across Europe. The survey was translated into Italian, German, Greek, Portuguese, Russian, Croatian, French and Spanish.Guidelines were subsequently developed by the writing committee. Results: Results from the survey indicated that a wide variety of treatment practices and treatment verification protocols are in operation for head and neck cancer patients across Europe currently.The guidelines developed are based on the experience and expertise of the writing committee, remaining cognisant of the variations in imaging and immobilisation techniques used currently in Europe. Conclusions: These guidelines have been developed to provide RTTs with guidance on positioning, immobilisation and position verification of HNC patients. The guidelines will also provide RTTs with the means to critically reflect on their own daily clinical practice with this patient group. Keywords: Head and neck, Immobilisation, Positioning, Verification

  12. Secure stand alone positive personnel identity verification system (SSA-PPIV)

    International Nuclear Information System (INIS)

    Merillat, P.D.

    1979-03-01

    The properties of a secure stand-alone positive personnel identity verification system are detailed. The system is designed to operate without the aid of a central computing facility and the verification function is performed in the absence of security personnel. Security is primarily achieved by means of data encryption on a magnetic stripe badge. Several operational configurations are discussed. Advantages and disadvantages of this system compared to a central computer driven system are detailed

  13. Comparison of megavoltage position verification for prostate irradiation based on bony anatomy and implanted fiducials

    International Nuclear Information System (INIS)

    Nederveen, Aart J.; Dehnad, Homan; Heide, Uulke A. van der; Moorselaar, R. Jeroen A. van; Hofman, Pieter; Lagendijk, Jan J.W.

    2003-01-01

    Purpose: The patient position during radiotherapy treatment of prostate cancer can be verified with the help of portal images acquired during treatment. In this study we quantify the clinical consequences of the use of image-based verification based on the bony anatomy and the prostate target itself. Patients and methods: We analysed 2025 portal images and 23 computed tomography (CT) scans from 23 patients with prostate cancer. In all patients gold markers were implanted prior to CT scanning. Statistical data for both random and systematic errors were calculated for displacements of bones and markers and we investigated the effectiveness of an off-line correction protocol. Results: Standard deviations for systematic marker displacement are 2.4 mm in the lateral (LR) direction, 4.4 mm in the anterior-posterior (AP) direction and 3.7 mm in the caudal-cranial direction (CC). Application of off-line position verification based on the marker positions results in a shrinkage of the systematic error to well below 1 mm. Position verification based on the bony anatomy reduces the systematic target uncertainty to 50% in the AP direction and in the LR direction. No reduction was observed in the CC direction. For six out of 23 patients we found an increase of the systematic error after application of bony anatomy-based position verification. Conclusions: We show that even if correction based on the bony anatomy is applied, considerable margins have to be set to account for organ motion. Our study highlights that for individual patients the systematic error can increase after application of bony anatomy-based position verification, whereas the population standard deviation will decrease. Off-line target-based position verification effectively reduces the systematic error to well below 1 mm, thus enabling significant margin reduction

  14. Position paper - peer review and design verification of selected activities

    International Nuclear Information System (INIS)

    Stine, M.D.

    1994-09-01

    Position Paper to develop and document a position on the performance of independent peer reviews on selected design and analysis components of the Title I (preliminary) and Title II (detailed) design phases of the Multi-Function Waste Tank Facility project

  15. Verification of Positional Accuracy of ZVS3003 Geodetic Control ...

    African Journals Online (AJOL)

    The International GPS Service (IGS) has provided GPS orbit products to the scientific community with increased precision and timeliness. Many users interested in geodetic positioning have adopted the IGS precise orbits to achieve centimeter level accuracy and ensure long-term reference frame stability. Positioning with ...

  16. Verification of the Patient Positioning in the Bellyboard Pelvic Radiotherapy

    International Nuclear Information System (INIS)

    Kasabasic, M.; Faj, D.; Smilovic Radojcic, D.; Svabic, M.; Ivkovic, A.; Jurkovic, S.

    2008-01-01

    The size and shape of the treatment fields applied in radiotherapy account for uncertainties in the daily set-up of the patients during the treatment. We investigated the accuracy of daily patient positioning in the bellyboard pelvic radiotherapy in order to find out the magnitude of the patients movement during the treatment. Translational as well as rotational movements of the patients are explored. Film portal imaging is used in order to find patient positioning error during the treatment of the pelvic region. Patients are treated in the prone position using the bellyboard positioning device. Thirty six patients are included in the study; 15 patients were followed during the whole treatment and 21 during the first 5 consecutive treatment days. The image acquisition was completed in 85 percent and systematic and random positioning errors in 453 images are analyzed. (author)

  17. Verification of the positioning of radiotherapy patients with two webcams

    Energy Technology Data Exchange (ETDEWEB)

    Píriz, Gustavo; Lucas, Manuela; Bertini, Diego; Doldán, Raquel, E-mail: oncosur@oncosur.com.uy [Oncosur, Florida (Uruguay); Banguero, Yolma; Frederico, Marcel, E-mail: ybanguero@cin.edu.uy [Unidad de Radioproteccion, Centro de Invetigaciones Nucleares, Universidad de la Republica, Montevideo (Uruguay)

    2017-07-01

    In radiotherapy, high doses are given to the volumes that are to be irradiated and the organs at risk have to be protected. This implies putting great care into the patient's position and the reproducibility of the patient during all treatment sections. In our center the position during the treatment is bounded with thermoplastic masks, and the reproducibility between different sections is checked with daily photos during the treatment. For the evaluation of the photos we stick spheres of 8 mm of diameter to the fixing masks. We captured photos with two webcams that are less than 90 cm from the isocenter, one from the left and the other from the right. A quick visual evaluation allows you to detect changes in the position of the radius of the sphere. We present an absolute calibration methodology of the position of the sphere with two webcams. (author)

  18. Stereotactic Target point Verification in Actual Treatment Position of Radiosurgery

    International Nuclear Information System (INIS)

    Yun, Hyong Geun; Lee, Hyun Koo

    1995-01-01

    Purpose : Authors tried to enhance the safety and accuracy of radiosurgery by verifying stereotactic target point in actual treatment position prior to irradiation. Materials and Methods : Before the actual treatment, several sections of anthropomorphic head phantom were used to create a condition of unknown coordinated of the target point. A film was sand witched between the phantom sections and punctured by sharp needle tip. The tip of the needle represented the target point. The head phantom was fixed to the stereotactic ring and CT scan was done with CT localizer attached to the ring. After the CT scanning, the stereotactic coordinates of the target point were determined. The head phantom was secured to accelerator's treatment couch and the movement of laser isocenter to the stereotactic coordinates determined by CT scanning was performed using target positioner. Accelerator's anteroposterior and lateral portal films were taken using angiographic localizers. The stereotactic coordinates determined by analysis of portal films were compared with the stereotactic coordinates previously determined by CT scanning. Following the correction of discrepancy, the head phantom was irradiated using a stereotactic technique of several arcs. After the irradiation, the film which was sand witched between the phantom sections was developed and the degree of coincidence between the center of the radiation distribution with the target point represented by the hole in the film was measured. In the treatment of actual patients, the way of determining the stereotactic coordinates with CT localizers and angiographic localizers between two sets of coordinates, we proceeded to the irradiation of the actual patient. Results : In the phantom study, the agreement between the center of the radiation distribution and the localized target point was very good. By measuring optical density profiles of the sand witched film along axes that intersected the target point, authors could confirm

  19. Source position verification and dosimetry in HDR brachytherapy using an EPID

    International Nuclear Information System (INIS)

    Smith, R. L.; Taylor, M. L.; McDermott, L. N.; Franich, R. D.; Haworth, A.; Millar, J. L.

    2013-01-01

    Purpose: Accurate treatment delivery in high dose rate (HDR) brachytherapy requires correct source dwell positions and dwell times to be administered relative to each other and to the surrounding anatomy. Treatment delivery inaccuracies predominantly occur for two reasons: (i) anatomical movement or (ii) as a result of human errors that are usually related to incorrect implementation of the planned treatment. Electronic portal imaging devices (EPIDs) were originally developed for patient position verification in external beam radiotherapy and their application has been extended to provide dosimetric information. The authors have characterized the response of an EPID for use with an 192 Ir brachytherapy source to demonstrate its use as a verification device, providing both source position and dosimetric information.Methods: Characterization of the EPID response using an 192 Ir brachytherapy source included investigations of reproducibility, linearity with dose rate, photon energy dependence, and charge build-up effects associated with exposure time and image acquisition time. Source position resolution in three dimensions was determined. To illustrate treatment verification, a simple treatment plan was delivered to a phantom and the measured EPID dose distribution compared with the planned dose.Results: The mean absolute source position error in the plane parallel to the EPID, for dwells measured at 50, 100, and 150 mm source to detector distances (SDD), was determined to be 0.26 mm. The resolution of the z coordinate (perpendicular distance from detector plane) is SDD dependent with 95% confidence intervals of ±0.1, ±0.5, and ±2.0 mm at SDDs of 50, 100, and 150 mm, respectively. The response of the EPID is highly linear to dose rate. The EPID exhibits an over-response to low energy incident photons and this nonlinearity is incorporated into the dose calibration procedure. A distance (spectral) dependent dose rate calibration procedure has been developed. The

  20. Bundle 13 position verification tool description and on-reactor use

    Energy Technology Data Exchange (ETDEWEB)

    Onderwater, T G [Canadian General Electric Co. Ltd., Peterborough, ON (Canada)

    1997-12-31

    To address the Power Pulse problem, Bruce B uses Gap: a comprehensive monitoring program by the station to maintain the gap between the fuel string and the upstream shield plug. The gap must be maintained within a band. The gap must not be so large as to allow excessive reactivity increases or cause high impact forces during reverse flow events. It should also not be so small as to cause crushed fuel during rapid, differential reactor/fuel string cool downs. Rapid cool downs are infrequent. The Bundle 13 Position Verification Tool (BPV tool) role is to independently measure the position of the upstream bundle of the fuel string. The measurements are made on-reactor, on-power and will allow verification of the Gap Management system`s calculated fuel string position. This paper reviews the reasons for developing the BPV tool. Design issues relevant to safe operation in the fuelling machine, fuel channel and fuel handling equipment are also reviewed. Tests ensuring no adverse effects on channel pressure losses are described and actual on-reactor, on-power results are discussed. (author). 4 figs.

  1. Bundle 13 position verification tool description and on-reactor use

    International Nuclear Information System (INIS)

    Onderwater, T.G.

    1996-01-01

    To address the Power Pulse problem, Bruce B uses Gap: a comprehensive monitoring program by the station to maintain the gap between the fuel string and the upstream shield plug. The gap must be maintained within a band. The gap must not be so large as to allow excessive reactivity increases or cause high impact forces during reverse flow events. It should also not be so small as to cause crushed fuel during rapid, differential reactor/fuel string cool downs. Rapid cool downs are infrequent. The Bundle 13 Position Verification Tool (BPV tool) role is to independently measure the position of the upstream bundle of the fuel string. The measurements are made on-reactor, on-power and will allow verification of the Gap Management system's calculated fuel string position. This paper reviews the reasons for developing the BPV tool. Design issues relevant to safe operation in the fuelling machine, fuel channel and fuel handling equipment are also reviewed. Tests ensuring no adverse effects on channel pressure losses are described and actual on-reactor, on-power results are discussed. (author). 4 figs

  2. Device-independence for two-party cryptography and position verification

    DEFF Research Database (Denmark)

    Ribeiro, Jeremy; Thinh, Le Phuc; Kaniewski, Jedrzej

    Quantum communication has demonstrated its usefulness for quantum cryptography far beyond quantum key distribution. One domain is two-party cryptography, whose goal is to allow two parties who may not trust each other to solve joint tasks. Another interesting application is position......-based cryptography whose goal is to use the geographical location of an entity as its only identifying credential. Unfortunately, security of these protocols is not possible against an all powerful adversary. However, if we impose some realistic physical constraints on the adversary, there exist protocols for which...... security can be proven, but these so far relied on the knowledge of the quantum operations performed during the protocols. In this work we give device-independent security proofs of two-party cryptography and Position Verification for memoryless devices under different physical constraints on the adversary...

  3. SU-E-J-115: Graticule for Verification of Treatment Position in Neutron Therapy.

    Science.gov (United States)

    Halford, R; Snyder, M

    2012-06-01

    Until recently the treatment verification for patients undergoing fast neutron therapy at our facility was accomplished through a combination of neutron beam portal films aligned with a graticule mounted on an orthronormal x-ray tube. To eliminate uncertainty with respect to the relative positions of the x-ray graticule and the therapy beam, we have developed a graticule which is placed in the neutron beam itself. For a graticule to be visible on the portal film, the attenuation of the neutron beam by the graticule landmarks must be significantly greater than that of the material in which the landmarks are mounted. Various materials, thicknesses, and mounting points were tried to gain the largest contrast between the graticule landmarks and the mounting material. The final design involved 2 inch steel pins of 0.125 inch diameter captured between two parallel plates of 0.25 inch thick clear acrylic plastic. The distance between the two acrylic plates was 1.625 inches, held together at the perimeter with acrylic sidewall spacers. This allowed the majority of length of the steel pins to be surrounded by air. The pins were set 1 cm apart and mounted at angles parallel to the divergence of the beam dependent on their position within the array. The entire steel pin and acrylic plate assembly was mounted on an acrylic accessory tray to allow for graticule alignment. Despite the inherent difficulties in attenuating fast neutrons, our simple graticule design produces the required difference of attenuation between the arrays of landmarks and the mounting material. The graticule successfully provides an in-beam frame of reference for patient portal verification. © 2012 American Association of Physicists in Medicine.

  4. Second COS FUV Lifetime Position: Verification of FUV Bright Object Aperture (BOA) Operations (FCAL4)

    Science.gov (United States)

    Debes, John H.

    2013-05-01

    As part of the calibration of the second lifetime position on the Cosmic Origins Spectrograph (COS) far-ultraviolet (FUV) detectors, observations of the external target, G191-B2B, were obtained with the G130M, G160M, and G140L gratings in combi- nation with the Bright Object Aperture. The observations were designed to verify the performance of these spectroscopic modes by reproducing similar observations taken during the SM4 Servicing Mission Observatory Verification (SMOV) of COS. These observations allowed for a detailed determination of the spatial location and profile of the spectra from the three gratings, as well as a determination of the spectral resolution of the G130M grating prior to and after the lifetime move. In general, the negligi- ble differences which exist between the two lifetime positions can be attributed to slight differences in the optical path. In particular, the spectral resolution appears to be slightly improved. The stability of the absolute and relative flux calibration was investigated for G130M as well using STIS echelle data of G191-B2B. We determine that the COS ab- solute flux calibration with the BOA is accurate to 10%, and flux calibrated data are reproducible at the 1-2% level since SMOV.

  5. Device independence for two-party cryptography and position verification with memoryless devices

    Science.gov (United States)

    Ribeiro, Jérémy; Thinh, Le Phuc; Kaniewski, Jedrzej; Helsen, Jonas; Wehner, Stephanie

    2018-06-01

    Quantum communication has demonstrated its usefulness for quantum cryptography far beyond quantum key distribution. One domain is two-party cryptography, whose goal is to allow two parties who may not trust each other to solve joint tasks. Another interesting application is position-based cryptography whose goal is to use the geographical location of an entity as its only identifying credential. Unfortunately, security of these protocols is not possible against an all powerful adversary. However, if we impose some realistic physical constraints on the adversary, there exist protocols for which security can be proven, but these so far relied on the knowledge of the quantum operations performed during the protocols. In this work we improve the device-independent security proofs of Kaniewski and Wehner [New J. Phys. 18, 055004 (2016), 10.1088/1367-2630/18/5/055004] for two-party cryptography (with memoryless devices) and we add a security proof for device-independent position verification (also memoryless devices) under different physical constraints on the adversary. We assess the quality of the devices by observing a Bell violation, and, as for Kaniewski and Wehner [New J. Phys. 18, 055004 (2016), 10.1088/1367-2630/18/5/055004], security can be attained for any violation of the Clauser-Holt-Shimony-Horne inequality.

  6. Suitability of markerless EPID tracking for tumor position verification in gated radiotherapy

    International Nuclear Information System (INIS)

    Serpa, Marco; Baier, Kurt; Guckenberger, Matthias; Cremers, Florian; Meyer, Juergen

    2014-01-01

    Purpose: To maximize the benefits of respiratory gated radiotherapy (RGRT) of lung tumors real-time verification of the tumor position is required. This work investigates the feasibility of markerless tracking of lung tumors during beam-on time in electronic portal imaging device (EPID) images of the MV therapeutic beam. Methods: EPID movies were acquired at ∼2 fps for seven lung cancer patients with tumor peak-to-peak motion ranges between 7.8 and 17.9 mm (mean: 13.7 mm) undergoing stereotactic body radiotherapy. The external breathing motion of the abdomen was synchronously measured. Both datasets were retrospectively analyzed inPortalTrack, an in-house developed tracking software. The authors define a three-step procedure to run the simulations: (1) gating window definition, (2) gated-beam delivery simulation, and (3) tumor tracking. First, an amplitude threshold level was set on the external signal, defining the onset of beam-on/-off signals. This information was then mapped onto a sequence of EPID images to generate stamps of beam-on/-hold periods throughout the EPID movies in PortalTrack, by obscuring the frames corresponding to beam-off times. Last, tumor motion in the superior-inferior direction was determined on portal images by the tracking algorithm during beam-on time. The residual motion inside the gating window as well as target coverage (TC) and the marginal target displacement (MTD) were used as measures to quantify tumor position variability. Results: Tumor position monitoring and estimation from beam's-eye-view images during RGRT was possible in 67% of the analyzed beams. For a reference gating window of 5 mm, deviations ranging from 2% to 86% (35% on average) were recorded between the reference and measured residual motion. TC (range: 62%–93%; mean: 77%) losses were correlated with false positives incidence rates resulting mostly from intra-/inter-beam baseline drifts, as well as sudden cycle-to-cycle fluctuations in exhale positions. Both

  7. Verification of multileaf collimator leaf positions using an electronic portal imaging device

    International Nuclear Information System (INIS)

    Samant, Sanjiv S.; Zheng Wei; Parra, Nestor Andres; Chandler, Jason; Gopal, Arun; Wu Jian; Jain Jinesh; Zhu Yunping; Sontag, Marc

    2002-01-01

    An automated method is presented for determining individual leaf positions of the Siemens dual focus multileaf collimator (MLC) using the Siemens BEAMVIEW(PLUS) electronic portal imaging device (EPID). Leaf positions are computed with an error of 0.6 mm at one standard deviation (σ) using separate computations of pixel dimensions, image distortion, and radiation center. The pixel dimensions are calculated by superimposing the film image of a graticule with the corresponding EPID image. A spatial correction is used to compensate for the optical distortions of the EPID, reducing the mean distortion from 3.5 pixels (uncorrected) per localized x-ray marker to 2 pixels (1 mm) for a rigid rotation and 1 pixel for a third degree polynomial warp. A correction for a nonuniform dosimetric response across the field of view of the EPID images is not necessary due to the sharp intensity gradients across leaf edges. The radiation center, calculated from the average of the geometric centers of a square field at 0 deg. and 180 deg. collimator angles, is independent of graticule placement error. Its measured location on the EPID image was stable to within 1 pixel based on 3 weeks of repeated extensions/retractions of the EPID. The MLC leaf positions determined from the EPID images agreed to within a pixel of the corresponding values measured using film and ionization chamber. Several edge detection algorithms were tested: contour, Sobel, Roberts, Prewitt, Laplace, morphological, and Canny. These agreed with each other to within ≤1.2 pixels for the in-air EPID images. Using a test pattern, individual MLC leaves were found to be typically within 1 mm of the corresponding record-and-verify values, with a maximum difference of 1.8 mm, and standard deviations of <0.3 mm in the daily reproducibility. This method presents a fast, automatic, and accurate alternative to using film or a light field for the verification and calibration of the MLC

  8. Numerical deconvolution to enhance sharpness and contrast of portal images for radiotherapy patient positioning verification

    International Nuclear Information System (INIS)

    Looe, H.K.; Uphoff, Y.; Poppe, B.; Carl von Ossietzky Univ., Oldenburg; Harder, D.; Willborn, K.C.

    2012-01-01

    The quality of megavoltage clinical portal images is impaired by physical and geometrical effects. This image blurring can be corrected by a fast numerical two-dimensional (2D) deconvolution algorithm implemented in the electronic portal image device. We present some clinical examples of deconvolved portal images and evaluate the clinical advantages achieved by the improved sharpness and contrast. The principle of numerical 2D image deconvolution and the enhancement of sharpness and contrast thereby achieved are shortly explained. The key concept is the convolution kernel K(x,y), the mathematical equivalent of the smearing or blurring of a picture, and the computer-based elimination of this influence. Enhancements of sharpness and contrast were observed in all clinical portal images investigated. The images of fine bone structures were restored. The identification of organ boundaries and anatomical landmarks was improved, thereby permitting a more accurate comparison with the x-ray simulator radiographs. The visibility of prostate gold markers is also shown to be enhanced by deconvolution. The blurring effects of clinical portal images were eliminated by a numerical deconvolution algorithm that leads to better image sharpness and contrast. The fast algorithm permits the image blurring correction to be performed in real time, so that patient positioning verification with increased accuracy can be achieved in clinical practice. (orig.)

  9. Numerical deconvolution to enhance sharpness and contrast of portal images for radiotherapy patient positioning verification

    Energy Technology Data Exchange (ETDEWEB)

    Looe, H.K.; Uphoff, Y.; Poppe, B. [Pius Hospital, Oldenburg (Germany). Clinic for Radiation Therapy; Carl von Ossietzky Univ., Oldenburg (Germany). WG Medical Radiation Physics; Harder, D. [Georg August Univ., Goettingen (Germany). Medical Physics and Biophysics; Willborn, K.C. [Pius Hospital, Oldenburg (Germany). Clinic for Radiation Therapy

    2012-02-15

    The quality of megavoltage clinical portal images is impaired by physical and geometrical effects. This image blurring can be corrected by a fast numerical two-dimensional (2D) deconvolution algorithm implemented in the electronic portal image device. We present some clinical examples of deconvolved portal images and evaluate the clinical advantages achieved by the improved sharpness and contrast. The principle of numerical 2D image deconvolution and the enhancement of sharpness and contrast thereby achieved are shortly explained. The key concept is the convolution kernel K(x,y), the mathematical equivalent of the smearing or blurring of a picture, and the computer-based elimination of this influence. Enhancements of sharpness and contrast were observed in all clinical portal images investigated. The images of fine bone structures were restored. The identification of organ boundaries and anatomical landmarks was improved, thereby permitting a more accurate comparison with the x-ray simulator radiographs. The visibility of prostate gold markers is also shown to be enhanced by deconvolution. The blurring effects of clinical portal images were eliminated by a numerical deconvolution algorithm that leads to better image sharpness and contrast. The fast algorithm permits the image blurring correction to be performed in real time, so that patient positioning verification with increased accuracy can be achieved in clinical practice. (orig.)

  10. MDEP Generic Common Position No DICWG-03. Common position on verification and validation throughout the life cycle of digital safety systems

    International Nuclear Information System (INIS)

    2013-01-01

    Verification and validation (V and V) is essential throughout the life cycle of nuclear power plant safety systems. This common position applies to V and V activities for digital safety systems throughout their life cycles. This encompasses both the software and hardware of such systems. The Digital Instrumentation and Controls Working Group (DICWG) has agreed that a common position on this topic is warranted given the use of Digital I and C in new reactor designs, its safety implications, and the need to develop a common understanding from the perspectives of regulatory authorities. This action follows the DICWG examination of the regulatory requirements of the participating members and of relevant industry standards and IAEA documents. The DICWG proposes a common position based on its recent experience with the new reactor application reviews and operating plant issues

  11. Verification of patient position and delivery of IMRT by electronic portal imaging

    International Nuclear Information System (INIS)

    Fielding, Andrew L.; Evans, Philip M.; Clark, Catharine H.

    2004-01-01

    Background and purpose: The purpose of the work presented in this paper was to determine whether patient positioning and delivery errors could be detected using electronic portal images of intensity modulated radiotherapy (IMRT). Patients and methods: We carried out a series of controlled experiments delivering an IMRT beam to a humanoid phantom using both the dynamic and multiple static field method of delivery. The beams were imaged, the images calibrated to remove the IMRT fluence variation and then compared with calibrated images of the reference beams without any delivery or position errors. The first set of experiments involved translating the position of the phantom both laterally and in a superior/inferior direction a distance of 1, 2, 5 and 10 mm. The phantom was also rotated 1 and 2 deg. For the second set of measurements the phantom position was kept fixed and delivery errors were introduced to the beam. The delivery errors took the form of leaf position and segment intensity errors. Results: The method was able to detect shifts in the phantom position of 1 mm, leaf position errors of 2 mm, and dosimetry errors of 10% on a single segment of a 15 segment IMRT step and shoot delivery (significantly less than 1% of the total dose). Conclusions: The results of this work have shown that the method of imaging the IMRT beam and calibrating the images to remove the intensity modulations could be a useful tool in verifying both the patient position and the delivery of the beam

  12. Verification of computed tomographic estimates of cochlear implant array position: a micro-CT and histologic analysis.

    Science.gov (United States)

    Teymouri, Jessica; Hullar, Timothy E; Holden, Timothy A; Chole, Richard A

    2011-08-01

    To determine the efficacy of clinical computed tomographic (CT) imaging to verify postoperative electrode array placement in cochlear implant (CI) patients. Nine fresh cadaver heads underwent clinical CT scanning, followed by bilateral CI insertion and postoperative clinical CT scanning. Temporal bones were removed, trimmed, and scanned using micro-CT. Specimens were then dehydrated, embedded in either methyl methacrylate or LR White resin, and sectioned with a diamond wafering saw. Histology sections were examined by 3 blinded observers to determine the position of individual electrodes relative to soft tissue structures within the cochlea. Electrodes were judged to be within the scala tympani, scala vestibuli, or in an intermediate position between scalae. The position of the array could be estimated accurately from clinical CT scans in all specimens using micro-CT and histology as a criterion standard. Verification using micro-CT yielded 97% agreement, and histologic analysis revealed 95% agreement with clinical CT results. A composite, 3-dimensional image derived from a patient's preoperative and postoperative CT images using a clinical scanner accurately estimates the position of the electrode array as determined by micro-CT imaging and histologic analyses. Information obtained using the CT method provides valuable insight into numerous variables of interest to patient performance such as surgical technique, array design, and processor programming and troubleshooting.

  13. A phantom for verification of dwell position and time of a high dose rate brachytherapy source

    International Nuclear Information System (INIS)

    Madebo, M.; Kron, T.; Pillainayagam, J.; Franich, R.

    2012-01-01

    Accuracy of dwell position and reproducibility of dwell time are critical in high dose rate (HDR) brachytherapy. A phantom was designed to verify dwell position and dwell time reproducibility for an Ir-192 HDR stepping source using Computed Radiography (CR). The central part of the phantom, incorporating thin alternating strips of lead and acrylic, was used to measure dwell positions. The outer part of the phantom features recesses containing different absorber materials (lead, aluminium, acrylic and polystyrene foam), and was used for determining reproducibility of dwell times. Dwell position errors of <1 mm were easily detectable using the phantom. The effect of bending a transfer tube was studied with this phantom and no change of clinical significance was observed when varying the curvature of the transfer tube in typical clinical scenarios. Changes of dwell time as low as 0.1 s, the minimum dwell time of the treatment unit, could be detected by choosing dwell times over the four materials that produce identical exposure at the CR detector.

  14. Real-Time Verification of a High-Dose-Rate Iridium 192 Source Position Using a Modified C-Arm Fluoroscope

    Energy Technology Data Exchange (ETDEWEB)

    Nose, Takayuki, E-mail: nose-takayuki@nms.ac.jp [Department of Radiation Oncology, Nippon Medical School Tamanagayama Hospital, Tama (Japan); Chatani, Masashi [Department of Radiation Oncology, Osaka Rosai Hospital, Sakai (Japan); Otani, Yuki [Department of Radiology, Kaizuka City Hospital, Kaizuka (Japan); Teshima, Teruki [Department of Radiation Oncology, Osaka Medical Center for Cancer and Cardiovascular Diseases, Osaka (Japan); Kumita, Shinichirou [Department of Radiology, Nippon Medical School Hospital, Tokyo (Japan)

    2017-03-15

    Purpose: High-dose-rate (HDR) brachytherapy misdeliveries can occur at any institution, and they can cause disastrous results. Even a patient's death has been reported. Misdeliveries could be avoided with real-time verification methods. In 1996, we developed a modified C-arm fluoroscopic verification of an HDR Iridium 192 source position prevent these misdeliveries. This method provided excellent image quality sufficient to detect errors, and it has been in clinical use at our institutions for 20 years. The purpose of the current study is to introduce the mechanisms and validity of our straightforward C-arm fluoroscopic verification method. Methods and Materials: Conventional X-ray fluoroscopic images are degraded by spurious signals and quantum noise from Iridium 192 photons, which make source verification impractical. To improve image quality, we quadrupled the C-arm fluoroscopic X-ray dose per pulse. The pulse rate was reduced by a factor of 4 to keep the average exposure compliant with Japanese medical regulations. The images were then displayed with quarter-frame rates. Results: Sufficient quality was obtained to enable observation of the source position relative to both the applicators and the anatomy. With this method, 2 errors were detected among 2031 treatment sessions for 370 patients within a 6-year period. Conclusions: With the use of a modified C-arm fluoroscopic verification method, treatment errors that were otherwise overlooked were detected in real time. This method should be given consideration for widespread use.

  15. Automated System Calibration and Verification of the Position Measurements for the Los Alamos Isotope Production Facility and the Switchyard Kicker Facilities

    Science.gov (United States)

    Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.

    2004-11-01

    The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.

  16. Verification of the patient positioning for evaluation of PTV margins in radiotherapy of prostate cancer

    International Nuclear Information System (INIS)

    Frohlich, B.F.; Peron, T.M.; Scheid, A.M.; Cardoso, F.; Alves, F.; Alves, M.S.; Dias, T.M.

    2016-01-01

    The work aimed to verify the relative displacements between the patient and the isocenters of the device based on the reproducibility of positioning, and estimates a PTV margins of radiotherapy treatments for prostate cancer. The results of displacements were obtained from a sample of 30 patient and showed values in vertical, longitudinal and lateral directions -0.03 ± 0.48 cm, 0.12 ± 0.47 cm and 0.02 ± 0.53 cm, respectively. PTV margins were calculated resulting in 0.97 cm for vertical direction, 0.85 cm for longitudinal, and 0.98 cm for lateral. (author)

  17. THE POSITION OF INDIRECT EVIDENCE AS VERIFICATION TOOLS IN THE CARTEL CASE

    Directory of Open Access Journals (Sweden)

    Veri Antoni

    2014-06-01

    Full Text Available Indirect (circumstantial evidence, either economic evidence or communication evidence, has been used in cartel cases in many countries such as United States of America, Japan, Australia, Brazil, Malaysia, and others. According to Indonesia criminal procedure law, the position of indirect (circumstantial evidence is categorized as an indication (clue evidence whereas according to Indonesia civil procedure law, indirect (circumstantial evidence is categorized as presumption. Considering the characteristics the antimonopoly law which aims to find material truth, the position of indirect evidence is more properly said to be an indication. Owing to its status as an indication, indirect evidence should be exhibited together with the other direct evidence.   Indirect evidenceatau bukti tidak langsung, baik bukti ekonomi atau bukti komunikasi, telah digunakan dalam kasus-kasus kartel di banyak negara, seperti Amerika Serikat, Jepang, Australia, Brazil, Malaysia, dan lain-lain. Menurut hukum acara pidana Indonesia, posisi bukti tidak langsung dikategorikan sebagai indikasi (bukti petunjuk, padahal menurut hukum acara perdata Indonesia, bukti tidak langsung dikategorikan sebagai praduga. Mengingat karakteristik hukum anti-monopoli yang bertujuan untuk mencari kebenaran materiil, posisi bukti tidak langsung lebih tepat dikatakan indikasi. Karena statusnya sebagai indikasi, bukti tidak langsung harus dipamerkan bersama dengan bukti langsung lainnya.

  18. Patient positioning and its verification: justification and state of the art of conventional practices

    International Nuclear Information System (INIS)

    Maingon, P.; Giraud, J.Y.

    2000-01-01

    Patient immobilization is considered an important part of the quality control program in radiation therapy. The need for patient immobilization according to the symptoms, ensuring both reproducibility and comfort, is described in the recommendations for level 1 published by the SFPM (Societe francaise de physique medicale) and the SFRO (Societe francaise de radiotherapie oncologique). A customized device, associated with regular controls, is required for level 2. Customized procedures are mandatory in clinical practice for head and neck and brain treatments. Internal margins and movements of patients must be analyzed and controlled as deviations of CTV placement relative to the isocenter of linacs. They could be reduced with efficient positioning devices and accurate customized immobilization systems. Different devices could be used according to tumor locations. Except for sub-clavicular tumors, their efficacy remains controversial. (authors)

  19. Invisible marker based augmented reality system

    Science.gov (United States)

    Park, Hanhoon; Park, Jong-Il

    2005-07-01

    Augmented reality (AR) has recently gained significant attention. The previous AR techniques usually need a fiducial marker with known geometry or objects of which the structure can be easily estimated such as cube. Placing a marker in the workspace of the user can be intrusive. To overcome this limitation, we present an AR system using invisible markers which are created/drawn with an infrared (IR) fluorescent pen. Two cameras are used: an IR camera and a visible camera, which are positioned in each side of a cold mirror so that their optical centers coincide with each other. We track the invisible markers using IR camera and visualize AR in the view of visible camera. Additional algorithms are employed for the system to have a reliable performance in the cluttered background. Experimental results are given to demonstrate the viability of the proposed system. As an application of the proposed system, the invisible marker can act as a Vision-Based Identity and Geometry (VBIG) tag, which can significantly extend the functionality of RFID. The invisible tag is the same as RFID in that it is not perceivable while more powerful in that the tag information can be presented to the user by direct projection using a mobile projector or by visualizing AR on the screen of mobile PDA.

  20. Surgical clips for position verification and correction of non-rigid breast tissue in simultaneously integrated boost (SIB) treatments

    International Nuclear Information System (INIS)

    Penninkhof, Joan; Quint, Sandra; Boer, Hans de; Mens, Jan Willem; Heijmen, Ben; Dirkx, Maarten

    2009-01-01

    Background and purpose: The aim of this study is to investigate whether surgical clips in the lumpectomy cavity are representative for position verification of both the tumour bed and the whole breast in simultaneously integrated boost (SIB) treatments. Materials and methods: For a group of 30 patients treated with a SIB technique, kV and MV planar images were acquired throughout the course of the fractionated treatment. The 3D set-up error for the tumour bed was derived by matching the surgical clips (3-8 per patient) in two almost orthogonal planar kV images. By projecting the 3D set-up error derived from the planar kV images to the (u, v)-plane of the tangential beams, the correlation with the 2D set-up error for the whole breast, derived from the MV EPID images, was determined. The stability of relative clip positions during the fractionated treatment was investigated. In addition, for a subgroup of 15 patients, the impact of breathing was determined from fluoroscopic movies acquired at the linac. Results: The clip configurations were stable over the course of radiotherapy, showing an inter-fraction variation (1 SD) of 0.5 mm on average. Between the start and the end of the treatment, the mean distance between the clips and their center of mass was reduced by 0.9 mm. A decrease larger than 2 mm was observed in eight patients (17 clips). The top-top excursion of the clips due to breathing was generally less than 2.5 mm in all directions. The population averages of the difference (±1 SD) between kV and MV matches in the (u, v)-plane were 0.2 ± 1.8 mm and 0.9 ± 1.5 mm, respectively. In 30% of the patients, time trends larger than 3 mm were present over the course of the treatment in either or in both kV and MV match results. Application of the NAL protocol based on the clips reduced the population mean systematic error to less than 2 mm in all directions, both for the tumour bed and the whole breast. Due to the observed time trends, these systematic errors can

  1. Methodology to reduce 6D patient positional shifts into a 3D linear shift and its verification in frameless stereotactic radiotherapy

    Science.gov (United States)

    Sarkar, Biplab; Ray, Jyotirmoy; Ganesh, Tharmarnadar; Manikandan, Arjunan; Munshi, Anusheel; Rathinamuthu, Sasikumar; Kaur, Harpreet; Anbazhagan, Satheeshkumar; Giri, Upendra K.; Roy, Soumya; Jassal, Kanan; Kalyan Mohanti, Bidhu

    2018-04-01

    The aim of this article is to derive and verify a mathematical formulation for the reduction of the six-dimensional (6D) positional inaccuracies of patients (lateral, longitudinal, vertical, pitch, roll and yaw) to three-dimensional (3D) linear shifts. The formulation was mathematically and experimentally tested and verified for 169 stereotactic radiotherapy patients. The mathematical verification involves the comparison of any (one) of the calculated rotational coordinates with the corresponding value from the 6D shifts obtained by cone beam computed tomography (CBCT). The experimental verification involves three sets of measurements using an ArcCHECK phantom, when (i) the phantom was not moved (neutral position: 0MES), (ii) the position of the phantom shifted by 6D shifts obtained from CBCT (6DMES) from neutral position and (iii) the phantom shifted from its neutral position by 3D shifts reduced from 6D shifts (3DMES). Dose volume histogram and statistical comparisons were made between ≤ft and ≤ft . The mathematical verification was performed by a comparison of the calculated and measured yaw (γ°) rotation values, which gave a straight line, Y  =  1X with a goodness of fit as R 2  =  0.9982. The verification, based on measurements, gave a planning target volume receiving 100% of the dose (V100%) as 99.1  ±  1.9%, 96.3  ±  1.8%, 74.3  ±  1.9% and 72.6  ±  2.8% for the calculated treatment planning system values TPSCAL, 0MES, 3DMES and 6DMES, respectively. The statistical significance (p-values: paired sample t-test) of V100% were found to be 0.03 for the paired sample ≤ft and 0.01 for ≤ft . In this paper, a mathematical method to reduce 6D shifts to 3D shifts is presented. The mathematical method is verified by using well-matched values between the measured and calculated γ°. Measurements done on the ArcCHECK phantom also proved that the proposed methodology is correct. The post-correction of the

  2. Automatic analysis of intrinsic positional verification films brachytherapy using MATLAB; Analisis automatico de peliculas de verificacion posicional intrinsica en braqueterapia mediante MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Quiros Higueras, J. D.; Marco Blancas, N. de; Ruiz Rodriguez, J. C.

    2011-07-01

    One of the essential tests in quality control of brachytherapy equipment is verification auto load intrinsic positional radioactive source. A classic method for evaluation is the use of x-ray film and measuring the distance between the marks left by autoradiography of the source with respect to a reference. In our center has developed an automated method of measurement by the radiochromic film scanning and implementation of a macro developed in Matlab, in order to optimize time and reduce uncertainty in the measurement. The purpose of this paper is to describe the method developed, assess their uncertainty and quantify their advantages over the manual method. (Author)

  3. Online Kidney Position Verification Using Non-Contrast Radiographs on a Linear Accelerator with on Board KV X-Ray Imaging Capability

    International Nuclear Information System (INIS)

    Willis, David J.; Kron, Tomas; Hubbard, Patricia; Haworth, Annette; Wheeler, Greg; Duchesne, Gillian M.

    2009-01-01

    The kidneys are dose-limiting organs in abdominal radiotherapy. Kilovoltage (kV) radiographs can be acquired using on-board imager (OBI)-equipped linear accelerators with better soft tissue contrast and lower radiation doses than conventional portal imaging. A feasibility study was conducted to test the suitability of anterior-posterior (AP) non-contrast kV radiographs acquired at treatment time for online kidney position verification. Anthropomorphic phantoms were used to evaluate image quality and radiation dose. Institutional Review Board approval was given for a pilot study that enrolled 5 adults and 5 children. Customized digitally reconstructed radiographs (DRRs) were generated to provide a priori information on kidney shape and position. Radiotherapy treatment staff performed online evaluation of kidney visibility on OBI radiographs. Kidney dose measured in a pediatric anthropomorphic phantom was 0.1 cGy for kV imaging and 1.7 cGy for MV imaging. Kidneys were rated as well visualized in 60% of patients (90% confidence interval, 34-81%). The likelihood of visualization appears to be influenced by the relative AP separation of the abdomen and kidneys, the axial profile of the kidneys, and their relative contrast with surrounding structures. Online verification of kidney position using AP non-contrast kV radiographs on an OBI-equipped linear accelerator appears feasible for patients with suitable abdominal anatomy. Kidney position information provided is limited to 2-dimensional 'snapshots,' but this is adequate in some clinical situations and potentially advantageous in respiratory-correlated treatments. Successful clinical implementation requires customized partial DRRs, appropriate imaging parameters, and credentialing of treatment staff.

  4. Tailoring four-dimensional cone-beam CT acquisition settings for fiducial marker-based image guidance in radiation therapy.

    Science.gov (United States)

    Jin, Peng; van Wieringen, Niek; Hulshof, Maarten C C M; Bel, Arjan; Alderliesten, Tanja

    2018-04-01

    Use of four-dimensional cone-beam CT (4D-CBCT) and fiducial markers for image guidance during radiation therapy (RT) of mobile tumors is challenging due to the trade-off among image quality, imaging dose, and scanning time. This study aimed to investigate different 4D-CBCT acquisition settings for good visibility of fiducial markers in 4D-CBCT. Using these 4D-CBCTs, the feasibility of marker-based 4D registration for RT setup verification and manual respiration-induced motion quantification was investigated. For this, we applied a dynamic phantom with three different breathing motion amplitudes and included two patients with implanted markers. Irrespective of the motion amplitude, for a medium field of view (FOV), marker visibility was improved by reducing the imaging dose per projection and increasing the number of projection images; however, the scanning time was 4 to 8 min. For a small FOV, the total imaging dose and the scanning time were reduced (62.5% of the dose using a medium FOV, 2.5 min) without losing marker visibility. However, the body contour could be missing for a small FOV, which is not preferred in RT. The marker-based 4D setup verification was feasible for both the phantom and patient data. Moreover, manual marker motion quantification can achieve a high accuracy with a mean error of [Formula: see text].

  5. Comparison of measurement methods with a mixed effects procedure accounting for replicated evaluations (COM3PARE): method comparison algorithm implementation for head and neck IGRT positional verification.

    Science.gov (United States)

    Roy, Anuradha; Fuller, Clifton D; Rosenthal, David I; Thomas, Charles R

    2015-08-28

    Comparison of imaging measurement devices in the absence of a gold-standard comparator remains a vexing problem; especially in scenarios where multiple, non-paired, replicated measurements occur, as in image-guided radiotherapy (IGRT). As the number of commercially available IGRT presents a challenge to determine whether different IGRT methods may be used interchangeably, an unmet need conceptually parsimonious and statistically robust method to evaluate the agreement between two methods with replicated observations. Consequently, we sought to determine, using an previously reported head and neck positional verification dataset, the feasibility and utility of a Comparison of Measurement Methods with the Mixed Effects Procedure Accounting for Replicated Evaluations (COM3PARE), a unified conceptual schema and analytic algorithm based upon Roy's linear mixed effects (LME) model with Kronecker product covariance structure in a doubly multivariate set-up, for IGRT method comparison. An anonymized dataset consisting of 100 paired coordinate (X/ measurements from a sequential series of head and neck cancer patients imaged near-simultaneously with cone beam CT (CBCT) and kilovoltage X-ray (KVX) imaging was used for model implementation. Software-suggested CBCT and KVX shifts for the lateral (X), vertical (Y) and longitudinal (Z) dimensions were evaluated for bias, inter-method (between-subject variation), intra-method (within-subject variation), and overall agreement using with a script implementing COM3PARE with the MIXED procedure of the statistical software package SAS (SAS Institute, Cary, NC, USA). COM3PARE showed statistically significant bias agreement and difference in inter-method between CBCT and KVX was observed in the Z-axis (both p - value<0.01). Intra-method and overall agreement differences were noted as statistically significant for both the X- and Z-axes (all p - value<0.01). Using pre-specified criteria, based on intra-method agreement, CBCT was deemed

  6. Multileaf collimator leaf position verification and analysis for adaptive radiation therapy using a video-optical method

    Science.gov (United States)

    Sethna, Sohrab B.

    External beam radiation therapy is commonly used to eliminate and control cancerous tumors. High-energy beams are shaped to match the patient's specific tumor volume, whereby maximizing radiation dose to malignant cells and limiting dose to normal tissue. A multileaf collimator (MLC) consisting of multiple pairs of tungsten leaves is used to conform the radiation beam to the desired treatment field. Advanced treatment methods utilize dynamic MLC settings to conform to multiple treatment fields and provide intensity modulated radiation therapy (IMRT). Future methods would further increase conformity by actively tracking tumor motion caused by patient cardiac and respiratory motion. Leaf position quality assurance for a dynamic MLC is critical as variation between the planned and actual leaf positions could induce significant errors in radiation dose. The goal of this research project is to prototype a video-optical quality assurance system for MLC leaf positions. The system captures light-field images of MLC leaf sequences during dynamic therapy. Image acquisition and analysis software was developed to determine leaf edge positions. The mean absolute difference between QA prototype predicted and caliper measured leaf positions was found to be 0.6 mm with an uncertainty of +/- 0.3 mm. Maximum errors in predicted positions were below 1.0 mm for static fields. The prototype served as a proof of concept for quality assurance of future tumor tracking methods. Specifically, a lung tumor phantom was created to mimic a lung tumor's motion from respiration. The lung tumor video images were superimposed on MLC field video images for visualization and analysis. The toolbox is capable of displaying leaf position, leaf velocity, tumor position, and determining errors between planned and actual treatment fields for dynamic radiation therapy.

  7. Experimental verification of agreement between thermal and real time visual melt-solid interface positions in vertical Bridgman grown germanium

    Science.gov (United States)

    Barber, P. G.; Fripp, A. L.; Debnam, W. J.; Woodell, G.; Berry, R. F.; Simchick, R. T.

    1996-03-01

    Measurements of the liquid-solid interface position during crystal growth were made by observing the discontinuity of the temperature gradient with movable thermocouples in a centerline, quartz capillary placed inside a sealed quartz ampoule of germanium in a vertical Bridgman furnace. Simultaneously, in situ, real time visual observations, using X-ray imaging technology, determined the position of the melt-solid interface. The radiographically detected interface position was several millimeters from the thermal interface position and the direction of displacement depended upon the direction of thermocouple insertion. Minimization of this spurious heat flow was achieved by using an unclad thermocouple that had each of its two wire leads entering the capillary from different ends of the furnace. Using this configuration the visual interface coincided with the thermal interface. Such observations show the utility of using in situ, real time visualization to record the melt-solid interface shape and position during crystal growth; and they suggest improvements in furnace and ampoule designs for use in high thermal gradients.

  8. Patient Position Verification and Corrective Evaluation Using Cone Beam Computed Tomography (CBCT) in Intensity modulated Radiation Therapy

    International Nuclear Information System (INIS)

    Do, Gyeong Min; Jeong, Deok Yang; Kim, Young Bum

    2009-01-01

    Cone beam computed tomography (CBCT) using an on board imager (OBI) can check the movement and setup error in patient position and target volume by comparing with the image of computer simulation treatment in real.time during patient treatment. Thus, this study purposed to check the change and movement of patient position and target volume using CBCT in IMRT and calculate difference from the treatment plan, and then to correct the position using an automated match system and to test the accuracy of position correction using an electronic portal imaging device (EPID) and examine the usefulness of CBCT in IMRT and the accuracy of the automatic match system. The subjects of this study were 3 head and neck patients and 1 pelvis patient sampled from IMRT patients treated in our hospital. In order to investigate the movement of treatment position and resultant displacement of irradiated volume, we took CBCT using OBI mounted on the linear accelerator. Before each IMRT treatment, we took CBCT and checked difference from the treatment plan by coordinate by comparing it with the image of CT simulation. Then, we made correction through the automatic match system of 3D/3D match to match the treatment plan, and verified and evaluated using electronic portal imaging device. When CBCT was compared with the image of CT simulation before treatment, the average difference by coordinate in the head and neck was 0.99 mm vertically, 1.14 mm longitudinally, 4.91 mm laterally, and 1.07 degrees in the rotational direction, showing somewhat insignificant differences by part. In testing after correction, when the image from the electronic portal imaging device was compared with DRR image, it was found that correction had been made accurately with error less than 0.5 mm. By comparing a CBCT image before treatment with a 3D image reconstructed into a volume instead of a 2D image for the patient's setup error and change in the position of the organs and the target, we could measure and

  9. A Novel Marker Based Method to Teeth Alignment in MRI

    Science.gov (United States)

    Luukinen, Jean-Marc; Aalto, Daniel; Malinen, Jarmo; Niikuni, Naoko; Saunavaara, Jani; Jääsaari, Päivi; Ojalammi, Antti; Parkkola, Riitta; Soukka, Tero; Happonen, Risto-Pekka

    2018-04-01

    Magnetic resonance imaging (MRI) can precisely capture the anatomy of the vocal tract. However, the crowns of teeth are not visible in standard MRI scans. In this study, a marker-based teeth alignment method is presented and evaluated. Ten patients undergoing orthognathic surgery were enrolled. Supraglottal airways were imaged preoperatively using structural MRI. MRI visible markers were developed, and they were attached to maxillary teeth and corresponding locations on the dental casts. Repeated measurements of intermarker distances in MRI and in a replica model was compared using linear regression analysis. Dental cast MRI and corresponding caliper measurements did not differ significantly. In contrast, the marker locations in vivo differed somewhat from the dental cast measurements likely due to marker placement inaccuracies. The markers were clearly visible in MRI and allowed for dental models to be aligned to head and neck MRI scans.

  10. PCR-based verification of positive rapid diagnostic tests for intestinal protozoa infections with variable test band intensity.

    Science.gov (United States)

    Becker, Sören L; Müller, Ivan; Mertens, Pascal; Herrmann, Mathias; Zondie, Leyli; Beyleveld, Lindsey; Gerber, Markus; du Randt, Rosa; Pühse, Uwe; Walter, Cheryl; Utzinger, Jürg

    2017-10-01

    Stool-based rapid diagnostic tests (RDTs) for pathogenic intestinal protozoa (e.g. Cryptosporidium spp. and Giardia intestinalis) allow for prompt diagnosis and treatment in resource-constrained settings. Such RDTs can improve individual patient management and facilitate population-based screening programmes in areas without microbiological laboratories for confirmatory testing. However, RDTs are difficult to interpret in case of 'trace' results with faint test band intensities and little is known about whether such ambiguous results might indicate 'true' infections. In a longitudinal study conducted in poor neighbourhoods of Port Elizabeth, South Africa, a total of 1428 stool samples from two cohorts of schoolchildren were examined on the spot for Cryptosporidium spp. and G. intestinalis using an RDT (Crypto/Giardia DuoStrip; Coris BioConcept). Overall, 121 samples were positive for G. intestinalis and the RDT suggested presence of cryptosporidiosis in 22 samples. After a storage period of 9-10 months in cohort 1 and 2-3 months in cohort 2, samples were subjected to multiplex PCR (BD Max™ Enteric Parasite Panel, Becton Dickinson). Ninety-three percent (112/121) of RDT-positive samples for G. intestinalis were confirmed by PCR, with a correlation between RDT test band intensity and quantitative pathogen load present in the sample. For Cryptosporidium spp., all positive RDTs had faintly visible lines and these were negative on PCR. The performance of the BD Max™ PCR was nearly identical in both cohorts, despite the prolonged storage at disrupted cold chain conditions in cohort 1. The Crypto/Giardia DuoStrip warrants further validation in communities with a high incidence of diarrhoea. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Development of marker-based tracking methods for augmented reality applied to NPP maintenance work support and its experimental evaluation

    International Nuclear Information System (INIS)

    Ishii, H.; Fujino, H.; Bian, Z.; Sekiyama, T.; Shimoda, H.; Yoshikawa, H.

    2006-01-01

    In this study, two types of marker-based tracking methods for Augmented Reality have been developed. One is a method which employs line-shaped markers and the other is a method which employs circular-shaped markers. These two methods recognize the markers by means of image processing and calculate the relative position and orientation between the markers and the camera in real time. The line-shaped markers are suitable to be pasted in the buildings such as NPPs where many pipes and tanks exist. The circular-shaped markers are suitable for the case that there are many obstacles and it is difficult to use line-shaped markers because the obstacles hide the part of the line-shaped markers. Both methods can extend the maximum distance between the markers and the camera compared to the legacy marker-based tracking methods. (authors)

  12. Marker-Based Multi-Sensor Fusion Indoor Localization System for Micro Air Vehicles.

    Science.gov (United States)

    Xing, Boyang; Zhu, Quanmin; Pan, Feng; Feng, Xiaoxue

    2018-05-25

    A novel multi-sensor fusion indoor localization algorithm based on ArUco marker is designed in this paper. The proposed ArUco mapping algorithm can build and correct the map of markers online with Grubbs criterion and K-mean clustering, which avoids the map distortion due to lack of correction. Based on the conception of multi-sensor information fusion, the federated Kalman filter is utilized to synthesize the multi-source information from markers, optical flow, ultrasonic and the inertial sensor, which can obtain a continuous localization result and effectively reduce the position drift due to the long-term loss of markers in pure marker localization. The proposed algorithm can be easily implemented in a hardware of one Raspberry Pi Zero and two STM32 micro controllers produced by STMicroelectronics (Geneva, Switzerland). Thus, a small-size and low-cost marker-based localization system is presented. The experimental results show that the speed estimation result of the proposed system is better than Px4flow, and it has the centimeter accuracy of mapping and positioning. The presented system not only gives satisfying localization precision, but also has the potential to expand other sensors (such as visual odometry, ultra wideband (UWB) beacon and lidar) to further improve the localization performance. The proposed system can be reliably employed in Micro Aerial Vehicle (MAV) visual localization and robotics control.

  13. Marker-Based Multi-Sensor Fusion Indoor Localization System for Micro Air Vehicles

    Directory of Open Access Journals (Sweden)

    Boyang Xing

    2018-05-01

    Full Text Available A novel multi-sensor fusion indoor localization algorithm based on ArUco marker is designed in this paper. The proposed ArUco mapping algorithm can build and correct the map of markers online with Grubbs criterion and K-mean clustering, which avoids the map distortion due to lack of correction. Based on the conception of multi-sensor information fusion, the federated Kalman filter is utilized to synthesize the multi-source information from markers, optical flow, ultrasonic and the inertial sensor, which can obtain a continuous localization result and effectively reduce the position drift due to the long-term loss of markers in pure marker localization. The proposed algorithm can be easily implemented in a hardware of one Raspberry Pi Zero and two STM32 micro controllers produced by STMicroelectronics (Geneva, Switzerland. Thus, a small-size and low-cost marker-based localization system is presented. The experimental results show that the speed estimation result of the proposed system is better than Px4flow, and it has the centimeter accuracy of mapping and positioning. The presented system not only gives satisfying localization precision, but also has the potential to expand other sensors (such as visual odometry, ultra wideband (UWB beacon and lidar to further improve the localization performance. The proposed system can be reliably employed in Micro Aerial Vehicle (MAV visual localization and robotics control.

  14. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  15. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  16. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  17. Expert model of decision-making system for efficient orientation of basketball players to positions and roles in the game--empirical verification.

    Science.gov (United States)

    Dezman, B; Trninić, S; Dizdar, D

    2001-06-01

    The purpose of the research was to empirically verify the expert model system designed for more efficient orientation of basketball players to particular positions and /or roles in the game (specialization). Participants were 60 randomly chosen male basketball players (12 players per each position) from the 12 Croatian 1st league teams in season 1998/99. Data were gathered from 10 basketball coaches who estimated overall performance (actual quality) of players on defense (7 variables) and on offense (12 variables). Variables were established by Trninić, Perica and Dizdar. A measure of body height was added to the aforementioned group of variables. The results obtained suggest that the proposed decision-making system can be used as an auxiliary instrument in orienting players to the positions and roles in the game. It has been established that the players have attained the highest grades of overall performance exactly at their primary playing positions in the game. The largest differences were determined between point guards (position 1) and centers (position 5). The greatest difficulties have occurred in determining optimal position for small forwards (position 3), then for shooting guards (position 2) and, last, for power forwards (position 4), because all these basketball players are the most versatile ones. Therefore, reliability of the system is the lowest when it is applied for selecting and orientating players to these positions. Convenient body height significantly contributes to aptitude of these players to play multiple positions and to assume multiple roles in the game. This research has reinforced the thesis that body height is a variable with the greatest influence on orientation of players to particular positions and roles in the game.

  18. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  19. Verification and analysis of the positioning of a source of brachytherapy high dose within an applicator gynecological interstitial fletcher Utrecht TC/RM

    International Nuclear Information System (INIS)

    Panedo Cobos, J. M.; Garcia castejon, M. A.; Huertas Martinez, C.; Gomez-Tejedor Alonso, S.; Rincon Perez, M.; Luna Tirado, J.; Perez Casas, A. M.

    2013-01-01

    Applicators are guides that circulate and are located within the patient brachytherapy sources. Applicators can suffer mechanical deformations due to processes of sterilization or shock, which may result in that the source do not place within these very precise and coincides with the planned. In these cases the planned treatment deviate actually managed. The object of this study is to verify that the position of the source into the dispenser coincides with the planned position, with a procedure that is described. (Author)

  20. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  1. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  2. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    already awash in fissile material and is increasingly threatened by the possible consequences of illicit trafficking in such material. The chemical field poses fewer problems. The ban on chemical weapons is a virtually complete post-Cold War regime, with state-of-the-art concepts and procedures of verification resulting from decades of negotiation. The detection of prohibited materials and activities is the common goal of the nuclear and chemical regimes for which the most intrusive and intensive procedures are activated by the three organizations. Accounting for the strictly peaceful application of dual-use items constitutes the bulk of the work of the inspectorates at the IAEA and the OPCW. A common challenge in both fields is the advance of science and technology in the vast nuclear and chemical industries and the ingenuity of some determined proliferators to deceive by concealing illicit activities under legitimate ones. Inspection procedures and technologies need to keep up with the requirement for flexibility and adaptation to change. The common objective of the three organizations is to assemble and analyze all relevant information in order to conclude reliably whether a State is or is not complying with its treaty obligations. The positive lessons learned from the IAEA's verification experience today are valuable in advancing concepts and technologies that might also benefit the other areas of WMD verification. Together with the emerging, more comprehensive verification practice of the OPCW, they may provide a useful basis for developing common standards, which may in turn help in evaluating the cost-effectiveness of verification methods for the Biological and Toxin Weapons Convention and other components of a WMD control regime

  3. Development of an iterative reconstruction method to overcome 2D detector low resolution limitations in MLC leaf position error detection for 3D dose verification in IMRT

    NARCIS (Netherlands)

    Visser, Ruurd; J., Godart; Wauben, D.J.L.; Langendijk, J.; van 't Veld, A.A.; Korevaar, E.W.

    2016-01-01

    The objective of this study was to introduce a new iterative method to reconstruct multi leaf collimator (MLC) positions based on low resolution ionization detector array measurements and to evaluate its error detection performance. The iterative reconstruction method consists of a fluence model, a

  4. Instant screening and verification of carbapenemase activity in Bacteroides fragilis in positive blood culture, using matrix-assisted laser desorption ionization--time of flight mass spectrometry.

    Science.gov (United States)

    Johansson, Åsa; Nagy, Elisabeth; Sóki, József

    2014-08-01

    Rapid identification of isolates in positive blood cultures are of great importance to secure correct treatment of septicaemic patients. As antimicrobial resistance is increasing, rapid detection of resistance is crucial. Carbapenem resistance in Bacteroides fragilis associated with cfiA-encoded class B metallo-beta-lactamase is emerging. In our study we spiked blood culture bottles with 26 B. fragilis strains with various cfiA-status and ertapenem MICs. By using main spectra specific for cfiA-positive and cfiA-negative B. fragilis strains, isolates could be screened for resistance. To verify strains that were positive in the screening, a carbapenemase assay was performed where the specific peaks of intact and hydrolysed ertapenem were analysed with matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS). We show here that it is possible to correctly identify B. fragilis and to screen for enzymic carbapenem resistance directly from the pellet of positive blood cultures. The carbapenemase assay to verify the presence of the enzyme was successfully performed on the pellet from the direct identification despite the presence of blood components. The result of the procedure was achieved in 3 h. Also the Bruker mass spectrometric β-lactamase assay (MSBL assay) prototype software was proven not only to be based on an algorithm that correlated with the manual inspection of the spectra, but also to improve the interpretation by showing the variation in the dataset. © 2014 The Authors.

  5. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  6. Influence of surface position along the working range of conoscopic holography sensors on dimensional verification of AISI 316 wire EDM machined surfaces.

    Science.gov (United States)

    Fernández, Pedro; Blanco, David; Rico, Carlos; Valiño, Gonzalo; Mateos, Sabino

    2014-03-06

    Conoscopic holography (CH) is a non-contact interferometric technique used for surface digitization which presents several advantages over other optical techniques such as laser triangulation. Among others, the ability for the reconstruction of high-sloped surfaces stands out, and so does its lower dependence on surface optical properties. Nevertheless, similarly to other optical systems, adjustment of CH sensors requires an adequate selection of configuration parameters for ensuring a high quality surface digitizing. This should be done on a surface located as close as possible to the stand-off distance by tuning frequency (F) and power (P) until the quality indicators Signal-to-Noise Ratio (SNR) and signal envelope (Total) meet proper values. However, not all the points of an actual surface are located at the stand-off distance, but they could be located throughout the whole working range (WR). Thus, the quality of a digitized surface may not be uniform. The present work analyses how the quality of a reconstructed surface is affected by its relative position within the WR under different combinations of the parameters F and P. Experiments have been conducted on AISI 316 wire EDM machined flat surfaces. The number of high-quality points digitized as well as distance measurements between different surfaces throughout the WR allowed for comparing the metrological behaviour of the CH sensor with respect to a touch probe (TP) on a CMM.

  7. Influence of Surface Position along the Working Range of Conoscopic Holography Sensors on Dimensional Verification of AISI 316 Wire EDM Machined Surfaces

    Directory of Open Access Journals (Sweden)

    Pedro Fernández

    2014-03-01

    Full Text Available Conoscopic holography (CH is a non-contact interferometric technique used for surface digitization which presents several advantages over other optical techniques such as laser triangulation. Among others, the ability for the reconstruction of high-sloped surfaces stands out, and so does its lower dependence on surface optical properties. Nevertheless, similarly to other optical systems, adjustment of CH sensors requires an adequate selection of configuration parameters for ensuring a high quality surface digitizing. This should be done on a surface located as close as possible to the stand-off distance by tuning frequency (F and power (P until the quality indicators Signal-to-Noise Ratio (SNR and signal envelope (Total meet proper values. However, not all the points of an actual surface are located at the stand-off distance, but they could be located throughout the whole working range (WR. Thus, the quality of a digitized surface may not be uniform. The present work analyses how the quality of a reconstructed surface is affected by its relative position within the WR under different combinations of the parameters F and P. Experiments have been conducted on AISI 316 wire EDM machined flat surfaces. The number of high-quality points digitized as well as distance measurements between different surfaces throughout the WR allowed for comparing the metrological behaviour of the CH sensor with respect to a touch probe (TP on a CMM.

  8. Patient set-up verification by infrared optical localization and body surface sensing in breast radiation therapy

    International Nuclear Information System (INIS)

    Spadea, Maria Francesca; Baroni, Guido; Riboldi, Marco; Orecchia, Roberto; Pedotti, Antonio; Tagaste, Barbara; Garibaldi, Cristina

    2006-01-01

    Background and purpose: The aim of the study was to investigate the clinical application of a technique for patient set-up verification in breast cancer radiotherapy, based on the 3D localization of a hybrid configuration of surface control points. Materials and methods: An infrared optical tracker provided the 3D position of two passive markers and 10 laser spots placed around and within the irradiation field on nine patients. A fast iterative constrained minimization procedure was applied to detect and compensate patient set-up errors, through the control points registration with reference data coming from treatment plan (markers reference position, CT-based surface model). Results: The application of the corrective spatial transformation estimated by the registration procedure led to significant improvement of patient set-up. Median value of 3D errors affecting three additional verification markers within the irradiation field decreased from 5.7 to 3.5 mm. Errors variability (25-75%) decreased from 3.2 to 2.1 mm. Laser spots registration on the reference surface model was documented to contribute substantially to set-up errors compensation. Conclusions: Patient set-up verification through a hybrid set of control points and constrained surface minimization algorithm was confirmed to be feasible in clinical practice and to provide valuable information for the improvement of the quality of patient set-up, with minimal requirement of operator-dependant procedures. The technique combines conveniently the advantages of passive markers based methods and surface registration techniques, by featuring immediate and robust estimation of the set-up accuracy from a redundant dataset

  9. Multivariate analysis for the estimation of target localization errors in fiducial marker-based radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Takamiya, Masanori [Department of Nuclear Engineering, Graduate School of Engineering, Kyoto University, Kyoto 606-8501, Japan and Department of Radiation Oncology and Image-applied Therapy, Graduate School of Medicine, Kyoto University, Kyoto 606-8507 (Japan); Nakamura, Mitsuhiro, E-mail: m-nkmr@kuhp.kyoto-u.ac.jp; Akimoto, Mami; Ueki, Nami; Yamada, Masahiro; Matsuo, Yukinori; Mizowaki, Takashi; Hiraoka, Masahiro [Department of Radiation Oncology and Image-applied Therapy, Graduate School of Medicine, Kyoto University, Kyoto 606-8507 (Japan); Tanabe, Hiroaki [Division of Radiation Oncology, Institute of Biomedical Research and Innovation, Kobe 650-0047 (Japan); Kokubo, Masaki [Division of Radiation Oncology, Institute of Biomedical Research and Innovation, Kobe 650-0047, Japan and Department of Radiation Oncology, Kobe City Medical Center General Hospital, Kobe 650-0047 (Japan); Itoh, Akio [Department of Nuclear Engineering, Graduate School of Engineering, Kyoto University, Kyoto 606-8501 (Japan)

    2016-04-15

    calculated from the MRA ({sup MRA}TLE) increased as |TMD| and |aRM| increased and adversely decreased with each increment of n. The median 3D {sup MRA}TLE was 2.0 mm (range, 0.6–4.3 mm) for n = 1, 1.8 mm (range, 0.4–4.0 mm) for n = 2, and 1.6 mm (range, 0.3–3.7 mm) for n = 3. Although statistical significance between n = 1 and n = 3 was observed in all directions, the absolute average difference and the standard deviation of the {sup MRA}TLE between n = 1 and n = 3 were 0.5 and 0.2 mm, respectively. Conclusions: A large |TMD| and |aRM| increased the differences in TLE between each n; however, the difference in 3D {sup MRA}TLEs was, at most, 0.6 mm. Thus, the authors conclude that it is acceptable to continue fiducial marker-based radiotherapy as long as |TMD| is maintained at ≤58.7 mm for a 3D |aRM|  ≥  10 mm.

  10. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  11. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  12. A Real-Time Marker-Based Visual Sensor Based on a FPGA and a Soft Core Processor.

    Science.gov (United States)

    Tayara, Hilal; Ham, Woonchul; Chong, Kil To

    2016-12-15

    This paper introduces a real-time marker-based visual sensor architecture for mobile robot localization and navigation. A hardware acceleration architecture for post video processing system was implemented on a field-programmable gate array (FPGA). The pose calculation algorithm was implemented in a System on Chip (SoC) with an Altera Nios II soft-core processor. For every frame, single pass image segmentation and Feature Accelerated Segment Test (FAST) corner detection were used for extracting the predefined markers with known geometries in FPGA. Coplanar PosIT algorithm was implemented on the Nios II soft-core processor supplied with floating point hardware for accelerating floating point operations. Trigonometric functions have been approximated using Taylor series and cubic approximation using Lagrange polynomials. Inverse square root method has been implemented for approximating square root computations. Real time results have been achieved and pixel streams have been processed on the fly without any need to buffer the input frame for further implementation.

  13. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  14. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  15. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  16. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  17. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  18. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  19. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  20. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  1. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  2. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  3. Intrafractional tracking accuracy in infrared marker-based hybrid dynamic tumour-tracking irradiation with a gimballed linac

    International Nuclear Information System (INIS)

    Mukumoto, Nobutaka; Nakamura, Mitsuhiro; Yamada, Masahiro; Takahashi, Kunio; Tanabe, Hiroaki; Yano, Shinsuke; Miyabe, Yuki; Ueki, Nami; Kaneko, Shuji; Matsuo, Yukinori; Mizowaki, Takashi; Sawada, Akira; Kokubo, Masaki; Hiraoka, Masahiro

    2014-01-01

    Purpose: To verify the intrafractional tracking accuracy in infrared (IR) marker-based hybrid dynamic tumour tracking irradiation (“IR Tracking”) with the Vero4DRT. Materials and methods: The gimballed X-ray head tracks a moving target by predicting its future position from displacements of IR markers in real-time. Ten lung cancer patients who underwent IR Tracking were enrolled. The 95th percentiles of intrafractional mechanical (iE M 95 ), prediction (iE P 95 ), and overall targeting errors (iE T 95 ) were calculated from orthogonal fluoroscopy images acquired during tracking irradiation and from the synchronously acquired log files. Results: Averaged intrafractional errors were (left–right, cranio-caudal [CC], anterior–posterior [AP]) = (0.1 mm, 0.4 mm, 0.1 mm) for iE M 95 , (1.2 mm, 2.7 mm, 2.1 mm) for iE P 95 , and (1.3 mm, 2.4 mm, 1.4 mm) for iE T 95 . By correcting systematic prediction errors in the previous field, the iE P 95 was reduced significantly, by an average of 0.4 mm in the CC (p < 0.05) and by 0.3 mm in the AP (p < 0.01) directions. Conclusions: Prediction errors were the primary cause of overall targeting errors, whereas mechanical errors were negligible. Furthermore, improvement of the prediction accuracy could be achieved by correcting systematic prediction errors in the previous field

  4. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  5. Marker-based reconstruction of the kinematics of a chain of segments: a new method that incorporates joint kinematic constraints.

    Science.gov (United States)

    Klous, Miriam; Klous, Sander

    2010-07-01

    The aim of skin-marker-based motion analysis is to reconstruct the motion of a kinematical model from noisy measured motion of skin markers. Existing kinematic models for reconstruction of chains of segments can be divided into two categories: analytical methods that do not take joint constraints into account and numerical global optimization methods that do take joint constraints into account but require numerical optimization of a large number of degrees of freedom, especially when the number of segments increases. In this study, a new and largely analytical method for a chain of rigid bodies is presented, interconnected in spherical joints (chain-method). In this method, the number of generalized coordinates to be determined through numerical optimization is three, irrespective of the number of segments. This new method is compared with the analytical method of Veldpaus et al. [1988, "A Least-Squares Algorithm for the Equiform Transformation From Spatial Marker Co-Ordinates," J. Biomech., 21, pp. 45-54] (Veldpaus-method, a method of the first category) and the numerical global optimization method of Lu and O'Connor [1999, "Bone Position Estimation From Skin-Marker Co-Ordinates Using Global Optimization With Joint Constraints," J. Biomech., 32, pp. 129-134] (Lu-method, a method of the second category) regarding the effects of continuous noise simulating skin movement artifacts and regarding systematic errors in joint constraints. The study is based on simulated data to allow a comparison of the results of the different algorithms with true (noise- and error-free) marker locations. Results indicate a clear trend that accuracy for the chain-method is higher than the Veldpaus-method and similar to the Lu-method. Because large parts of the equations in the chain-method can be solved analytically, the speed of convergence in this method is substantially higher than in the Lu-method. With only three segments, the average number of required iterations with the chain

  6. Compatibility of pedigree-based and marker-based relationship matrices for single-step genetic evaluation

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund

    2012-01-01

    Single-step methods for genomic prediction have recently become popular because they are conceptually simple and in practice such a method can completely replace a pedigree-based method for routine genetic evaluation. An issue with single-step methods is compatibility between the marker-based rel...

  7. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  8. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  9. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  10. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  11. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  12. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  13. Construction of an SSR and RAD-Marker Based Molecular Linkage Map of Vigna vexillata (L.) A. Rich.

    Science.gov (United States)

    Marubodee, Rusama; Ogiso-Tanaka, Eri; Isemura, Takehisa; Chankaew, Sompong; Kaga, Akito; Naito, Ken; Ehara, Hiroshi; Tomooka, Norihiko

    2015-01-01

    Vigna vexillata (L.) A. Rich. (tuber cowpea) is an underutilized crop for consuming its tuber and mature seeds. Wild form of V. vexillata is a pan-tropical perennial herbaceous plant which has been used by local people as a food. Wild V. vexillata has also been considered as useful gene(s) source for V. unguiculata (cowpea), since it was reported to have various resistance gene(s) for insects and diseases of cowpea. To exploit the potential of V. vexillata, an SSR-based linkage map of V. vexillata was developed. A total of 874 SSR markers successfully amplified single DNA fragment in V. vexillata among 1,336 SSR markers developed from Vigna angularis (azuki bean), V. unguiculata and Phaseolus vulgaris (common bean). An F2 population of 300 plants derived from a cross between salt resistant (V1) and susceptible (V5) accessions was used for mapping. A genetic linkage map was constructed using 82 polymorphic SSR markers loci, which could be assigned to 11 linkage groups spanning 511.5 cM in length with a mean distance of 7.2 cM between adjacent markers. To develop higher density molecular linkage map and to confirm SSR markers position in a linkage map, RAD markers were developed and a combined SSR and RAD markers linkage map of V. vexillata was constructed. A total of 559 (84 SSR and 475 RAD) markers loci could be assigned to 11 linkage groups spanning 973.9 cM in length with a mean distance of 1.8 cM between adjacent markers. Linkage and genetic position of all SSR markers in an SSR linkage map were confirmed. When an SSR genetic linkage map of V. vexillata was compared with those of V. radiata and V. unguiculata, it was suggested that the structure of V. vexillata chromosome was considerably differentiated. This map is the first SSR and RAD marker-based V. vexillata linkage map which can be used for the mapping of useful traits.

  14. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  15. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  16. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  17. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  18. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  19. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  20. Retail applications of signature verification

    Science.gov (United States)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  1. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  2. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  3. Ghost marker detection and elimination in marker-based optical tracking systems for real-time tracking in stereotactic body radiotherapy

    International Nuclear Information System (INIS)

    Yan, Guanghua; Li, Jonathan; Huang, Yin; Mittauer, Kathryn; Lu, Bo; Liu, Chihray

    2014-01-01

    Purpose: To propose a simple model to explain the origin of ghost markers in marker-based optical tracking systems (OTS) and to develop retrospective strategies to detect and eliminate ghost markers. Methods: In marker-based OTS, ghost markers are virtual markers created due to the cross-talk between the two camera sensors, which can lead to system execution failure or inaccuracy in patient tracking. As a result, the users have to limit the number of markers and avoid certain marker configurations to reduce the chances of ghost markers. In this work, the authors propose retrospective strategies to detect and eliminate ghost markers. The two camera sensors were treated as mathematical points in space. The authors identified the coplanar within limit (CWL) condition as the necessary condition for ghost marker occurrence. A simple ghost marker detection method was proposed based on the model. Ghost marker elimination was achieved through pattern matching: a ghost marker-free reference set was matched with the optical marker set observed by the OTS; unmatched optical markers were eliminated as either ghost markers or misplaced markers. The pattern matching problem was formulated as a constraint satisfaction problem (using pairwise distances as constraints) and solved with an iterative backtracking algorithm. Wildcard markers were introduced to address missing or misplaced markers. An experiment was designed to measure the sensor positions and the limit for the CWL condition. The ghost marker detection and elimination algorithms were verified with samples collected from a five-marker jig and a nine-marker anthropomorphic phantom, rotated with the treatment couch from −60° to +60°. The accuracy of the pattern matching algorithm was further validated with marker patterns from 40 patients who underwent stereotactic body radiotherapy (SBRT). For this purpose, a synthetic optical marker pattern was created for each patient by introducing ghost markers, marker position

  4. Ghost marker detection and elimination in marker-based optical tracking systems for real-time tracking in stereotactic body radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Guanghua, E-mail: yan@ufl.edu; Li, Jonathan; Huang, Yin; Mittauer, Kathryn; Lu, Bo; Liu, Chihray [Department of Radiation Oncology, University of Florida, Gainesville, Florida 32610 (United States)

    2014-10-15

    Purpose: To propose a simple model to explain the origin of ghost markers in marker-based optical tracking systems (OTS) and to develop retrospective strategies to detect and eliminate ghost markers. Methods: In marker-based OTS, ghost markers are virtual markers created due to the cross-talk between the two camera sensors, which can lead to system execution failure or inaccuracy in patient tracking. As a result, the users have to limit the number of markers and avoid certain marker configurations to reduce the chances of ghost markers. In this work, the authors propose retrospective strategies to detect and eliminate ghost markers. The two camera sensors were treated as mathematical points in space. The authors identified the coplanar within limit (CWL) condition as the necessary condition for ghost marker occurrence. A simple ghost marker detection method was proposed based on the model. Ghost marker elimination was achieved through pattern matching: a ghost marker-free reference set was matched with the optical marker set observed by the OTS; unmatched optical markers were eliminated as either ghost markers or misplaced markers. The pattern matching problem was formulated as a constraint satisfaction problem (using pairwise distances as constraints) and solved with an iterative backtracking algorithm. Wildcard markers were introduced to address missing or misplaced markers. An experiment was designed to measure the sensor positions and the limit for the CWL condition. The ghost marker detection and elimination algorithms were verified with samples collected from a five-marker jig and a nine-marker anthropomorphic phantom, rotated with the treatment couch from −60° to +60°. The accuracy of the pattern matching algorithm was further validated with marker patterns from 40 patients who underwent stereotactic body radiotherapy (SBRT). For this purpose, a synthetic optical marker pattern was created for each patient by introducing ghost markers, marker position

  5. Marker based standardization of polyherbal formulation (SJT-DI-02 by high performance thin layer chromatography method

    Directory of Open Access Journals (Sweden)

    Bhakti J Ladva

    2014-01-01

    Full Text Available Background: Preparation of highly standardized herbal products with respect to chemical composition and biological activity is considered to be a valuable approach in this field. SJT-DI-02 polyherbal formulation was successfully developed at our institute and filed for patent at Mumbai patent office. Objective: The present work was marker based standardization of patented, novel and efficacious polyherbal formulation namely SJT-DI-02 for the treatment of diabetes. The SJT-DI-02 was comprised of dried extracts of rhizomes of Acorus calamus, leaves of Aegle marmelose, fruits of Benincasa hispida, roots of Chlorophytum arendinaceum, seeds of Eugenia jambolana, leaves of Ocimum sanctum, pericarp of Punica granatum, seeds of Tamarindus indica. Selected plants were collected, dried and extracted with suitable solvents. The formulation was prepared by mixing different fractions of extracts. Materials and Methods: For successful and best standardization, first of all selection and procurement was carried out. Selection is done on the basis of therapeutic efficacy and amount of the marker present in the particular plant part. At the time of procurement side by side phytochemical screening and estimation of phytoconstituents was carried out. After completion of preliminary screening using characterized markers, we tried to develop best TLC systems using selected solvent composition. Finally well-developed TLC systems were applied in HPTLC. In the present study polyherbal formulation was standardized by using different four markers. TLC Densitometric methods were developed using HPTLC for the quantification of these marker compounds. Solvent systems were optimized to achieve best resolution of the marker compounds from other components of the sample extract. The identity of the bands in the sample extracts were confirmed by comparing the Rf and the absorption spectra by overlaying their UV absorption spectra with those of their respective standards. The

  6. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  7. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  8. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  9. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  10. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  11. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  12. Is identity per se irrelevant? A contrarian view of self-verification effects

    OpenAIRE

    Gregg, Aiden P.

    2008-01-01

    Self-verification theory (SVT) posits that people who hold negative self-views, such as depressive patients, ironically strive to verify that these self-views are correct, by actively seeking out critical feedback or interaction partners who evaluate them unfavorably. Such verification strivings are allegedly directed towards maximizing subjective perceptions of prediction and control. Nonetheless, verification strivings are also alleged to stabilize maladaptive self-perceptions, and thereby ...

  13. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  14. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding

    Science.gov (United States)

    Fu, Yong-Bi; Yang, Mo-Hua; Zeng, Fangqin; Biligetu, Bill

    2017-01-01

    Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST) SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding. PMID:28729875

  15. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding

    Directory of Open Access Journals (Sweden)

    Yong-Bi Fu

    2017-07-01

    Full Text Available Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding.

  16. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding.

    Science.gov (United States)

    Fu, Yong-Bi; Yang, Mo-Hua; Zeng, Fangqin; Biligetu, Bill

    2017-01-01

    Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST) SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding.

  17. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  18. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  19. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  20. Mixed Marker-Based/Marker-Less Visual Odometry System for Mobile Robots

    Directory of Open Access Journals (Sweden)

    Fabrizio Lamberti

    2013-05-01

    Full Text Available Abstract When moving in generic indoor environments, robotic platforms generally rely solely on information provided by onboard sensors to determine their position and orientation. However, the lack of absolute references often leads to the introduction of severe drifts in estimates computed, making autonomous operations really hard to accomplish. This paper proposes a solution to alleviate the impact of the above issues by combining two vision-based pose estimation techniques working on relative and absolute coordinate systems, respectively. In particular, the unknown ground features in the images that are captured by the vertical camera of a mobile platform are processed by a vision-based odometry algorithm, which is capable of estimating the relative frame-to-frame movements. Then, errors accumulated in the above step are corrected using artificial markers displaced at known positions in the environment. The markers are framed from time to time, which allows the robot to maintain the drifts bounded by additionally providing it with the navigation commands needed for autonomous flight. Accuracy and robustness of the designed technique are demonstrated using an off-the-shelf quadrotor via extensive experimental tests.

  1. Variability of Marker-Based Rectal Dose Evaluation in HDR Cervical Brachytherapy

    International Nuclear Information System (INIS)

    Wang Zhou; Jaggernauth, Wainwright; Malhotra, Harish K.; Podgorsak, Matthew B.

    2010-01-01

    In film-based intracavitary brachytherapy for cervical cancer, position of the rectal markers may not accurately represent the anterior rectal wall. This study was aimed at analyzing the variability of rectal dose estimation as a result of interfractional variation of marker placement. A cohort of five patients treated with multiple-fraction tandem and ovoid high-dose-rate (HDR) brachytherapy was studied. The cervical os point and the orientation of the applicators were matched among all fractional plans for each patient. Rectal points obtained from all fractions were then input into each clinical treated plan. New fractional rectal doses were obtained and a new cumulative rectal dose for each patient was calculated. The maximum interfractional variation of distances between rectal dose points and the closest source positions was 1.1 cm. The corresponding maximum variability of fractional rectal dose was 65.5%. The percentage difference in cumulative rectal dose estimation for each patient was 5.4%, 19.6%, 34.6%, 23.4%, and 13.9%, respectively. In conclusion, care should be taken when using rectal markers as reference points for estimating rectal dose in HDR cervical brachytherapy. The best estimate of true rectal dose for each fraction should be determined by the most anterior point among all fractions.

  2. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  3. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  4. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  5. Remote Marker-Based Tracking for UAV Landing Using Visible-Light Camera Sensor.

    Science.gov (United States)

    Nguyen, Phong Ha; Kim, Ki Wan; Lee, Young Won; Park, Kang Ryoung

    2017-08-30

    Unmanned aerial vehicles (UAVs), which are commonly known as drones, have proved to be useful not only on the battlefields where manned flight is considered too risky or difficult, but also in everyday life purposes such as surveillance, monitoring, rescue, unmanned cargo, aerial video, and photography. More advanced drones make use of global positioning system (GPS) receivers during the navigation and control loop which allows for smart GPS features of drone navigation. However, there are problems if the drones operate in heterogeneous areas with no GPS signal, so it is important to perform research into the development of UAVs with autonomous navigation and landing guidance using computer vision. In this research, we determined how to safely land a drone in the absence of GPS signals using our remote maker-based tracking algorithm based on the visible light camera sensor. The proposed method uses a unique marker designed as a tracking target during landing procedures. Experimental results show that our method significantly outperforms state-of-the-art object trackers in terms of both accuracy and processing time, and we perform test on an embedded system in various environments.

  6. Tomotherapy: IMRT and tomographic verification

    International Nuclear Information System (INIS)

    Mackie, T.R.

    2000-01-01

    include MLC's and many clinics use them to replace 90% or more of the field-shaping requirements of conventional radiotherapy. Now, several academic centers are treating patients with IMRT using conventional MLC's to modulate the field. IMRT using conventional MLC's have the advantage that the patient is stationary during the treatment and the MLC's can be used in conventional practice. Nevertheless, tomotherapy using the Peacock system delivers the most conformal dose distributions of any commercial system to date. The biggest limitation with the both the NOMOS Peacock tomotherapy system and conventional MLC's for IMRT delivery is the lack of treatment verification. In conventional few-field radiotherapy one relied on portal images to determine if the patient was setup correctly and the beams were correctly positioned. With IMRT the image contrast is superimposed on the beam intensity variation. Conventional practice allowed for monitor unit calculation checks and point dosimeters placed on the patient's surface to verify that the treatment was properly delivered. With IMRT it is impossible to perform hand calculations of monitor units and dosimeters placed on the patient's surface are prone to error due to high gradients in the beam intensity. NOMOS has developed a verification phantom that allows multiple sheets of film to be placed in a light-tight box that is irradiated with the same beam pattern that is used to treat the patient. The optical density of the films are adjusted, normalized, and calibrated and then quantitatively compared with the dose calculated for the phantom delivery. However, this process is too laborious to be used for patient-specific QA. If IMRT becomes ubiquitous and it can be shown that IMRT is useful on most treatment sites then there is a need to design treatment units dedicated to IMRT delivery and verification. Helical tomotherapy is such a redesign. Helical tomotherapy is the delivery of a rotational fan beam while the patient is

  7. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  8. Clinical verification in homeopathy and allergic conditions.

    Science.gov (United States)

    Van Wassenhoven, Michel

    2013-01-01

    The literature on clinical research in allergic conditions treated with homeopathy includes a meta-analysis of randomised controlled trials (RCT) for hay fever with positive conclusions and two positive RCTs in asthma. Cohort surveys using validated Quality of Life questionnaires have shown improvement in asthma in children, general allergic conditions and skin diseases. Economic surveys have shown positive results in eczema, allergy, seasonal allergic rhinitis, asthma, food allergy and chronic allergic rhinitis. This paper reports clinical verification of homeopathic symptoms in all patients and especially in various allergic conditions in my own primary care practice. For preventive treatments in hay fever patients, Arsenicum album was the most effective homeopathic medicine followed by Nux vomica, Pulsatilla pratensis, Gelsemium, Sarsaparilla, Silicea and Natrum muriaticum. For asthma patients, Arsenicum iodatum appeared most effective, followed by Lachesis, Calcarea arsenicosa, Carbo vegetabilis and Silicea. For eczema and urticaria, Mezereum was most effective, followed by Lycopodium, Sepia, Arsenicum iodatum, Calcarea carbonica and Psorinum. The choice of homeopathic medicine depends on the presence of other associated symptoms and 'constitutional' features. Repertories should be updated by including results of such clinical verifications of homeopathic prescribing symptoms. Copyright © 2012 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  9. Error analysis of marker-based object localization using a single-plane XRII

    International Nuclear Information System (INIS)

    Habets, Damiaan F.; Pollmann, Steven I.; Yuan, Xunhua; Peters, Terry M.; Holdsworth, David W.

    2009-01-01

    The role of imaging and image guidance is increasing in surgery and therapy, including treatment planning and follow-up. Fluoroscopy is used for two-dimensional (2D) guidance or localization; however, many procedures would benefit from three-dimensional (3D) guidance or localization. Three-dimensional computed tomography (CT) using a C-arm mounted x-ray image intensifier (XRII) can provide high-quality 3D images; however, patient dose and the required acquisition time restrict the number of 3D images that can be obtained. C-arm based 3D CT is therefore limited in applications for x-ray based image guidance or dynamic evaluations. 2D-3D model-based registration, using a single-plane 2D digital radiographic system, does allow for rapid 3D localization. It is our goal to investigate - over a clinically practical range - the impact of x-ray exposure on the resulting range of 3D localization precision. In this paper it is assumed that the tracked instrument incorporates a rigidly attached 3D object with a known configuration of markers. A 2D image is obtained by a digital fluoroscopic x-ray system and corrected for XRII distortions (±0.035 mm) and mechanical C-arm shift (±0.080 mm). A least-square projection-Procrustes analysis is then used to calculate the 3D position using the measured 2D marker locations. The effect of x-ray exposure on the precision of 2D marker localization and on 3D object localization was investigated using numerical simulations and x-ray experiments. The results show a nearly linear relationship between 2D marker localization precision and the 3D localization precision. However, a significant amplification of error, nonuniformly distributed among the three major axes, occurs, and that is demonstrated. To obtain a 3D localization error of less than ±1.0 mm for an object with 20 mm marker spacing, the 2D localization precision must be better than ±0.07 mm. This requirement was met for all investigated nominal x-ray exposures at 28 cm FOV, and

  10. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  11. Woodward Effect Experimental Verifications

    Science.gov (United States)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  12. Verification of hypergraph states

    Science.gov (United States)

    Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito

    2017-12-01

    Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.

  13. Marker-based or model-based RSA for evaluation of hip resurfacing arthroplasty? A clinical validation and 5-year follow-up.

    Science.gov (United States)

    Lorenzen, Nina Dyrberg; Stilling, Maiken; Jakobsen, Stig Storgaard; Gustafson, Klas; Søballe, Kjeld; Baad-Hansen, Thomas

    2013-11-01

    The stability of implants is vital to ensure a long-term survival. RSA determines micro-motions of implants as a predictor of early implant failure. RSA can be performed as a marker- or model-based analysis. So far, CAD and RE model-based RSA have not been validated for use in hip resurfacing arthroplasty (HRA). A phantom study determined the precision of marker-based and CAD and RE model-based RSA on a HRA implant. In a clinical study, 19 patients were followed with stereoradiographs until 5 years after surgery. Analysis of double-examination migration results determined the clinical precision of marker-based and CAD model-based RSA, and at the 5-year follow-up, results of the total translation (TT) and the total rotation (TR) for marker- and CAD model-based RSA were compared. The phantom study showed that comparison of the precision (SDdiff) in marker-based RSA analysis was more precise than model-based RSA analysis in TT (p CAD RSA analysis (p = 0.002), but showed no difference between the marker- and CAD model-based RSA analysis regarding the TR (p = 0.91). Comparing the mean signed values regarding the TT and the TR at the 5-year follow-up in 13 patients, the TT was lower (p = 0.03) and the TR higher (p = 0.04) in the marker-based RSA compared to CAD model-based RSA. The precision of marker-based RSA was significantly better than model-based RSA. However, problems with occluded markers lead to exclusion of many patients which was not a problem with model-based RSA. HRA were stable at the 5-year follow-up. The detection limit was 0.2 mm TT and 1° TR for marker-based and 0.5 mm TT and 1° TR for CAD model-based RSA for HRA.

  14. Verification of the patient positioning for evaluation of PTV margins in radiotherapy of prostate cancer; Verificacao do posicionamento do paciente para a avaliacao das margens de PTV em radioterapia do cancer de prostata

    Energy Technology Data Exchange (ETDEWEB)

    Frohlich, B.F.; Peron, T.M.; Scheid, A.M.; Cardoso, F.; Alves, F.; Alves, M.S.; Dias, T.M. [Hospital de Clinicas de Porto Alegre, RS (Brazil)

    2016-07-01

    The work aimed to verify the relative displacements between the patient and the isocenters of the device based on the reproducibility of positioning, and estimates a PTV margins of radiotherapy treatments for prostate cancer. The results of displacements were obtained from a sample of 30 patient and showed values in vertical, longitudinal and lateral directions -0.03 ± 0.48 cm, 0.12 ± 0.47 cm and 0.02 ± 0.53 cm, respectively. PTV margins were calculated resulting in 0.97 cm for vertical direction, 0.85 cm for longitudinal, and 0.98 cm for lateral. (author)

  15. A Roadmap for the Implementation of Continued Process Verification.

    Science.gov (United States)

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  16. Behavioral biometrics for verification and recognition of malicious software agents

    Science.gov (United States)

    Yampolskiy, Roman V.; Govindaraju, Venu

    2008-04-01

    Homeland security requires technologies capable of positive and reliable identification of humans for law enforcement, government, and commercial applications. As artificially intelligent agents improve in their abilities and become a part of our everyday life, the possibility of using such programs for undermining homeland security increases. Virtual assistants, shopping bots, and game playing programs are used daily by millions of people. We propose applying statistical behavior modeling techniques developed by us for recognition of humans to the identification and verification of intelligent and potentially malicious software agents. Our experimental results demonstrate feasibility of such methods for both artificial agent verification and even for recognition purposes.

  17. Partner verification: restoring shattered images of our intimates.

    Science.gov (United States)

    De La Ronde, C; Swann, W B

    1998-08-01

    When spouses received feedback that disconfirmed their impressions of their partners, they attempted to undermine that feedback during subsequent interactions with these partners. Such partner verification activities occurred whether partners construed the feedback as overly favorable or overly unfavorable. Furthermore, because spouses tended to see their partners as their partners saw themselves, their efforts to restore their impressions of partners often worked hand-in-hand with partners' efforts to verify their own views. Finally, support for self-verification theory emerged in that participants were more intimate with spouses who verified their self-views, whether their self-views happened to be positive or negative.

  18. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  19. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  20. Neighbors Based Discriminative Feature Difference Learning for Kinship Verification

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    In this paper, we present a discriminative feature difference learning method for facial image based kinship verification. To transform feature difference of an image pair to be discriminative for kinship verification, a linear transformation matrix for feature difference between an image pair...... than the commonly used feature concatenation, leading to a low complexity. Furthermore, there is no positive semi-definitive constrain on the transformation matrix while there is in metric learning methods, leading to an easy solution for the transformation matrix. Experimental results on two public...... databases show that the proposed method combined with a SVM classification method outperforms or is comparable to state-of-the-art kinship verification methods. © Springer International Publishing AG, Part of Springer Science+Business Media...

  1. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  2. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  3. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  4. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  5. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  6. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  7. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  8. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  9. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  10. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  11. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  12. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  13. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  14. Vehicle usage verification system

    NARCIS (Netherlands)

    Scanlon, W.G.; McQuiston, Jonathan; Cotton, Simon L.

    2012-01-01

    EN)A computer-implemented system for verifying vehicle usage comprising a server capable of communication with a plurality of clients across a communications network. Each client is provided in a respective vehicle and with a respective global positioning system (GPS) by which the client can

  15. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  16. Hot cell verification facility update

    International Nuclear Information System (INIS)

    Titzler, P.A.; Moffett, S.D.; Lerch, R.E.

    1985-01-01

    The Hot Cell Verification Facility (HCVF) provides a prototypic hot cell mockup to check equipment for functional and remote operation, and provides actual hands-on training for operators. The facility arrangement is flexible and assists in solving potential problems in a nonradioactive environment. HCVF has been in operation for six years, and the facility is a part of the Hanford Engineering Development Laboratory

  17. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  18. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  19. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  20. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  1. Portal verification for breast cancer radiotherapy

    International Nuclear Information System (INIS)

    Petkovska, Sonja; Pejkovikj, Sasho; Apostolovski, Nebojsha

    2013-01-01

    At the University Clinic in Skopje, breast cancer irradiation is being planned and performed by using a mono-iso centrical method, which means that a unique isocenter (I C) for all irradiation fields is used. The goal of this paper is to present the patient’s position in all coordinates before the first treatment session, relative to the position determined during the CT simulation. Deviation of up to 5 mm is allowed. The analysis was made by using a portal verification. Sixty female patients at random selection are reviewed. The matching results show that for each patient deviation exists at least on one axis. The largest deviations are in the longitudinal direction (head-feet) up to 4 mm, mean 1.8 mm. In 60 out of 85 analysed fields, the deviation is towards the head. In lateral direction, median deviation is 1.1 mm and in 65% of the analysed portals those deviations are in medial direction – contralateral breast which can increases the dose in the lung and in the contralateral breast. This deviation for supraclavicular field can increase the dose in the spinal cord. Although these doses are well below the limit, this fact should be taken into account in setting the treatment fields. The final conclusion from the research is that despite of the fact we are dealing with small deviations, in conditions when accuracy in positioning is done with portal, the portal verification needs to be done in the coming weeks of the treatment, not only before the first treatment. This provides information for an intra fractional set-up deviation. (Author)

  2. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click.

    Science.gov (United States)

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.

  3. Decentralized position verification in geographic ad hoc routing

    NARCIS (Netherlands)

    Leinmüller, Tim; Schoch, Elmar; Kargl, Frank; Maihöfer, Christian

    Inter-vehicle communication is regarded as one of the major applications of mobile ad hoc networks (MANETs). Compared to MANETs or wireless sensor networks (WSNs), these so-called vehicular ad hoc networks (VANETs) have unique requirements on network protocols. The requirements result mainly from

  4. MOV reliability evaluation and periodic verification scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  5. MOV reliability evaluation and periodic verification scheduling

    International Nuclear Information System (INIS)

    Bunte, B.D.

    1996-01-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs

  6. Fostering group identification and creativity in diverse groups: the role of individuation and self-verification.

    Science.gov (United States)

    Swann, William B; Kwan, Virginia S Y; Polzer, Jeffrey T; Milton, Laurie P

    2003-11-01

    A longitudinal study examined the interplay of identity negotiation processes and diversity in small groups of master's of business administration (MBA) students. When perceivers formed relatively positive impressions of other group members, higher diversity predicted more individuation of targets. When perceivers formed relatively neutral impressions of other group members, however, higher diversity predicted less individuation of targets. Individuation at the outset of the semester predicted self-verification effects several weeks later, and self-verification, in turn, predicted group identification and creative task performance. The authors conclude that contrary to self-categorization theory, fostering individuation and self-verification in diverse groups may maximize group identification and productivity.

  7. Fiducial marker-based correction for involuntary motion in weight-bearing C-arm CT scanning of knees. Part I. Numerical model-based optimization.

    Science.gov (United States)

    Choi, Jang-Hwan; Fahrig, Rebecca; Keil, Andreas; Besier, Thor F; Pal, Saikat; McWalter, Emily J; Beaupré, Gary S; Maier, Andreas

    2013-09-01

    Human subjects in standing positions are apt to show much more involuntary motion than in supine positions. The authors aimed to simulate a complicated realistic lower body movement using the four-dimensional (4D) digital extended cardiac-torso (XCAT) phantom. The authors also investigated fiducial marker-based motion compensation methods in two-dimensional (2D) and three-dimensional (3D) space. The level of involuntary movement-induced artifacts and image quality improvement were investigated after applying each method. An optical tracking system with eight cameras and seven retroreflective markers enabled us to track involuntary motion of the lower body of nine healthy subjects holding a squat position at 60° of flexion. The XCAT-based knee model was developed using the 4D XCAT phantom and the optical tracking data acquired at 120 Hz. The authors divided the lower body in the XCAT into six parts and applied unique affine transforms to each so that the motion (6 degrees of freedom) could be synchronized with the optical markers' location at each time frame. The control points of the XCAT were tessellated into triangles and 248 projection images were created based on intersections of each ray and monochromatic absorption. The tracking data sets with the largest motion (Subject 2) and the smallest motion (Subject 5) among the nine data sets were used to animate the XCAT knee model. The authors defined eight skin control points well distributed around the knees as pseudo-fiducial markers which functioned as a reference in motion correction. Motion compensation was done in the following ways: (1) simple projection shifting in 2D, (2) deformable projection warping in 2D, and (3) rigid body warping in 3D. Graphics hardware accelerated filtered backprojection was implemented and combined with the three correction methods in order to speed up the simulation process. Correction fidelity was evaluated as a function of number of markers used (4-12) and marker distribution

  8. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  9. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  10. MR image-guided portal verification for brain treatment field

    International Nuclear Information System (INIS)

    Yin Fangfang; Gao Qinghuai; Xie Huchen; Nelson, Diana F.; Yu Yan; Kwok, W. Edmund; Totterman, Saara; Schell, Michael C.; Rubin, Philip

    1998-01-01

    Purpose: To investigate a method for the generation of digitally reconstructed radiographs directly from MR images (DRR-MRI) to guide a computerized portal verification procedure. Methods and Materials: Several major steps were developed to perform an MR image-guided portal verification procedure. Initially, a wavelet-based multiresolution adaptive thresholding method was used to segment the skin slice-by-slice in MR brain axial images. Some selected anatomical structures, such as target volume and critical organs, were then manually identified and were reassigned to relatively higher intensities. Interslice information was interpolated with a directional method to achieve comparable display resolution in three dimensions. Next, a ray-tracing method was used to generate a DRR-MRI image at the planned treatment position, and the ray tracing was simply performed on summation of voxels along the ray. The skin and its relative positions were also projected to the DRR-MRI and were used to guide the search of similar features in the portal image. A Canny edge detector was used to enhance the brain contour in both portal and simulation images. The skin in the brain portal image was then extracted using a knowledge-based searching technique. Finally, a Chamfer matching technique was used to correlate features between DRR-MRI and portal image. Results: The MR image-guided portal verification method was evaluated using a brain phantom case and a clinical patient case. Both DRR-CT and DRR-MRI were generated using CT and MR phantom images with the same beam orientation and then compared. The matching result indicated that the maximum deviation of internal structures was less than 1 mm. The segmented results for brain MR slice images indicated that a wavelet-based image segmentation technique provided a reasonable estimation for the brain skin. For the clinical patient case with a given portal field, the MR image-guided verification method provided an excellent match between

  11. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  12. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    International Nuclear Information System (INIS)

    Luke, S.J.

    2011-01-01

    genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.

  13. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    Energy Technology Data Exchange (ETDEWEB)

    Luke, S J

    2011-12-20

    genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.

  14. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  15. Verification and nuclear material security

    International Nuclear Information System (INIS)

    ElBaradei, M.

    2001-01-01

    Full text: The Director General will open the symposium by presenting a series of challenges facing the international safeguards community: the need to ensure a robust system, with strong verification tools and a sound research and development programme; the importance of securing the necessary support for the system, in terms of resources; the effort to achieve universal participation in the non-proliferation regime; and the necessity of re-energizing disarmament efforts. Special focus will be given to the challenge underscored by recent events, of strengthening international efforts to combat nuclear terrorism. (author)

  16. SHIELD verification and validation report

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    This document outlines the verification and validation effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system code. Along with its predecessors, SHIELD has been in use at the Savannah River Site (SRS) for more than ten years. During this time the code has been extensively tested and a variety of validation documents have been issued. The primary function of this report is to specify the features and capabilities for which SHIELD is to be considered validated, and to reference the documents that establish the validation

  17. Trojan technical specification verification project

    International Nuclear Information System (INIS)

    Bates, L.; Rickenback, M.

    1991-01-01

    The Trojan Technical Specification Verification (TTSV) project at the Trojan plant of Portland General Electric Company was motivated by the recognition that many numbers in the Trojan technical specifications (TTS) potentially lacked the consideration of instrument- and/or process-related errors. The plant setpoints were known to consider such errors, but many of the values associated with the limiting conditions for operation (LCO) did not. In addition, the existing plant instrument error analyses were based on industry values that do not reflect the Trojan plant-specific experience. The purpose of this project is to ensure that the Trojan plant setpoint and LCO values include plant-specific instrument error

  18. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples......: a textbook “philosophers” example, and an example motivated by a ubiquitous computing application. We give a tractable heuristic with which to approximate interference between reaction rules, and prove this analysis to be safe. We provide a mechanism for state reachability checking of bigraphical reactive...

  19. Hot-cell verification facility

    International Nuclear Information System (INIS)

    Eschenbaum, R.A.

    1981-01-01

    The Hot Cell Verification Facility (HCVF) was established as the test facility for the Fuels and Materials Examination Facility (FMEF) examination equipment. HCVF provides a prototypic hot cell environment to check the equipment for functional and remote operation. It also provides actual hands-on training for future FMEF Operators. In its two years of operation, HCVF has already provided data to make significant changes in items prior to final fabrication. It will also shorten the startup time in FMEF since the examination equipment will have been debugged and operated in HCVF

  20. KNGR core proection calculator, software, verification and validation plan

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Park, Jong Kyun; Lee, Ki Young; Lee, Jang Soo; Cheon, Se Woo

    2001-05-01

    This document describes the Software Verification and Validation Plan(SVVP) Guidance to be used in reviewing the Software Program Manual(SPM) in Korean Next Generation Reactor(KNGR) projects. This document is intended for a verifier or reviewer who is involved with performing of software verification and validation task activity in KNGR projects. This document includeds the basic philosophy, performing V and V effort, software testing techniques, criteria of review and audit on the safety software V and V activity. Major review topics on safety software addresses three kinds of characteristics based on Standard Review Plan(SRP) Chapter 7, Branch Technical Position(BTP)-14 : management characteristics, implementation characteristics and resources characteristics when reviewing on SVVP. Based on major topics of this document, we have produced the evaluation items list such as checklist in Appendix A

  1. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  2. Verification test of control rod system for HTR-10

    International Nuclear Information System (INIS)

    Zhou Huizhong; Diao Xingzhong; Huang Zhiyong; Cao Li; Yang Nianzu

    2002-01-01

    There are 10 sets of control rods and driving devices in 10 MW High Temperature Gas-cooled Test Reactor (HTR-10). The control rod system is the controlling and shutdown system of HTR-10, which is designed for reactor criticality, operation, and shutdown. In order to guarantee technical feasibility, a series of verification tests were performed, including room temperature test, thermal test, test after control rod system installed in HTR-10, and test of control rod system before HTR-10 first criticality. All the tests data showed that driving devices working well, control rods running smoothly up and down, random position settling well, and exactly position indicating

  3. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  4. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  5. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  6. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  7. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  8. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  9. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  10. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  11. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  12. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  13. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  14. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  15. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  16. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  17. Initial clinical experience with infrared-reflecting skin markers in the positioning of patients treated by conformal radiotherapy for prostate cancer

    International Nuclear Information System (INIS)

    Soete, Guy; Steene, Jan van de; Verellen, Dirk; Vinh-Hung, Vincent; Berge, Dirk van den; Michielsen, Dirk; Keuppens, Frans; Roover, Patricia de; Storme, Guy

    2002-01-01

    Purpose: To evaluate an infrared (IR) marker-based positioning system in patients receiving conformal radiotherapy for prostate cancer. Methods and materials: During 553 treatments, the ability of the IR system to automatically position the isocenter was recorded. Setup errors were measured by means of orthogonal verification films and compared to conventional positioning (using skin drawings and lasers) in 184 treatments. Results: The standard deviation of anteroposterior (AP) and lateral setup errors was significantly reduced with IR marker positioning compared to conventional: 2 vs. 4.8 mm AP (p<0.01) and 1.6 vs. 3.5 mm laterally (p<0.01). Longitudinally, the difference was not significant (3.5 vs. 3.0 mm). Systematic errors were on the average smaller AP and laterally for the IR method: 4.1 vs. 7.8 mm AP (p=0.01) and 3.1 vs. 5.6 mm lateral (p=0.07). Longitudinally, the IR system resulted in somewhat larger systematic errors: 5.0 vs. 3.4 mm for conventional positioning (p=0.03). The use of an off-line correction protocol, based on the average deviation measured over the first four fractions, allowed virtual elimination of systematic errors. Inability of the IR system to correctly locate the markers, leading to an executional failure, occurred in 21% of 553 fractions. Conclusion: IR marker-assisted patient positioning significantly improves setup accuracy along the AP and lateral axes. Executional failures need to be reduced

  18. Linear and nonlinear verification of gyrokinetic microstability codes

    Science.gov (United States)

    Bravenec, R. V.; Candy, J.; Barnes, M.; Holland, C.

    2011-12-01

    the nonlinear fluxes without collisions. With collisions, the differences between the time-averaged fluxes are larger than the uncertainties defined as the oscillations of the fluxes, with the GS2 fluxes consistently larger (or more positive) than those from GYRO. However, the electrostatic fluxes are much smaller than those without collisions (the electromagnetic energy flux is negligible in both cases). In fact, except for the electron energy fluxes, the absolute magnitudes of the differences in fluxes with collisions are the same or smaller than those without. None of the fluxes exhibit large absolute differences between codes. Beyond these results, the specific linear and nonlinear benchmarks proposed here, as well as the underlying methodology, provide the basis for a wide variety of future verification efforts.

  19. Comparison of 3D Joint Angles Measured With the Kinect 2.0 Skeletal Tracker Versus a Marker-Based Motion Capture System.

    Science.gov (United States)

    Guess, Trent M; Razu, Swithin; Jahandar, Amirhossein; Skubic, Marjorie; Huo, Zhiyu

    2017-04-01

    The Microsoft Kinect is becoming a widely used tool for inexpensive, portable measurement of human motion, with the potential to support clinical assessments of performance and function. In this study, the relative osteokinematic Cardan joint angles of the hip and knee were calculated using the Kinect 2.0 skeletal tracker. The pelvis segments of the default skeletal model were reoriented and 3-dimensional joint angles were compared with a marker-based system during a drop vertical jump and a hip abduction motion. Good agreement between the Kinect and marker-based system were found for knee (correlation coefficient = 0.96, cycle RMS error = 11°, peak flexion difference = 3°) and hip (correlation coefficient = 0.97, cycle RMS = 12°, peak flexion difference = 12°) flexion during the landing phase of the drop vertical jump and for hip abduction/adduction (correlation coefficient = 0.99, cycle RMS error = 7°, peak flexion difference = 8°) during isolated hip motion. Nonsagittal hip and knee angles did not correlate well for the drop vertical jump. When limited to activities in the optimal capture volume and with simple modifications to the skeletal model, the Kinect 2.0 skeletal tracker can provide limited 3-dimensional kinematic information of the lower limbs that may be useful for functional movement assessment.

  20. Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics.

    Science.gov (United States)

    Patrizi, Alfredo; Pennestrì, Ettore; Valentini, Pier Paolo

    2016-01-01

    The paper deals with the comparison between a high-end marker-based acquisition system and a low-cost marker-less methodology for the assessment of the human posture during working tasks. The low-cost methodology is based on the use of a single Microsoft Kinect V1 device. The high-end acquisition system is the BTS SMART that requires the use of reflective markers to be placed on the subject's body. Three practical working activities involving object lifting and displacement have been investigated. The operational risk has been evaluated according to the lifting equation proposed by the American National Institute for Occupational Safety and Health. The results of the study show that the risk multipliers computed from the two acquisition methodologies are very close for all the analysed activities. In agreement to this outcome, the marker-less methodology based on the Microsoft Kinect V1 device seems very promising to promote the dissemination of computer-aided assessment of ergonomics while maintaining good accuracy and affordable costs. PRACTITIONER’S SUMMARY: The study is motivated by the increasing interest for on-site working ergonomics assessment. We compared a low-cost marker-less methodology with a high-end marker-based system. We tested them on three different working tasks, assessing the working risk of lifting loads. The two methodologies showed comparable precision in all the investigations.

  1. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  2. Numerical Verification Of Equilibrium Chemistry

    International Nuclear Information System (INIS)

    Piro, Markus; Lewis, Brent; Thompson, William T.; Simunovic, Srdjan; Besmann, Theodore M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  3. Seismic verification of underground explosions

    International Nuclear Information System (INIS)

    Glenn, L.A.

    1985-06-01

    The first nuclear test agreement, the test moratorium, was made in 1958 and lasted until the Soviet Union unilaterally resumed testing in the atmosphere in 1961. It was followed by the Limited Test Ban Treaty of 1963, which prohibited nuclear tests in the atmosphere, in outer space, and underwater. In 1974 the Threshold Test Ban Treaty (TTBT) was signed, limiting underground tests after March 1976 to a maximum yield of 250 kt. The TTBT was followed by a treaty limiting peaceful nuclear explosions and both the United States and the Soviet Union claim to be abiding by the 150-kt yield limit. A comprehensive test ban treaty (CTBT), prohibiting all testing of nuclear weapons, has also been discussed. However, a verifiable CTBT is a contradiction in terms. No monitoring technology can offer absolute assurance that very-low-yield illicit explosions have not occurred. The verification process, evasion opportunities, and cavity decoupling are discussed in this paper

  4. The verification of ethnographic data.

    Science.gov (United States)

    Pool, Robert

    2017-09-01

    Anthropologists are increasingly required to account for the data on which they base their interpretations and to make it available for public scrutiny and re-analysis. While this may seem straightforward (why not place our data in online repositories?), it is not. Ethnographic 'data' may consist of everything from verbatim transcripts ('hard data') to memories and impressions ('soft data'). Hard data can be archived and re-analysed; soft data cannot. The focus on hard 'objective' data contributes to the delegitimizing of the soft data that are essential for ethnographic understanding, and without which hard data cannot be properly interpreted. However, the credibility of ethnographic interpretation requires the possibility of verification. This could be achieved by obligatory, standardised forms of personal storage with the option for audit if required, and by being more explicit in publications about the nature and status of the data and the process of interpretation.

  5. The NRC measurement verification program

    International Nuclear Information System (INIS)

    Pham, T.N.; Ong, L.D.Y.

    1995-01-01

    A perspective is presented on the US Nuclear Regulatory Commission (NRC) approach for effectively monitoring the measurement methods and directly testing the capability and performance of licensee measurement systems. A main objective in material control and accounting (MC and A) inspection activities is to assure the accuracy and precision of the accounting system and the absence of potential process anomalies through overall accountability. The primary means of verification remains the NRC random sampling during routine safeguards inspections. This involves the independent testing of licensee measurement performance with statistical sampling plans for physical inventories, item control, and auditing. A prospective cost-effective alternative overcheck is also discussed in terms of an externally coordinated sample exchange or ''round robin'' program among participating fuel cycle facilities in order to verify the quality of measurement systems, i.e., to assure that analytical measurement results are free of bias

  6. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  7. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  8. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  9. Self-verification and depression among youth psychiatric inpatients.

    Science.gov (United States)

    Joiner, T E; Katz, J; Lew, A S

    1997-11-01

    According to self-verification theory (e.g., W.B. Swann, 1983), people are motivated to preserve stable self-concepts by seeking self-confirming interpersonal responses, even if the responses are negative. In the current study of 72 youth psychiatric inpatients (36 boys; 36 girls; ages 7-17, M = 13.18; SD = 2.59), the authors provide the 1st test of self-verification theory among a youth sample. Participants completed self-report questionnaires on depression, self-esteem, anxiety, negative and positive affect, and interest in negative feedback from others. The authors made chart diagnoses available, and they collected peer rejection ratings. Consistent with hypotheses, the authors found that interest in negative feedback was associated with depression, was predictive of peer rejection (but only within relatively longer peer relationships), was more highly related to cognitive than emotional aspects of depression, and was specifically associated with depression, rather than being generally associated with emotional distress. The authors discuss implications for self-verification theory and for the phenomenology of youth depression.

  10. Technology for bolus verification in proton therapy

    Science.gov (United States)

    Shipulin, K. N.; Mytsin, G. V.; Agapov, A. V.

    2015-01-01

    To ensure the conformal depth-dose distribution of a proton beam within a target volume, complex shaped range shifters (so-called boluses), which account for the heterogeneous structure of patient tissue and organs in the beam path, were calculated and manufactured. The precise manufacturing of proton compensators used for patient treatment is a vital step in quality assurance in proton therapy. In this work a software-hardware complex that verifies the quality and precision of bolus manufacturing at the Medico-Technical Complex (MTC) was developed. The boluses consisted of a positioning system with two photoelectric biosensors. We evaluated 20 boluses used in proton therapy of five patients. A total number of 2562 experimental points were measured, of which only two points had values that differed from the calculated value by more than 0.5 mm. The other data points displayed a deviation within ±0.5 mm from the calculated value. The technology for bolus verification developed in this work can be used for the high precision testing of geometrical parameters of proton compensators in radiotherapy.

  11. Smartphone User Identity Verification Using Gait Characteristics

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2016-09-01

    Full Text Available Smartphone-based biometrics offers a wide range of possible solutions, which could be used to authenticate users and thus to provide an extra level of security and theft prevention. We propose a method for positive identification of smartphone user’s identity using user’s gait characteristics captured by embedded smartphone sensors (gyroscopes, accelerometers. The method is based on the application of the Random Projections method for feature dimensionality reduction to just two dimensions. Then, a probability distribution function (PDF of derived features is calculated, which is compared against known user PDF. The Jaccard distance is used to evaluate distance between two distributions, and the decision is taken based on thresholding. The results for subject recognition are at an acceptable level: we have achieved a grand mean Equal Error Rate (ERR for subject identification of 5.7% (using the USC-HAD dataset. Our findings represent a step towards improving the performance of gait-based user identity verification technologies.

  12. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  13. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  14. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  15. X-ray-assisted positioning of patients treated by conformal arc radiotherapy for prostate cancer: Comparison of setup accuracy using implanted markers versus bony structures

    International Nuclear Information System (INIS)

    Soete, Guy; Cock, Mieke de; Verellen, Dirk; Michielsen, Dirk; Keuppens, Frans; Storme, Guy

    2007-01-01

    Purpose: The aim of this study was to compare setup accuracy of NovalisBody stereoscopic X-ray positioning using implanted markers in the prostate vs. bony structures in patients treated with dynamic conformal arc radiotherapy for prostate cancer. Methods and Materials: Random and systematic setup errors (RE and SE) of the isocenter with regard to the center of gravity of three fiducial markers were measured by means of orthogonal verification films in 120 treatment sessions in 12 patients. Positioning was performed using NovalisBody semiautomated marker fusion. The results were compared with a control group of 261 measurements in 15 patients who were positioned with NovalisBody automated bone fusion. In addition, interfraction and intrafraction prostate motion was registered in the patients with implanted markers. Results: Marker-based X-ray positioning resulted in a reduction of RE as well as SE in the anteroposterior, craniocaudal, and left-right directions compared with those in the control group. The interfraction prostate displacements with regard to the bony pelvis that could be avoided by marker positioning ranged between 1.6 and 2.8 mm for RE and between 1.3 and 4.3 mm for SE. Intrafraction random and systematic prostate movements ranged between 1.4 and 2.4 mm and between 0.8 and 1.3 mm, respectively. Conclusion: The problem of interfraction prostate motion can be solved by using implanted markers. In addition, the NovalisBody X-ray system performs more accurately with markers compared with bone fusion. Intrafraction organ motion has become the limiting factor for margin reduction around the clinical target volume

  16. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  17. Data Exchanges and Verifications Online (DEVO)

    Data.gov (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  18. 10 CFR 300.11 - Independent verification.

    Science.gov (United States)

    2010-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11... managing an auditing or verification process, including the recruitment and allocation of other individual.... (c) Qualifications of organizations accrediting verifiers. Organizations that accredit individual...

  19. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  20. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  1. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  2. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  3. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  4. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  5. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  6. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  7. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  8. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  9. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  10. Rapid identification of red-flesh loquat cultivars using EST-SSR markers based on manual cultivar identification diagram strategy.

    Science.gov (United States)

    Li, X Y; Xu, H X; Chen, J W

    2014-04-29

    Manual cultivar identification diagram is a new strategy for plant cultivar identification based on DNA markers, providing information to efficiently separate cultivars. We tested 25 pairs of apple EST-SSR primers for amplification of PCR products from loquat cultivars. These EST-SSR primers provided clear amplification products from the loquat cultivars, with a relatively high transferability rate of 84% to loquat; 11 pairs of primers amplified polymorphic products. After analysis of 24 red-fleshed loquat accessions, we found that only 7 pairs of primers could clearly separate all of them. A cultivar identification diagram of the 24 cultivars was constructed using polymorphic bands from the DNA fingerprints and EST-SSR primers. Any two of the 24 cultivars could be rapidly separated from each other, according to the polymorphic bands from the cultivars; the corresponding primers were marked in the correct position on the cultivar identification diagram. This red-flesh loquat cultivar identification diagram can separate the 24 red-flesh loquat cultivars, which is of benefit for loquat cultivar identification for germplasm management and breeding programs.

  11. Verification of excess defense material

    International Nuclear Information System (INIS)

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-01-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials

  12. Dosimetric verification of IMRT plans

    International Nuclear Information System (INIS)

    Bulski, W.; Cheimicski, K.; Rostkowska, J.

    2012-01-01

    Intensity modulated radiotherapy (IMRT) is a complex procedure requiring proper dosimetric verification. IMRT dose distributions are characterized by steep dose gradients which enable to spare organs at risk and allow for an escalation of the dose to the tumor. They require large number of radiation beams (sometimes over 10). The fluence measurements for individual beams are not sufficient for evaluation of the total dose distribution and to assure patient safety. The methods used at the Centre of Oncology in Warsaw are presented. In order to measure dose distributions in various cross-sections the film dosimeters were used (radiographic Kodak EDR2 films and radiochromic Gafchromic EBT films). The film characteristics were carefully examined. Several types of tissue equivalent phantoms were developed. A methodology of comparing measured dose distributions against the distributions calculated by treatment planning systems (TPS) was developed and tested. The tolerance level for this comparison was set at 3% difference in dose and 3 mm in distance to agreement. The so called gamma formalism was used. The results of these comparisons for a group of over 600 patients are presented. Agreement was found in 87 % of cases. This film dosimetry methodology was used as a benchmark to test and validate the performance of commercially available 2D and 3D matrices of detectors (ionization chambers or diodes). The results of these validations are also presented. (authors)

  13. MR image-guided portal verification for brain treatment field

    International Nuclear Information System (INIS)

    Yin, F.-F.; Gao, Q.H.; Xie, H.; Nelson, D.F.; Yu, Y.; Kwok, W.E.; Totterman, S.; Schell, M.C.; Rubin, P.

    1996-01-01

    and marrow information within the skull. Next, a ray-tracing method is used to generate a projection (pseudo-portal) image at the planned treatment position. In this situation, the ray-tracing is simply performed on pixels rather than attenuation coefficients. The skull and its relative positions are also projected to the pseudo-portal image and are used as 'hint' for the search of similar features in the portal images. A Canny edge detector is applied to the region of treatment field and is used to enhance brain contour and skull. The skull in the brain is then identified using a snake technique which is guided by the ''hint'', the projected features from MR images. Finally, a Chamfer matching technique is used to correlate features between the MR projection and portal images. Results: MR image-guided portal verification technique is evaluated using a clinical patient case who has an astrocytoma brain tumor and is treated by radiation therapy. The segmented results for brain MR slice images indicate that a wavelet-based image segmentation technique provides a reasonable estimation for the brain skull. Compared to the brain portal image, the method developed in this study for the generation of brain projection images provides skull structure about 3 mm differences. However, overall matching results are within 2 mm compared to the results between portal and simulation images. In addition, tumor volume can be accurately visualized in the projection image and be mapped over to portal images for treatment verification with this approach. Conclusions: A method for MR image-guided portal verification of brain treatment field is being developed. Although the projection image from MR images dose not have the similar radiographic appearance as portal images, it provides certain essential anatomical features (landmarks and gross tumor) as well as their relative locations to be used as references for computerized portal verification

  14. Confidence-increasing elements in user instructions: Seniors' reactions to verification steps and personal stories

    NARCIS (Netherlands)

    Loorbach, N.R.; Karreman, Joyce; Steehouder, M.F.

    2013-01-01

    Purpose: Research shows that confidence-increasing elements in user instructions positively influence senior users' task performance and motivation. We added verification steps and personal stories to user instructions for a cell phone, to find out how seniors (between 60 and 70 years) perceive

  15. Academic Self-Esteem and Perceived Validity of Grades: A Test of Self-Verification Theory.

    Science.gov (United States)

    Okun, Morris A.; Fournet, Lee M.

    1993-01-01

    The hypothesis derived from self-verification theory that semester grade point average would be positively related to perceived validity of grade scores among high self-esteem undergraduates and inversely related for low self-esteem students was not supported in a study with 281 undergraduates. (SLD)

  16. ATLANTIDES: An Architecture for Alert Verification in Network Intrusion Detection Systems

    NARCIS (Netherlands)

    Bolzoni, D.; Crispo, Bruno; Etalle, Sandro

    2007-01-01

    We present an architecture designed for alert verification (i.e., to reduce false positives) in network intrusion-detection systems. Our technique is based on a systematic (and automatic) anomaly-based analysis of the system output, which provides useful context information regarding the network

  17. Leaf trajectory verification during dynamic intensity modulated radiotherapy using an amorphous silicon flat panel imager

    International Nuclear Information System (INIS)

    Sonke, Jan-Jakob; Ploeger, Lennert S.; Brand, Bob; Smitsmans, Monique H.P.; Herk, Marcel van

    2004-01-01

    An independent verification of the leaf trajectories during each treatment fraction improves the safety of IMRT delivery. In order to verify dynamic IMRT with an electronic portal imaging device (EPID), the EPID response should be accurate and fast such that the effect of motion blurring on the detected moving field edge position is limited. In the past, it was shown that the errors in the detected position of a moving field edge determined by a scanning liquid-filled ionization chamber (SLIC) EPID are negligible in clinical practice. Furthermore, a method for leaf trajectory verification during dynamic IMRT was successfully applied using such an EPID. EPIDs based on amorphous silicon (a-Si) arrays are now widely available. Such a-Si flat panel imagers (FPIs) produce portal images with superior image quality compared to other portal imaging systems, but they have not yet been used for leaf trajectory verification during dynamic IMRT. The aim of this study is to quantify the effect of motion distortion and motion blurring on the detection accuracy of a moving field edge for an Elekta iViewGT a-Si FPI and to investigate its applicability for the leaf trajectory verification during dynamic IMRT. We found that the detection error for a moving field edge to be smaller than 0.025 cm at a speed of 0.8 cm/s. Hence, the effect of motion blurring on the detection accuracy of a moving field edge is negligible in clinical practice. Furthermore, the a-Si FPI was successfully applied for the verification of dynamic IMRT. The verification method revealed a delay in the control system of the experimental DMLC that was also found using a SLIC EPID, resulting in leaf positional errors of 0.7 cm at a leaf speed of 0.8 cm/s

  18. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  19. Verification of Kaplan turbine cam curves realization accuracy at power plant

    Directory of Open Access Journals (Sweden)

    Džepčeski Dane

    2016-01-01

    Full Text Available Sustainability of approximately constant value of Kaplan turbine efficiency, for relatively large net head changes, is a result of turbine runner variable geometry. Dependence of runner blades position change on guide vane opening represents the turbine cam curve. The cam curve realization accuracy is of great importance for the efficient and proper exploitation of turbines and consequently complete units. Due to the reasons mentioned above, special attention has been given to the tests designed for cam curves verification. The goal of this paper is to provide the description of the methodology and the results of the tests performed in the process of Kaplan turbine cam curves verification.

  20. FIR signature verification system characterizing dynamics of handwriting features

    Science.gov (United States)

    Thumwarin, Pitak; Pernwong, Jitawat; Matsuura, Takenobu

    2013-12-01

    This paper proposes an online signature verification method based on the finite impulse response (FIR) system characterizing time-frequency characteristics of dynamic handwriting features. First, the barycenter determined from both the center point of signature and two adjacent pen-point positions in the signing process, instead of one pen-point position, is used to reduce the fluctuation of handwriting motion. In this paper, among the available dynamic handwriting features, motion pressure and area pressure are employed to investigate handwriting behavior. Thus, the stable dynamic handwriting features can be described by the relation of the time-frequency characteristics of the dynamic handwriting features. In this study, the aforesaid relation can be represented by the FIR system with the wavelet coefficients of the dynamic handwriting features as both input and output of the system. The impulse response of the FIR system is used as the individual feature for a particular signature. In short, the signature can be verified by evaluating the difference between the impulse responses of the FIR systems for a reference signature and the signature to be verified. The signature verification experiments in this paper were conducted using the SUBCORPUS MCYT-100 signature database consisting of 5,000 signatures from 100 signers. The proposed method yielded equal error rate (EER) of 3.21% on skilled forgeries.

  1. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  2. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  3. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    International Nuclear Information System (INIS)

    Acero, R; Pueo, M; Santolaria, J; Aguilar, J J; Brau, A

    2015-01-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures. (paper)

  4. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    Science.gov (United States)

    Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.

    2015-11-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.

  5. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    1998-02-01

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  6. Independent verification of the delivered dose in High-Dose Rate (HDR) brachytherapy

    International Nuclear Information System (INIS)

    Portillo, P.; Feld, D.; Kessler, J.

    2009-01-01

    An important aspect of a Quality Assurance program in Clinical Dosimetry is an independent verification of the dosimetric calculation done by the Treatment Planning System for each radiation treatment. The present paper is aimed at creating a spreadsheet for the verification of the dose recorded at a point of an implant with radioactive sources and HDR in gynecological injuries. An 192 Ir source automatic differed loading equipment, GammaMedplus model, Varian Medical System with HDR installed at the Angel H. Roffo Oncology Institute has been used. The planning system implemented for getting the dose distribution is the BraquiVision. The sources coordinates as well as those of the calculation point (Rectum) are entered into the Excel-devised verification program by assuming the existence of a point source in each one of the applicators' positions. Such calculation point has been selected as the rectum is an organ at risk, therefore determining the treatment planning. The dose verification is performed at points standing at a sources distance having at least twice the active length of such sources, so they may be regarded as point sources. Most of the sources used in HDR brachytherapy with 192 Ir have a 5 mm active length for all equipment brands. Consequently, the dose verification distance must be at least of 10 mm. (author)

  7. A feasible method for clinical delivery verification and dose reconstruction in tomotherapy

    International Nuclear Information System (INIS)

    Kapatoes, J.M.; Olivera, G.H.; Ruchala, K.J.; Smilowitz, J.B.; Reckwerdt, P.J.; Mackie, T.R.

    2001-01-01

    Delivery verification is the process in which the energy fluence delivered during a treatment is verified. This verified energy fluence can be used in conjunction with an image in the treatment position to reconstruct the full three-dimensional dose deposited. A method for delivery verification that utilizes a measured database of detector signal is described in this work. This database is a function of two parameters, radiological path-length and detector-to-phantom distance, both of which are computed from a CT image taken at the time of delivery. Such a database was generated and used to perform delivery verification and dose reconstruction. Two experiments were conducted: a simulated prostate delivery on an inhomogeneous abdominal phantom, and a nasopharyngeal delivery on a dog cadaver. For both cases, it was found that the verified fluence and dose results using the database approach agreed very well with those using previously developed and proven techniques. Delivery verification with a measured database and CT image at the time of treatment is an accurate procedure for tomotherapy. The database eliminates the need for any patient-specific, pre- or post-treatment measurements. Moreover, such an approach creates an opportunity for accurate, real-time delivery verification and dose reconstruction given fast image reconstruction and dose computation tools

  8. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  9. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  10. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  11. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  12. Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?

    OpenAIRE

    Rosen, Lisa H.; Principe, Connor P.; Langlois, Judith H.

    2012-01-01

    The authors examined whether early adolescents (N = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive positive or negative feedback from an unknown peer in different domains of self. Results were consistent with self-verification theory; adolescents who perceived ...

  13. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  14. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  15. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    Langdon, J.

    1996-01-01

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  16. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  17. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  18. Core power capability verification for PWR NPP

    International Nuclear Information System (INIS)

    Xian Chunyu; Liu Changwen; Zhang Hong; Liang Wei

    2002-01-01

    The Principle and methodology of pressurized water reactor nuclear power plant core power capability verification for reload are introduced. The radial and axial power distributions of normal operation (category I or condition I) and abnormal operation (category II or condition II) are simulated by using neutronics calculation code. The linear power density margin and DNBR margin for both categories, which reflect core safety, are analyzed from the point view of reactor physics and T/H, and thus category I operating domain and category II protection set point are verified. Besides, the verification results of reference NPP are also given

  19. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  20. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  1. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  2. Genome-wide conserved non-coding microsatellite (CNMS) marker-based integrative genetical genomics for quantitative dissection of seed weight in chickpea.

    Science.gov (United States)

    Bajaj, Deepak; Saxena, Maneesha S; Kujur, Alice; Das, Shouvik; Badoni, Saurabh; Tripathi, Shailesh; Upadhyaya, Hari D; Gowda, C L L; Sharma, Shivali; Singh, Sube; Tyagi, Akhilesh K; Parida, Swarup K

    2015-03-01

    Phylogenetic footprinting identified 666 genome-wide paralogous and orthologous CNMS (conserved non-coding microsatellite) markers from 5'-untranslated and regulatory regions (URRs) of 603 protein-coding chickpea genes. The (CT)n and (GA)n CNMS carrying CTRMCAMV35S and GAGA8BKN3 regulatory elements, respectively, are abundant in the chickpea genome. The mapped genic CNMS markers with robust amplification efficiencies (94.7%) detected higher intraspecific polymorphic potential (37.6%) among genotypes, implying their immense utility in chickpea breeding and genetic analyses. Seventeen differentially expressed CNMS marker-associated genes showing strong preferential and seed tissue/developmental stage-specific expression in contrasting genotypes were selected to narrow down the gene targets underlying seed weight quantitative trait loci (QTLs)/eQTLs (expression QTLs) through integrative genetical genomics. The integration of transcript profiling with seed weight QTL/eQTL mapping, molecular haplotyping, and association analyses identified potential molecular tags (GAGA8BKN3 and RAV1AAT regulatory elements and alleles/haplotypes) in the LOB-domain-containing protein- and KANADI protein-encoding transcription factor genes controlling the cis-regulated expression for seed weight in the chickpea. This emphasizes the potential of CNMS marker-based integrative genetical genomics for the quantitative genetic dissection of complex seed weight in chickpea. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  3. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  4. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  5. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  6. IAEA inspectors complete verification of nuclear material in Iraq

    International Nuclear Information System (INIS)

    2004-01-01

    Full text: At the request of the Government of Iraq and pursuant to the NPT Safeguards Agreement with Iraq, a team of IAEA safeguards inspectors has completed the annual Physical Inventory Verification of declared nuclear material in Iraq, and is returning to Vienna. The material - natural or low-enriched uranium - is not sensitive from a proliferation perspective and is consolidated at a storage facility near the Tuwaitha complex, south of Baghdad. This inspection was conducted with the logistical and security assistance of the Multinational Force and the Office of the UN Security Coordinator. Inspections such as this are required by safeguards agreements with every non-nuclear-weapon state party to the NPT that has declared holdings of nuclear material, to verify the correctness of the declaration, and that material has not been diverted to any undeclared activity. Such inspections have been performed in Iraq on a continuing basis. The most recent took place in June 2003, following reports of looting of nuclear material at the Tuwaitha complex; IAEA inspectors recovered, repackaged and resealed all but a minute amount of material. NPT safeguards inspections are limited in scope and coverage as compared to the verification activities carried out in 1991-98 and 2002-03 by the IAEA under Security Council resolution 687 and related resolutions. 'This week's mission was a good first step,' IAEA Director General Mohamed ElBaradei said. 'Now we hope to be in a position to complete the mandate entrusted to us by the Security Council, to enable the Council over time to remove all sanctions and restrictions imposed on Iraq - so that Iraq's rights as a full-fledged member of the international community can be restored.' The removal of remaining sanctions is dependent on completion of the verification process by the IAEA and the UN Monitoring, Verification and Inspection Commission (UNMOVIC). It should be noted that IAEA technical assistance to Iraq has been resumed over

  7. Adobe Photoshop images software in the verification of radiation portal

    International Nuclear Information System (INIS)

    Ouyang Shuigen; Wang Xiaohu; Liu Zhiqiang; Wei Xiyi; Qi Yong

    2010-01-01

    Objective: To investigate the value of Adobe Photoshop images software in the verification of radiation portal. Methods: The portal and simulation films or CT reconstruction images were imported into computer using a scanner. The image size, gray scale and contrast scale were adjusted with Adobe Photoshop images software, then image registration and measurement were completed. Results: By the comparison between portal image and simulation image, the set-up errors in right-left, superior-inferior and anterior-posterior directions were (1.11 ± 1.37) mm, (1.33 ± 1.25) mm and (0.83±0.79) mm in the head and neck;(1.44±1.03) mm,(1.6±1.52) mm and (1.34±1.17) mm in the thorax;(1.53±0.86) mm, (1.83 ± 1.19) mm and (1.67 ± 0.68)mm in the abdomen; (1.93 ± I. 83) mm, (1.59 ± 1.07)mm and (0.85 ± 0.72)mm in the pelvic cavity. Conclusions: Accurate radiation portal verification and position measurement can be completed by using Adobe Photoshop, which is a simple, safe and reliable method. (authors)

  8. Gated Treatment Delivery Verification With On-Line Megavoltage Fluoroscopy

    International Nuclear Information System (INIS)

    Tai An; Christensen, James D.; Gore, Elizabeth; Khamene, Ali; Boettger, Thomas; Li, X. Allen

    2010-01-01

    Purpose: To develop and clinically demonstrate the use of on-line real-time megavoltage (MV) fluoroscopy for gated treatment delivery verification. Methods and Materials: Megavoltage fluoroscopy (MVF) image sequences were acquired using a flat panel equipped for MV cone-beam CT in synchrony with the respiratory signal obtained from the Anzai gating device. The MVF images can be obtained immediately before or during gated treatment delivery. A prototype software tool (named RTReg4D) was developed to register MVF images with phase-sequenced digitally reconstructed radiograph images generated from the treatment planning system based on four-dimensional CT. The image registration can be used to reposition the patient before or during treatment delivery. To demonstrate the reliability and clinical usefulness, the system was first tested using a thoracic phantom and then prospectively in actual patient treatments under an institutional review board-approved protocol. Results: The quality of the MVF images for lung tumors is adequate for image registration with phase-sequenced digitally reconstructed radiographs. The MVF was found to be useful for monitoring inter- and intrafractional variations of tumor positions. With the planning target volume contour displayed on the MVF images, the system can verify whether the moving target stays within the planning target volume margin during gated delivery. Conclusions: The use of MVF images was found to be clinically effective in detecting discrepancies in tumor location before and during respiration-gated treatment delivery. The tools and process developed can be useful for gated treatment delivery verification.

  9. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  10. 9 CFR 417.8 - Agency verification.

    Science.gov (United States)

    2010-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  11. Timed verification with µCRL

    NARCIS (Netherlands)

    Blom, S.C.C.; Ioustinova, N.; Sidorova, N.; Broy, M.; Zamulin, A.V.

    2003-01-01

    µCRL is a process algebraic language for specification and verification of distributed systems. µCRL allows to describe temporal properties of distributed systems but it has no explicit reference to time. In this work we propose a manner of introducing discrete time without extending the language.

  12. Programmable electronic system design & verification utilizing DFM

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2000-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DIM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DIM to

  13. Verification of Software Components: Addressing Unbounded Paralelism

    Czech Academy of Sciences Publication Activity Database

    Adámek, Jiří

    2007-01-01

    Roč. 8, č. 2 (2007), s. 300-309 ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * formal verification * unbounded parallelism Subject RIV: JC - Computer Hardware ; Software

  14. A Comparison of Modular Verification Techniques

    DEFF Research Database (Denmark)

    Andersen, Henrik Reif; Staunstrup, Jørgen; Maretti, Niels

    1997-01-01

    This paper presents and compares three techniques for mechanized verification of state oriented design descriptions. One is a traditional forwardgeneration of a fixed point characterizing the reachable states. The two others can utilize a modular structure provided by the designer. Onerequires...

  15. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    The problem of validation and verification of correctness of present day hardware and soft- ware systems has become extemely complex due to the enormous growth in the size of the designs. Today typically 50% to 70% of the design cycle time is spent in verifying correct- ness. While simulation remains a predominant form ...

  16. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Formal Verification of Quasi-Synchronous Systems

    Science.gov (United States)

    2015-07-01

    pg. 215-226, Springer-Verlag: London, UK, 2001. [4] Nicolas Halbwachs and Louis Mandel, Simulation and Verification of Asynchronous Systems by...Huang, S. A. Smolka, W. Tan , and S. Tripakis, Deep Random Search for Efficient Model Checking of Timed Automata, in Proceedings of the 13th Monterey

  18. Behaviour Protocols Verification: Fighting State Explosion

    Czech Academy of Sciences Publication Activity Database

    Mach, M.; Plášil, František; Kofroň, Jan

    2005-01-01

    Roč. 6, č. 2 (2005), s. 22-30 ISSN 1525-9293 R&D Projects: GA ČR(CZ) GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : formal verification * software components * stateexplos ion * behavior protocols * parse trees Subject RIV: JC - Computer Hardware ; Software

  19. Verification of Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Jacobsen, Lasse; Jacobsen, Morten; Møller, Mikael Harkjær

    2011-01-01

    of interesting theoretical properties distinguishing them from other time extensions of Petri nets. We shall give an overview of the recent theory developed in the verification of TAPN extended with features like read/transport arcs, timed inhibitor arcs and age invariants. We will examine in detail...

  20. Unification & sharing in timed automata verification

    DEFF Research Database (Denmark)

    David, Alexandre; Behrmann, Gerd; Larsen, Kim Guldstrand

    2003-01-01

    We present the design of the model-checking engine and internal data structures for the next generation of UPPAAL. The design is based on a pipeline architecture where each stage represents one independent operation in the verification algorithms. The architecture is based on essentially one shar...

  1. A Verification Framework for Agent Communication

    NARCIS (Netherlands)

    Eijk, R.M. van; Boer, F.S. de; Hoek, W. van der; Meyer, J-J.Ch.

    2003-01-01

    In this paper, we introduce a verification method for the correctness of multiagent systems as described in the framework of acpl (Agent Communication Programming Language). The computational model of acpl consists of an integration of the two different paradigms of ccp (Concurrent Constraint

  2. A Typical Verification Challenge for the GRID

    NARCIS (Netherlands)

    van de Pol, Jan Cornelis; Bal, H. E.; Brim, L.; Leucker, M.

    2008-01-01

    A typical verification challenge for the GRID community is presented. The concrete challenge is to implement a simple recursive algorithm for finding the strongly connected components in a graph. The graph is typically stored in the collective memory of a number of computers, so a distributed

  3. Zero leakage quantization scheme for biometric verification

    NARCIS (Netherlands)

    Groot, de J.A.; Linnartz, J.P.M.G.

    2011-01-01

    Biometrics gain increasing interest as a solution for many security issues, but privacy risks exist in case we do not protect the stored templates well. This paper presents a new verification scheme, which protects the secrets of the enrolled users. We will show that zero leakage is achieved if

  4. Hydrostatic Paradox: Experimental Verification of Pressure Equilibrium

    Science.gov (United States)

    Kodejška, C.; Ganci, S.; Ríha, J.; Sedlácková, H.

    2017-01-01

    This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical…

  5. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  6. Using timing information in speaker verification

    CSIR Research Space (South Africa)

    Van Heerden, CJ

    2005-11-01

    Full Text Available This paper presents an analysis of temporal information as a feature for use in speaker verification systems. The relevance of temporal information in a speaker’s utterances is investigated, both with regard to improving the robustness of modern...

  7. Sampling for the verification of materials balances

    International Nuclear Information System (INIS)

    Avenhaus, R.; Goeres, H.J.; Beedgen, R.

    1983-08-01

    The results of a theory for verification of nuclear materials balance data are presented. The sampling theory is based on two diversion models where also a combination of models is taken into account. The theoretical considerations are illustrated with numerical examples using the data of a highly enriched uranium fabrication plant. (orig.) [de

  8. Statistical methods to correct for verification bias in diagnostic studies are inadequate when there are few false negatives: a simulation study

    Directory of Open Access Journals (Sweden)

    Vickers Andrew J

    2008-11-01

    Full Text Available Abstract Background A common feature of diagnostic research is that results for a diagnostic gold standard are available primarily for patients who are positive for the test under investigation. Data from such studies are subject to what has been termed "verification bias". We evaluated statistical methods for verification bias correction when there are few false negatives. Methods A simulation study was conducted of a screening study subject to verification bias. We compared estimates of the area-under-the-curve (AUC corrected for verification bias varying both the rate and mechanism of verification. Results In a single simulated data set, varying false negatives from 0 to 4 led to verification bias corrected AUCs ranging from 0.550 to 0.852. Excess variation associated with low numbers of false negatives was confirmed in simulation studies and by analyses of published studies that incorporated verification bias correction. The 2.5th – 97.5th centile range constituted as much as 60% of the possible range of AUCs for some simulations. Conclusion Screening programs are designed such that there are few false negatives. Standard statistical methods for verification bias correction are inadequate in this circumstance.

  9. Position Information

    Data.gov (United States)

    Social Security Administration — The Position Information Data Asset provides the ability to search for active SSA position descriptions using various search criteria. An individual may search by PD...

  10. Verification of the websites optimality

    OpenAIRE

    Hozjan, Boštjan

    2016-01-01

    Today, search engines are an important source of information for internet users. Whenever user performs a search, search engines display a vast number of results. Results are ranked by search engines’ own algorithms, which are not public. If a website owner wants his website to be found in search engines and consequently wants to generate traffic to the website through search engines, the website must appear among the first search results. Website’s position within the search engine result...

  11. Positive Psychology

    Science.gov (United States)

    Peterson, Christopher

    2009-01-01

    Positive psychology is a deliberate correction to the focus of psychology on problems. Positive psychology does not deny the difficulties that people may experience but does suggest that sole attention to disorder leads to an incomplete view of the human condition. Positive psychologists concern themselves with four major topics: (1) positive…

  12. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  13. Ubiquitous positioning

    CERN Document Server

    Mannings, Robin

    2008-01-01

    This groundbreaking resource offers a practical, in-depth understanding of Ubiquitous Positioning - positioning systems that identify the location and position of people, vehicles and objects in time and space in the digitized networked economy. The future and growth of ubiquitous positioning will be fueled by the convergence of many other areas of technology, from mobile telematics, Internet technology, and location systems, to sensing systems, geographic information systems, and the semantic web. This first-of-its-kind volume explores ubiquitous positioning from a convergence perspective, of

  14. A semi-automated 2D/3D marker-based registration algorithm modelling prostate shrinkage during radiotherapy for prostate cancer

    International Nuclear Information System (INIS)

    Budiharto, Tom; Slagmolen, Pieter; Hermans, Jeroen; Maes, Frederik; Verstraete, Jan; Heuvel, Frank Van den; Depuydt, Tom; Oyen, Raymond; Haustermans, Karin

    2009-01-01

    Background and purpose: Currently, most available patient alignment tools based on implanted markers use manual marker matching and rigid registration transformations to measure the needed translational shifts. To quantify the particular effect of prostate gland shrinkage, implanted gold markers were tracked during a course of radiotherapy including an isotropic scaling factor to model prostate shrinkage. Materials and methods: Eight patients with prostate cancer had gold markers implanted transrectally and seven were treated with (neo) adjuvant androgen deprivation therapy. After patient alignment to skin tattoos, orthogonal electronic portal images (EPIs) were taken. A semi-automated 2D/3D marker-based registration was performed to calculate the necessary couch shifts. The registration consists of a rigid transformation combined with an isotropic scaling to model prostate shrinkage. Results: The inclusion of an isotropic shrinkage model in the registration algorithm cancelled the corresponding increase in registration error. The mean scaling factor was 0.89 ± 0.09. For all but two patients, a decrease of the isotropic scaling factor during treatment was observed. However, there was almost no difference in the translation offset between the manual matching of the EPIs to the digitally reconstructed radiographs and the semi-automated 2D/3D registration. A decrease in the intermarker distance was found correlating with prostate shrinkage rather than with random marker migration. Conclusions: Inclusion of shrinkage in the registration process reduces registration errors during a course of radiotherapy. Nevertheless, this did not lead to a clinically significant change in the proposed table translations when compared to translations obtained with manual marker matching without a scaling correction

  15. Positioning consumption

    DEFF Research Database (Denmark)

    Halkier, Bente; Keller, Margit

    2014-01-01

    positionings emerges based on empirical examples of research in parent–children consumption. Positionings are flexible discursive fixations of the relationship between the performances of the practitioner, other practitioners, media discourse and consumption activities. The basic positioning types...... are the practice maintenance and the practice change position, with different sorts of adapting in between. Media discourse can become a resource for a resistant position against social control or for an appropriating position in favour of space for action. Regardless of the current relation to a particular media......This article analyses the ways in which media discourses become a part of contested consumption activities. We apply a positioning perspective with practice theory to focus on how practitioners relate to media discourse as a symbolic resource in their everyday practices. A typology of performance...

  16. Irradiation position-control equipment for the HIMAC

    Energy Technology Data Exchange (ETDEWEB)

    Higashi, Seiichi; Kuma, Shoichiro [Mitsubishi Electric Corp., Tokyo (Japan); Nomura, Kazuaki; Endo, Masahiro; Minohara, Shin-ichi

    1995-02-01

    Use of heavy-ion beams to mount a pinpoint attack on unhealthy tissue requires that the target tissue be placed in the precise location specified by the therapy planning equipment. The article reports on the detailed specifications, positioning mechanism, position verification method and the interface with the therapy planning equipment. (author).

  17. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  18. Dose delivery verification and accuracy assessment of stereotaxy in stereotactic radiotherapy and radiosurgery

    International Nuclear Information System (INIS)

    Pelagade, S.M.; Bopche, T.T.; Namitha, K.; Munshi, M.; Bhola, S.; Sharma, H.; Patel, B.K.; Vyas, R.K.

    2008-01-01

    The outcome of stereotactic radiotherapy (SRT) and stereotactic radiosurgery (SRS) in both benign and malignant tumors within the cranial region highly depends on precision in dosimetry, dose delivery and the accuracy assessment of stereotaxy associated with the unit. The frames BRW (Brown-Roberts-Wells) and GTC (Gill- Thomas-Cosman) can facilitate accurate patient positioning as well as precise targeting of tumours. The implementation of this technique may result in a significant benefit as compared to conventional therapy. As the target localization accuracy is improved, the demand for treatment planning accuracy of a TPS is also increased. The accuracy of stereotactic X Knife treatment planning system has two components to verify: (i) the dose delivery verification and the accuracy assessment of stereotaxy; (ii) to ensure that the Cartesian coordinate system associated is well established within the TPS for accurate determination of a target position. Both dose delivery verification and target positional accuracy affect dose delivery accuracy to a defined target. Hence there is a need to verify these two components in quality assurance protocol. The main intention of this paper is to present our dose delivery verification procedure using cylindrical wax phantom and accuracy assessment (target position) of stereotaxy using Geometric Phantom on Elekta's Precise linear accelerator for stereotactic installation

  19. Monitoring, reporting and verification for national REDD + programmes: two proposals

    International Nuclear Information System (INIS)

    Herold, Martin; Skutsch, Margaret

    2011-01-01

    Different options have been suggested by Parties to the UNFCCC (United Framework Convention on Climate Change) for inclusion in national approaches to REDD and REDD + (reduced deforestation, reduced degradation, enhancement of forest carbon stocks, sustainable management of forest, and conservation of forest carbon stocks). This paper proposes that from the practical and technical points of view of designing action for REDD and REDD + at local and sub-national level, as well as from the point of view of the necessary MRV (monitoring, reporting and verification), these should be grouped into three categories: conservation, which is rewarded on the basis of no changes in forest stock, reduced deforestation, in which lowered rates of forest area loss are rewarded, and positive impacts on carbon stock changes in forests remaining forest, which includes reduced degradation, sustainable management of forest of various kinds, and forest enhancement. Thus we have moved degradation, which conventionally is grouped with deforestation, into the forest management group reported as areas remaining forest land, with which it has, in reality, and particularly as regards MRV, much more in common. Secondly, in the context of the fact that REDD/REDD + is to take the form of a national or near-national approach, we argue that while systematic national monitoring is important, it may not be necessary for REDD/REDD + activities, or for national MRV, to be started at equal levels of intensity all over the country. Rather, areas where interventions seem easiest to start may be targeted, and here data measurements may be more rigorous (Tier 3), for example based on stakeholder self-monitoring with independent verification, while in other, untreated areas, a lower level of monitoring may be pursued, at least in the first instance. Treated areas may be targeted for any of the three groups of activities (conservation, reduced deforestation, and positive impact on carbon stock increases in

  20. Monitoring, reporting and verification for national REDD + programmes: two proposals

    Energy Technology Data Exchange (ETDEWEB)

    Herold, Martin [Center for Geoinformation, Department of Environmental Science, Wageningen University, Droevendaalsesteeg 3, 6708 PB Wageningen (Netherlands); Skutsch, Margaret, E-mail: martin.herold@wur.nl [Centro de Investigaciones en GeografIa Ambiental, UNAM Campus Morelia (Mexico)

    2011-01-15

    Different options have been suggested by Parties to the UNFCCC (United Framework Convention on Climate Change) for inclusion in national approaches to REDD and REDD + (reduced deforestation, reduced degradation, enhancement of forest carbon stocks, sustainable management of forest, and conservation of forest carbon stocks). This paper proposes that from the practical and technical points of view of designing action for REDD and REDD + at local and sub-national level, as well as from the point of view of the necessary MRV (monitoring, reporting and verification), these should be grouped into three categories: conservation, which is rewarded on the basis of no changes in forest stock, reduced deforestation, in which lowered rates of forest area loss are rewarded, and positive impacts on carbon stock changes in forests remaining forest, which includes reduced degradation, sustainable management of forest of various kinds, and forest enhancement. Thus we have moved degradation, which conventionally is grouped with deforestation, into the forest management group reported as areas remaining forest land, with which it has, in reality, and particularly as regards MRV, much more in common. Secondly, in the context of the fact that REDD/REDD + is to take the form of a national or near-national approach, we argue that while systematic national monitoring is important, it may not be necessary for REDD/REDD + activities, or for national MRV, to be started at equal levels of intensity all over the country. Rather, areas where interventions seem easiest to start may be targeted, and here data measurements may be more rigorous (Tier 3), for example based on stakeholder self-monitoring with independent verification, while in other, untreated areas, a lower level of monitoring may be pursued, at least in the first instance. Treated areas may be targeted for any of the three groups of activities (conservation, reduced deforestation, and positive impact on carbon stock increases in

  1. Shield verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    WSRC-RP-90-26, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification are integral part of the certification process. This document identifies the work performed and documentation generated to satisfy these action items for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system, it is not certification of the complete SHIELD system. Complete certification will follow at a later date. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but can be found in the references. The validation and verification effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system computer code is completed

  2. Focussed approach to verification under FMCT

    International Nuclear Information System (INIS)

    Bragin, V.; Carlson, J.; Bardsley, J.; Hill, J.

    1998-01-01

    FMCT will have different impacts on individual states due to the enormous variance in their nuclear fuel cycles and the associated fissile material inventories. The problem is how to negotiate a treaty that would achieve results favourable for all participants, given that interests and priorities vary so much. We believe that focussed verification, confined to safeguarding of enrichment and reprocessing facilities in NWS and TS, coupled with verification of unirradiated direct-use material produced after entry-into-force of a FMCT and supported with measures to detect possible undeclared enrichment and reprocessing activities, is technically adequate for the FMCT. Eventually this would become the appropriate model for all states party to the NPT

  3. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  4. Development and verification of the CATHENA GUI

    International Nuclear Information System (INIS)

    Chin, T.

    2008-01-01

    This paper presents the development and verification of a graphical user interface for CATHENA MOD-3.5d. The thermalhydraulic computer code CATHENA has been developed to simulate the physical behaviour of the hydraulic components in nuclear reactors and experimental facilities. A representation of the facility is developed as an ASCII text file and used by CATHENA to perform the simulation. The existing method of manual generation of idealizations of a physical system for performing thermal hydraulic analysis is complex, time-consuming and prone to errors. An overview is presented of the CATHENA GUI and its depiction of a CATHENA idealization through the manipulation of a visual collection of objects. The methodologies and rigour involved in the verification of the CATHENA GUI will be discussed. (author)

  5. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  6. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    Time optimal reachability analysis employs model-checking to compute goal states that can be reached from an initial state with a minimal accumulated time duration. The model-checker may produce a corresponding diagnostic trace which can be interpreted as a feasible schedule for many scheduling...... and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  7. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  8. GRIMHX verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Trumble, E.F.

    1991-12-01

    WSRC-RP-90-026, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification of the code is an integral part of this process. This document identifies the work performed and documentation generated to satisfy these action items for the Reactor Physics computer code GRIMHX. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but are found in the references. The publication of this document signals the validation and verification effort for the GRIMHX code is completed

  9. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  10. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  11. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    International Nuclear Information System (INIS)

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H

    2015-01-01

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  12. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H [University Medical Center Mannheim, University of Heidelberg, Mannheim, Baden-Wuerttemberg (Germany)

    2015-06-15

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  13. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    Science.gov (United States)

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  14. Face Verification using MLP and SVM

    OpenAIRE

    Cardinaux, Fabien; Marcel, Sébastien

    2002-01-01

    The performance of machine learning algorithms has steadily improved over the past few years, such as MLP or more recently SVM. In this paper, we compare two successful discriminant machine learning algorithms apply to the problem of face verification: MLP and SVM. These two algorithms are tested on a benchmark database, namely XM2VTS. Results show that a MLP is better than a SVM on this particular task.

  15. Verification tests for CANDU advanced fuel

    International Nuclear Information System (INIS)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D.

    1997-07-01

    For the development of a CANDU advanced fuel, the CANFLEX-NU fuel bundles were tested under reactor operating conditions at the CANDU-Hot test loop. This report describes test results and test methods in the performance verification tests for the CANFLEX-NU bundle design. The main items described in the report are as follows. - Fuel bundle cross-flow test - Endurance fretting/vibration test - Freon CHF test - Production of technical document. (author). 25 refs., 45 tabs., 46 figs

  16. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  17. Verification of the SLC wake potentials

    International Nuclear Information System (INIS)

    Bane, K.; Weiland, T.

    1983-01-01

    The accurate knowledge of the monopole, dipole, and quadrupole wake potentials is essential for SLC. These wake potentials were previously computed by the modal method. The time domain code TBCI allows independent verification of these results. This comparison shows that the two methods agree to within 10% for bunch lengths down to 1 mm. TBCI results also indicate that rounding the irises gives at least a 10% reduction in the wake potentials

  18. Safety Verification for Probabilistic Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587 ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid system s * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  19. Stamp Verification for Automated Document Authentication

    DEFF Research Database (Denmark)

    Micenková, Barbora; van Beusekom, Joost; Shafait, Faisal

    Stamps, along with signatures, can be considered as the most widely used extrinsic security feature in paper documents. In contrast to signatures, however, for stamps little work has been done to automatically verify their authenticity. In this paper, an approach for verification of color stamps ...... and copied stamps. Sensitivity and specificity of up to 95% could be obtained on a data set that is publicly available....

  20. Component Verification and Certification in NASA Missions

    Science.gov (United States)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  1. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  2. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  3. System Description: Embedding Verification into Microsoft Excel

    OpenAIRE

    Collins, Graham; Dennis, Louise Abigail

    2000-01-01

    The aim of the PROSPER project is to allow the embedding of existing verification technology into applications in such a way that the theorem proving is hidden, or presented to the end user in a natural way. This paper describes a system built to test whether the PROSPER toolkit satisfied this aim. The system combines the toolkit with Microsoft Excel, a popular commercial spreadsheet application.

  4. Functional Verification of Enhanced RISC Processor

    OpenAIRE

    SHANKER NILANGI; SOWMYA L

    2013-01-01

    This paper presents design and verification of a 32-bit enhanced RISC processor core having floating point computations integrated within the core, has been designed to reduce the cost and complexity. The designed 3 stage pipelined 32-bit RISC processor is based on the ARM7 processor architecture with single precision floating point multiplier, floating point adder/subtractor for floating point operations and 32 x 32 booths multiplier added to the integer core of ARM7. The binary representati...

  5. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  6. Initial Verification and Validation Assessment for VERA

    Energy Technology Data Exchange (ETDEWEB)

    Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States); Athe, Paridhi [North Carolina State Univ., Raleigh, NC (United States); Jones, Christopher [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hetzler, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sieger, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-04-01

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. This approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.

  7. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  8. IMRT delivery verification using a spiral phantom

    International Nuclear Information System (INIS)

    Richardson, Susan L.; Tome, Wolfgang A.; Orton, Nigel P.; McNutt, Todd R.; Paliwal, Bhudatt R.

    2003-01-01

    In this paper we report on the testing and verification of a system for IMRT delivery quality assurance that uses a cylindrical solid water phantom with a spiral trajectory for radiographic film placement. This spiral film technique provides more complete dosimetric verification of the entire IMRT treatment than perpendicular film methods, since it samples a three-dimensional dose subspace rather than using measurements at only one or two depths. As an example, the complete analysis of the predicted and measured spiral films is described for an intracranial IMRT treatment case. The results of this analysis are compared to those of a single field perpendicular film technique that is typically used for IMRT QA. The comparison demonstrates that both methods result in a dosimetric error within a clinical tolerance of 5%, however the spiral phantom QA technique provides a more complete dosimetric verification while being less time consuming. To independently verify the dosimetry obtained with the spiral film, the same IMRT treatment was delivered to a similar phantom in which LiF thermoluminescent dosimeters were arranged along the spiral trajectory. The maximum difference between the predicted and measured TLD data for the 1.8 Gy fraction was 0.06 Gy for a TLD located in a high dose gradient region. This further validates the ability of the spiral phantom QA process to accurately verify delivery of an IMRT plan

  9. Verification Survey of Uranium Mine Remediation

    International Nuclear Information System (INIS)

    Ron, Stager

    2009-01-01

    The Canadian Nuclear Safety Commission (CNSC) contracted an independent verification of an intensive gamma radiation survey conducted by a mining company to demonstrate that remediation of disturbed areas was complete. This site was the first of the recent mines being decommissioned in Canada and experience gained here may be applied to other mines being decommissioned in the future. The review included examination of the site-specific basis for clean-up criteria and ALARA as required by CNSC guidance. A paper review of the company report was conducted to determine if protocols were followed and that the summarized results could be independently reproduced. An independent verification survey was conducted on parts of the site and comparisons were made between gamma radiation measurements from the verification survey and the original company survey. Some aspects of data collection using rate meters linked to GPS data loggers are discussed as are aspects for data management and analyses methods required for the large amount of data collected during these surveys. Recommendations were made for implementation of future surveys and reporting the data from those surveys in order to ensure that remediation was complete. (authors)

  10. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  11. Margin Evaluation in the Presence of Deformation, Rotation, and Translation in Prostate and Entire Seminal Vesicle Irradiation With Daily Marker-Based Setup Corrections

    International Nuclear Information System (INIS)

    Mutanga, Theodore F.; Boer, Hans C.J. de; Wielen, Gerard J. van der; Hoogeman, Mischa S.; Incrocci, Luca; Heijmen, Ben J.M.

    2011-01-01

    Purpose: To develop a method for margin evaluation accounting for all measured displacements during treatment of prostate cancer. Methods and Materials: For 21 patients treated with stereographic targeting marker-based online translation corrections, dose distributions with varying margins and gradients were created. Sets of possible cumulative delivered dose distributions were simulated by moving voxels and accumulating dose per voxel. Voxel motion was simulated consistent with measured distributions of systematic and random displacements due to stereographic targeting inaccuracies, deformation, rotation, and intrafraction motion. The method of simulation maintained measured correlation of voxel motions due to organ deformation. Results: For the clinical target volume including prostate and seminal vesicles (SV), the probability that some part receives <95% of the prescribed dose, the changes in minimum dose, and volume receiving 95% of prescription dose compared with planning were 80.5% ± 19.2%, 9.0 ± 6.8 Gy, and 3.0% ± 3.7%, respectively, for the smallest studied margins (3 mm prostate, 5 mm SV) and steepest dose gradients. Corresponding values for largest margins (5 mm prostate, 8 mm SV) with a clinical intensity-modulated radiotherapy dose distribution were 46.5% ± 34.7%, 6.7 ± 5.8 Gy, and 1.6% ± 2.3%. For prostate-only clinical target volume, the values were 51.8% ± 17.7%, 3.3 ± 1.6 Gy, and 0.6% ± 0.5% with the smallest margins and 5.2% ± 7.4%, 1.8 ± 0.9 Gy, and 0.1% ± 0.1% for the largest margins. Addition of three-dimensional rotation corrections only improved these values slightly. All rectal planning constraints were met in the actual reconstructed doses for all studied margins. Conclusion: We developed a system for margin validation in the presence of deformations. In our population, a 5-mm margin provided sufficient dosimetric coverage for the prostate. In contrast, an 8-mm SV margin was still insufficient owing to deformations. Addition of

  12. Game Theory Models for the Verification of the Collective Behaviour of Autonomous Cars

    OpenAIRE

    Varga, László Z.

    2017-01-01

    The collective of autonomous cars is expected to generate almost optimal traffic. In this position paper we discuss the multi-agent models and the verification results of the collective behaviour of autonomous cars. We argue that non-cooperative autonomous adaptation cannot guarantee optimal behaviour. The conjecture is that intention aware adaptation with a constraint on simultaneous decision making has the potential to avoid unwanted behaviour. The online routing game model is expected to b...

  13. Online pretreatment verification of high-dose rate brachytherapy using an imaging panel

    Science.gov (United States)

    Fonseca, Gabriel P.; Podesta, Mark; Bellezzo, Murillo; Van den Bosch, Michiel R.; Lutgens, Ludy; Vanneste, Ben G. L.; Voncken, Robert; Van Limbergen, Evert J.; Reniers, Brigitte; Verhaegen, Frank

    2017-07-01

    Brachytherapy is employed to treat a wide variety of cancers. However, an accurate treatment verification method is currently not available. This study describes a pre-treatment verification system that uses an imaging panel (IP) to verify important aspects of the treatment plan. A detailed modelling of the IP was only possible with an extensive calibration performed using a robotic arm. Irradiations were performed with a high dose rate (HDR) 192Ir source within a water phantom. An empirical fit was applied to measure the distance between the source and the detector so 3D Cartesian coordinates of the dwell positions can be obtained using a single panel. The IP acquires 7.14 fps to verify the dwell times, dwell positions and air kerma strength (Sk). A gynecological applicator was used to create a treatment plan that was registered with a CT image of the water phantom used during the experiments for verification purposes. Errors (shifts, exchanged connections and wrong dwell times) were simulated to verify the proposed verification system. Cartesian source positions (panel measurement plane) have a standard deviation of about 0.02 cm. The measured distance between the source and the panel (z-coordinate) have a standard deviation up to 0.16 cm and maximum absolute error of  ≈0.6 cm if the signal is close to sensitive limit of the panel. The average response of the panel is very linear with Sk. Therefore, Sk measurements can be performed with relatively small errors. The measured dwell times show a maximum error of 0.2 s which is consistent with the acquisition rate of the panel. All simulated errors were clearly identified by the proposed system. The use of IPs is not common in brachytherapy, however, it provides considerable advantages. It was demonstrated that the IP can accurately measure Sk, dwell times and dwell positions.

  14. Weak lensing magnification in the Dark Energy Survey Science Verification data

    Science.gov (United States)

    Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration

    2018-05-01

    In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.

  15. Researcher positioning

    DEFF Research Database (Denmark)

    Mørck, Line Lerche; Khawaja, Iram

    2009-01-01

    abstract  This article focuses on the complex and multi-layered process of researcher positioning, specifically in relation to the politically sensitive study of marginalised and ‘othered' groups such as Muslims living in Denmark. We discuss the impact of different ethnic, religious and racial...... political and personal involvement by the researcher, which challenges traditional perspectives on research and researcher positioning. A key point in this regard is the importance of constant awareness of and reflection on the multiple ways in which one's positioning as a researcher influences the research...

  16. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  17. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  18. Verification and analysis of the positioning of a source of brachytherapy high dose within an applicator gynecological interstitial fletcher Utrecht TC/RM; Verificacion y analysis del posicionamiento de una fuente de braquiterapia de alta tasa de dosis dentro de un aplicador ginecologico fletcher intersticial UTRECHT TC/RM

    Energy Technology Data Exchange (ETDEWEB)

    Panedo Cobos, J. M.; Garcia castejon, M. A.; Huertas Martinez, C.; Gomez-Tejedor Alonso, S.; Rincon Perez, M.; Luna Tirado, J.; Perez Casas, A. M.

    2013-07-01

    Applicators are guides that circulate and are located within the patient brachytherapy sources. Applicators can suffer mechanical deformations due to processes of sterilization or shock, which may result in that the source do not place within these very precise and coincides with the planned. In these cases the planned treatment deviate actually managed. The object of this study is to verify that the position of the source into the dispenser coincides with the planned position, with a procedure that is described. (Author)

  19. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  20. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  1. Radiographic positioning

    International Nuclear Information System (INIS)

    Eisenberg, R.L.; Dennis, C.A.; May, C.

    1989-01-01

    This book concentrates on the routine radiographic examinations commonly performed. It details the wide variety of examinations possible and their place in initial learning and in the radiology department as references for those occasions when an unusual examination is requested. This book provides information ranging from basic terminology to skeletal positioning to special procedures. Positions are discussed and supplemented with a picture of a patient, the resulting radiograph, and a labeled diagram. Immobilization and proper shielding of the patient are also shown

  2. Position encoder

    International Nuclear Information System (INIS)

    Goursky, Vsevolod

    1975-01-01

    A circuitry for deriving the quotient of signal delivered by position-sensitive detectors is described. Digital output is obtained in the form of 10- to 12-bit words. Impact position may be determined with 0.25% accuracy when the dynamic range of the energy signal is less 1:10, and 0.5% accuracy when the dynamic range is 1:20. The division requires an average time of 5μs for 10-bit words

  3. Position encoder

    International Nuclear Information System (INIS)

    Goursky, V.

    1975-05-01

    This paper describes circuitry for deriving the quotient of signals delivered by position-sensitive detectors. Digital output is obtained in the form of 10 to 12 bit words. Impact position may be determined with 0.25% accuracy when the dynamic range of the energy signal is less than 1:10, and 0.5% accuracy when the dynamic range is 1:20. The division requires an average time of 5μs for 10-bit words [fr

  4. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  5. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  6. Verification techniques and dose distribution for computed tomographic planned supine craniospinal radiation therapy

    International Nuclear Information System (INIS)

    Chang, Eric L.; Wong Peifong; Forster, Kenneth M.; Petru, Mark D.; Kowalski, Alexander V.; Maor, Moshe H.

    2003-01-01

    A modified 3-field technique was designed with opposed cranial fields and a single spinal field encompassing the entire spinal axis. Two methods of plan verifications were performed before the first treatment. First, a system of orthogonal rulers plus the thermoplastic head holder was used to visualize the light fields at the craniospinal junction. Second, film phantom measurements were taken to visualize the gap between the fields at the level of the spinal cord. Treatment verification entailed use of a posterior-anterior (PA) portal film and placement of radiopaque wire on the inferior border of the cranial field. More rigorous verification required a custom-fabricated orthogonal film holder. The isocenter positions of both fields when they matched were recorded using a record-and-verify system. A single extended distance spinal field collimated at 42 degree sign encompassed the entire spinal neuraxis. Data were collected from 40 fractions of craniospinal irradiation (CSI). The systematic error observed for the actual daily treatments was -0.5 mm (underlap), while the stochastic error was represented by a standard deviation of 5.39 mm. Measured data across the gapped craniospinal junction with junction shifts included revealed a dose ranging from 89.3% to 108%. CSI can be performed without direct visualization of the craniospinal junction by using the verification methods described. While the use of rigorous film verification for supine technique may have reduced the systematic error, the inability to visualize the supine craniospinal junction on skin appears to have increased the stochastic error compared to published data on such errors associated with prone craniospinal irradiation

  7. Advancing the Fork detector for quantitative spent nuclear fuel verification

    Science.gov (United States)

    Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.

    2018-04-01

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms

  8. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  9. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  10. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  11. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  12. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  13. Positional games

    CERN Document Server

    Hefetz, Dan; Stojaković, Miloš; Szabó, Tibor

    2014-01-01

    This text serves as a thorough introduction to the rapidly developing field of positional games. This area constitutes an important branch of combinatorics, whose aim it is to systematically develop an extensive mathematical basis for a variety of two-player perfect information games. These range from such popular games as Tic-Tac-Toe and Hex to purely abstract games played on graphs and hypergraphs. The subject of positional games is strongly related to several other branches of combinatorics such as Ramsey theory, extremal graph and set theory, and the probabilistic method. These notes cover a variety of topics in positional games, including both classical results and recent important developments. They are presented in an accessible way and are accompanied by exercises of varying difficulty, helping the reader to better understand the theory. The text will benefit both researchers and graduate students in combinatorics and adjacent fields.

  14. Specification and Verification of Hybrid System

    International Nuclear Information System (INIS)

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  15. VEG-01: Veggie Hardware Verification Testing

    Science.gov (United States)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  16. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  17. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  18. Researcher Positioning

    DEFF Research Database (Denmark)

    Khawaja, Iram; Mørck, Line Lerche

    2009-01-01

    involvement by the researcher, which challenges traditional perspectives onresearch and researcher positioning. A key point in this regard is the importance ofconstant awareness of and reflection on the multiple ways in which one's positioningas a researcher influences the research process. Studying the other...

  19. Position detector

    International Nuclear Information System (INIS)

    Hayakawa, Toshifumi.

    1985-01-01

    Purpose: To enable to detect the position of an moving object in a control rod position detector, stably in a digital manner at a high accuracy and free from the undesired effects of circumstantial conditions such as the reactor temperature. Constitution: Coils connected in parallel with each other are disposed along the passage of a moving object and variable resistors and relays are connected in series with each of the coils respectively. Light emitting diodes is connected in series with the contacts of the respective relays. The resistance value of the variable resistors are adjusted depending on the changes in the circumstantial conditions and temperature distribution upon carrying out the positional detection. When the object is inserted into a coils, the relevant relay is deenergized, by which the relay contacts are closed to light up the diode. In the same manner, as the object is successively inserted into the coils, the diodes are lighted-up successively thereby enabling highly accurate and stable positional detection in a digital manner, free from the undesired effects of the circumstantial conditions. (Horiuchi, T.)

  20. MODELS CONCERNING PREVENTIVE VERIFICATION OF TECHNICAL EQUIPMENT

    Directory of Open Access Journals (Sweden)

    CÂRLAN M.

    2016-12-01

    Full Text Available The paper presents three operative models whose purpose is to improve the practice of preventive maintenance to a wide range of technical installations. Although the calculation criteria are different, the goal is the same: to determine the optimum time between two consecutive preventive interventions. The optimum criteria of these models are: - the maximum share of technical entity operating probabilities, in the case of the Ackoff - Sasieni [1] method; -the optimum time interval for preventive verification depending on the preventive-corrective maintenance costs imposed by the deciding factor, for the AsturioBaldin [2] model; - the minimum number of renewals – preventive and/or corrective maintenance operations [3

  1. Verification report for SIMREP 1.1

    International Nuclear Information System (INIS)

    Tarapore, P.S.

    1987-06-01

    SIMREP 1.1 is a discrete event computer simulation of repository operations in the surface waste-handling facility. The logic for this model is provided by Fluor Technology, Inc., the Architect/Engineer of the salt repository. The verification methods included a line-by-line review of the code, a detailed examination of a generated trace of all simulated events over a given period of operations, and a comparison of the simulation output results with expected values. SIMREP 1.1 performs in the required manner under the given range of input conditions

  2. Turf Conversion Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Figueroa, Jorge [Western Resource Advocates, Boulder, CO (United States)

    2017-12-05

    This measurement and verification (M and V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings as a result of water conservation measures (WCMs) in energy performance contracts associated with converting turfgrass or other water-intensive plantings to water-wise and sustainable landscapes. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M and V plan, and details the procedures to use to determine water savings.

  3. Outdoor Irrigation Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Figueroa, Jorge [Western Resource Advocates, Boulder, CO (United States)

    2017-12-05

    This measurement and verification (M&V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings resulting from water conservation measures (WCMs) in energy performance contracts associated with outdoor irrigation efficiency projects. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M&V plan, and details the procedures to use to determine water savings.

  4. Verification of product quality from process control

    International Nuclear Information System (INIS)

    Drobot, A.; Bunnell, L.R.; Freeborn, W.P.; Macedo, P.B.; Mellinger, G.B.; Pegg, I.L.; Piepel, G.F.; Reimus, M.A.H.; Routt, K.R.; Saad, E.

    1989-01-01

    Process models were developed to characterize the waste vitrification at West Valley, in terms of process operating constraints and glass compositions achievable. The need for verification of compliance with the proposed Waste Acceptance Preliminary Specification criteria led to development of product models, the most critical one being a glass durability model. Both process and product models were used in developing a target composition for the waste glass. This target composition designed to ensure that glasses made to this target will be of acceptable durability after all process variations have been accounted for. 4 refs., 11 figs., 5 tabs

  5. FEFTRA {sup TM} verification. Update 2013

    Energy Technology Data Exchange (ETDEWEB)

    Loefman, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Meszaros, F. [The Relief Lab., Harskut, (Hungary)

    2013-12-15

    FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems

  6. CIT photoheliograph functional verification unit test program

    Science.gov (United States)

    1973-01-01

    Tests of the 2/3-meter photoheliograph functional verification unit FVU were performed with the FVU installed in its Big Bear Solar Observatory vacuum chamber. Interferometric tests were run both in Newtonian (f/3.85) and Gregorian (f/50) configurations. Tests were run in both configurations with optical axis horizontal, vertical, and at 45 deg to attempt to determine any gravity effects on the system. Gravity effects, if present, were masked by scatter in the data associated with the system wavefront error of 0.16 lambda rms ( = 6328A) apparently due to problems in the primary mirror. Tests showed that the redesigned secondary mirror assembly works well.

  7. SCALE criticality safety verification and validation package

    International Nuclear Information System (INIS)

    Bowman, S.M.; Emmett, M.B.; Jordan, W.C.

    1998-01-01

    Verification and validation (V and V) are essential elements of software quality assurance (QA) for computer codes that are used for performing scientific calculations. V and V provides a means to ensure the reliability and accuracy of such software. As part of the SCALE QA and V and V plans, a general V and V package for the SCALE criticality safety codes has been assembled, tested and documented. The SCALE criticality safety V and V package is being made available to SCALE users through the Radiation Safety Information Computational Center (RSICC) to assist them in performing adequate V and V for their SCALE applications

  8. Accelerating functional verification of an integrated circuit

    Science.gov (United States)

    Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.

    2015-10-27

    Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.

  9. Burnup verification using the FORK measurement system

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK measurement system, designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program, has been used to verify reactor site records for burnup and cooling time for many years. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. This report deals with the application of the FORK system to burnup credit operations based on measurements performed on spent fuel assemblies at the Oconee Nuclear Station of Duke Power Company

  10. The backfitting process and its verification

    International Nuclear Information System (INIS)

    Del Nero, G.; Grimaldi, G.

    1990-01-01

    Backfitting of plants in operation is based on: - compliance with new standards and regulations, - lessons learned from operating experience. This goal can be more effectively achieved on the basis of a valid methodology of analysis and a consistent process of collection, storage and retrieval of the operating data. The general backfitting problem, the verification process and the utilization of TPA as mean to assess backfitting are illustrated. The results of the analyses performed on Caorso plant are presented as well, using some specially designed software tools Management more than hardware problems are focused. Some general conclusions are then presented as final results of the whole work

  11. DeepMitosis: Mitosis detection via deep detection, verification and segmentation networks.

    Science.gov (United States)

    Li, Chao; Wang, Xinggang; Liu, Wenyu; Latecki, Longin Jan

    2018-04-01

    Mitotic count is a critical predictor of tumor aggressiveness in the breast cancer diagnosis. Nowadays mitosis counting is mainly performed by pathologists manually, which is extremely arduous and time-consuming. In this paper, we propose an accurate method for detecting the mitotic cells from histopathological slides using a novel multi-stage deep learning framework. Our method consists of a deep segmentation network for generating mitosis region when only a weak label is given (i.e., only the centroid pixel of mitosis is annotated), an elaborately designed deep detection network for localizing mitosis by using contextual region information, and a deep verification network for improving detection accuracy by removing false positives. We validate the proposed deep learning method on two widely used Mitosis Detection in Breast Cancer Histological Images (MITOSIS) datasets. Experimental results show that we can achieve the highest F-score on the MITOSIS dataset from ICPR 2012 grand challenge merely using the deep detection network. For the ICPR 2014 MITOSIS dataset that only provides the centroid location of mitosis, we employ the segmentation model to estimate the bounding box annotation for training the deep detection network. We also apply the verification model to eliminate some false positives produced from the detection model. By fusing scores of the detection and verification models, we achieve the state-of-the-art results. Moreover, our method is very fast with GPU computing, which makes it feasible for clinical practice. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Is identity per se irrelevant? A contrarian view of self-verification effects.

    Science.gov (United States)

    Gregg, Aiden P

    2009-01-01

    Self-verification theory (SVT) posits that people who hold negative self-views, such as depressive patients, ironically strive to verify that these self-views are correct, by actively seeking out critical feedback or interaction partners who evaluate them unfavorably. Such verification strivings are allegedly directed towards maximizing subjective perceptions of prediction and control. Nonetheless, verification strivings are also alleged to stabilize maladaptive self-perceptions, and thereby hindering therapeutic recovery. Despite the widespread acceptance of SVT, I contend that the evidence for it is weak and circumstantial. In particular, I contend that that most or all major findings cited in support of SVT can be more economically explained in terms of raison oblige theory (ROT). ROT posits that people with negative self-views solicit critical feedback, not because they want it, but because they their self-view inclines them regard it as probative, a necessary condition for considering it worth obtaining. Relevant findings are reviewed and reinterpreted with an emphasis on depression, and some new empirical data reported. (c) 2008 Wiley-Liss, Inc.

  13. Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?

    Science.gov (United States)

    Rosen, Lisa H; Principe, Connor P; Langlois, Judith H

    2013-02-13

    The authors examined whether early adolescents ( N = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive positive or negative feedback from an unknown peer in different domains of self. Results were consistent with self-verification theory; adolescents who perceived themselves as having both strengths and weaknesses were more likely to seek negative feedback regarding a self-perceived weakness compared to a self-perceived strength. The authors found similar support for self-verification processes when they considered the entire sample regardless of perceived strengths and weaknesses; hierarchical linear modeling (HLM) examined the predictive power of ratings of self-perceived ability, certainty, and importance on feedback seeking for all participants and provided additional evidence of self-verification strivings in adolescence.

  14. First Images from VLT Science Verification Programme

    Science.gov (United States)

    1998-09-01

    morning of September 1 when the telescope was returned to the Commissioning Team that has since continued its work. The FORS instrument is now being installed and the first images from this facility are expected shortly. Observational circumstances During the two-week SV period, a total of 154 hours were available for astronomical observations. Of these, 95 hours (62%) were used to collect scientific data, including calibrations, e.g. flat-fielding and photometric standard star observations. 15 hours (10%) were spent to solve minor technical problems, while another 44 hours (29%) were lost due to adverse meteorological conditions (clouds or wind exceeding 15 m/sec). The amount of telescope technical downtime is very small at this moment of the UT1 commissioning. This fact provides an impressive indication of high technical reliability that has been achieved and which will be further consolidated during the next months. The meteorological conditions that were encountered at Paranal during this period were unfortunately below average, when compared to data from the same calendar period in earlier years. There was an excess of bad seeing and fewer good seeing periods than normal; see, however, ESO PR Photo 35c/98 with 0.26 arcsec image quality. Nevertheless, the measured image quality on the acquired frames was often better than the seeing measured outside the enclosure by the Paranal seeing monitor. Part of this very positive effect is due to "active field stabilization" , now performed during all observations by rapid motion (10 - 70 times per second) of the 1.1-m secondary mirror of beryllium (M2) and compensating for the "twinkling" of stars. Science Verification data soon to be released A great amount of valuable data was collected during the SV programme. The available programme time was distributed as follows: Hubble Deep Field - South [HDF-S; NICMOS and STIS Fields] (37.1 hrs); Lensed QSOs (3.2 hrs); High-z Clusters (6.2 hrs); Host Galaxies of Gamma-Ray Bursters (2

  15. Optimizing of verification photographs by using the so-called tangential field technique

    International Nuclear Information System (INIS)

    Proske, H.; Merte, H.; Kratz, H.

    1991-01-01

    When irradiating under high voltage condition, verification photographs prove to be difficult to take if the Gantry position is not aligned to 0deg respectively 180deg, since the patient is being irradiated diagonally. Under these conditions it is extremely difficult to align the X-ray-cartridge vertically to the central beam of the therapeutic radiation. This results in, amongst others, misprojections, so that definite dimensions of portrayed organ structures become practical impossible to determine. This paper describes how we have solved these problems on our high voltage units (tele-gamma cobalt unit and linear-accelerator). By using simple accessories, determination of dimensions of organ structures, as shown on the verification photographs, are made possible. We illustrate our method by using the so-called tangential fields technique when irradiating mamma carcinoma. (orig.) [de

  16. Maintaining positive

    OpenAIRE

    Gheorghe Gh. IONESCU; Adina Letitia NEGRUSA

    2004-01-01

    Maintaining positive work-force relationships includes in effective labor-management relations and making appropriate responses to current employee issues. Among the major current employee issues are protection from arbitrary dismissal, drug and alcohol abuse, privacy rights and family maters and they impact work. In our paper we discus two problems: first, the meanings of industrial democracy; second, the three principal operational concepts of industrial democracy (1) industrial democracy t...

  17. A quality assurance index for brachytherapy treatment plan verification

    International Nuclear Information System (INIS)

    Simpson, J.B.; Clarke, J.P.

    2000-01-01

    A method is described which provides an independent verification of a brachytherapy treatment plan. The method is applicable to any common geometric configuration and utilises a simple equation derived from a common form of nonlinear regression. The basis for the index value is the relationship between the treatment time, prescribed dose, source strength and plan geometry. This relationship may be described mathematically as: Total Treatment Time ∝ Prescribed Dose/Source Strength x (a geometric term) with the geometric term incorporating three geometric components, namely the distance from source positions to points of dose normalisation (d), the total length of the dwell positions (L), and the number of source trains or catheters (N). A general equation of the form GF = k (d) -α (L) -β (N) -y is used to describe the plan geometry, where GF is what we have termed the geometric factor, k is a constant of proportionality and the exponents are derived from the non-linear regression process. The resulting index is simple to calculate prior to patient treatment and sensitive enough to identify significant error whilst being robust enough to allow for a normal degree of geometric distortion

  18. Approach to 3D dose verification by utilizing autoactivation

    Energy Technology Data Exchange (ETDEWEB)

    Nakajima, Yasunori, E-mail: yasunori.nkjm@gmail.com [Tokyo Institute of Technology, Yokohama-shi (Japan); Kohno, Toshiyuki [Tokyo Institute of Technology, Yokohama-shi (Japan); Inaniwa, Taku; Sato, Shinji; Yoshida, Eiji; Yamaya, Taiga [National Institute of Radiological Sciences, Chiba-shi (Japan); Tsuruta, Yuki [Tokyo Institute of Technology, Yokohama-shi (Japan); Sihver, Lembit [Chalmers University of Technology, Gothenburg (Sweden)

    2011-08-21

    To evaluate the deposited dose distribution in a target, we have proposed to utilize the annihilation gamma-rays emitted from the positron emitters distributed in the target irradiated with stable heavy-ion beams. Verification of the one dimensional (1-D) dose distributions along and perpendicular to a beam axis was achieved through our previous works. The purpose of this work is to verify 3-D dose distributions. As the first attempt uniform PMMA targets were irradiated in simple rectangular parallelepiped shapes, and the annihilation gamma-rays were detected with a PET scanner. By comparing the detected annihilation gamma-ray distributions with the calculated ones the dose distributions were estimated. As a result the estimated positions of the distal edges of the dose distributions were in agreement with the measured ones within 1 mm. However, the estimated positions of the proximal edges were different from the measured ones by 5-9 mm depending on the thickness of the irradiation filed.

  19. SORO post-simulations of Bruce A Unit 4 in-core flux detector verification tests

    Energy Technology Data Exchange (ETDEWEB)

    Braverman, E.; Nainer, O. [Bruce Power, Nuclear Safety Analysis and Support Dept., Toronto, Ontario (Canada)]. E-mail: Evgeny.Braverman@brucepower.com; Ovidiu.Nainer@brucepower.com

    2004-07-01

    During the plant equipment assessment prior to requesting approval for restart of Bruce A Units 3 and 4 it was determined that all in-core flux detectors needed to be replaced. Flux detector verification tests were performed to confirm that the newly installed detectors had been positioned according to design specifications and that their response closely follows the calculated flux shape changes caused by selected reactivity mechanism movements. By comparing the measured and post-simulated RRS and NOP detector responses to various perturbations, it was confirmed that the new detectors are wired and positioned correctly. (author)

  20. Advanced verification methods for OVI security ink

    Science.gov (United States)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  1. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  2. Nonintrusive verification attributes for excess fissile materials

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Eccleston, G.W.; Fearey, B.L.

    1997-10-01

    Under US initiatives, over two hundred metric tons of fissile materials have been declared to be excess to national defense needs. These excess materials are in both classified and unclassified forms. The US has expressed the intent to place these materials under international inspections as soon as practicable. To support these commitments, members of the US technical community are examining a variety of nonintrusive approaches (i.e., those that would not reveal classified or sensitive information) for verification of a range of potential declarations for these classified and unclassified materials. The most troublesome and potentially difficult issues involve approaches for international inspection of classified materials. The primary focus of the work to date has been on the measurement of signatures of relevant materials attributes (e.g., element, identification number, isotopic ratios, etc.), especially those related to classified materials and items. The authors are examining potential attributes and related measurement technologies in the context of possible verification approaches. The paper will discuss the current status of these activities, including their development, assessment, and benchmarking status

  3. Advanced Technologies for Design Information Verification

    International Nuclear Information System (INIS)

    Watkins, Michael L.; Sheen, David M.; Rose, Joseph L.; Cumblidge, Stephen E.

    2009-01-01

    This paper discusses several technologies that have the potential to enhance facilities design verification. These approaches have shown promise in addressing the challenges associated with the verification of sub-component geometry and material composition for structures that are not directly accessible for physical inspection. A simple example is a pipe that extends into or through a wall or foundation. Both advanced electromagnetic and acoustic modalities will be discussed. These include advanced radar imaging, transient thermographic imaging, and guided acoustic wave imaging. Examples of current applications are provided. The basic principles and mechanisms of these inspection techniques are presented along with the salient practical features, advantages, and disadvantages of each technique. Other important considerations, such as component geometries, materials, and degree of access are also treated. The importance of, and strategies for, developing valid inspection models are also discussed. Beyond these basic technology adaptation and evaluation issues, important user interface considerations are outlined, along with approaches to quantify the overall performance reliability of the various inspection methods.

  4. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  5. Automatic quality verification of the TV sets

    Science.gov (United States)

    Marijan, Dusica; Zlokolica, Vladimir; Teslic, Nikola; Pekovic, Vukota; Temerinac, Miodrag

    2010-01-01

    In this paper we propose a methodology for TV set verification, intended for detecting picture quality degradation and functional failures within a TV set. In the proposed approach we compare the TV picture captured from a TV set under investigation with the reference image for the corresponding TV set in order to assess the captured picture quality and therefore, assess the acceptability of TV set quality. The methodology framework comprises a logic block for designing the verification process flow, a block for TV set quality estimation (based on image quality assessment) and a block for generating the defect tracking database. The quality assessment algorithm is a full-reference intra-frame approach which aims at detecting various digital specific-TV-set picture degradations, coming from TV system hardware and software failures, and erroneous operational modes and settings in TV sets. The proposed algorithm is a block-based scheme which incorporates the mean square error and a local variance between the reference and the tested image. The artifact detection algorithm is shown to be highly robust against brightness and contrast changes in TV sets. The algorithm is evaluated by performance comparison with the other state-of-the-art image quality assessment metrics in terms of detecting TV picture degradations, such as illumination and contrast change, compression artifacts, picture misalignment, aliasing, blurring and other types of degradations that are due to defects within the TV set video chain.

  6. Monitoring/Verification Using DMS: TATP Example

    International Nuclear Information System (INIS)

    Kevin Kyle; Stephan Weeks

    2008-01-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15-300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements

  7. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  8. Subsurface barrier verification technologies, informal report

    International Nuclear Information System (INIS)

    Heiser, J.H.

    1994-06-01

    One of the more promising remediation options available to the DOE waste management community is subsurface barriers. Some of the uses of subsurface barriers include surrounding and/or containing buried waste, as secondary confinement of underground storage tanks, to direct or contain subsurface contaminant plumes and to restrict remediation methods, such as vacuum extraction, to a limited area. To be most effective the barriers should be continuous and depending on use, have few or no breaches. A breach may be formed through numerous pathways including: discontinuous grout application, from joints between panels and from cracking due to grout curing or wet-dry cycling. The ability to verify barrier integrity is valuable to the DOE, EPA, and commercial sector and will be required to gain full public acceptance of subsurface barriers as either primary or secondary confinement at waste sites. It is recognized that no suitable method exists for the verification of an emplaced barrier's integrity. The large size and deep placement of subsurface barriers makes detection of leaks challenging. This becomes magnified if the permissible leakage from the site is low. Detection of small cracks (fractions of an inch) at depths of 100 feet or more has not been possible using existing surface geophysical techniques. Compounding the problem of locating flaws in a barrier is the fact that no placement technology can guarantee the completeness or integrity of the emplaced barrier. This report summarizes several commonly used or promising technologies that have been or may be applied to in-situ barrier continuity verification

  9. Verification and validation of control system software

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  10. DESIGN INFORMATION VERIFICATION FOR NUCLEAR SAFEGUARDS

    Energy Technology Data Exchange (ETDEWEB)

    Robert S. Bean; Richard R. M. Metcalf; Phillip C. Durst

    2009-07-01

    A critical aspect of international safeguards activities performed by the International Atomic Energy Agency (IAEA) is the verification that facility design and construction (including upgrades and modifications) do not create opportunities for nuclear proliferation. These Design Information Verification activities require that IAEA inspectors compare current and past information about the facility to verify the operator’s declaration of proper use. The actual practice of DIV presents challenges to the inspectors due to the large amount of data generated, concerns about sensitive or proprietary data, the overall complexity of the facility, and the effort required to extract just the safeguards relevant information. Planned and anticipated facilities will (especially in the case of reprocessing plants) be ever larger and increasingly complex, thus exacerbating the challenges. This paper reports the results of a workshop held at the Idaho National Laboratory in March 2009, which considered technologies and methods to address these challenges. The use of 3D Laser Range Finding, Outdoor Visualization System, Gamma-LIDAR, and virtual facility modeling, as well as methods to handle the facility data issues (quantity, sensitivity, and accessibility and portability for the inspector) were presented. The workshop attendees drew conclusions about the use of these techniques with respect to successfully employing them in an operating environment, using a Fuel Conditioning Facility walk-through as a baseline for discussion.

  11. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  12. Automatic verification of a lip-synchronisation protocol using Uppaal

    NARCIS (Netherlands)

    Bowman, H.; Faconti, G.; Katoen, J.-P.; Latella, D.; Massink, M.

    1998-01-01

    We present the formal specification and verification of a lip-synchronisation protocol using the real-time model checker Uppaal. A number of specifications of this protocol can be found in the literature, but this is the first automatic verification. We take a published specification of the

  13. Verification of a CT scanner using a miniature step gauge

    DEFF Research Database (Denmark)

    Cantatore, Angela; Andreasen, J.L.; Carmignato, S.

    2011-01-01

    The work deals with performance verification of a CT scanner using a 42mm miniature replica step gauge developed for optical scanner verification. Errors quantification and optimization of CT system set-up in terms of resolution and measurement accuracy are fundamental for use of CT scanning...

  14. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  15. Modular Verification of Linked Lists with Views via Separation Logic

    DEFF Research Database (Denmark)

    Jensen, Jonas Braband; Birkedal, Lars; Sestoft, Peter

    2011-01-01

    We present a separation logic specification and verification of linked lists with views, a data structure from the C5 collection library for .NET. A view is a generalization of the well-known concept of an iterator. Linked lists with views form an interesting case study for verification since...

  16. A transformation of SDL specifications : a step towards the verification

    NARCIS (Netherlands)

    Ioustinova, N.; Sidorova, N.; Bjorner, D.; Broy, M.; Zamulin, A.

    2001-01-01

    Industrial-size specifications/models (whose state space is often infinite) can not be model checked in a direct way— a verification model of a system is model checked instead. Program transformation is a way to build a finite-state verification model that can be submitted to a model checker.

  17. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  18. 37 CFR 260.6 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... verification of the payment of royalty fees to those parties entitled to receive such fees, according to terms... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION...

  19. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  20. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  1. Neutron spectrometric methods for core inventory verification in research reactors

    International Nuclear Information System (INIS)

    Ellinger, A.; Filges, U.; Hansen, W.; Knorr, J.; Schneider, R.

    2002-01-01

    In consequence of the Non-Proliferation Treaty safeguards, inspections are periodically made in nuclear facilities by the IAEA and the EURATOM Safeguards Directorate. The inspection methods are permanently improved. Therefore, the Core Inventory Verification method is being developed as an indirect method for the verification of the core inventory and to check the declared operation of research reactors

  2. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  3. 37 CFR 262.7 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  4. 45 CFR 1626.7 - Verification of eligible alien status.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  5. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  6. Validation and verification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    Ades, M.J.; Crowe, R.D.; Toffer, H.

    1991-04-01

    This report discusses a verification and validation (V ampersand V) plan for computer codes used for safety analysis and probabilistic risk assessment calculations. The present plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River Office (DOE-SRO) to bring the essential safety analysis and probabilistic risk assessment codes in compliance with verification and validation requirements

  7. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  8. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  9. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  10. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  11. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  12. Independent tube verification and dynamic tracking in et inspection of nuclear steam generator

    International Nuclear Information System (INIS)

    Xiongzi, Li; Zhongxue, Gan; Lance, Fitzgibbons

    2001-01-01

    The full text follows. In the examination of pressure boundary tubes in steam generators of commercial pressurized water nuclear power plants (PWR's), it is critical to know exactly which particular tube is being accessed. There are no definitive landmarks or markings on the individual tubes. Today this is done manually, it is tedious, and interrupts the normal inspection work, and is difficult due to the presence of water on the tube surface, plug ends instead of tube openings in the field of view, and varying lighting quality. In order to eliminate the human error and increase the efficiency of operation, there is a need to identify tube position during the inspection process, independent of robot encoder position and motion. A process based on a Cognex MVS-8200 system and its application function package has been developed to independently identify tube locations. ABB Combustion Engineering Nuclear Power's Outage Services group, USPPL in collaboration with ABB Power Plant Laboratories' Advanced Computers and Controls department has developed a new vision-based Independent Tube Verification system (GENESIS-ITVS-TM ). The system employ's a model-based tube-shape detection algorithm and dynamic tracking methodology to detect the true tool position and its offsets from identified tube location. GENESIS-ITVS-TM is an automatic Independent Tube Verification System (ITVS). Independent tube verification is a tube validation technique using computer vision, and not using any robot position parameters. This process independently counts the tubes in the horizontal and vertical axes of the plane of the steam generator tube sheet as the work tool is moved. Thus it knows the true position in the steam generator, given a known starting point. This is analogous to the operator's method of counting tubes for verification, but it is automated. GENESIS-ITVS-TM works independent of the robot position, velocity, or acceleration. The tube position information is solely obtained from

  13. Agreeable fancy or disagreeable truth? Reconciling self-enhancement and self-verification.

    Science.gov (United States)

    Swann, W B; Pelham, B W; Krull, D S

    1989-11-01

    Three studies asked why people sometimes seek positive feedback (self-enhance) and sometimes seek subjectively accurate feedback (self-verify). Consistent with self-enhancement theory, people with low self-esteem as well as those with high self-esteem indicated that they preferred feedback pertaining to their positive rather than negative self-views. Consistent with self-verification theory, the very people who sought favorable feedback pertaining to their positive self-conceptions sought unfavorable feedback pertaining to their negative self-views, regardless of their level of global self-esteem. Apparently, although all people prefer to seek feedback regarding their positive self-views, when they seek feedback regarding their negative self-views, they seek unfavorable feedback. Whether people self-enhance or self-verify thus seems to be determined by the positivity of the relevant self-conceptions rather than their level of self-esteem or the type of person they are.

  14. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  15. Study of Measurement Strategies of Geometric Deviation of the Position of the Threaded Holes

    Science.gov (United States)

    Drbul, Mário; Martikan, Pavol; Sajgalik, Michal; Czan, Andrej; Broncek, Jozef; Babik, Ondrej

    2017-12-01

    Verification of product and quality control is an integral part of current production process. In terms of functional requirements and product interoperability, it is necessary to analyze their dimensional and also geometric specifications. Threaded holes are verified elements too, which are a substantial part of detachable screw connections and have a broad presence in engineering products. This paper deals with on the analysing of measurement strategies of verification geometric deviation of the position of the threaded holes, which are the indirect method of measuring threaded pins when applying different measurement strategies which can affect the result of the verification of the product..

  16. Design verification for reactor head replacement

    International Nuclear Information System (INIS)

    Dwivedy, K.K.; Whitt, M.S.; Lee, R.

    2005-01-01

    This paper outlines the challenges of design verification for reactor head replacement for PWR plants and the program for qualification from the prospective of the utility design engineering group. This paper is based on the experience with the design confirmation of four reactor head replacements for two plants, and their interfacing components, parts, appurtenances, and support structures. The reactor head replacement falls under the jurisdiction of the applicable edition of the ASME Section XI code, with particular reference to repair/replacement activities. Under any repair/replacement activities, demands may be encountered in the development of program and plan for replacement due to the vintage of the original design/construction Code and the design reports governing the component qualifications. Because of the obvious importance of the reactor vessel, these challenges take on an added significance. Additional complexities are introduced to the project, when the replacement components are fabricated by vendors different from the original vendor. Specific attention is needed with respect to compatibility with the original design and construction of the part and interfacing components. The program for reactor head replacement requires evaluation of welding procedures, applicable examination, test, and acceptance criteria for material, welds, and the components. Also, the design needs to take into consideration the life of the replacement components with respect to the extended period of operation of the plant after license renewal and other plant improvements. Thus, the verification of acceptability of reactor head replacement provides challenges for development and maintenance of a program and plan, design specification, design report, manufacturer's data report and material certification, and a report of reconciliation. The technical need may also be compounded by other challenges such as widely scattered global activities and organizational barriers, which

  17. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  18. MCNP5 development, verification, and performance

    International Nuclear Information System (INIS)

    Forrest B, Brown

    2003-01-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  19. RELAP-7 Software Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  20. Security Protocols: Specification, Verification, Implementation, and Composition

    DEFF Research Database (Denmark)

    Almousa, Omar

    An important aspect of Internet security is the security of cryptographic protocols that it deploys. We need to make sure that such protocols achieve their goals, whether in isolation or in composition, i.e., security protocols must not suffer from any aw that enables hostile intruders to break...... results. The most important generalization is the support for all security properties of the geometric fragment proposed by [Gut14]....... their security. Among others, tools like OFMC [MV09b] and Proverif [Bla01] are quite efficient for the automatic formal verification of a large class of protocols. These tools use different approaches such as symbolic model checking or static analysis. Either approach has its own pros and cons, and therefore, we...

  1. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  2. Palmprint Based Verification System Using SURF Features

    Science.gov (United States)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  3. Consortium for Verification Technology Fellowship Report.

    Energy Technology Data Exchange (ETDEWEB)

    Sadler, Lorraine E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-06-01

    As one recipient of the Consortium for Verification Technology (CVT) Fellowship, I spent eight days as a visiting scientist at the University of Michigan, Department of Nuclear Engineering and Radiological Sciences (NERS). During this time, I participated in multiple department and research group meetings and presentations, met with individual faculty and students, toured multiple laboratories, and taught one-half of a one-unit class on Risk Analysis in Nuclear Arms control (six 1.5 hour lectures). The following report describes some of the interactions that I had during my time as well as a brief discussion of the impact of this fellowship on members of the consortium and on me/my laboratory’s technical knowledge and network.

  4. Remedial activities effectiveness verification in tailing areas.

    Science.gov (United States)

    Kluson, J; Thinova, L; Neznal, M; Svoboda, T

    2015-06-01

    The complex radiological study of the basin of sludge from the uranium ore mining and preprocessing was done. Air kerma rates (including its spectral analysis) at the reference height of 1 m above ground over the whole area were measured and radiation fields mapped during two measuring campaigns (years 2009 and 2014). K, U and Th concentrations in sludge and concentrations in depth profiles (including radon concentration and radon exhalation rates) in selected points were determined using gamma spectrometry for in situ as well as laboratory samples measurement. Results were used for the analysis, design evaluation and verification of the efficiency of the remediation measures. Efficiency of the sludge basin covering by the inert material was modelled using MicroShield code. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Remedial activities effectiveness verification in tailing areas

    International Nuclear Information System (INIS)

    Kluson, J.; Thinova, L.; Svoboda, T.; Neznal, M.

    2015-01-01

    The complex radiological study of the basin of sludge from the uranium ore mining and preprocessing was done. Air kerma rates (including its spectral analysis) at the reference height of 1 m above ground over the whole area were measured and radiation fields mapped during two measuring campaigns (years 2009 and 2014). K, U and Th concentrations in sludge and concentrations in depth profiles (including radon concentration and radon exhalation rates) in selected points were determined using gamma spectrometry for in situ as well as laboratory samples measurement. Results were used for the analysis, design evaluation and verification of the efficiency of the remediation measures. Efficiency of the sludge basin covering by the inert material was modelled using MicroShield code. (authors)

  6. Automated Verification of Quantum Protocols using MCMAS

    Directory of Open Access Journals (Sweden)

    F. Belardinelli

    2012-07-01

    Full Text Available We present a methodology for the automated verification of quantum protocols using MCMAS, a symbolic model checker for multi-agent systems The method is based on the logical framework developed by D'Hondt and Panangaden for investigating epistemic and temporal properties, built on the model for Distributed Measurement-based Quantum Computation (DMC, an extension of the Measurement Calculus to distributed quantum systems. We describe the translation map from DMC to interpreted systems, the typical formalism for reasoning about time and knowledge in multi-agent systems. Then, we introduce dmc2ispl, a compiler into the input language of the MCMAS model checker. We demonstrate the technique by verifying the Quantum Teleportation Protocol, and discuss the performance of the tool.

  7. Biometric Subject Verification Based on Electrocardiographic Signals

    Science.gov (United States)

    Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor)

    2014-01-01

    A method of authenticating or declining to authenticate an asserted identity of a candidate-person. In an enrollment phase, a reference PQRST heart action graph is provided or constructed from information obtained from a plurality of graphs that resemble each other for a known reference person, using a first graph comparison metric. In a verification phase, a candidate-person asserts his/her identity and presents a plurality of his/her heart cycle graphs. If a sufficient number of the candidate-person's measured graphs resemble each other, a representative composite graph is constructed from the candidate-person's graphs and is compared with a composite reference graph, for the person whose identity is asserted, using a second graph comparison metric. When the second metric value lies in a selected range, the candidate-person's assertion of identity is accepted.

  8. 300 Area Process Trenches Verification Package

    International Nuclear Information System (INIS)

    Lerch, J.A.

    1998-03-01

    The purpose of this verification package is to document achievement of the remedial action objectives for the 300 Area Process Trenches (300 APT) located within the 300-FF-1 Operable Unit (OU). The 300 APT became active in 1975 as a replacement for the North and South Process Pond system that is also part of the 300-FF-1 OU. The trenches received 300 Area process effluent from the uranium fuel fabrication facilities. Waste from the 300 Area laboratories that was determined to be below discharge limits based on monitoring performed at the 307 retention basin was also released to the trenches. Effluent flowed through the headworks sluice gates, down a concrete apron, and into the trenches. From the beginning of operations in 1975 until 1993, a continuous, composite sampler was located at the headwork structure to analyze process effluent at the point of discharge to the environment

  9. Status and verification strategy for ITER neutronics

    Energy Technology Data Exchange (ETDEWEB)

    Loughlin, Michael, E-mail: michael.loughlin@iter.org [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Angelone, Maurizio [Associazione EURATOM-ENEA Sulla Fusione, Via E. Fermi 45, I-00044 Frascati, Roma (Italy); Batistoni, Paola [Associazione EURATOM-ENEA Sulla Fusione, Via E. Fermi 45, I-00044 Frascati, Roma (Italy); JET-EFDA, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Bertalot, Luciano [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Eskhult, Jonas [Studsvik Nuclear AB, SE-611 Nyköping (Sweden); Konno, Chikara [Japan Atomic Energy Agency Tokai-mura, Naka-gun, Ibaraki-ken 319-1195 (Japan); Pampin, Raul [F4E Fusion for Energy, Josep Pla 2, Torres Diagonal Litoral B3, Barcelona 08019 (Spain); Polevoi, Alexei; Polunovskiy, Eduard [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2014-10-15

    The paper summarizes the current status of neutronics at ITER and a first set of proposals for experimental programmes to be conducted in the early operational life-time of ITER are described for the more crucial areas. These include a TF coils heating benchmark, a streaming benchmark and streaming measurements by activation on ITER itself. Also on ITER the measurement of activated water from triton burn-up should be planned and performed. This will require the measurement of triton burn-up in DD phase. Measurements of neutron flux in the tokamak building during DD operations should also be carried out. The use of JET for verification of shut down dose rate estimates is desirable. Other facilities to examine the production and behaviour of activated corrosion products and the shielding properties of concretes to high energy (6 MeV) gamma-rays are recommended.

  10. Verification of adolescent self-reported smoking.

    Science.gov (United States)

    Kentala, Jukka; Utriainen, Pekka; Pahkala, Kimmo; Mattila, Kari

    2004-02-01

    Smoking and the validity of information obtained on it is often questioned in view of the widespread belief that adolescents tend to under- or over-report the habit. The aim here was to verify smoking habits as reported in a questionnaire given in conjunction with dental examinations by asking participants directly whether they smoked or not and performing biochemical measurements of thiocyanate in the saliva and carbon monoxide in the expired air. The series consisted of 150 pupils in the ninth grade (age 15 years). The reports in the questionnaires seemed to provide a reliable estimate of adolescent smoking, the sensitivity of the method being 81-96%, specificity 77-95%. Biochemical verification or control of smoking proved needless in normal dental practice. Accepting information offered by the patient provides a good starting point for health education and work motivating and supporting of self-directed breaking of the habit.

  11. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  12. Computer Generated Inputs for NMIS Processor Verification

    International Nuclear Information System (INIS)

    J. A. Mullens; J. E. Breeding; J. A. McEvers; R. W. Wysor; L. G. Chiang; J. R. Lenarduzzi; J. T. Mihalczo; J. K. Mattingly

    2001-01-01

    Proper operation of the Nuclear Identification Materials System (NMIS) processor can be verified using computer-generated inputs [BIST (Built-In-Self-Test)] at the digital inputs. Preselected sequences of input pulses to all channels with known correlation functions are compared to the output of the processor. These types of verifications have been utilized in NMIS type correlation processors at the Oak Ridge National Laboratory since 1984. The use of this test confirmed a malfunction in a NMIS processor at the All-Russian Scientific Research Institute of Experimental Physics (VNIIEF) in 1998. The NMIS processor boards were returned to the U.S. for repair and subsequently used in NMIS passive and active measurements with Pu at VNIIEF in 1999

  13. Formal Modeling and Verification for MVB

    Directory of Open Access Journals (Sweden)

    Mo Xia

    2013-01-01

    Full Text Available Multifunction Vehicle Bus (MVB is a critical component in the Train Communication Network (TCN, which is widely used in most of the modern train techniques of the transportation system. How to ensure security of MVB has become an important issue. Traditional testing could not ensure the system correctness. The MVB system modeling and verification are concerned in this paper. Petri Net and model checking methods are used to verify the MVB system. A Hierarchy Colored Petri Net (HCPN approach is presented to model and simulate the Master Transfer protocol of MVB. Synchronous and asynchronous methods are proposed to describe the entities and communication environment. Automata model of the Master Transfer protocol is designed. Based on our model checking platform M3C, the Master Transfer protocol of the MVB is verified and some system logic critical errors are found. Experimental results show the efficiency of our methods.

  14. Game-based verification and synthesis

    DEFF Research Database (Denmark)

    Vester, Steen

    and the environment behaves. Synthesis of strategies in games can thus be used for automatic generation of correct-by-construction programs from specifications. We consider verification and synthesis problems for several well-known game-based models. This includes both model-checking problems and satisfiability...... can be extended to solve finitely-branching turn-based games more efficiently. Further, the novel concept of winning cores in parity games is introduced. We use this to develop a new polynomial-time under-approximation algorithm for solving parity games. Experimental results show that this algorithm...... corresponds directly to a program for the corresponding entity of the system. A strategy for a player which ensures that the player wins no matter how the other players behave then corresponds to a program ensuring that the specification of the entity is satisfied no matter how the other entities...

  15. MCNP5 development, verification, and performance

    Energy Technology Data Exchange (ETDEWEB)

    Forrest B, Brown [Los Alamos National Laboratory (United States)

    2003-07-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  16. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  17. Towards Formal Verification of a Separation Microkernel

    Science.gov (United States)

    Butterfield, Andrew; Sanan, David; Hinchey, Mike

    2013-08-01

    The best approach to verifying an IMA separation kernel is to use a (fixed) time-space partitioning kernel with a multiple independent levels of separation (MILS) architecture. We describe an activity that explores the cost and feasibility of doing a formal verification of such a kernel to the Common Criteria (CC) levels mandated by the Separation Kernel Protection Profile (SKPP). We are developing a Reference Specification of such a kernel, and are using higher-order logic (HOL) to construct formal models of this specification and key separation properties. We then plan to do a dry run of part of a formal proof of those properties using the Isabelle/HOL theorem prover.

  18. Accuracy verification methods theory and algorithms

    CERN Document Server

    Mali, Olli; Repin, Sergey

    2014-01-01

    The importance of accuracy verification methods was understood at the very beginning of the development of numerical analysis. Recent decades have seen a rapid growth of results related to adaptive numerical methods and a posteriori estimates. However, in this important area there often exists a noticeable gap between mathematicians creating the theory and researchers developing applied algorithms that could be used in engineering and scientific computations for guaranteed and efficient error control.   The goals of the book are to (1) give a transparent explanation of the underlying mathematical theory in a style accessible not only to advanced numerical analysts but also to engineers and students; (2) present detailed step-by-step algorithms that follow from a theory; (3) discuss their advantages and drawbacks, areas of applicability, give recommendations and examples.

  19. On the numerical verification of industrial codes

    International Nuclear Information System (INIS)

    Montan, Sethy Akpemado

    2013-01-01

    Numerical verification of industrial codes, such as those developed at EDF R and D, is required to estimate the precision and the quality of computed results, even more for code running in HPC environments where millions of instructions are performed each second. These programs usually use external libraries (MPI, BLACS, BLAS, LAPACK). In this context, it is required to have a tool as non intrusive as possible to avoid rewriting the original code. In this regard, the CADNA library, which implements the Discrete Stochastic Arithmetic, appears to be one of a promising approach for industrial applications. In the first part of this work, we are interested in an efficient implementation of the BLAS routine DGEMM (General Matrix Multiply) implementing Discrete Stochastic Arithmetic. The implementation of a basic algorithm for matrix product using stochastic types leads to an overhead greater than 1000 for a matrix of 1024 * 1024 compared to the standard version and commercial versions of xGEMM. Here, we detail different solutions to reduce this overhead and the results we have obtained. A new routine Dgemm- CADNA have been designed. This routine has allowed to reduce the overhead from 1100 to 35 compare to optimized BLAS implementations (GotoBLAS). Then, we focus on the numerical verification of Telemac-2D computed results. Performing a numerical validation with the CADNA library shows that more than 30% of the numerical instabilities occurring during an execution come from the dot product function. A more accurate implementation of the dot product with compensated algorithms is presented in this work. We show that implementing these kinds of algorithms, in order to improve the accuracy of computed results does not alter the code performance. (author)

  20. Verification of hyperthermia treatment planning in cervix carcinoma patients using invasive thermometry

    International Nuclear Information System (INIS)

    Haaren Van, P.M.A.; Kok, H.P.; Zum Voerde Sive Voerding, P.J.; Oldenborg, S.; Stalpers, L.J.A.; Crezee, J.; Berg Van den, C.A.T; Leeuw De, A.A.C.

    2005-01-01

    Full text: Hyperthermia treatment planning (HTP) is a useful tool for improvement of clinical hyperthermia treatments. Aim of this study was to determine the correlation between HTP and measurements during hyperthermia treatments. We compared the calculated specific absorption rate (SAR) with clinically measured SAR-values, from ΔT-measurements, in cervix carcinoma patients. General difficulties for such clinical verifications are changes in the anatomy during the different steps and possible movement of the catheters. We used one fixed invasive catheter in the tumor additional to the usual non-invasive catheters in the vagina, bladder and rectum, for insertion of multisensor thermocouple probes. A special CT-scan with the patient in treatment position and the catheters in situ was made for the HTP. We performed these verifications in a total of 11 treatments in 7 patients. The main difficulties for accurate verification were of clinical nature: difficulties arising from the use of gynaecological tampon and the limited number of measurements in tissue. Remaining air in the vagina and sub-optimal tissue contact of the catheters resulted in bad thermal contact between thermocouples and tissue, causing measurement artefacts that are difficult to correlate with calculations. These artefacts are probably not specific for thermocouple measurements, but more general for intraluminal temperature and SAR measurements. (author)

  1. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    International Nuclear Information System (INIS)

    Fittipaldi, Ana; Maciel Felix

    2000-01-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  2. FMEF Electrical single line diagram and panel schedule verification process

    International Nuclear Information System (INIS)

    Fong, S.K.

    1998-01-01

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF

  3. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  4. Independent verification in operations at nuclear power plants

    International Nuclear Information System (INIS)

    Donderi, D.C.; Smiley, A.; Ostry, D.J.; Moray, N.P.

    1995-09-01

    A critical review of approaches to independent verification in operations used in nuclear power plant quality assurance programs in other countries, was conducted for this study. This report identifies the uses of independent verification and provides an assessment of the effectiveness of the various approaches. The findings indicate that at Canadian nuclear power plants as much, if not more, independent verification is performed than at power plants in the other countries included in the study. Additional requirements in this area are not proposed for Canadian stations. (author)

  5. A GIS support system for declaration and verification

    International Nuclear Information System (INIS)

    Poucet, A.; Contini, S.; Bellezza, F.

    2001-01-01

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  6. Enhanced dynamic wedge and independent monitor unit verification

    International Nuclear Information System (INIS)

    Howlett, S.J.; University of Newcastle, NSW

    2004-01-01

    Full text: Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. The enhanced dynamic wedge factor (EDWF) presents some significant problems in accurate MU calculation, particularly in the case of non centre of field position (COF). This paper describes development of an independent MU program, concentrating on the implementation of the EDW component. The difficult case of non COF points under the EDW was studied in detail. A survey of Australasian centres regarding the use of independent MU check systems was conducted. The MUCalculator was developed with reference to MU calculations made by Pinnacle 3D RTP system (Philips) for 4MV, 6MV and 18MV X-ray beams from Varian machines used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. Ionisation chamber measurements in solid water TM and liquid water were performed based on a published test data set. Published algorithms combined with a depth dependent profile correction were applied in an attempt to match measured data with maximum accuracy. The survey results are presented. Substantial data is presented in tabular form and extensive comparison with published data. Several different methods for calculating EDWF are examined. A small systematic error was detected in the Gibbon equation used for the EDW calculations. Generally, calculations were within +2% of measured values, although some setups exceeded this variation. Results indicate that COF

  7. A GIS support system for declaration and verification

    Energy Technology Data Exchange (ETDEWEB)

    Poucet, A; Contini, S; Bellezza, F [European Commission, Joint Research Centre, Institute for Systems Informatics and Safety (ISIS), Ispra (Italy)

    2001-07-01

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  8. A silicon strip detector array for energy verification and quality assurance in heavy ion therapy.

    Science.gov (United States)

    Debrot, Emily; Newall, Matthew; Guatelli, Susanna; Petasecca, Marco; Matsufuji, Naruhiro; Rosenfeld, Anatoly B

    2018-02-01

    The measurement of depth dose profiles for range and energy verification of heavy ion beams is an important aspect of quality assurance procedures for heavy ion therapy facilities. The steep dose gradients in the Bragg peak region of these profiles require the use of detectors with high spatial resolution. The aim of this work is to characterize a one dimensional monolithic silicon detector array called the "serial Dose Magnifying Glass" (sDMG) as an independent ion beam energy and range verification system used for quality assurance conducted for ion beams used in heavy ion therapy. The sDMG detector consists of two linear arrays of 128 silicon sensitive volumes each with an effective size of 2mm × 50μm × 100μm fabricated on a p-type substrate at a pitch of 200 μm along a single axis of detection. The detector was characterized for beam energy and range verification by measuring the response of the detector when irradiated with a 290 MeV/u 12 C ion broad beam incident along the single axis of the detector embedded in a PMMA phantom. The energy of the 12 C ion beam incident on the detector and the residual energy of an ion beam incident on the phantom was determined from the measured Bragg peak position in the sDMG. Ad hoc Monte Carlo simulations of the experimental setup were also performed to give further insight into the detector response. The relative response profiles along the single axis measured with the sDMG detector were found to have good agreement between experiment and simulation with the position of the Bragg peak determined to fall within 0.2 mm or 1.1% of the range in the detector for the two cases. The energy of the beam incident on the detector was found to vary less than 1% between experiment and simulation. The beam energy incident on the phantom was determined to be (280.9 ± 0.8) MeV/u from the experimental and (280.9 ± 0.2) MeV/u from the simulated profiles. These values coincide with the expected energy of 281 MeV/u. The sDMG detector

  9. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    Science.gov (United States)

    Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.

    2008-08-01

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET

  10. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Knopf, A; Paganetti, H; Cascio, E; Bortfeld, T [Department of Radiation Oncology, MGH and Harvard Medical School, Boston, MA 02114 (United States); Parodi, K [Heidelberg Ion Therapy Center, Heidelberg (Germany); Bonab, A [Department of Radiology, MGH and Harvard Medical School, Boston, MA 02114 (United States)

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6{sup 0} to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the

  11. Quantitative assessment of the physical potential of proton beam range verification with PET/CT.

    Science.gov (United States)

    Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the

  12. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  13. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  14. Results of verification and investigation of wind velocity field forecast. Verification of wind velocity field forecast model

    International Nuclear Information System (INIS)

    Ogawa, Takeshi; Kayano, Mitsunaga; Kikuchi, Hideo; Abe, Takeo; Saga, Kyoji

    1995-01-01

    In Environmental Radioactivity Research Institute, the verification and investigation of the wind velocity field forecast model 'EXPRESS-1' have been carried out since 1991. In fiscal year 1994, as the general analysis, the validity of weather observation data, the local features of wind field, and the validity of the positions of monitoring stations were investigated. The EXPRESS which adopted 500 m mesh so far was improved to 250 m mesh, and the heightening of forecast accuracy was examined, and the comparison with another wind velocity field forecast model 'SPEEDI' was carried out. As the results, there are the places where the correlation with other points of measurement is high and low, and it was found that for the forecast of wind velocity field, by excluding the data of the points with low correlation or installing simplified observation stations to take their data in, the forecast accuracy is improved. The outline of the investigation, the general analysis of weather observation data and the improvements of wind velocity field forecast model and forecast accuracy are reported. (K.I.)

  15. Environmental Technology Verification Report: Taconic Energy, Inc. TEA Fuel Additive

    Science.gov (United States)

    The Greenhouse Gas Technology Center (GHG Center) is one of six verification organizations operating under EPA’s ETV program. One sector of significant interest to GHG Center stakeholders is transportation - particularly technologies that result in fuel economy improvements. Taco...

  16. Verification of cloud cover forecast with INSAT observation over ...

    Indian Academy of Sciences (India)

    and recent trends in forecast quality, improving ... general framework for forecast verification based on the joint ... clouds is given by POD, and FAR offers a metric for how often the .... Kain J S and Fritsch J M 1993 Convective parameterizations.

  17. Hawk-I - First results from science verification

    NARCIS (Netherlands)

    Kissler-Patig, M.; Larsen, S.S.|info:eu-repo/dai/nl/304833347; Wehner, E.M.|info:eu-repo/dai/nl/314114688

    2008-01-01

    The VLT wide-field near-infrared imager HAWK-I was commissioned in 2007 and Science Verification (SV) programmes were conducted in August 2007. A selection of results from among the twelve Science Verfication proposals are summarised.

  18. Verification and Planning for Stochastic Processes with Asynchronous Events

    National Research Council Canada - National Science Library

    Younes, Hakan L

    2005-01-01

    .... The most common assumption is that of history-independence: the Markov assumption. In this thesis, the author considers the problems of verification and planning for stochastic processes with asynchronous events, without relying on the Markov assumption...

  19. Interpolant tree automata and their application in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2016-01-01

    This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way in this ......This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way...... clause verification problems indicates that the combination of interpolant tree automaton with abstract interpretation gives some increase in the power of the verification tool, while sometimes incurring a performance overhead....

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: GREEN BUILDING TECHNOLOGIES

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  1. Systematic study of source mask optimization and verification flows

    Science.gov (United States)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  2. VERIFICATION TESTING OF AIR POLLUTION CONTROL TECHNOLOGY QUALITY MANAGEMENT PLAN

    Science.gov (United States)

    This document is the basis for quality assurance for the Air Pollution Control Technology Verification Center (APCT Center) operated under the U.S. Environmental Protection Agency (EPA). It describes the policies, organizational structure, responsibilities, procedures, and qualit...

  3. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  4. Verification and Validation of Tropospheric Model/Database

    National Research Council Canada - National Science Library

    Junho, choi

    1998-01-01

    A verification and validation of tropospheric models and databases has been performed based on ray tracing algorithm, statistical analysis, test on real time system operation, and other technical evaluation process...

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: EXEL INDUSTRIAL AIRMIX SPRAY GUN

    Science.gov (United States)

    The Environmental Technology Verification Program has partnered with Concurrent Technologies Corp. to verify innovative coatings and coating equipment technologies for reducing air emissions. This report describes the performance of EXEL Industrial's Kremlin Airmix high transfer ...

  6. Assessment of Automated Measurement and Verification (M&V) Methods

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Touzani, Samir [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Custodio, Claudine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sohn, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fernandes, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jump, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-01

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  7. Stability and Concentration Verification of Ammonium Perchlorate Dosing Solutions

    National Research Council Canada - National Science Library

    Tsui, David

    1998-01-01

    Stability and concentration verification was performed for the ammonium perchlorate dosing solutions used in the on-going 90-Day Oral Toxicity Study conducted by Springborn Laboratories, Inc. (SLI Study No. 3433.1...

  8. Verification of Linear (In)Dependence in Finite Precision Arithmetic

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří

    2014-01-01

    Roč. 8, č. 3-4 (2014), s. 323-328 ISSN 1661-8289 Institutional support: RVO:67985807 Keywords : linear dependence * linear independence * pseudoinverse matrix * finite precision arithmetic * verification * MATLAB file Subject RIV: BA - General Mathematics

  9. Secure Oblivious Hiding, Authentication, Tamper Proofing, and Verification Techniques

    National Research Council Canada - National Science Library

    Fridrich, Jessica

    2002-01-01

    In this report, we describe an algorithm for robust visual hash functions with applications to digital image watermarking for authentication and integrity verification of video data and still images...

  10. Towards real-time verification of CO2 emissions

    Science.gov (United States)

    Peters, Glen P.; Le Quéré, Corinne; Andrew, Robbie M.; Canadell, Josep G.; Friedlingstein, Pierre; Ilyina, Tatiana; Jackson, Robert B.; Joos, Fortunat; Korsbakken, Jan Ivar; McKinley, Galen A.; Sitch, Stephen; Tans, Pieter

    2017-12-01

    The Paris Agreement has increased the incentive to verify reported anthropogenic carbon dioxide emissions with independent Earth system observations. Reliable verification requires a step change in our understanding of carbon cycle variability.

  11. INF and IAEA: A comparative analysis of verification strategy

    International Nuclear Information System (INIS)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities

  12. Cleanup Verification Package for the 300-18 Waste Site

    International Nuclear Information System (INIS)

    Capron, J.M.

    2005-01-01

    This cleanup verification package documents completion of remedial action for the 300-18 waste site. This site was identified as containing radiologically contaminated soil, metal shavings, nuts, bolts, and concrete

  13. Spent-fuel verification with the Los Alamos fork detector

    International Nuclear Information System (INIS)

    Rinard, P.M.; Bosler, G.E.

    1987-01-01

    The Los Alamos fork detector for the verification of spent-fuel assemblies has generated precise, reproducible data. The data analyses have now evolved to the point of placing tight restrictions on a diverter's actions

  14. Technical workshop on safeguards, verification technologies, and other related experience

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation

  15. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  16. Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling

    Science.gov (United States)

    Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.

    2002-01-01

    Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.

  17. Technical workshop on safeguards, verification technologies, and other related experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-31

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation Refs, figs, tabs

  18. Distributed Engine Control Empirical/Analytical Verification Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — In this Phase I project, Impact Technologies, in collaboration with Prof. R.K. Yedavalli, propose a novel verification environment for eventual rapid certification...

  19. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  20. Learning a Genetic Measure for Kinship Verification Using Facial Images

    Directory of Open Access Journals (Sweden)

    Lu Kou

    2015-01-01

    Full Text Available Motivated by the key observation that children generally resemble their parents more than other persons with respect to facial appearance, distance metric (similarity learning has been the dominant choice for state-of-the-art kinship verification via facial images in the wild. Most existing learning-based approaches to kinship verification, however, are focused on learning a genetic similarity measure in a batch learning manner, leading to less scalability for practical applications with ever-growing amount of data. To address this, we propose a new kinship verification approach by learning a sparse similarity measure in an online fashion. Experimental results on the kinship datasets show that our approach is highly competitive to the state-of-the-art alternatives in terms of verification accuracy, yet it is superior in terms of scalability for practical applications.

  1. Container Verification Using Optically Stimulated Luminescence

    International Nuclear Information System (INIS)

    Tanner, Jennifer E.; Miller, Steven D.; Conrady, Matthew M.; Simmons, Kevin L.; Tinker, Michael R.

    2008-01-01

    Containment verification is a high priority for safeguards containment and surveillance. Nuclear material containers, safeguards equipment cabinets, camera housings, and detector cable conduit are all vulnerable to tampering. Even with a high security seal on a lid or door, custom-built hinges and interfaces, and special colors and types of finishes, the surfaces of enclosures can be tampered with and any penetrations repaired and covered over. With today's technology, these repairs would not be detected during a simple visual inspection. Several suggested solutions have been to develop complicated networks of wires, fiber-optic cables, lasers or other sensors that line the inside of a container and alarm when the network is disturbed. This results in an active system with real time evidence of tampering but is probably not practical for most safeguards applications. A more practical solution would be to use a passive approach where an additional security feature was added to surfaces which would consist of a special coating or paint applied to the container or enclosure. One type of coating would incorporate optically stimulated luminescent (OSL) material. OSL materials are phosphors that luminesce in proportion to the ionizing radiation dose when stimulated with the appropriate optical wavelengths. The OSL fluoresces at a very specific wavelength when illuminated at another, very specific wavelength. The presence of the pre-irradiated OSL material in the coating is confirmed using a device that interrogates the surface of the enclosure using the appropriate optical wavelength and then reads the resulting luminescence. The presence of the OSL indicates that the integrity of the surface is intact. The coating itself could be transparent which would allow the appearance of the container to remain unchanged or the OSL material could be incorporated into certain paints or epoxies used on various types of containers. The coating could be applied during manufacturing

  2. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Energy Technology Data Exchange (ETDEWEB)

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  3. Verification of the uncertainty principle by using diffraction of light waves

    International Nuclear Information System (INIS)

    Nikolic, D; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the acquisition of the experimental data and their further analysis, we used a computer. Because of its simplicity this experiment is very suitable for demonstration, as well as for a quantitative exercise at universities and final year of high school studies.

  4. Application of semi-active RFID power meter in automatic verification pipeline and intelligent storage system

    Science.gov (United States)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    In this paper, the semi-active RFID watt-hour meter is applied to automatic test lines and intelligent warehouse management, from the transmission system, test system and auxiliary system, monitoring system, realize the scheduling of watt-hour meter, binding, control and data exchange, and other functions, make its more accurate positioning, high efficiency of management, update the data quickly, all the information at a glance. Effectively improve the quality, efficiency and automation of verification, and realize more efficient data management and warehouse management.

  5. Intraoperative verification of hysterosalpingography and laparoscopy in cases of oviductal infertility

    International Nuclear Information System (INIS)

    Cislo, M.; Murawski, M.; Popiela, A.

    1993-01-01

    There has been made an analysis of 45 cases of women's oviductal infertility qualified to surgical treatment. In search of these reasons of infertility these patients underwent hysterosalpingographic examination and 24 of them additionally, diagnostic laparoscopy with chromoturbation. Positive intraoperative verification of HSG and laparoscopy, as a confirmation of tubal obstruction, they obtained at 40 women (88.9%) that made possible to carry out microsurgical operations in them. Such a big percentage of correct diagnoses make sure, that both HSG and laparoscopy are indispensable for proper qualification of a patient to microsurgical treatment of oviductal infertility. (author)

  6. Automated synthesis and verification of configurable DRAM blocks for ASIC's

    Science.gov (United States)

    Pakkurti, M.; Eldin, A. G.; Kwatra, S. C.; Jamali, M.

    1993-01-01

    A highly flexible embedded DRAM compiler is developed which can generate DRAM blocks in the range of 256 bits to 256 Kbits. The compiler is capable of automatically verifying the functionality of the generated DRAM modules. The fully automated verification capability is a key feature that ensures the reliability of the generated blocks. The compiler's architecture, algorithms, verification techniques and the implementation methodology are presented.

  7. Tree dimension in verification of constrained Horn clauses

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick; Ganty, Pierre

    2018-01-01

    In this paper, we show how the notion of tree dimension can be used in the verification of constrained Horn clauses (CHCs). The dimension of a tree is a numerical measure of its branching complexity and the concept here applies to Horn clause derivation trees. Derivation trees of dimension zero c...... algorithms using these constructions to decompose a CHC verification problem. One variation of this decomposition considers derivations of successively increasing dimension. The paper includes descriptions of implementations and experimental results....

  8. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  9. Towards Model Validation and Verification with SAT Techniques

    OpenAIRE

    Gogolla, Martin

    2010-01-01

    After sketching how system development and the UML (Unified Modeling Language) and the OCL (Object Constraint Language) are related, validation and verification with the tool USE (UML-based Specification Environment) is demonstrated. As a more efficient alternative for verification tasks, two approaches using SAT-based techniques are put forward: First, a direct encoding of UML and OCL with Boolean variables and propositional formulas, and second, an encoding employing an...

  10. Engineering a static verification tool for GPU kernels

    OpenAIRE

    Bardsley, E; Betts, A; Chong, N; Collingbourne, P; Deligiannis, P; Donaldson, AF; Ketema, J; Liew, D; Qadeer, S

    2014-01-01

    We report on practical experiences over the last 2.5 years related to the engineering of GPUVerify, a static verification tool for OpenCL and CUDA GPU kernels, plotting the progress of GPUVerify from a prototype to a fully functional and relatively efficient analysis tool. Our hope is that this experience report will serve the verification community by helping to inform future tooling efforts. ? 2014 Springer International Publishing.

  11. Verification and Validation of RADTRAN 5.5.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  12. Experience in non-proliferation verification: The Treaty of Raratonga

    International Nuclear Information System (INIS)

    Walker, R.A.

    1998-01-01

    The verification provisions of the Treaty of Raratonga are subdivided into two categories: those performed by IAEA and those performed by other entities. A final provision of the Treaty of Raratonga is relevant to IAEA safeguards according to support of the continued effectiveness of the international non-proliferation system based on the Non-proliferation Treaty and the IAEA safeguards system. The non-IAEA verification process is described as well

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  14. Radiochemical verification and validation in the environmental data collection process

    International Nuclear Information System (INIS)

    Rosano-Reece, D.; Bottrell, D.; Bath, R.J.

    1994-01-01

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  15. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  16. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  17. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    OLGUIN, L.J.

    2000-01-01

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  18. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  19. The verification of DRAGON: progress and lessons learned

    International Nuclear Information System (INIS)

    Marleau, G.

    2002-01-01

    The general requirements for the verification of the legacy code DRAGON are somewhat different from those used for new codes. For example, the absence of a design manual for DRAGON makes it difficult to confirm that the each part of the code performs as required since these requirements are not explicitly spelled out for most of the DRAGON modules. In fact, this conformance of the code can only be assessed, in most cases, by making sure that the contents of the DRAGON data structures, which correspond to the output generated by a module of the code, contains the adequate information. It is also possible in some cases to use the self-verification options in DRAGON to perform additional verification or to evaluate, using an independent software, the performance of specific functions in the code. Here, we will describe the global verification process that was considered in order to bring DRAGON to an industry standard tool-set (IST) status. We will also discuss some of the lessons we learned in performing this verification and present some of the modification to DRAGON that were implemented as a consequence of this verification. (author)

  20. Precise positioning of patients for radiation therapy

    International Nuclear Information System (INIS)

    Verhey, L.J.; Goitein, M.; McNulty, P.; Munzenrider, J.E.; Suit, H.D.

    1982-01-01

    A number of immobilization schemes which permit precise daily positioning of patients for radiation therapy are discussed. Pretreatment and post-treatment radiographs have been taken with the patient in the treatment position and analyzed to determine the amount of intratreatment movement. Studies of patients in the supine, seated and decubitus positions indicate mean movements of less than 1 mm with a standard deviation of less than 1 mm. Patients immobilized in the seated position with a bite block and a mask have a mean movement of about 0.5 mm +/- 0.3 mm (s.d.), and patients immobilized in the supine position with their necks hyperextended for submental therapy evidence a mean movement of about 1.4 mm +/- 0.9 mm (s.d.). With the exception of those used for the decubitus position, the immobilization devices are simply fabricated out of thermoplastic casting materials readily available from orthopedic supply houses. A study of day-to-day reproducibility of patient position using laser alignment and pretreatment radiographs for final verification of position indicates that the initial laser alignment can be used to position a patient within 2.2 mm +/- 1.4 mm (s.d.) of the intended position. These results indicate that rigid immobilization devices can improve the precision of radiotherapy, which would be advantageous with respect to both tumor and normal tissue coverage in certain situations