WorldWideScience

Sample records for management independent verification

  1. Independent verification in operations at nuclear power plants

    International Nuclear Information System (INIS)

    Donderi, D.C.; Smiley, A.; Ostry, D.J.; Moray, N.P.

    1995-09-01

    A critical review of approaches to independent verification in operations used in nuclear power plant quality assurance programs in other countries, was conducted for this study. This report identifies the uses of independent verification and provides an assessment of the effectiveness of the various approaches. The findings indicate that at Canadian nuclear power plants as much, if not more, independent verification is performed than at power plants in the other countries included in the study. Additional requirements in this area are not proposed for Canadian stations. (author)

  2. Independent verification: operational phase liquid metal breeder reactors

    International Nuclear Information System (INIS)

    Bourne, P.B.

    1981-01-01

    The Fast Flux Test Facility (FFTF) recently achieved 100-percent power and now is in the initial stages of operation as a test reactor. An independent verification program has been established to assist in maintaining stable plant conditions, and to assure the safe operation of the reactor. Independent verification begins with the development of administrative procedures to control all other procedures and changes to the plant configurations. The technical content of the controlling procedures is subject to independent verification. The actual accomplishment of test procedures and operational maneuvers is witnessed by personnel not responsible for operating the plant. Off-normal events are analyzed, problem reports from other operating reactors are evaluated, and these results are used to improve on-line performance. Audits are used to confirm compliance with established practices and to identify areas where individual performance can be improved

  3. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  4. 10 CFR 300.11 - Independent verification.

    Science.gov (United States)

    2010-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11... managing an auditing or verification process, including the recruitment and allocation of other individual.... (c) Qualifications of organizations accrediting verifiers. Organizations that accredit individual...

  5. Independent verification in operations at nuclear power plants: Summaries of site visits

    International Nuclear Information System (INIS)

    Donderi, D.C.; Smiley, A.; Ostry, D.J.; Moray, N.P.

    1995-09-01

    A critical review of approaches to independent verification in operations used in nuclear power plant quality assurance programs in other countries was conducted and are detailed in volume 1. This paper is a compilation of the visits to nuclear power plant sites to study independent verification in operations at sites in Canada, USA, Japan, United Kingdom, France and Germany. 3 tabs., 22 figs

  6. Independent Verification Final Summary Report for the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    P.C. Weaver

    2009-04-29

    The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified “hot spot” cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: 1) performing radiological walkover surveys, and 2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site was removed in its entirety.

  7. Independent Verification Final Summary Report for the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    International Nuclear Information System (INIS)

    Weaver, P.C.

    2009-01-01

    The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified 'hot spot' cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: (1) performing radiological walkover surveys, and (2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site was removed in its entirety.

  8. Enhanced dynamic wedge and independent monitor unit verification

    International Nuclear Information System (INIS)

    Howlett, SJ.

    2005-01-01

    Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. This paper describes development of an independent MU program, concentrating on the implementation of the Enhanced Dynamic Wedge (EDW) component. The difficult case of non centre of field (COF) calculation points under the EDW was studied in some detail. Results of a survey of Australasian centres regarding the use of independent MU check systems is also presented. The system was developed with reference to MU calculations made by Pinnacle 3 D Radiotherapy Treatment Planning (RTP) system (ADAC - Philips) for 4MV, 6MV and 18MV X-ray beams used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. A small systematic error was detected in the equation used for the EDW calculations. Results indicate that COF equations may be used in the non COF situation with similar accuracy to that achieved with profile corrected methods. Further collaborative work with other centres is planned to extend these findings

  9. Verification of Linear (In)Dependence in Finite Precision Arithmetic

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří

    2014-01-01

    Roč. 8, č. 3-4 (2014), s. 323-328 ISSN 1661-8289 Institutional support: RVO:67985807 Keywords : linear dependence * linear independence * pseudoinverse matrix * finite precision arithmetic * verification * MATLAB file Subject RIV: BA - General Mathematics

  10. Independent verification of monitor unit calculation for radiation treatment planning system.

    Science.gov (United States)

    Chen, Li; Chen, Li-Xin; Huang, Shao-Min; Sun, Wen-Zhao; Sun, Hong-Qiang; Deng, Xiao-Wu

    2010-02-01

    To ensure the accuracy of dose calculation for radiation treatment plans is an important part of quality assurance (QA) procedures for radiotherapy. This study evaluated the Monitor Units (MU) calculation accuracy of a third-party QA software and a 3-dimensional treatment planning system (3D TPS), to investigate the feasibility and reliability of independent verification for radiation treatment planning. Test plans in a homogenous phantom were designed with 3-D TPS, according to the International Atomic Energy Agency (IAEA) Technical Report No. 430, including open, blocked, wedge, and multileaf collimator (MLC) fields. Test plans were delivered and measured in the phantom. The delivered doses were input to the QA software and the independent calculated MUs were compared with delivery. All test plans were verified with independent calculation and phantom measurements separately, and the differences of the two kinds of verification were then compared. The deviation of the independent calculation to the measurements was (0.1 +/- 0.9)%, the biggest difference fell onto the plans that used block and wedge fields (2.0%). The mean MU difference between the TPS and the QA software was (0.6 +/- 1.0)%, ranging from -0.8% to 2.8%. The deviation in dose of the TPS calculation compared to the measurements was (-0.2 +/- 1.7)%, ranging from -3.9% to 2.9%. MU accuracy of the third-party QA software is clinically acceptable. Similar results were achieved with the independent calculations and the phantom measurements for all test plans. The tested independent calculation software can be used as an efficient tool for TPS plan verification.

  11. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    International Nuclear Information System (INIS)

    Tachibana, H; Tachibana, R

    2015-01-01

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification software program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction

  12. Independent tube verification and dynamic tracking in et inspection of nuclear steam generator

    International Nuclear Information System (INIS)

    Xiongzi, Li; Zhongxue, Gan; Lance, Fitzgibbons

    2001-01-01

    The full text follows. In the examination of pressure boundary tubes in steam generators of commercial pressurized water nuclear power plants (PWR's), it is critical to know exactly which particular tube is being accessed. There are no definitive landmarks or markings on the individual tubes. Today this is done manually, it is tedious, and interrupts the normal inspection work, and is difficult due to the presence of water on the tube surface, plug ends instead of tube openings in the field of view, and varying lighting quality. In order to eliminate the human error and increase the efficiency of operation, there is a need to identify tube position during the inspection process, independent of robot encoder position and motion. A process based on a Cognex MVS-8200 system and its application function package has been developed to independently identify tube locations. ABB Combustion Engineering Nuclear Power's Outage Services group, USPPL in collaboration with ABB Power Plant Laboratories' Advanced Computers and Controls department has developed a new vision-based Independent Tube Verification system (GENESIS-ITVS-TM ). The system employ's a model-based tube-shape detection algorithm and dynamic tracking methodology to detect the true tool position and its offsets from identified tube location. GENESIS-ITVS-TM is an automatic Independent Tube Verification System (ITVS). Independent tube verification is a tube validation technique using computer vision, and not using any robot position parameters. This process independently counts the tubes in the horizontal and vertical axes of the plane of the steam generator tube sheet as the work tool is moved. Thus it knows the true position in the steam generator, given a known starting point. This is analogous to the operator's method of counting tubes for verification, but it is automated. GENESIS-ITVS-TM works independent of the robot position, velocity, or acceleration. The tube position information is solely obtained from

  13. Independent Verification Survey Report For Zone 1 Of The East Tennessee Technology Park In Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    King, David A.

    2012-01-01

    Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs)

  14. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    Energy Technology Data Exchange (ETDEWEB)

    Itano, M; Yamazaki, T [Inagi Municipal Hospital, Inagi, Tokyo (Japan); Tachibana, R; Uchida, Y [National Cancer Center Hospital East, Kashiwa, Chiba (Japan); Yamashita, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Shimizu, H [Kitasato University Medical Center, Kitamoto, Saitama (Japan); Sugawara, Y; Kotabe, K [National Center for Global Health and Medicine, Shinjuku, Tokyo (Japan); Kamima, T [Cancer Institute Hospital Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Ishibashi, S [Sasebo City General Hospital, Sasebo, Nagasaki (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently, patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)

  15. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    International Nuclear Information System (INIS)

    Itano, M; Yamazaki, T; Tachibana, R; Uchida, Y; Yamashita, M; Shimizu, H; Sugawara, Y; Kotabe, K; Kamima, T; Takahashi, R; Ishibashi, S; Tachibana, H

    2016-01-01

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently, patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)

  16. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  17. Enhanced dynamic wedge and independent monitor unit verification

    International Nuclear Information System (INIS)

    Howlett, S.J.; University of Newcastle, NSW

    2004-01-01

    Full text: Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. The enhanced dynamic wedge factor (EDWF) presents some significant problems in accurate MU calculation, particularly in the case of non centre of field position (COF). This paper describes development of an independent MU program, concentrating on the implementation of the EDW component. The difficult case of non COF points under the EDW was studied in detail. A survey of Australasian centres regarding the use of independent MU check systems was conducted. The MUCalculator was developed with reference to MU calculations made by Pinnacle 3D RTP system (Philips) for 4MV, 6MV and 18MV X-ray beams from Varian machines used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. Ionisation chamber measurements in solid water TM and liquid water were performed based on a published test data set. Published algorithms combined with a depth dependent profile correction were applied in an attempt to match measured data with maximum accuracy. The survey results are presented. Substantial data is presented in tabular form and extensive comparison with published data. Several different methods for calculating EDWF are examined. A small systematic error was detected in the Gibbon equation used for the EDW calculations. Generally, calculations were within +2% of measured values, although some setups exceeded this variation. Results indicate that COF

  18. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    International Nuclear Information System (INIS)

    Baba, H; Tachibana, H; Kamima, T; Takahashi, R; Kawai, D; Sugawara, Y; Yamamoto, T; Sato, A; Yamashita, M

    2015-01-01

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%

  19. TH-AB-201-01: A Feasibility Study of Independent Dose Verification for CyberKnife

    International Nuclear Information System (INIS)

    Sato, A; Noda, T; Keduka, Y; Kawajiri, T; Itano, M; Yamazaki, T; Tachibana, H

    2016-01-01

    Purpose: CyberKnife irradiation is composed of tiny-size, multiple and intensity-modulated beams compared to conventional linacs. Few of the publications for Independent dose calculation verification for CyberKnife have been reported. In this study, we evaluated the feasibility of independent dose verification for CyberKnife treatment as Secondary check. Methods: The followings were measured: test plans using some static and single beams, clinical plans in a phantom and using patient’s CT. 75 patient plans were collected from several treatment sites of brain, lung, liver and bone. In the test plans and the phantom plans, a pinpoint ion-chamber measurement was performed to assess dose deviation for a treatment planning system (TPS) and an independent verification program of Simple MU Analysis (SMU). In the clinical plans, dose deviation between the SMU and the TPS was performed. Results: In test plan, the dose deviations were 3.3±4.5%, and 4.1±4.4% for the TPS and the SMU, respectively. In the phantom measurements for the clinical plans, the dose deviations were −0.2±3.6% for the TPS and −2.3±4.8% for the SMU. In the clinical plans using the patient’s CT, the dose deviations were −3.0±2.1% (Mean±1SD). The systematic difference was partially derived from inverse square law and penumbra calculation. Conclusion: The independent dose calculation for CyberKnife shows −3.0±4.2% (Mean±2SD) and our study, the confidence limit was achieved within 5% of the tolerance level from AAPM task group 114 for non-IMRT treatment. Thus, it may be feasible to use independent dose calculation verification for CyberKnife treatment as the secondary check. This research is partially supported by Japan Agency for Medical Research and Development (AMED)

  20. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    International Nuclear Information System (INIS)

    Yamashita, M; Kokubo, M; Takahashi, R; Takayama, K; Tanabe, H; Sueoka, M; Okuuchi, N; Ishii, M; Iwamoto, Y; Tachibana, H

    2016-01-01

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used for dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and

  1. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    Energy Technology Data Exchange (ETDEWEB)

    Yamashita, M; Kokubo, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Takayama, K [Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Tanabe, H; Sueoka, M; Okuuchi, N [Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Ishii, M; Iwamoto, Y [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used for dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and

  2. SU-F-T-267: A Clarkson-Based Independent Dose Verification for the Helical Tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Juntendo University, Hongo, Tokyo (Japan); Hongo, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Tsukuba University, Tsukuba, Ibaraki (Japan); Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Hashimoto, H [Shonan Fujisawa Tokushukai Hospital, Fujisawa, Kanagawa (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: There have been few reports for independent dose verification for Tomotherapy. We evaluated the accuracy and the effectiveness of an independent dose verification system for the Tomotherapy. Methods: Simple MU Analysis (SMU, Triangle Product, Ishikawa, Japan) was used as the independent verification system and the system implemented a Clarkson-based dose calculation algorithm using CT image dataset. For dose calculation in the SMU, the Tomotherapy machine-specific dosimetric parameters (TMR, Scp, OAR and MLC transmission factor) were registered as the machine beam data. Dose calculation was performed after Tomotherapy sinogram from DICOM-RT plan information was converted to the information for MU and MLC location at more segmented control points. The performance of the SMU was assessed by a point dose measurement in non-IMRT and IMRT plans (simple target and mock prostate plans). Subsequently, 30 patients’ treatment plans for prostate were compared. Results: From the comparison, dose differences between the SMU and the measurement were within 3% for all cases in non-IMRT plans. In the IMRT plan for the simple target, the differences (Average±1SD) were −0.70±1.10% (SMU vs. TPS), −0.40±0.10% (measurement vs. TPS) and −1.20±1.00% (measurement vs. SMU), respectively. For the mock prostate, the differences were −0.40±0.60% (SMU vs. TPS), −0.50±0.90% (measurement vs. TPS) and −0.90±0.60% (measurement vs. SMU), respectively. For patients’ plans, the difference was −0.50±2.10% (SMU vs. TPS). Conclusion: A Clarkson-based independent dose verification for the Tomotherapy can be clinically available as a secondary check with the similar tolerance level of AAPM Task group 114. This research is partially supported by Japan Agency for Medical Research and Development (AMED)

  3. Criterion of independence applied to personnel responsible for in-house verification

    International Nuclear Information System (INIS)

    Pavaux, F.

    1982-01-01

    Framatome's experience has shown that one of the most difficult criteria to interpret in applying quality assurance programmes is that of ''organization''. In particular, this requires that personnel responsible for in-house verification should have ''sufficient independence''. The author examines how Framatome interprets the criterion of sufficient independence. It may seem easy to deal with this problem on paper, by redistributing the boxes of the organizational chart, but to do so is both unrealistic and deceptive; the development of reference models runs into difficulties when it comes to practical application and these difficulties alone justify trying another approach to the problem. The method advocated here consists in analysing each situation as it arises, taking into account the criterion in question, and disregarding any pre-defined model or reference situation. The analysis should involve all quality assurance functions and not, as is too often the case, only the independence of the quality assurance service. The analysis should also examine organizational freedom and independence from direct pressures of cost and schedule considerations. To support this recommendation, three standard cases are described (manufacturing control, design verification, on-site inspection team) which demonstrate how these criteria can give rise to different difficulties in different cases. The author concludes that, in contrast to other criteria so often applied by successive approximations, organizational changes should only be decided upon when absolutely necessary and after a detailed analysis of the particular case in question has been performed. (author)

  4. Results of the independent radiological verification survey at 4400 Piehl Road, Ottawa Lake, Michigan (BTO002)

    Energy Technology Data Exchange (ETDEWEB)

    Murray, M.E.; Brown, K.S.

    1996-04-01

    At the request of the US Department of Energy (DOE), a team from Oak Ridge National Laboratory (ORNL) conducted an independent radiological verification survey at Ottawa Lake, Michigan. The survey was performed in November and December of 1994. The purpose of the survey was to verify that the site was remediated to levels below the DOE guidelines for FUSRAP sites. Results of the independent radiological verification survey at Ottawa Lake, Michigan confirm that the residual uranium contamination at the site is below DOE FUSRAP guidelines for unrestricted use.

  5. Combining Multiple Features for Text-Independent Writer Identification and Verification

    OpenAIRE

    Bulacu , Marius; Schomaker , Lambert

    2006-01-01

    http://www.suvisoft.com; In recent years, we proposed a number of new and very effective features for automatic writer identification and verification. They are probability distribution functions (PDFs) extracted from the handwriting images and characterize writer individuality independently of the textual content of the written samples. In this paper, we perform an extensive analysis of feature combinations. In our fusion scheme, the final unique distance between two handwritten samples is c...

  6. Getting ready for final disposal in Finland - Independent verification of spent fuel

    International Nuclear Information System (INIS)

    Tarvainen, Matti; Honkamaa, Tapani; Martikka, Elina; Varjoranta, Tero; Hautamaeki, Johanna; Tiitta, Antero

    2001-01-01

    Full text: Final disposal of spent nuclear fuel has been known to be the solution for the back-end of the fuel cycle in Finland already for a long time. This has allowed the State system for accounting and control (SSAC) to prepare for the safeguards requirements in time. The Finnish SSAC includes the operator, the State authority STUK and the parties above them e.g. the Ministry for Trade and Industry. Undisputed responsibility of the safe disposal of spent fuel is on the operator. The role of the safety authority STUK. is to set up detailed requirements, to inspect the operator plans and by using different tools of a quality audit approach to verity that the requirements will be complied with in practice. Responsibility on the safeguards issues is similar with the addition of the role of the regional and the international verification organizations represented by Euratom and the IAEA, As the competent safeguards authority, STUK has decided to maintain its active role also in the future. This will be reflected in the future in the increasing cooperation between the SSAC and the IAEA in the new safeguards activities related to the Additional Protocol. The role of Euratom will remain the same concerning the implementation of conventional safeguards. Based on its SSAC role, STUK has continued carrying out safeguards inspections including independent verification measurements on spent fuel also after joining the EU and Euratom safeguards in 1995. Verification of the operator declared data is the key verification element of safeguards. This will remain to be the case also under the Integrated Safeguards (IS) in the future. It is believed that the importance of high quality measurements will rather increase than decrease when the frequency of interim inspections will decrease. Maintaining the continuity of knowledge makes sense only when the knowledge is reliable and independently verified. One of the corner stones of the high quality of the Finnish SSAC activities is

  7. Independent verification of the delivered dose in High-Dose Rate (HDR) brachytherapy

    International Nuclear Information System (INIS)

    Portillo, P.; Feld, D.; Kessler, J.

    2009-01-01

    An important aspect of a Quality Assurance program in Clinical Dosimetry is an independent verification of the dosimetric calculation done by the Treatment Planning System for each radiation treatment. The present paper is aimed at creating a spreadsheet for the verification of the dose recorded at a point of an implant with radioactive sources and HDR in gynecological injuries. An 192 Ir source automatic differed loading equipment, GammaMedplus model, Varian Medical System with HDR installed at the Angel H. Roffo Oncology Institute has been used. The planning system implemented for getting the dose distribution is the BraquiVision. The sources coordinates as well as those of the calculation point (Rectum) are entered into the Excel-devised verification program by assuming the existence of a point source in each one of the applicators' positions. Such calculation point has been selected as the rectum is an organ at risk, therefore determining the treatment planning. The dose verification is performed at points standing at a sources distance having at least twice the active length of such sources, so they may be regarded as point sources. Most of the sources used in HDR brachytherapy with 192 Ir have a 5 mm active length for all equipment brands. Consequently, the dose verification distance must be at least of 10 mm. (author)

  8. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  9. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  10. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    International Nuclear Information System (INIS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Joergen; Nyholm, Tufve; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm 3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  11. Independent Verification Survey Report for the Operable Unit-1 Landfill Miamisburg Closure Project, Miamisburg, OH

    International Nuclear Information System (INIS)

    Weaver, P.

    2008-01-01

    The objectives of the independent verification survey were to confirm that remedial actions have been effective in meeting established release criteria and that documentation accurately and adequately describes the current radiological and chemical conditions of the MCP site

  12. Benchmark testing and independent verification of the VS2DT computer code

    International Nuclear Information System (INIS)

    McCord, J.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code's original documentation

  13. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  14. SU-E-T-455: Impact of Different Independent Dose Verification Software Programs for Secondary Check

    International Nuclear Information System (INIS)

    Itano, M; Yamazaki, T; Kosaka, M; Kobayashi, N; Yamashita, M; Ishibashi, S; Higuchi, Y; Tachibana, H

    2015-01-01

    Purpose: There have been many reports for different dose calculation algorithms for treatment planning system (TPS). Independent dose verification program (IndpPro) is essential to verify clinical plans from the TPS. However, the accuracy of different independent dose verification programs was not evident. We conducted a multi-institutional study to reveal the impact of different IndpPros using different TPSs. Methods: Three institutes participated in this study. They used two different IndpPros (RADCALC and Simple MU Analysis (SMU), which implemented the Clarkson algorithm. RADCALC needed the input of radiological path length (RPL) computed by the TPSs (Eclipse or Pinnacle3). SMU used CT images to compute the RPL independently from TPS). An ion-chamber measurement in water-equivalent phantom was performed to evaluate the accuracy of two IndpPros and the TPS in each institute. Next, the accuracy of dose calculation using the two IndpPros compared to TPS was assessed in clinical plan. Results: The accuracy of IndpPros and the TPSs in the homogenous phantom was +/−1% variation to the measurement. 1543 treatment fields were collected from the patients treated in the institutes. The RADCALC showed better accuracy (0.9 ± 2.2 %) than the SMU (1.7 ± 2.1 %). However, the accuracy was dependent on the TPS (Eclipse: 0.5%, Pinnacle3: 1.0%). The accuracy of RADCALC with Eclipse was similar to that of SMU in one of the institute. Conclusion: Depending on independent dose verification program, the accuracy shows systematic dose accuracy variation even though the measurement comparison showed a similar variation. The variation was affected by radiological path length calculation. IndpPro with Pinnacle3 has different variation because Pinnacle3 computed the RPL using physical density. Eclipse and SMU uses electron density, though

  15. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    Science.gov (United States)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  16. Verification Survey of Uranium Mine Remediation

    International Nuclear Information System (INIS)

    Ron, Stager

    2009-01-01

    The Canadian Nuclear Safety Commission (CNSC) contracted an independent verification of an intensive gamma radiation survey conducted by a mining company to demonstrate that remediation of disturbed areas was complete. This site was the first of the recent mines being decommissioned in Canada and experience gained here may be applied to other mines being decommissioned in the future. The review included examination of the site-specific basis for clean-up criteria and ALARA as required by CNSC guidance. A paper review of the company report was conducted to determine if protocols were followed and that the summarized results could be independently reproduced. An independent verification survey was conducted on parts of the site and comparisons were made between gamma radiation measurements from the verification survey and the original company survey. Some aspects of data collection using rate meters linked to GPS data loggers are discussed as are aspects for data management and analyses methods required for the large amount of data collected during these surveys. Recommendations were made for implementation of future surveys and reporting the data from those surveys in order to ensure that remediation was complete. (authors)

  17. Comparison of monitor units calculated by radiotherapy treatment planning system and an independent monitor unit verification software.

    Science.gov (United States)

    Sellakumar, P; Arun, C; Sanjay, S S; Ramesh, S B

    2011-01-01

    In radiation therapy, the monitor units (MU) needed to deliver a treatment plan are calculated by treatment planning systems (TPS). The essential part of quality assurance is to verify the MU with independent monitor unit calculation to correct any potential errors prior to the start of treatment. In this study, we have compared the MU calculated by TPS and by independent MU verification software. The MU verification software was commissioned and tested for the data integrity to ensure that the correct beam data was considered for MU calculations. The accuracy of the calculations was tested by creating a series of test plans and comparing them with ion chamber measurements. The results show that there is good agreement between the two. The MU difference (MUdiff) between the monitor unit calculations of TPS and independent MU verification system was calculated for 623 fields from 245 patients and was analyzed by treatment site for head & neck, thorax, breast, abdomen and pelvis. The mean MUdiff of -0.838% with a standard deviation of 3.04% was observed for all 623 fields. The site specific standard deviation of MUdiff was as follows: abdomen and pelvis (<1.75%), head & neck (2.5%), thorax (2.32%) and breast (6.01%). The disparities were analyzed and different correction methods were used to reduce the disparity. © 2010 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  18. Device-independence for two-party cryptography and position verification

    DEFF Research Database (Denmark)

    Ribeiro, Jeremy; Thinh, Le Phuc; Kaniewski, Jedrzej

    Quantum communication has demonstrated its usefulness for quantum cryptography far beyond quantum key distribution. One domain is two-party cryptography, whose goal is to allow two parties who may not trust each other to solve joint tasks. Another interesting application is position......-based cryptography whose goal is to use the geographical location of an entity as its only identifying credential. Unfortunately, security of these protocols is not possible against an all powerful adversary. However, if we impose some realistic physical constraints on the adversary, there exist protocols for which...... security can be proven, but these so far relied on the knowledge of the quantum operations performed during the protocols. In this work we give device-independent security proofs of two-party cryptography and Position Verification for memoryless devices under different physical constraints on the adversary...

  19. Preparation of a program for the independent verification of the brachytherapy planning systems calculations

    International Nuclear Information System (INIS)

    V Carmona, V.; Perez-Calatayud, J.; Lliso, F.; Richart Sancho, J.; Ballester, F.; Pujades-Claumarchirant, M.C.; Munoz, M.

    2010-01-01

    In this work a program is presented that independently checks for each patient the treatment planning system calculations in low dose rate, high dose rate and pulsed dose rate brachytherapy. The treatment planning system output text files are automatically loaded in this program in order to get the source coordinates, the desired calculation point coordinates and the dwell times when it is the case. The source strength and the reference dates are introduced by the user. The program allows implementing the recommendations about independent verification of the clinical brachytherapy dosimetry in a simple and accurate way, in few minutes. (Author).

  20. VERIFICATION OF THE FOOD SAFETY MANAGEMENT SYSTEM IN DEEP FROZEN FOOD PRODUCTION PLANT

    Directory of Open Access Journals (Sweden)

    Peter Zajác

    2010-07-01

    Full Text Available In work is presented verification of food safety management system of deep frozen food. Main emphasis is on creating set of verification questions within articles of standard STN EN ISO 22000:2006 and on searching of effectiveness in food safety management system. Information were acquired from scientific literature sources and they pointed out importance of implementation and upkeep of effective food safety management system. doi:10.5219/28

  1. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  2. A method for online verification of adapted fields using an independent dose monitor

    International Nuclear Information System (INIS)

    Chang Jina; Norrlinger, Bernhard D.; Heaton, Robert K.; Jaffray, David A.; Cho, Young-Bin; Islam, Mohammad K.; Mahon, Robert

    2013-01-01

    Purpose: Clinical implementation of online adaptive radiotherapy requires generation of modified fields and a method of dosimetric verification in a short time. We present a method of treatment field modification to account for patient setup error, and an online method of verification using an independent monitoring system.Methods: The fields are modified by translating each multileaf collimator (MLC) defined aperture in the direction of the patient setup error, and magnifying to account for distance variation to the marked isocentre. A modified version of a previously reported online beam monitoring system, the integral quality monitoring (IQM) system, was investigated for validation of adapted fields. The system consists of a large area ion-chamber with a spatial gradient in electrode separation to provide a spatially sensitive signal for each beam segment, mounted below the MLC, and a calculation algorithm to predict the signal. IMRT plans of ten prostate patients have been modified in response to six randomly chosen setup errors in three orthogonal directions.Results: A total of approximately 49 beams for the modified fields were verified by the IQM system, of which 97% of measured IQM signal agree with the predicted value to within 2%.Conclusions: The modified IQM system was found to be suitable for online verification of adapted treatment fields

  3. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  4. SU-E-T-48: A Multi-Institutional Study of Independent Dose Verification for Conventional, SRS and SBRT

    International Nuclear Information System (INIS)

    Takahashi, R; Kamima, T; Tachibana, H; Baba, H; Itano, M; Yamazaki, T; Ishibashi, S; Higuchi, Y; Shimizu, H; Yamamoto, T; Yamashita, M; Sugawara, Y; Sato, A; Nishiyama, S; Kawai, D; Miyaoka, S

    2015-01-01

    Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle 3 (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sites (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly

  5. SU-E-T-48: A Multi-Institutional Study of Independent Dose Verification for Conventional, SRS and SBRT

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, R; Kamima, T [The Cancer Institute Hospital of JFCR, Koto-ku, Tokyo (Japan); Tachibana, H; Baba, H [National Cancer Center Hospital East, Kashiwa, Chiba (Japan); Itano, M; Yamazaki, T [Inagi Municipal Hospital, Inagi, Tokyo (Japan); Ishibashi, S; Higuchi, Y [Sasebo City General Hospital, Sasebo, Nagasaki (Japan); Shimizu, H [Kitasato University Medical Center, Kitamoto, Saitama (Japan); Yamamoto, T [Otemae Hospital, Chuou-ku, Osaka-city (Japan); Yamashita, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Sugawara, Y [The National Center for Global Health and Medicine, Shinjuku-ku, Tokyo (Japan); Sato, A [Itabashi Central General Hospital, Itabashi-ku, Tokyo (Japan); Nishiyama, S [Kuki General Hospital, Kuki, Saitama (Japan); Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa-prefecture (Japan); Miyaoka, S [Kamitsuga General Hospital, Kanuma, Tochigi (Japan)

    2015-06-15

    Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle{sup 3} (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sites (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly.

  6. Results of the independent radiological verification survey at the former Bridgeport Brass Company Facility, Seymour, Connecticut (SSC001)

    International Nuclear Information System (INIS)

    Foley, R.D.; Rice, D.E.; Allred, J.F.; Brown, K.S.

    1995-03-01

    At the request of the USDOE, a team from ORNL conducted an independent radiological verification survey at the former Bridgeport Brass Company Facility, Seymour, Connecticut, from September 1992 to March 1993. Purpose of the survey was to determine whether residual levels of radioactivity inside the Ruffert Building and selected adjacent areas were rmediated to levels below DOE guidelines for FUSRAP sites. The property was contaminated with radioactive residues of 238 U from uranium processing experiments conducted by Reactive Metals, Inc., from 1962 to 1964 for the Atomic Energy Commission. A previous radiological survey did not characterize the entire floor space because equipment which could not be moved at the time made it inaccessible for radiological surveys. During the remediation process, additional areas of elevated radioactivity were discovered under stationary equipment, which required additional remediation and further verification. Results of the independent radiological verification survey confirm that, with the exception of the drain system inside the building, residual uranium contamination has been remediated to levels below DOE guidelines for unrestricted release of property at FUSRAP sites inside and outside the Ruffert Building. However, certain sections of the drain system retain uranium contamination above DOE surface guideline levels. These sections of pipe are addressed in separate, referenced documentation

  7. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  8. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  9. Text-independent writer identification and verification using textural and allographic features.

    Science.gov (United States)

    Bulacu, Marius; Schomaker, Lambert

    2007-04-01

    The identification of a person on the basis of scanned images of handwriting is a useful biometric modality with application in forensic and historic document analysis and constitutes an exemplary study area within the research field of behavioral biometrics. We developed new and very effective techniques for automatic writer identification and verification that use probability distribution functions (PDFs) extracted from the handwriting images to characterize writer individuality. A defining property of our methods is that they are designed to be independent of the textual content of the handwritten samples. Our methods operate at two levels of analysis: the texture level and the character-shape (allograph) level. At the texture level, we use contour-based joint directional PDFs that encode orientation and curvature information to give an intimate characterization of individual handwriting style. In our analysis at the allograph level, the writer is considered to be characterized by a stochastic pattern generator of ink-trace fragments, or graphemes. The PDF of these simple shapes in a given handwriting sample is characteristic for the writer and is computed using a common shape codebook obtained by grapheme clustering. Combining multiple features (directional, grapheme, and run-length PDFs) yields increased writer identification and verification performance. The proposed methods are applicable to free-style handwriting (both cursive and isolated) and have practical feasibility, under the assumption that a few text lines of handwritten material are available in order to obtain reliable probability estimates.

  10. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, R; Kamima, T [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dose deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)

  11. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2

    Energy Technology Data Exchange (ETDEWEB)

    PARSONS, J.E.

    2000-07-15

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  12. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    Energy Technology Data Exchange (ETDEWEB)

    Harpeneau, Evan M. [Oak Ridge Institute for Science and Education, Oak Ridge, TN (United States). Independent Environmental Assessment and Verification Program

    2011-06-24

    On May 9, 2011, ORISE conducted verification survey activities including scans, sampling, and the collection of smears of the remaining soils and off-gas pipe associated with the 802 Fan House within the HFBR (High Flux Beam Reactor) Complex at BNL. ORISE is of the opinion, based on independent scan and sample results obtained during verification activities at the HFBR 802 Fan House, that the FSS (final status survey) unit meets the applicable site cleanup objectives established for as left radiological conditions.

  13. An evaluation of the management system verification pilot at Hanford

    International Nuclear Information System (INIS)

    Briggs, C.R.; Ramonas, L.; Westendorf, W.

    1998-01-01

    The Chemical Management System (CMS), currently under development at Hanford, was used as the ''test program'' for pilot testing the value added aspects of the Chemical Manufacturers Association's (CMA) Management Systems Verification (MSV) process. The MSV process, which was developed by CMA's member chemical companies specifically as a tool to assist in the continuous improvement of environment, safety and health (ESH) performance, represents a commercial sector ''best practice'' for evaluating ESH management systems. The primary purpose of Hanford's MSV Pilot was to evaluate the applicability and utility of the MSV process in the Department of Energy (DOE) environment. However, because the Integrated Safety Management System (ISMS) is the framework for ESH management at Hanford and at all DOE sites, the pilot specifically considered the MSV process in the context of a possible future adjunct to Integrated Safety Management System Verification (ISMSV) efforts at Hanford and elsewhere within the DOE complex. The pilot involved the conduct of two-hour interviews with four separate panels of individuals with functional responsibilities related to the CMS including the Department of Energy Richland Operations (DOE-RL), Fluor Daniel Hanford (FDH) and FDH's major subcontractors (MSCS). A semi-structured interview process was employed by the team of three ''verifiers'' who directed open-ended questions to the panels regarding the development, integration and effectiveness of management systems necessary to ensure the sustainability of the CMS effort. An ''MSV Pilot Effectiveness Survey'' also was completed by each panel participant immediately following the interview

  14. The Integrated Safety Management System Verification Enhancement Review of the Plutonium Finishing Plant (PFP)

    International Nuclear Information System (INIS)

    BRIGGS, C.R.

    2000-01-01

    The primary purpose of the verification enhancement review was for the DOE Richland Operations Office (RL) to verify contractor readiness for the independent DOE Integrated Safety Management System Verification (ISMSV) on the Plutonium Finishing Plant (PFP). Secondary objectives included: (1) to reinforce the engagement of management and to gauge management commitment and accountability; (2) to evaluate the ''value added'' benefit of direct public involvement; (3) to evaluate the ''value added'' benefit of direct worker involvement; (4) to evaluate the ''value added'' benefit of the panel-to-panel review approach; and, (5) to evaluate the utility of the review's methodology/adaptability to periodic assessments of ISM status. The review was conducted on December 6-8, 1999, and involved the conduct of two-hour interviews with five separate panels of individuals with various management and operations responsibilities related to PFP. A semi-structured interview process was employed by a team of five ''reviewers'' who directed open-ended questions to the panels which focused on: (1) evidence of management commitment, accountability, and involvement; and, (2) consideration and demonstration of stakeholder (including worker) information and involvement opportunities. The purpose of a panel-to-panel dialogue approach was to better spotlight: (1) areas of mutual reinforcement and alignment that could serve as good examples of the management commitment and accountability aspects of ISMS implementation, and, (2) areas of potential discrepancy that could provide opportunities for improvement. In summary, the Review Team found major strengths to include: (1) the use of multi-disciplinary project work teams to plan and do work; (2) the availability and broad usage of multiple tools to help with planning and integrating work; (3) senior management presence and accessibility; (4) the institutionalization of worker involvement; (5) encouragement of self-reporting and self

  15. Orion GN&C Fault Management System Verification: Scope And Methodology

    Science.gov (United States)

    Brown, Denise; Weiler, David; Flanary, Ronald

    2016-01-01

    In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.

  16. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    CERN Document Server

    Parsons, J E

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  17. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    International Nuclear Information System (INIS)

    PARSONS, J.E.

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented

  18. The science and management of sex verification in sport | Tucker ...

    African Journals Online (AJOL)

    The verification of gender eligibility in sporting competition poses a biological and management challenge for sports science and medicine, as well as for sporting authorities. It has been established that in most sporting events, the strength and power advantage possessed by males as a result of the virilising action of ...

  19. Independent verification of plutonium decontamination on Johnston Atoll (1992--1996)

    International Nuclear Information System (INIS)

    Wilson-Nichols, M.J.; Wilson, J.E.; McDowell-Boyer, L.M.; Davidson, J.R.; Egidi, P.V.; Coleman, R.L.

    1998-05-01

    The Field Command, Defense Special Weapons Agency (FCDSWA) (formerly FCDNA) contracted Oak Ridge National Laboratory (ORNL) Environmental Technology Section (ETS) to conduct an independent verification (IV) of the Johnston Atoll (JA) Plutonium Decontamination Project by an interagency agreement with the US Department of Energy in 1992. The main island is contaminated with the transuranic elements plutonium and americium, and soil decontamination activities have been ongoing since 1984. FCDSWA has selected a remedy that employs a system of sorting contaminated particles from the coral/soil matrix, allowing uncontaminated soil to be reused. The objective of IV is to evaluate the effectiveness of remedial action. The IV contractor's task is to determine whether the remedial action contractor has effectively reduced contamination to levels within established criteria and whether the supporting documentation describing the remedial action is adequate. ORNL conducted four interrelated tasks from 1992 through 1996 to accomplish the IV mission. This document is a compilation and summary of those activities, in addition to a comprehensive review of the history of the project

  20. Independent verification survey report for exposure units Z2-24, Z2-31, Z2-32, AND Z2-36 in zone 2 of the East Tennessee technology park Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    King, David A. [Oak Ridge Inst. for Science and Education (ORISE), Oak Ridge, TN (United States)

    2013-10-01

    The U.S. Department of Energy (DOE) Oak Ridge Office of Environmental Management selected Oak Ridge Associated Universities (ORAU), through the Oak Ridge Institute for Science and Education (ORISE) contract, to perform independent verification (IV) at Zone 2 of the East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. ORAU has concluded IV surveys, per the project-specific plan (PSP) (ORAU 2013a) covering exposure units (EUs) Z2-24, -31, -32, and -36. The objective of this effort was to verify the target EUs comply with requirements in the Zone 2 Record of Decision (ROD) (DOE 2005), as implemented by using the dynamic verification strategy presented in the dynamic work plan (DWP) (BJC 2007); and confirm commitments in the DWP were adequately implemented, as verified via IV surveys and soil sampling.

  1. A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware

    Science.gov (United States)

    Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun

    During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.

  2. The verification of PWR-fuel code for PWR in-core fuel management

    International Nuclear Information System (INIS)

    Surian Pinem; Tagor M Sembiring; Tukiran

    2015-01-01

    In-core fuel management for PWR is not easy because of the number of fuel assemblies in the core as much as 192 assemblies so many possibilities for placement of the fuel in the core. Configuration of fuel assemblies in the core must be precise and accurate so that the reactor operates safely and economically. It is necessary for verification of PWR-FUEL code that will be used in-core fuel management for PWR. PWR-FUEL code based on neutron transport theory and solved with the approach of multi-dimensional nodal diffusion method many groups and diffusion finite difference method (FDM). The goal is to check whether the program works fine, especially for the design and in-core fuel management for PWR. Verification is done with equilibrium core search model at three conditions that boron free, 1000 ppm boron concentration and critical boron concentration. The result of the average burn up fuel assemblies distribution and power distribution at BOC and EOC showed a consistent trend where the fuel with high power at BOC will produce a high burn up in the EOC. On the core without boron is obtained a high multiplication factor because absence of boron in the core and the effect of fission products on the core around 3.8 %. Reactivity effect at 1000 ppm boron solution of BOC and EOC is 6.44 % and 1.703 % respectively. Distribution neutron flux and power density using NODAL and FDM methods have the same result. The results show that the verification PWR-FUEL code work properly, especially for core design and in-core fuel management for PWR. (author)

  3. Device independence for two-party cryptography and position verification with memoryless devices

    Science.gov (United States)

    Ribeiro, Jérémy; Thinh, Le Phuc; Kaniewski, Jedrzej; Helsen, Jonas; Wehner, Stephanie

    2018-06-01

    Quantum communication has demonstrated its usefulness for quantum cryptography far beyond quantum key distribution. One domain is two-party cryptography, whose goal is to allow two parties who may not trust each other to solve joint tasks. Another interesting application is position-based cryptography whose goal is to use the geographical location of an entity as its only identifying credential. Unfortunately, security of these protocols is not possible against an all powerful adversary. However, if we impose some realistic physical constraints on the adversary, there exist protocols for which security can be proven, but these so far relied on the knowledge of the quantum operations performed during the protocols. In this work we improve the device-independent security proofs of Kaniewski and Wehner [New J. Phys. 18, 055004 (2016), 10.1088/1367-2630/18/5/055004] for two-party cryptography (with memoryless devices) and we add a security proof for device-independent position verification (also memoryless devices) under different physical constraints on the adversary. We assess the quality of the devices by observing a Bell violation, and, as for Kaniewski and Wehner [New J. Phys. 18, 055004 (2016), 10.1088/1367-2630/18/5/055004], security can be attained for any violation of the Clauser-Holt-Shimony-Horne inequality.

  4. Management of citation verification requests for multiple projects at Sandia National Laboratories

    International Nuclear Information System (INIS)

    Crawford, C.S.

    1995-01-01

    Sandia National Laboratories' (SNL) Technical Library is now responsible for providing citation verification management support for all references cited in technical reports issued by the Nuclear Waste Management (NWM) Program. This paper dancing how this process is managed for the Yucca Mountain Site Characterization (YWP), Waste Isolation Pilot Plant (WIPP), Idaho National Engineering Laboratory (INEL), and Greater Confinement Disposal (GCD) projects. Since technical reports are the main product of these projects, emphasis is placed on meeting the constantly evolving needs of these customers in a timely and cost-effective manner

  5. Verification of the MOTIF code version 3.0

    International Nuclear Information System (INIS)

    Chan, T.; Guvanasen, V.; Nakka, B.W.; Reid, J.A.K.; Scheier, N.W.; Stanchell, F.W.

    1996-12-01

    As part of the Canadian Nuclear Fuel Waste Management Program (CNFWMP), AECL has developed a three-dimensional finite-element code, MOTIF (Model Of Transport In Fractured/ porous media), for detailed modelling of groundwater flow, heat transport and solute transport in a fractured rock mass. The code solves the transient and steady-state equations of groundwater flow, solute (including one-species radionuclide) transport, and heat transport in variably saturated fractured/porous media. The initial development was completed in 1985 (Guvanasen 1985) and version 3.0 was completed in 1986. This version is documented in detail in Guvanasen and Chan (in preparation). This report describes a series of fourteen verification cases which has been used to test the numerical solution techniques and coding of MOTIF, as well as demonstrate some of the MOTIF analysis capabilities. For each case the MOTIF solution has been compared with a corresponding analytical or independently developed alternate numerical solution. Several of the verification cases were included in Level 1 of the International Hydrologic Code Intercomparison Project (HYDROCOIN). The MOTIF results for these cases were also described in the HYDROCOIN Secretariat's compilation and comparison of results submitted by the various project teams (Swedish Nuclear Power Inspectorate 1988). It is evident from the graphical comparisons presented that the MOTIF solutions for the fourteen verification cases are generally in excellent agreement with known analytical or numerical solutions obtained from independent sources. This series of verification studies has established the ability of the MOTIF finite-element code to accurately model the groundwater flow and solute and heat transport phenomena for which it is intended. (author). 20 refs., 14 tabs., 32 figs

  6. Reprocessing Close Range Terrestrial and Uav Photogrammetric Projects with the Dbat Toolbox for Independent Verification and Quality Control

    Science.gov (United States)

    Murtiyoso, A.; Grussenmeyer, P.; Börlin, N.

    2017-11-01

    Photogrammetry has recently seen a rapid increase in many applications, thanks to developments in computing power and algorithms. Furthermore with the democratisation of UAVs (Unmanned Aerial Vehicles), close range photogrammetry has seen more and more use due to the easier capability to acquire aerial close range images. In terms of photogrammetric processing, many commercial software solutions exist in the market that offer results from user-friendly environments. However, in most commercial solutions, a black-box approach to photogrammetric calculations is often used. This is understandable in light of the proprietary nature of the algorithms, but it may pose a problem if the results need to be validated in an independent manner. In this paper, the Damped Bundle Adjustment Toolbox (DBAT) developed for Matlab was used to reprocess some photogrammetric projects that were processed using the commercial software Agisoft Photoscan. Several scenarios were experimented on in order to see the performance of DBAT in reprocessing terrestrial and UAV close range photogrammetric projects in several configurations of self-calibration setting. Results show that DBAT managed to reprocess PS projects and generate metrics which can be useful for project verification.

  7. REPROCESSING CLOSE RANGE TERRESTRIAL AND UAV PHOTOGRAMMETRIC PROJECTS WITH THE DBAT TOOLBOX FOR INDEPENDENT VERIFICATION AND QUALITY CONTROL

    Directory of Open Access Journals (Sweden)

    A. Murtiyoso

    2017-11-01

    Full Text Available Photogrammetry has recently seen a rapid increase in many applications, thanks to developments in computing power and algorithms. Furthermore with the democratisation of UAVs (Unmanned Aerial Vehicles, close range photogrammetry has seen more and more use due to the easier capability to acquire aerial close range images. In terms of photogrammetric processing, many commercial software solutions exist in the market that offer results from user-friendly environments. However, in most commercial solutions, a black-box approach to photogrammetric calculations is often used. This is understandable in light of the proprietary nature of the algorithms, but it may pose a problem if the results need to be validated in an independent manner. In this paper, the Damped Bundle Adjustment Toolbox (DBAT developed for Matlab was used to reprocess some photogrammetric projects that were processed using the commercial software Agisoft Photoscan. Several scenarios were experimented on in order to see the performance of DBAT in reprocessing terrestrial and UAV close range photogrammetric projects in several configurations of self-calibration setting. Results show that DBAT managed to reprocess PS projects and generate metrics which can be useful for project verification.

  8. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  9. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  10. Design and verification of computer-based reactor control system modification at Bruce-A candu nuclear generating station

    International Nuclear Information System (INIS)

    Basu, S.; Webb, N.

    1995-01-01

    The Reactor Control System at Bruce-A Nuclear Generating Station is going through some design modifications, which involve a rigorous design process including independent verification and validation. The design modification includes changes to the control logic, alarms and annunciation, hardware and software. The design (and verification) process includes design plan, design requirements, hardware and software specifications, hardware and software design, testing, technical review, safety evaluation, reliability analysis, failure mode and effect analysis, environmental qualification, seismic qualification, software quality assurance, system validation, documentation update, configuration management, and final acceptance. (7 figs.)

  11. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan; FINAL

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  12. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  13. Development of the clearance level verification evaluation system. 2. Construction of the clearance data management system

    International Nuclear Information System (INIS)

    Kubota, Shintaro; Usui, Hideo; Kawagoshi, Hiroshi

    2014-06-01

    Clearance is defined as the removal of radioactive materials or radioactive objects within authorized practices from any further regulatory control by the regulatory body. In Japan, clearance level and a procedure for its verification has been introduced under the Laws and Regulations, and solid clearance wastes inspected by the national authority can be handled and recycled as normal wastes. The most prevalent type of wastes have generated from the dismantling of nuclear facilities, so the Japan Atomic Energy Agency (JAEA) has been developing the Clearance Level Verification Evaluation System (CLEVES) as a convenient tool. The Clearance Data Management System (CDMS), which is a part of CLEVES, has been developed to support measurement, evaluation, making and recording documents with clearance level verification. In addition, validation of the evaluation result of the CDMS was carried out by inputting the data of actual clearance activities in the JAEA. Clearance level verification is easily applied by using the CDMS for the clearance activities. (author)

  14. Risk management for independent power projects

    International Nuclear Information System (INIS)

    Owen, J.L.

    1993-01-01

    Independent Power, where electric utilities or other bulk electric power users contract with individual electric power generation facilities to meet their projected long term power needs, has grown dramatically over the past ten years or more. This concept, to contract with Independent Power Producers (IPP), is not a new concept and in fact goes back to the early formation of the electric power industry in this country and worldwide. Successful Risk Management is the foundation for ultimate project completion and operation in fulfilling the expectations of all parties. The primary risks associated with the development of Independent Power projects include: predicting long term fuel availability and cost; predicting long term price for the deliverable of electricity; site selection, site characteristics and permitting; innovative or evolving technology; project execution (design and construction), and; lifetime O ampersand M costs and plant reliability. This paper focuses on the risks inherent in the development of IPPs and addresses the management of these risks

  15. Results of the independent radiological verification survey of the lower Sheffield Brook floodplain, Wayne, New Jersey

    International Nuclear Information System (INIS)

    Yalcintas, M.G.; Carrier, R.F.

    1989-05-01

    Prior to 1971, the W.R. Grace Company processed and stored radioactive materials at Wayne, New Jersey, under license to the Atomic Energy Commission. Decontamination of structures and storage of waste materials on the property at the Wayne Interim Storage Site (WISS) took place in 1974. Surveys by the State of New Jersey Department of Environmental Protection and by Oak Ridge Associated Universities for the NRC in 1982 indicated that properties adjacent to the WISS contained surface contamination by radioactive residuals in amounts exceeding those acceptable under US Department of Energy (DOE) remedial action guidelines. At the request of DOE, remedial actions have been conducted by Bechtel National, Inc., to remove radioactive residuals from properties adjacent to the site. It is the policy of DOE to assign an independent verification contractor to ensure the effectiveness of remedial actions performed within the Formerly Utilized Sites Remedial Action Program. This report describes the methods and results of those studies that were conducted by the Measurement Applications and Development Group of the Oak Ridge National Laboratory for the lower Sheffield Brook floodplain west of the WISS. Based upon post-remedial action and verification survey data, it was concluded that residual soil concentrations and gamma levels following excavation and backfilling of the area are within the limits prescribed by DOE radiological guidelines. 12 refs., 6 figs., 8 tabs

  16. SU-E-T-50: A Multi-Institutional Study of Independent Dose Verification Software Program for Lung SBRT

    International Nuclear Information System (INIS)

    Kawai, D; Takahashi, R; Kamima, T; Baba, H; Yamamoto, T; Kubo, Y; Ishibashi, S; Higuchi, Y; Takahashi, H; Tachibana, H

    2015-01-01

    Purpose: The accuracy of dose distribution depends on treatment planning system especially in heterogeneity-region. The tolerance level (TL) of the secondary check using the independent dose verification may be variable in lung SBRT plans. We conducted a multi-institutional study to evaluate the tolerance level of lung SBRT plans shown in the AAPM TG114. Methods: Five institutes in Japan participated in this study. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. Analytical Anisotropic Algorithm (AAA), Pencil Beam Convolution with modified Batho-method (PBC-B) and Adaptive Convolve (AC) were used for lung SBRT planning. A measurement using an ion-chamber was performed in a heterogeneous phantom to compare doses from the three different algorithms and the SMU to the measured dose. In addition to it, a retrospective analysis using clinical lung SBRT plans (547 beams from 77 patients) was conducted to evaluate the confidence limit (CL, Average±2SD) in dose between the three algorithms and the SMU. Results: Compared to the measurement, the AAA showed the larger systematic dose error of 2.9±3.2% than PBC-B and AC. The Clarkson-based SMU showed larger error of 5.8±3.8%. The CLs for clinical plans were 7.7±6.0 % (AAA), 5.3±3.3 % (AC), 5.7±3.4 % (PBC -B), respectively. Conclusion: The TLs from the CLs were evaluated. A Clarkson-based system shows a large systematic variation because of inhomogeneous correction. The AAA showed a significant variation. Thus, we must consider the difference of inhomogeneous correction as well as the dependence of dose calculation engine

  17. SU-E-T-50: A Multi-Institutional Study of Independent Dose Verification Software Program for Lung SBRT

    Energy Technology Data Exchange (ETDEWEB)

    Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa-prefecture (Japan); Takahashi, R; Kamima, T [The Cancer Institute Hospital of JFCR, Koutou-ku, Tokyo (Japan); Baba, H [The National Cancer Center Hospital East, Kashiwa-city, Chiba prefecture (Japan); Yamamoto, T; Kubo, Y [Otemae Hospital, Chuou-ku, Osaka-city (Japan); Ishibashi, S; Higuchi, Y [Sasebo City General Hospital, Sasebo, Nagasaki (Japan); Takahashi, H [St Lukes International Hospital, Chuou-ku, Tokyo (Japan); Tachibana, H [National Cancer Center Hospital East, Kashiwa, Chiba (Japan)

    2015-06-15

    Purpose: The accuracy of dose distribution depends on treatment planning system especially in heterogeneity-region. The tolerance level (TL) of the secondary check using the independent dose verification may be variable in lung SBRT plans. We conducted a multi-institutional study to evaluate the tolerance level of lung SBRT plans shown in the AAPM TG114. Methods: Five institutes in Japan participated in this study. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. Analytical Anisotropic Algorithm (AAA), Pencil Beam Convolution with modified Batho-method (PBC-B) and Adaptive Convolve (AC) were used for lung SBRT planning. A measurement using an ion-chamber was performed in a heterogeneous phantom to compare doses from the three different algorithms and the SMU to the measured dose. In addition to it, a retrospective analysis using clinical lung SBRT plans (547 beams from 77 patients) was conducted to evaluate the confidence limit (CL, Average±2SD) in dose between the three algorithms and the SMU. Results: Compared to the measurement, the AAA showed the larger systematic dose error of 2.9±3.2% than PBC-B and AC. The Clarkson-based SMU showed larger error of 5.8±3.8%. The CLs for clinical plans were 7.7±6.0 % (AAA), 5.3±3.3 % (AC), 5.7±3.4 % (PBC -B), respectively. Conclusion: The TLs from the CLs were evaluated. A Clarkson-based system shows a large systematic variation because of inhomogeneous correction. The AAA showed a significant variation. Thus, we must consider the difference of inhomogeneous correction as well as the dependence of dose calculation engine.

  18. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  19. Independent calculation-based verification of IMRT plans using a 3D dose-calculation engine

    International Nuclear Information System (INIS)

    Arumugam, Sankar; Xing, Aitang; Goozee, Gary; Holloway, Lois

    2013-01-01

    Independent monitor unit verification of intensity-modulated radiation therapy (IMRT) plans requires detailed 3-dimensional (3D) dose verification. The aim of this study was to investigate using a 3D dose engine in a second commercial treatment planning system (TPS) for this task, facilitated by in-house software. Our department has XiO and Pinnacle TPSs, both with IMRT planning capability and modeled for an Elekta-Synergy 6 MV photon beam. These systems allow the transfer of computed tomography (CT) data and RT structures between them but do not allow IMRT plans to be transferred. To provide this connectivity, an in-house computer programme was developed to convert radiation therapy prescription (RTP) files as generated by many planning systems into either XiO or Pinnacle IMRT file formats. Utilization of the technique and software was assessed by transferring 14 IMRT plans from XiO and Pinnacle onto the other system and performing 3D dose verification. The accuracy of the conversion process was checked by comparing the 3D dose matrices and dose volume histograms (DVHs) of structures for the recalculated plan on the same system. The developed software successfully transferred IMRT plans generated by 1 planning system into the other. Comparison of planning target volume (TV) DVHs for the original and recalculated plans showed good agreement; a maximum difference of 2% in mean dose, − 2.5% in D95, and 2.9% in V95 was observed. Similarly, a DVH comparison of organs at risk showed a maximum difference of +7.7% between the original and recalculated plans for structures in both high- and medium-dose regions. However, for structures in low-dose regions (less than 15% of prescription dose) a difference in mean dose up to +21.1% was observed between XiO and Pinnacle calculations. A dose matrix comparison of original and recalculated plans in XiO and Pinnacle TPSs was performed using gamma analysis with 3%/3 mm criteria. The mean and standard deviation of pixels passing

  20. The virtual product-process design laboratory to manage the complexity in the verification of formulated products

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Malik, Tahir I.

    2011-01-01

    -Process Design laboratory (virtual PPD-lab) software is based on this decomposition strategy for the design of formulated liquid products. When the needed models are available in the software, the solution of formulation design/verification problems is straightforward, while when models are not available...... mixtures need to be predicted. This complexity has to be managed through decomposition of the problem into sub-problems. Each sub-problem is solved and analyzed and, from the knowledge gained, an overall evaluation of the complex chemical system representing the product is made. The virtual Product...... in the software library, they need to be developed and/or implemented. The potential of the virtual PPD-lab in managing the complexity in the verification of formulated products, after the needed models have been developed and implemented, is highlighted in this paper through a case study from industry dealing...

  1. SU-E-T-490: Independent Three-Dimensional (3D) Dose Verification of VMAT/SBRT Using EPID and Cloud Computing

    Energy Technology Data Exchange (ETDEWEB)

    Ding, A; Han, B; Bush, K; Wang, L; Xing, L [Stanford University School of Medicine, Stanford, CA (United States)

    2015-06-15

    Purpose: Dosimetric verification of VMAT/SBRT is currently performed on one or two planes in a phantom with either film or array detectors. A robust and easy-to-use 3D dosimetric tool has been sought since the advent of conformal radiation therapy. Here we present such a strategy for independent 3D VMAT/SBRT plan verification system by a combined use of EPID and cloud-based Monte Carlo (MC) dose calculation. Methods: The 3D dosimetric verification proceeds in two steps. First, the plan was delivered with a high resolution portable EPID mounted on the gantry, and the EPID-captured gantry-angle-resolved VMAT/SBRT field images were converted into fluence by using the EPID pixel response function derived from MC simulations. The fluence was resampled and used as the input for an in-house developed Amazon cloud-based MC software to reconstruct the 3D dose distribution. The accuracy of the developed 3D dosimetric tool was assessed using a Delta4 phantom with various field sizes (square, circular, rectangular, and irregular MLC fields) and different patient cases. The method was applied to validate VMAT/SBRT plans using WFF and FFF photon beams (Varian TrueBeam STX). Results: It was found that the proposed method yielded results consistent with the Delta4 measurements. For points on the two detector planes, a good agreement within 1.5% were found for all the testing fields. Patient VMAT/SBRT plan studies revealed similar level of accuracy: an average γ-index passing rate of 99.2± 0.6% (3mm/3%), 97.4± 2.4% (2mm/2%), and 72.6± 8.4 % ( 1mm/1%). Conclusion: A valuable 3D dosimetric verification strategy has been developed for VMAT/SBRT plan validation. The technique provides a viable solution for a number of intractable dosimetry problems, such as small fields and plans with high dose gradient.

  2. SU-E-T-490: Independent Three-Dimensional (3D) Dose Verification of VMAT/SBRT Using EPID and Cloud Computing

    International Nuclear Information System (INIS)

    Ding, A; Han, B; Bush, K; Wang, L; Xing, L

    2015-01-01

    Purpose: Dosimetric verification of VMAT/SBRT is currently performed on one or two planes in a phantom with either film or array detectors. A robust and easy-to-use 3D dosimetric tool has been sought since the advent of conformal radiation therapy. Here we present such a strategy for independent 3D VMAT/SBRT plan verification system by a combined use of EPID and cloud-based Monte Carlo (MC) dose calculation. Methods: The 3D dosimetric verification proceeds in two steps. First, the plan was delivered with a high resolution portable EPID mounted on the gantry, and the EPID-captured gantry-angle-resolved VMAT/SBRT field images were converted into fluence by using the EPID pixel response function derived from MC simulations. The fluence was resampled and used as the input for an in-house developed Amazon cloud-based MC software to reconstruct the 3D dose distribution. The accuracy of the developed 3D dosimetric tool was assessed using a Delta4 phantom with various field sizes (square, circular, rectangular, and irregular MLC fields) and different patient cases. The method was applied to validate VMAT/SBRT plans using WFF and FFF photon beams (Varian TrueBeam STX). Results: It was found that the proposed method yielded results consistent with the Delta4 measurements. For points on the two detector planes, a good agreement within 1.5% were found for all the testing fields. Patient VMAT/SBRT plan studies revealed similar level of accuracy: an average γ-index passing rate of 99.2± 0.6% (3mm/3%), 97.4± 2.4% (2mm/2%), and 72.6± 8.4 % ( 1mm/1%). Conclusion: A valuable 3D dosimetric verification strategy has been developed for VMAT/SBRT plan validation. The technique provides a viable solution for a number of intractable dosimetry problems, such as small fields and plans with high dose gradient

  3. Multifractal analysis of managed and independent float exchange rates

    Science.gov (United States)

    Stošić, Darko; Stošić, Dusan; Stošić, Tatijana; Stanley, H. Eugene

    2015-06-01

    We investigate multifractal properties of daily price changes in currency rates using the multifractal detrended fluctuation analysis (MF-DFA). We analyze managed and independent floating currency rates in eight countries, and determine the changes in multifractal spectrum when transitioning between the two regimes. We find that after the transition from managed to independent float regime the changes in multifractal spectrum (position of maximum and width) indicate an increase in market efficiency. The observed changes are more pronounced for developed countries that have a well established trading market. After shuffling the series, we find that the multifractality is due to both probability density function and long term correlations for managed float regime, while for independent float regime multifractality is in most cases caused by broad probability density function.

  4. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  5. Safety assessment and verification for nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2001-01-01

    verification' are used differently in different countries. The way that these terms have been used in this Safety Guide is explained in Section 2. The term 'design' as used here includes the specifications for the safe operation and management of the plant. This Safety Guide identifies the key recommendations for carrying out the safety assessment and the independent verification. It provides detailed guidance in support of IAEA, Safety of Nuclear Power Plants: Design, Safety Standards Series No. NS-R-1 (2000), particularly in the area of safety analysis. However, this does not include all the technical details which are available and reference is made to other IAEA publications on specific design issues and safety analysis methods. Specific deterministic or probabilistic safety targets or radiological limits can vary in different countries and are the responsibility of the regulatory body. This Safety Guide provides some references to targets and limits established by international organizations. Operators, and sometimes designers, may also set their own safety targets which may be more stringent than those set by the regulator or may address different aspects of safety. In some countries operators are expected to do this as part of their 'ownership' of the entire safety case. This Safety Guide does not include specific recommendations for the safety assessment of those plant systems for which dedicated Safety Guides exist. Section 2 defines the terms 'safety assessment', 'safety analysis' and 'independent verification' and outlines their relationship. Section 3 gives the key recommendations for the safety assessment of the principal and plant design requirements. Section 4 gives the key recommendations for safety analysis. It describes the identification of postulated initiating events (PIEs), which are used throughout the safety assessment including the safety analysis, the deterministic transient analysis and severe accident analysis, and the probabilistic safety analysis

  6. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  7. Methodology, results and experience of independent brachytherapy plan verifications based on DICOM standard; Implementacion, resultados y experiencia de una verificacion independiente de tratamientos de braquiterapia basada en el estandar DICOM

    Energy Technology Data Exchange (ETDEWEB)

    Ferrando Sanchez, A.; Pardo Perez, E.; Castro Novals, J.; Casa de Julian, M. A. de la; Cabello Murillo, E.; Diaz Fuentes, R.; Molina Lopez, M. Y.

    2013-09-01

    The use of a high dose rate source together with an afterloading treatment delivery in brachytherapy plans allows for dose modulation minimizing dose to staff. An independent verification of the exported data to the treatment station is required by local regulations (being also a widely accepted recommendation on the international literature). We have developed a methodology under home brew code to import DICOM treatment data onto an Excel spreadsheet that is able to calculate dose on given reference points using the TG-43 formalism of the AAPM3-5. It employs analytic fits of anisotropy factor and radial dose function for different sources. The end point implementations we present here allow merging in one step an independent verification and a treatment printout. The use of DICOM standard makes our code versatile and provides greater compatibility with respect to current treatment planning systems. (Author)

  8. Stakeholder Analysis on Community Forest Management Partnership and Independent

    Directory of Open Access Journals (Sweden)

    Dimas Alfred Pasetia

    2018-03-01

    Full Text Available Timber of community forest in one of the alternative supply that is needed by the wood processing industries. Partnership and independent of community forest can be realized in the relationship between farmers and industry. However, parts of the community forest system is represented by different stakeholders, which are interrelated in a system. This study analyzed stakeholder interest, influences and relationships between partnership and independent of community forest management. The study was conducted in Probolinggo District and respondents were selected using snowball sampling. There were 15 stakeholders identified as being involved in the partnership of community forest management of which were classified 4 as key players, 2 as context setters, 5 as subjects and 5 as crowds. There were 12 stakeholders identified as being involved in the independent of community forest management of which were classified 3 as key players, 1 as context setters, 5 as subjects and 3 as crowd. The performances of each stakeholder can be controlled if the integration of relationships and rules has been established. Keywords: community forest, independent, partnership, stakeholders

  9. The role of independent inspection in verification activities

    International Nuclear Information System (INIS)

    Fuerste, W.

    1982-01-01

    Independent inspection has its origin in the early Middle Ages, when kings prescribed to textile manufacturers a quality control by authorized inspectors. Independent inspection became more important with respect to the safety of workers in the industrial era beginning in the 18th century, when the high-pressure steam engines were introduced and many accidents and explosions of steam boilers occurred. Nowadays, independent inspection is of utmost importance, because the safety of the public is endangered by industrial plants located near highly populated areas. To better understand the role of independent inspection, the paper reviews its historical development in Europe from the Middle Ages to the 20th century. Most of the independent inspection associations in European countries were established after the second half of the 19th century and began with the surveillance of steam boilers. Nowadays, organizations like the TUEVs (Technical Inspection Agencies) in the Federal Republic of Germany perform expertise and inspection on behalf of the government in various fields of industrial safety. To show how independent inspection is involved in quality assurance, the methods practised in the United States of America and the Federal Republic of Germany are compared and future trends discussed. For developing countries starting with nuclear energy, it is important to consider the information and recommendations regarding independent inspection and its regulation as laid down in a number of publications in the IAEA NUSS Programme. (author)

  10. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  11. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  12. Defining an Inteligent Information System for Monitoring and Verification of Energy Management in Cities

    International Nuclear Information System (INIS)

    Tomsic, Z.; Gasic, I.; Lugaric, L.; Cacic, G.

    2011-01-01

    Improving the efficiency of energy consumption (EC) is a central theme of any energy policy. Improved energy efficiency (EE) meets three energy policy goals: security of supply, competitiveness and protection of the environment. Systematic energy management is a body of knowledge and skills based on an organizational structure that links people with assigned responsibilities, efficiency monitoring procedures and continuous measurement and improvement of energy efficiency. This body of knowledge must be supported by appropriate ICT for gathering, processing and disseminating data on EC, EE targets and information. Energy Management Information System - EMIS is a web application for monitoring and analysis of energy and water consumption in public buildings and represents inevitable tool for systematic energy management. EMIS software tool connects processes of gathering data on buildings and their energy consumption, monitoring consumption indicators, setting energy efficiency targets and reporting energy and water consumption savings. Project Intelligent Information System for Monitoring and Verification of Energy Management in Cities (ISEMIC) will distribute EMIS software tool in region (BiH, Slovenia and Serbia). This project also has a goal of improving a software system for utilizing EC measurements, both from smart meters and traditional measurement devices and subsequent data processing and analysis to facilitate, upgrade and eventually replace the currently used energy management system for public buildings in Croatia. ISEMIC will enable use of smart meters within an energy management for the first time in BiH, Slovenia and Serbia, along with an analytical part which enables intelligent estimation of energy consumption based on multiple criteria. EMIS/ISEMIC will enable: Continuous updating and maintenance of a database of information on buildings; Continuous entry and monitoring of consumption data for all energents and water in buildings; Calculation of

  13. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    OLGUIN, L.J.

    2000-01-01

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  14. The verification of DRAGON: progress and lessons learned

    International Nuclear Information System (INIS)

    Marleau, G.

    2002-01-01

    The general requirements for the verification of the legacy code DRAGON are somewhat different from those used for new codes. For example, the absence of a design manual for DRAGON makes it difficult to confirm that the each part of the code performs as required since these requirements are not explicitly spelled out for most of the DRAGON modules. In fact, this conformance of the code can only be assessed, in most cases, by making sure that the contents of the DRAGON data structures, which correspond to the output generated by a module of the code, contains the adequate information. It is also possible in some cases to use the self-verification options in DRAGON to perform additional verification or to evaluate, using an independent software, the performance of specific functions in the code. Here, we will describe the global verification process that was considered in order to bring DRAGON to an industry standard tool-set (IST) status. We will also discuss some of the lessons we learned in performing this verification and present some of the modification to DRAGON that were implemented as a consequence of this verification. (author)

  15. Unification & sharing in timed automata verification

    DEFF Research Database (Denmark)

    David, Alexandre; Behrmann, Gerd; Larsen, Kim Guldstrand

    2003-01-01

    We present the design of the model-checking engine and internal data structures for the next generation of UPPAAL. The design is based on a pipeline architecture where each stage represents one independent operation in the verification algorithms. The architecture is based on essentially one shar...

  16. Towards real-time verification of CO2 emissions

    Science.gov (United States)

    Peters, Glen P.; Le Quéré, Corinne; Andrew, Robbie M.; Canadell, Josep G.; Friedlingstein, Pierre; Ilyina, Tatiana; Jackson, Robert B.; Joos, Fortunat; Korsbakken, Jan Ivar; McKinley, Galen A.; Sitch, Stephen; Tans, Pieter

    2017-12-01

    The Paris Agreement has increased the incentive to verify reported anthropogenic carbon dioxide emissions with independent Earth system observations. Reliable verification requires a step change in our understanding of carbon cycle variability.

  17. Verification of road databases using multiple road models

    Science.gov (United States)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  18. Bundle 13 position verification tool description and on-reactor use

    Energy Technology Data Exchange (ETDEWEB)

    Onderwater, T G [Canadian General Electric Co. Ltd., Peterborough, ON (Canada)

    1997-12-31

    To address the Power Pulse problem, Bruce B uses Gap: a comprehensive monitoring program by the station to maintain the gap between the fuel string and the upstream shield plug. The gap must be maintained within a band. The gap must not be so large as to allow excessive reactivity increases or cause high impact forces during reverse flow events. It should also not be so small as to cause crushed fuel during rapid, differential reactor/fuel string cool downs. Rapid cool downs are infrequent. The Bundle 13 Position Verification Tool (BPV tool) role is to independently measure the position of the upstream bundle of the fuel string. The measurements are made on-reactor, on-power and will allow verification of the Gap Management system`s calculated fuel string position. This paper reviews the reasons for developing the BPV tool. Design issues relevant to safe operation in the fuelling machine, fuel channel and fuel handling equipment are also reviewed. Tests ensuring no adverse effects on channel pressure losses are described and actual on-reactor, on-power results are discussed. (author). 4 figs.

  19. Bundle 13 position verification tool description and on-reactor use

    International Nuclear Information System (INIS)

    Onderwater, T.G.

    1996-01-01

    To address the Power Pulse problem, Bruce B uses Gap: a comprehensive monitoring program by the station to maintain the gap between the fuel string and the upstream shield plug. The gap must be maintained within a band. The gap must not be so large as to allow excessive reactivity increases or cause high impact forces during reverse flow events. It should also not be so small as to cause crushed fuel during rapid, differential reactor/fuel string cool downs. Rapid cool downs are infrequent. The Bundle 13 Position Verification Tool (BPV tool) role is to independently measure the position of the upstream bundle of the fuel string. The measurements are made on-reactor, on-power and will allow verification of the Gap Management system's calculated fuel string position. This paper reviews the reasons for developing the BPV tool. Design issues relevant to safe operation in the fuelling machine, fuel channel and fuel handling equipment are also reviewed. Tests ensuring no adverse effects on channel pressure losses are described and actual on-reactor, on-power results are discussed. (author). 4 figs

  20. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Science.gov (United States)

    2010-10-01

    ... standards. (f) The reviewer shall analyze all Fault Tree Analyses (FTA), Failure Mode and Effects... for each product vulnerability cited by the reviewer; (4) Identification of any documentation or... not properly followed; (6) Identification of the software verification and validation procedures, as...

  1. The design and verification of probabilistic safety analysis platform NFRisk

    International Nuclear Information System (INIS)

    Hu Wenjun; Song Wei; Ren Lixia; Qian Hongtao

    2010-01-01

    To increase the technical ability in Probabilistic Safety Analysis (PSA) field in China,it is necessary and important to study and develop indigenous professional PSA platform. Following such principle as 'from structure simplification to modulization to production of cut sets to minimum of cut sets', the algorithms, including simplification algorithm, modulization algorithm, the algorithm of conversion from fault tree to binary decision diagram (BDD), the solving algorithm of cut sets, the minimum algorithm of cut sets, and so on, were designed and developed independently; the design of data management and operation platform was completed all alone; the verification and validation of NFRisk platform based on 3 typical fault trees was finished on our own. (authors)

  2. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  3. Verification and validation of RADMODL Version 1.0

    International Nuclear Information System (INIS)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V ampersand V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident

  4. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  5. Independent assembly technology for DCS safety equipment in nuclear power plant

    International Nuclear Information System (INIS)

    Hao Aixia

    2014-01-01

    A independent assembly technology of identification and verification was proposed, which included special process, wiring process and verification process. The safety reliability and practicability of the proposed technology were verified according to the application in FirmSys assembly implemented by China Techenergy Co., Ltd. (author)

  6. Adaptive beamlet-based finite-size pencil beam dose calculation for independent verification of IMRT and VMAT

    International Nuclear Information System (INIS)

    Park, Justin C.; Li, Jonathan G.; Arhjoul, Lahcen; Yan, Guanghua; Lu, Bo; Fan, Qiyong; Liu, Chihray

    2015-01-01

    Purpose: The use of sophisticated dose calculation procedure in modern radiation therapy treatment planning is inevitable in order to account for complex treatment fields created by multileaf collimators (MLCs). As a consequence, independent volumetric dose verification is time consuming, which affects the efficiency of clinical workflow. In this study, the authors present an efficient adaptive beamlet-based finite-size pencil beam (AB-FSPB) dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. Methods: The computational time of finite-size pencil beam (FSPB) algorithm is proportional to the number of infinitesimal and identical beamlets that constitute an arbitrary field shape. In AB-FSPB, dose distribution from each beamlet is mathematically modeled such that the sizes of beamlets to represent an arbitrary field shape no longer need to be infinitesimal nor identical. As a result, it is possible to represent an arbitrary field shape with combinations of different sized and minimal number of beamlets. In addition, the authors included the model parameters to consider MLC for its rounded edge and transmission. Results: Root mean square error (RMSE) between treatment planning system and conventional FSPB on a 10 × 10 cm 2 square field using 10 × 10, 2.5 × 2.5, and 0.5 × 0.5 cm 2 beamlet sizes were 4.90%, 3.19%, and 2.87%, respectively, compared with RMSE of 1.10%, 1.11%, and 1.14% for AB-FSPB. This finding holds true for a larger square field size of 25 × 25 cm 2 , where RMSE for 25 × 25, 2.5 × 2.5, and 0.5 × 0.5 cm 2 beamlet sizes were 5.41%, 4.76%, and 3.54% in FSPB, respectively, compared with RMSE of 0.86%, 0.83%, and 0.88% for AB-FSPB. It was found that AB-FSPB could successfully account for the MLC transmissions without major discrepancy. The algorithm was also graphical processing unit (GPU) compatible to maximize its computational speed. For an intensity modulated radiation therapy (∼12 segments) and a

  7. Adaptive beamlet-based finite-size pencil beam dose calculation for independent verification of IMRT and VMAT.

    Science.gov (United States)

    Park, Justin C; Li, Jonathan G; Arhjoul, Lahcen; Yan, Guanghua; Lu, Bo; Fan, Qiyong; Liu, Chihray

    2015-04-01

    The use of sophisticated dose calculation procedure in modern radiation therapy treatment planning is inevitable in order to account for complex treatment fields created by multileaf collimators (MLCs). As a consequence, independent volumetric dose verification is time consuming, which affects the efficiency of clinical workflow. In this study, the authors present an efficient adaptive beamlet-based finite-size pencil beam (AB-FSPB) dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. The computational time of finite-size pencil beam (FSPB) algorithm is proportional to the number of infinitesimal and identical beamlets that constitute an arbitrary field shape. In AB-FSPB, dose distribution from each beamlet is mathematically modeled such that the sizes of beamlets to represent an arbitrary field shape no longer need to be infinitesimal nor identical. As a result, it is possible to represent an arbitrary field shape with combinations of different sized and minimal number of beamlets. In addition, the authors included the model parameters to consider MLC for its rounded edge and transmission. Root mean square error (RMSE) between treatment planning system and conventional FSPB on a 10 × 10 cm(2) square field using 10 × 10, 2.5 × 2.5, and 0.5 × 0.5 cm(2) beamlet sizes were 4.90%, 3.19%, and 2.87%, respectively, compared with RMSE of 1.10%, 1.11%, and 1.14% for AB-FSPB. This finding holds true for a larger square field size of 25 × 25 cm(2), where RMSE for 25 × 25, 2.5 × 2.5, and 0.5 × 0.5 cm(2) beamlet sizes were 5.41%, 4.76%, and 3.54% in FSPB, respectively, compared with RMSE of 0.86%, 0.83%, and 0.88% for AB-FSPB. It was found that AB-FSPB could successfully account for the MLC transmissions without major discrepancy. The algorithm was also graphical processing unit (GPU) compatible to maximize its computational speed. For an intensity modulated radiation therapy (∼12 segments) and a volumetric modulated arc

  8. Advanced Test Reactor Safety Basis Upgrade Lessons Learned Relative to Design Basis Verification and Safety Basis Management

    International Nuclear Information System (INIS)

    G. L. Sharp; R. T. McCracken

    2004-01-01

    The Advanced Test Reactor (ATR) is a pressurized light-water reactor with a design thermal power of 250 MW. The principal function of the ATR is to provide a high neutron flux for testing reactor fuels and other materials. The reactor also provides other irradiation services such as radioisotope production. The ATR and its support facilities are located at the Test Reactor Area of the Idaho National Engineering and Environmental Laboratory (INEEL). An audit conducted by the Department of Energy's Office of Independent Oversight and Performance Assurance (DOE OA) raised concerns that design conditions at the ATR were not adequately analyzed in the safety analysis and that legacy design basis management practices had the potential to further impact safe operation of the facility.1 The concerns identified by the audit team, and issues raised during additional reviews performed by ATR safety analysts, were evaluated through the unreviewed safety question process resulting in shutdown of the ATR for more than three months while these concerns were resolved. Past management of the ATR safety basis, relative to facility design basis management and change control, led to concerns that discrepancies in the safety basis may have developed. Although not required by DOE orders or regulations, not performing design basis verification in conjunction with development of the 10 CFR 830 Subpart B upgraded safety basis allowed these potential weaknesses to be carried forward. Configuration management and a clear definition of the existing facility design basis have a direct relation to developing and maintaining a high quality safety basis which properly identifies and mitigates all hazards and postulated accident conditions. These relations and the impact of past safety basis management practices have been reviewed in order to identify lessons learned from the safety basis upgrade process and appropriate actions to resolve possible concerns with respect to the current ATR safety

  9. Tax management in strengthening financial independence of local government

    Directory of Open Access Journals (Sweden)

    Badida M.P.

    2017-05-01

    Full Text Available The article analyses the directions of tax management in the context of budgetary decentralization. The paper aims to research tax management in the system of forming local budgets in the conditions of financial resources decentralization and analyze the current tools of tax management on the local level. In spite of a great number of researches, the issue of tax management in strengthening financial independence of local budgets is not enough studied; therefore, to reach financial independence of local communities it is necessary to adopt new normative and legal acts concerning decentralization of financial resources taking into account tax potential of each territorial community. The lack of financial resources causes the problem of optimal redistribution of power among central and local authorities. The author proposes the new mechanisms of solving the current problems on the example of local budget of the city of Uzgorod to contribute budgetary decentralization. The principal idea while conducting the reform of the budgetary process must be the adoption of certain changes in budgetary legislation which have to shift the mechanisms of budgetary financing and equating local budgets. The author concludes that the progressive vector of well-being increasing completely depends on the developed concept of regional economic policy. The article pays attention to the importance of taking into account the tax potential of a territory and characterizes the directions of strengthening financial independence of local government.

  10. Ownership balance, supervisory efficiency of independent directors and the quality of management earnings forecasts

    Directory of Open Access Journals (Sweden)

    Yunling Song

    2013-06-01

    Full Text Available In the Chinese securities market, with its characteristics of influence through personal relationships (Guanxi and underdeveloped standards of law and enforcement, can independent directors play the supervisory role expected by securities regulators? In this study we use the degree of precision and accuracy in corporate earnings forecasts as proxies for the quality of information disclosure by listed companies and examine the supervisory efficiency of independent directors with respect to information disclosure. Using data from 2007 to 2009, we find that in the absence of ownership balance, independent directors have a significant positive effect on the accuracy of management forecasts. In addition, the personal backgrounds of independent directors have specific effects on management earnings forecasts. Directors with certified public accountant (CPA expertise significantly improve the precision of management forecasts. However, directors with industrial expertise significantly reduce the precision of management forecasts. In other words, having directors with CPA expertise improves the independence of boards, but having independent directors with industrial expertise has the opposite effect.

  11. Verification and Planning for Stochastic Processes with Asynchronous Events

    National Research Council Canada - National Science Library

    Younes, Hakan L

    2005-01-01

    .... The most common assumption is that of history-independence: the Markov assumption. In this thesis, the author considers the problems of verification and planning for stochastic processes with asynchronous events, without relying on the Markov assumption...

  12. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  13. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    Langdon, J.

    1996-01-01

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  14. Validation of the implementation of IMRT with three dosimetric methods of independent verification

    International Nuclear Information System (INIS)

    Tortosa Oliver, R. A.; Chinillach ferrando, N.; Alonso Arrizabalaga, S.; Campayo Esteban, J. M.; Morales Marco, J. C.; Soler Catalan, P.; Andreu Martinez, F. J.

    2013-01-01

    The TG119 is a simple and clear framework to verify the implementation of IMRT technique in a radiotherapy service. Verifications of this document recommended tests conducted with the three dosimetric methods listed above, allow to affirm that our Center is within the margins of tolerance considered suitable in the TG119 for the clinical implementation of IMRT. (Author)

  15. Design and verification of a self-timed RAM

    DEFF Research Database (Denmark)

    Nielsen, Lars Skovby; Staunstrup, Jørgen

    1995-01-01

    This paper describes a self-timed static RAM. A single bit RAM is described in the design language SYNCHRONIZED TRANSITIONS and using the verification tools supporting this language, it is shown that the design is speed-independent. Furthermore, a transistor level implementation of the design...

  16. Class 1E software verification and validation: Past, present, and future

    Energy Technology Data Exchange (ETDEWEB)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V&V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V&V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V&V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V&V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V&V Guidelines is introduced. The paper concludes with a glossary and bibliography.

  17. Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3

    Science.gov (United States)

    The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.

  18. Characterizing speed-independence of high-level designs

    DEFF Research Database (Denmark)

    Kishinevsky, Michael; Staunstrup, Jørgen

    1994-01-01

    This paper characterizes the speed-independence of high-level designs. The characterization is a condition on the design description ensuring that the behavior of the design is independent of the speeds of its components. The behavior of a circuit is modeled as a transition system, that allows data...... types, and internal as well as external non-determinism. This makes it possible to verify the speed-independence of a design without providing an explicit realization of the environment. The verification can be done mechanically. A number of experimental designs have been verified including a speed-independent...

  19. Loss of Financial Management Independence After Brain Injury: Survivors' Experiences.

    Science.gov (United States)

    Koller, Kathryn; Woods, Lindsay; Engel, Lisa; Bottari, Carolina; Dawson, Deirdre R; Nalder, Emily

    2016-01-01

    This pilot study explored the experiences of brain injury survivors after a change in financial management (FM) independence. Using a qualitative descriptive design, 6 participants with acquired brain injury were recruited from a community brain injury organization and participated in semistructured interviews. Data were analyzed using thematic analysis. Three themes emerged from the interviews: (1) trajectory of FM change, involving family members as key change agents; (2) current FM situation, involving FM strategies such as automatic deposits and restricted budgets; and (3) the struggle for control, in which survivors desired control while also accepting supports for FM. This study identifies some of the challenges brain injury survivors face in managing their finances and the adjustment associated with a loss of FM independence. Occupational therapists should be aware of clients' experiences when supporting them through a change in independence. Copyright © 2016 by the American Occupational Therapy Association, Inc.

  20. Verification and accreditation schemes for climate change activities: A review of requirements for verification of greenhouse gas reductions and accreditation of verifiers—Implications for long-term carbon sequestration

    Science.gov (United States)

    Roed-Larsen, Trygve; Flach, Todd

    The purpose of this chapter is to provide a review of existing national and international requirements for verification of greenhouse gas reductions and associated accreditation of independent verifiers. The credibility of results claimed to reduce or remove anthropogenic emissions of greenhouse gases (GHG) is of utmost importance for the success of emerging schemes to reduce such emissions. Requirements include transparency, accuracy, consistency, and completeness of the GHG data. The many independent verification processes that have developed recently now make up a quite elaborate tool kit for best practices. The UN Framework Convention for Climate Change and the Kyoto Protocol specifications for project mechanisms initiated this work, but other national and international actors also work intensely with these issues. One initiative gaining wide application is that taken by the World Business Council for Sustainable Development with the World Resources Institute to develop a "GHG Protocol" to assist companies in arranging for auditable monitoring and reporting processes of their GHG activities. A set of new international standards developed by the International Organization for Standardization (ISO) provides specifications for the quantification, monitoring, and reporting of company entity and project-based activities. The ISO is also developing specifications for recognizing independent GHG verifiers. This chapter covers this background with intent of providing a common understanding of all efforts undertaken in different parts of the world to secure the reliability of GHG emission reduction and removal activities. These verification schemes may provide valuable input to current efforts of securing a comprehensive, trustworthy, and robust framework for verification activities of CO2 capture, transport, and storage.

  1. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  2. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1994-01-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  3. 78 FR 28812 - Energy Efficiency Program for Industrial Equipment: Petition of UL Verification Services Inc. for...

    Science.gov (United States)

    2013-05-16

    ... are engineers. UL today is comprised of five businesses, Product Safety, Verification Services, Life..., Director--Global Technical Research, UL Verification Services. Subscribed and sworn to before me this 20... (431.447(c)(4)) General Personnel Overview UL is a global independent safety science company with more...

  4. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Jang-Soo; Kim, Jang-Yeol

    2004-01-01

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  5. Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator

    Energy Technology Data Exchange (ETDEWEB)

    Stagich, B. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-03-29

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.

  6. Global climate change mitigation and sustainable forest management--The challenge of monitoring and verification

    Energy Technology Data Exchange (ETDEWEB)

    Makundi, Willy R.

    1997-12-31

    In this paper, sustainable forest management is discussed within the historical and theoretical framework of the sustainable development debate. The various criteria and indicators for sustainable forest management put forth by different institutions are critically explored. Specific types of climate change mitigation policies/projects in the forest sector are identified and examined in the light of the general criteria for sustainable forest management. Areas of compatibility and contradiction between the climate mitigation objectives and the minimum criteria for sustainable forest management are identified and discussed. Emphasis is put on the problems of monitoring and verifying carbon benefits associated with such projects given their impacts on pre-existing policy objectives on sustainable forest management. The implications of such policy interactions on assignment of carbon credits from forest projects under Joint Implementation/Activities Implemented Jointly initiatives are discussed. The paper concludes that a comprehensive monitoring and verification regime must include an impact assessment on the criteria covered under other agreements such as the Biodiversity and/or Desertification Conventions. The actual carbon credit assigned to a specific project should at least take into account the negative impacts on the criteria for sustainable forest management. The value of the impacts and/or the procedure to evaluate them need to be established by interested parties such as the Councils of the respective Conventions.

  7. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.

    2012-01-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  8. Preparation of a program for the independent verification of the brachytherapy planning systems calculations; Confeccion de un programa para la verificacion independiente de los calculos de los sistemas de planificacion en braquiterapia

    Energy Technology Data Exchange (ETDEWEB)

    V Carmona, V.; Perez-Calatayud, J.; Lliso, F.; Richart Sancho, J.; Ballester, F.; Pujades-Claumarchirant, M.C.; Munoz, M.

    2010-07-01

    In this work a program is presented that independently checks for each patient the treatment planning system calculations in low dose rate, high dose rate and pulsed dose rate brachytherapy. The treatment planning system output text files are automatically loaded in this program in order to get the source coordinates, the desired calculation point coordinates and the dwell times when it is the case. The source strength and the reference dates are introduced by the user. The program allows implementing the recommendations about independent verification of the clinical brachytherapy dosimetry in a simple and accurate way, in few minutes. (Author).

  9. Radiochemical verification and validation in the environmental data collection process

    International Nuclear Information System (INIS)

    Rosano-Reece, D.; Bottrell, D.; Bath, R.J.

    1994-01-01

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  10. The independent Baltic states: Maritime law and resource management implications

    International Nuclear Information System (INIS)

    Canfield, J.L.

    1993-01-01

    The achievement of independence by the Baltic states impacts nearly all aspects of the maritime law and resource management regimes appertaining to the Baltic Sea. The unique position of these states, given their maritime history and role as a bridge between East and West, warrants reconsideration. The Baltic Sea basin is among the most highly industrialized shorelines in the world, accounting for approximately 15% of world industrial output, and is relatively dense in population. Large quantities of pollutants water its waters by way of industrial, agricultural, and municipal waste. A lack of adequate sewage treatment accounts for much of the waste. The Baltic is also especially sensitive to oil pollution as the relatively cold water inhibits bacteriological breakdown. Important issues of maritime border delimitation, treaty devolution, and the potential for reinstitution of exclusionary regimes reappeared with the attainment of independence. Further, the legacy of Soviet maritime environment and resource management has engendered fundamental political, social, and economic conflicts for which resources and effective management structures are lacking. The competing requirements of economic development, reintegration into Western markets, and management of critical marine resources highlights the need for comprehensive and regionally focused approaches to the problems identified

  11. Verification of the SLC wake potentials

    International Nuclear Information System (INIS)

    Bane, K.; Weiland, T.

    1983-01-01

    The accurate knowledge of the monopole, dipole, and quadrupole wake potentials is essential for SLC. These wake potentials were previously computed by the modal method. The time domain code TBCI allows independent verification of these results. This comparison shows that the two methods agree to within 10% for bunch lengths down to 1 mm. TBCI results also indicate that rounding the irises gives at least a 10% reduction in the wake potentials

  12. Results of the independent radiological verification survey of the remedial action performed at the former Alba Craft Laboratory site, Oxford, Ohio, (OXO001)

    International Nuclear Information System (INIS)

    Kleinhans, K.R.; Murray, M.E.; Carrier, R.F.

    1996-04-01

    Between October 1952 and February 1957, National Lead of Ohio (NLO), a primary contractor for the Atomic Energy Commission (AEC), subcontracted certain uranium machining operations to Alba Craft Laboratory, Incorporated, located at 10-14 West Rose Avenue, Oxford, Ohio. In 1992, personnel from Oak Ridge National Laboratory (ORNL) confirmed the presence of residual radioactive materials from the AEC-related operations in and around the facility in amounts exceeding the applicable Department of Energy (DOE) guidelines. Although the amount of uranium found on the property posed little health hazard if left undisturbed, the levels were sufficient to require remediation to bring radiological conditions into compliance with current guidelines, thus ensuring that the public and the environment are protected. A team from ORNL conducted a radiological verification survey of the former Alba Craft Laboratory property between December 1994 and February 1995. The survey was conducted at the request of DOE and included directly measured radiation levels, the collection and analysis of soil samples to determine concentrations of uranium and certain other radionuclides, and comparison of these data to the guidelines. This document reports the findings of this survey. The results of the independent verification survey of the former Alba Craft Laboratory property demonstrate that all contaminated areas have been remediated to radionuclide concentrations and activity levels below the applicable guideline limits set by DOE

  13. Verification for excess reactivity on beginning equilibrium core of RSG GAS

    International Nuclear Information System (INIS)

    Daddy Setyawan; Budi Rohman

    2011-01-01

    BAPETEN is an institution authorized to control the use of nuclear energy in Indonesia. Control for the use of nuclear energy is carried out through three pillars: regulation, licensing, and inspection. In order to assure the safety of the operating research reactors, the assessment unit of BAPETEN is carrying out independent assessment in order to verify safety related parameters in the SAR including neutronic aspect. The work includes verification to the Power Peaking Factor in the equilibrium silicide core of RSG GAS reactor by computational method using MCNP-ORIGEN. This verification calculation results for is 9.4 %. Meanwhile, the RSG-GAS safety analysis report shows that the excess reactivity on equilibrium core of RSG GAS is 9.7 %. The verification calculation results show a good agreement with the report. (author)

  14. Independent Verification Survey of the Clean Coral Storage Pile at the Johnston Atoll Plutonium Contaminated Soil Remediation Project

    International Nuclear Information System (INIS)

    Wilson-Nichols, M.J.; Egidi, P.V.; Roemer, E.K.; Schlosser, R.M.

    2000-01-01

    f I The Oak Ridge National Laboratory (ORNL) Environmental Technology Section conducted an independent verification (IV) survey of the clean storage pile at the Johnston Atoll Plutonium Contaminated Soil Remediation Project (JAPCSRP) from January 18-25, 1999. The goal of the JAPCSRP is to restore a 24-acre area that was contaminated with plutonium oxide particles during nuclear testing in the 1960s. The selected remedy was a soil sorting operation that combined radiological measurements and mining processes to identify and sequester plutonium-contaminated soil. The soil sorter operated from about 1990 to 1998. The remaining clean soil is stored on-site for planned beneficial use on Johnston Island. The clean storage pile currently consists of approximately 120,000 m3 of coral. ORNL conducted the survey according to a Sampling and Analysis Plan, which proposed to provide an IV of the clean pile by collecting a minimum number (99) of samples. The goal was to ascertain wi th 95% confidence whether 97% of the processed soil is less than or equal to the accepted guideline (500-Bq/kg or 13.5-pCi/g) total transuranic (TRU) activity

  15. Simulation of operator's actions during severe accident management

    International Nuclear Information System (INIS)

    Viktorov, A.

    2015-01-01

    Implementing accident management counter measures or actions to mitigate consequences of a severe accident is essential to reduce radiological risks to the public and environment. Station-specific severe accident management guidelines (SAMGs) have been developed and implemented at all Canadian nuclear power plants. Following the Fukushima Daiichi nuclear accident certain enhancements were introduced to the SAMG, namely consideration of multi-units accidents, events involving spent fuel pools, incorporation of capability offered by the portable emergency mitigating equipment, and so on. To evaluate the adequacy and usability of the SAMGs, CNSC staff initiated a number of activities including a desktop review of SAMG documentation, evaluation of SAMG implementation through exercises and interviews with station staff, and independent verification of SAMG action effectiveness. This paper focuses on the verification of SAMG actions through analytical simulations. The objectives of the work are two-folds: (a) to understand the effectiveness of SAMG-specified mitigation actions in addressing the safety challenges and (b) to check for potential negative effects of the action. Some sensitivity calculations were performed to help understanding of the impact from actions that rely on the partially effective equipment or limited material resources. The severe accident computer code MAAP4-CANDU is used as a tool in this verification. This paper will describe the methodology used in the verification of SAMG actions and some results obtained from simulations. (author)

  16. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  17. VERIFICATION TESTING OF AIR POLLUTION CONTROL TECHNOLOGY QUALITY MANAGEMENT PLAN

    Science.gov (United States)

    This document is the basis for quality assurance for the Air Pollution Control Technology Verification Center (APCT Center) operated under the U.S. Environmental Protection Agency (EPA). It describes the policies, organizational structure, responsibilities, procedures, and qualit...

  18. International Space Station Atmosphere Control and Supply, Atmosphere Revitalization, and Water Recovery and Management Subsystem - Verification for Node 1

    Science.gov (United States)

    Williams, David E.

    2007-01-01

    The International Space Station (ISS) Node 1 Environmental Control and Life Support (ECLS) System is comprised of five subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). This paper provides a summary of the nominal operation of the Node 1 ACS, AR, and WRM design and detailed Element Verification methodologies utilized during the Qualification phase for Node 1.

  19. Results of the independent radiological verification survey at the former Associate Aircraft Tool and Manufacturing Company site, Fairfield, Ohio (FOH001)

    International Nuclear Information System (INIS)

    Rice, D.E.; Murray, M.E.; Brown, K.S.

    1996-01-01

    The former Associate Aircraft Tool and Manufacturing Company site is located at 3550 Dixie Highway, Fairfield, Ohio. Associate Aircraft Tool and Manufacturing Company produced hollow uranium slugs in a machine shop at the site in 1956. The work was performed for National Lead of Ohio in a contract with the Atomic Energy Commission to augment the capacity of the Feed Materials Production Center at Fernald in the development of nuclear energy for defense-related projects. The current occupant of the building, Force Control, operates a multipurpose machine shop. At the request of the US Department of Energy (DOE), a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at the former Associate Aircraft Tool and Manufacturing Company Site, Fairfield, Ohio. The survey was performed from February to May of 1995. The purpose of the survey was to verify that radioactivity from residues of 238 U was remediated to a level below acceptable DOE guidelines levels

  20. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  1. BEval: A Plug-in to Extend Atelier B with Current Verification Technologies

    Directory of Open Access Journals (Sweden)

    Valério Medeiros Jr.

    2014-01-01

    Full Text Available This paper presents BEval, an extension of Atelier B to improve automation in the verification activities in the B method or Event-B. It combines a tool for managing and verifying software projects (Atelier B and a model checker/animator (ProB so that the verification conditions generated in the former are evaluated with the latter. In our experiments, the two main verification strategies (manual and automatic showed significant improvement as ProB's evaluator proves complementary to Atelier B built-in provers. We conducted experiments with the B model of a micro-controller instruction set; several verification conditions, that we were not able to discharge automatically or manually with AtelierB's provers, were automatically verified using BEval.

  2. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  3. Using Reversed MFCC and IT-EM for Automatic Speaker Verification

    Directory of Open Access Journals (Sweden)

    Sheeraz Memon

    2012-01-01

    Full Text Available This paper proposes text independent automatic speaker verification system using IMFCC (Inverse/ Reverse Mel Frequency Coefficients and IT-EM (Information Theoretic Expectation Maximization. To perform speaker verification, feature extraction using Mel scale has been widely applied and has established better results. The IMFCC is based on inverse Mel-scale. The IMFCC effectively captures information available at the high frequency formants which is ignored by the MFCC. In this paper the fusion of MFCC and IMFCC at input level is proposed. GMMs (Gaussian Mixture Models based on EM (Expectation Maximization have been widely used for classification of text independent verification. However EM comes across the convergence issue. In this paper we use our proposed IT-EM which has faster convergence, to train speaker models. IT-EM uses information theory principles such as PDE (Parzen Density Estimation and KL (Kullback-Leibler divergence measure. IT-EM acclimatizes the weights, means and covariances, like EM. However, IT-EM process is not performed on feature vector sets but on a set of centroids obtained using IT (Information Theoretic metric. The IT-EM process at once diminishes divergence measure between PDE estimates of features distribution within a given class and the centroids distribution within the same class. The feature level fusion and IT-EM is tested for the task of speaker verification using NIST2001 and NIST2004. The experimental evaluation validates that MFCC/IMFCC has better results than the conventional delta/MFCC feature set. The MFCC/IMFCC feature vector size is also much smaller than the delta MFCC thus reducing the computational burden as well. IT-EM method also showed faster convergence, than the conventional EM method, and thus it leads to higher speaker recognition scores.

  4. Monte Carlo simulations to replace film dosimetry in IMRT verification

    International Nuclear Information System (INIS)

    Goetzfried, Thomas; Trautwein, Marius; Koelbi, Oliver; Bogner, Ludwig; Rickhey, Mark

    2011-01-01

    Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3 mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. (orig.)

  5. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    International Nuclear Information System (INIS)

    Spezi, E; Lewis, D G; Smith, C W

    2002-01-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region

  6. Monitoring, reporting and verification for national REDD + programmes: two proposals

    International Nuclear Information System (INIS)

    Herold, Martin; Skutsch, Margaret

    2011-01-01

    Different options have been suggested by Parties to the UNFCCC (United Framework Convention on Climate Change) for inclusion in national approaches to REDD and REDD + (reduced deforestation, reduced degradation, enhancement of forest carbon stocks, sustainable management of forest, and conservation of forest carbon stocks). This paper proposes that from the practical and technical points of view of designing action for REDD and REDD + at local and sub-national level, as well as from the point of view of the necessary MRV (monitoring, reporting and verification), these should be grouped into three categories: conservation, which is rewarded on the basis of no changes in forest stock, reduced deforestation, in which lowered rates of forest area loss are rewarded, and positive impacts on carbon stock changes in forests remaining forest, which includes reduced degradation, sustainable management of forest of various kinds, and forest enhancement. Thus we have moved degradation, which conventionally is grouped with deforestation, into the forest management group reported as areas remaining forest land, with which it has, in reality, and particularly as regards MRV, much more in common. Secondly, in the context of the fact that REDD/REDD + is to take the form of a national or near-national approach, we argue that while systematic national monitoring is important, it may not be necessary for REDD/REDD + activities, or for national MRV, to be started at equal levels of intensity all over the country. Rather, areas where interventions seem easiest to start may be targeted, and here data measurements may be more rigorous (Tier 3), for example based on stakeholder self-monitoring with independent verification, while in other, untreated areas, a lower level of monitoring may be pursued, at least in the first instance. Treated areas may be targeted for any of the three groups of activities (conservation, reduced deforestation, and positive impact on carbon stock increases in

  7. Monitoring, reporting and verification for national REDD + programmes: two proposals

    Energy Technology Data Exchange (ETDEWEB)

    Herold, Martin [Center for Geoinformation, Department of Environmental Science, Wageningen University, Droevendaalsesteeg 3, 6708 PB Wageningen (Netherlands); Skutsch, Margaret, E-mail: martin.herold@wur.nl [Centro de Investigaciones en GeografIa Ambiental, UNAM Campus Morelia (Mexico)

    2011-01-15

    Different options have been suggested by Parties to the UNFCCC (United Framework Convention on Climate Change) for inclusion in national approaches to REDD and REDD + (reduced deforestation, reduced degradation, enhancement of forest carbon stocks, sustainable management of forest, and conservation of forest carbon stocks). This paper proposes that from the practical and technical points of view of designing action for REDD and REDD + at local and sub-national level, as well as from the point of view of the necessary MRV (monitoring, reporting and verification), these should be grouped into three categories: conservation, which is rewarded on the basis of no changes in forest stock, reduced deforestation, in which lowered rates of forest area loss are rewarded, and positive impacts on carbon stock changes in forests remaining forest, which includes reduced degradation, sustainable management of forest of various kinds, and forest enhancement. Thus we have moved degradation, which conventionally is grouped with deforestation, into the forest management group reported as areas remaining forest land, with which it has, in reality, and particularly as regards MRV, much more in common. Secondly, in the context of the fact that REDD/REDD + is to take the form of a national or near-national approach, we argue that while systematic national monitoring is important, it may not be necessary for REDD/REDD + activities, or for national MRV, to be started at equal levels of intensity all over the country. Rather, areas where interventions seem easiest to start may be targeted, and here data measurements may be more rigorous (Tier 3), for example based on stakeholder self-monitoring with independent verification, while in other, untreated areas, a lower level of monitoring may be pursued, at least in the first instance. Treated areas may be targeted for any of the three groups of activities (conservation, reduced deforestation, and positive impact on carbon stock increases in

  8. Development of Independent-type Optical CT

    Science.gov (United States)

    Yamaguchi, Tatsushi; Shiozawa, Daigoro; Rokunohe, Toshiaki; Kida, Junzo; Zhang, Wei

    Optical current transformers (optical CTs) have features that they can be made much smaller and lighter than conventional electromagnetic induction transformers by their simple structure, and contribute to improvement of equipment reliability because of their excellent surge resistance performance. Authors consider optical CTs to be next generation transformers, and are conducting research and development of optical CTs aiming to apply to measuring and protection in electric power systems. Specifically we developed an independent-type optical CT by utilizing basic data of optical CTs accumulated for large current characteristics, temperature characteristics, vibration resistance characteristics, and so on. In performance verification, type tests complying with IEC standards, such as short-time current tests, insulation tests, accuracy tests, and so on, showed good results. This report describes basic principle and configuration of optical CTs. After that, as basic characteristics of optical CTs, conditions and results of verification tests for dielectric breakdown characteristics of sensor fibers, large current characteristics, temperature characteristics, and vibration resistance characteristics are described. Finally, development outline of the independent-type optical CT aiming to apply to all digital substation and its type tests results are described.

  9. Methodology and tools for independent verification and validation of computerized I and C systems important to safety

    International Nuclear Information System (INIS)

    Lindner, A.; Miedl, H.

    1998-01-01

    Modular software based I and C systems are state-of-the-art in industrial automation. For I and C systems important to safety in nuclear power plants, software based systems are also more and more applied. According to existing national and international guidelines and standards, the assessment of these systems calls for appropriate test methods and tools. By use of tools quality of the assessment process should be improved and expense should be limited. The paper outlines the structure of the independent verification and validation (V and V) process of the Teleperm XS system and the lessons learnt from this process. Furthermore, tools are discussed used for V and V of the Teleperm XS software. The recently developed tool VALIDATOR, dedicated to V and V of the plant specific I and C functions is described in more detail. We consider V and V of the basic software components and the system software to be required only once, but the C source codes of the plant specific functional diagrams have to be checked for each application separately. The VALIDATOR is designed to perform this task. It gives evidence of compliance of the automatically generated C source codes with the graphical design of the functional diagrams in reasonable time and with acceptable costs. The working method, performance and results of the VALIDATOR are shown by means of an actual example. (author)

  10. Integrated Safety Management System Phase 1 and 2 Verification for the Environmental Restoration Contractor Volumes 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    CARTER, R.P.

    2000-04-04

    DOE Policy 450.4 mandates that safety be integrated into all aspects of the management and operations of its facilities. The goal of an institutionalized Integrated Safety Management System (ISMS) is to have a single integrated system that includes Environment, Safety, and Health requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and the federal property over the life cycle of the Environmental Restoration (ER) Project. The purpose of this Environmental Restoration Contractor (ERC) ISMS Phase MI Verification was to determine whether ISMS programs and processes were institutionalized within the ER Project, whether these programs and processes were implemented, and whether the system had promoted the development of a safety conscious work culture.

  11. Development of requirements tracking and verification technology for the NPP software

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-12-30

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification.

  12. Development of requirements tracking and verification technology for the NPP software

    International Nuclear Information System (INIS)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-01-01

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification

  13. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  14. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  15. Compositional Verification of Interlocking Systems for Large Stations

    DEFF Research Database (Denmark)

    Fantechi, Alessandro; Haxthausen, Anne Elisabeth; Macedo, Hugo Daniel dos Santos

    2017-01-01

    -networks that are independent at some degree. At this regard, we study how the division of a complex network into sub-networks, using stub elements to abstract all the routes that are common between sub-networks, may still guarantee compositionality of verification of safety properties....... for networks of large size due to the exponential computation time and resources needed. Some recent attempts to address this challenge adopt a compositional approach, targeted to track layouts that are easily decomposable into sub-networks such that a route is almost fully contained in a sub......-network: in this way granting the access to a route is essentially a decision local to the sub-network, and the interfaces with the rest of the network easily abstract away less interesting details related to the external world. Following up on previous work, where we defined a compositional verification method...

  16. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  17. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  18. A Roadmap for the Implementation of Continued Process Verification.

    Science.gov (United States)

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  19. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  20. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  1. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-05-12

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  2. Approach to IAEA material-balance verification at the Portsmouth Gas Centrifuge Enrichment Plant

    International Nuclear Information System (INIS)

    Gordon, D.M.; Sanborn, J.B.; Younkin, J.M.; DeVito, V.J.

    1983-01-01

    This paper describes a potential approach by which the International Atomic Energy Agency (IAEA) might verify the nuclear-material balance at the Portsmouth Gas Centrifuge Enrichment Plant (GCEP). The strategy makes use of the attributes and variables measurement verification approach, whereby the IAEA would perform independent measurements on a randomly selected subset of the items comprising the U-235 flows and inventories at the plant. In addition, the MUF-D statistic is used as the test statistic for the detection of diversion. The paper includes descriptions of the potential verification activities, as well as calculations of: (1) attributes and variables sample sizes for the various strata, (2) standard deviations of the relevant test statistics, and (3) the detection sensitivity which the IAEA might achieve by this verification strategy at GCEP

  3. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  4. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  5. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  6. The study of necessity of verification-methods for Depleted Uranium

    International Nuclear Information System (INIS)

    Park, J. B.; Ahn, S. H.; Ahn, G. H.; Chung, S. T.; Shin, J. S.

    2006-01-01

    ROK has tried to establish management system for depleted uranium from 2004, and ROK achieved some results in this field including management software, management skill, and the list of company using the nuclear material. But, the studies for the depleted uranium are insufficient exclude the studies of KAERI. In terms of SSAC, we have to study more about whether the depleted uranium is really dangerous material or not and how is the depleted uranium diverted to the nuclear weapon. The depleted uranium was controlled by the item counting in the national system for the small quantity nuclear material. We don't have unique technical methods to clarify the depleted uranium on-the-spot inspection not laboratory scale. Therefore, I would like to suggest of the necessity of the verification methods for depleted uranium. Furthermore, I would like to show you the methods of the verification of the depleted uranium in national system up to now

  7. TRU waste certification and TRUPACT-II payload verification

    International Nuclear Information System (INIS)

    Hunter, E.K.; Johnson, J.E.

    1990-01-01

    The Waste Isolation Pilot Plant (WIPP) established a policy (subsequently confirmed and required by DOE Order 5820.2A, Radioactive Waste Management, September 1988) that requires each waste shipper to verify that all waste shipments meet the requirements of the Waste Acceptance Criteria (WAC) prior to being shipped. This verification provides assurance that transuranic (TRU) wastes meet the criteria while still retained in a facility where discrepancies can be immediately corrected. In this manner, problems that would arise if WAC violations were discovered at the receiver, where corrective facilities are not available, are avoided. Each Department of Energy (DOE) TRU waste facility planning to ship waste to the Waste Isolation Pilot Plant (WIPP) is required to develop and implement a specific program including Quality Assurance (QA) provisions to verify that waste is in full compliance with WIPP's WAC. This program is audited by a composite DOE and contractor audit team prior to granting the facility permission to certify waste. During interaction with the Nuclear Regulatory Commission (NRC) on payload verification for shipping in TRUPACT-II, a similar system was established by DOE. The TRUPACT-II Safety Analysis Report (SAR) contains the technical requirements and physical and chemical limits that payloads must meet (like the WAC). All shippers must plan and implement a payload control program including independent QA provisions. A similar composite audit team will conduct preshipment audits, frequent subsequent audits, and operations inspections to verify that all TRU waste shipments in TRUPACT-II meet the requirements of the Certificate of Compliance (C of C) issued by the NRC which invokes the SAR requirements. 1 fig

  8. Moving the Journey Towards Independence: Adolescents Transitioning to Successful Diabetes Self-Management.

    Science.gov (United States)

    Babler, Elizabeth; Strickland, Carolyn June

    2015-01-01

    To gain a greater understanding of adolescent's experiences living with Type 1 diabetes mellitus (T1DM) and create a theoretical paradigm. Grounded theory as described by Glaser was used. Fifteen in-depth interviews were conducted with adolescent's ages 11-15 with T1DM. Symbolic interactionism is the theoretical framework for grounded theory. Data were collected; transcribed, coded, and analyzed simultaneously using constant comparative analysis and findings were grounded in the words of participants. A theoretical model was created with the concept of "normalizing". Normalizing was defined as the ability to integrate diabetes into one's daily life to make diabetes 'part of me'. Phase four of the model, and the focus of this manuscript was "Moving the Journey towards Independence" and included: 1) taking over care, 2) experiencing conflict with parents, and 3) realizing diabetes is hard. The major task for adolescents in this phase was separating from parents to independently manage diabetes. The normalizing task for this phase was: "taking on the burden of care". Adolescents described challenges with independent care and increased parental conflict including: fearing needles, forgetting insulin, feeling embarrassed and believing that diabetes was a burden in their life. Additionally, juggling the multiple responsibilities of home, school and work along with managing a chronic illness during adolescence is challenging. Transitioning to diabetes self-management is a challenge for adolescents. This model advances understanding of the moving processes in adolescents transitioning; additionally, hypotheses are presented that may be used for developing interventions to promote success in self-management. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Effect of strategic planning education on attitudes and perceptions of independent community pharmacy owners/managers.

    Science.gov (United States)

    Harrison, Donald L

    2007-01-01

    To assess the impact of formal education program participation on the attitudes and perceptions of independent community pharmacy owners/managers toward strategic planning. Cross-sectional study. United States; June 4-July 30, 2004. Nationwide random sample of 1,250 owners/managers of independent community pharmacies. Mailed survey. Strategic planning formal education program participation. Comprehensiveness of strategic planning. Attitudes and perceptions of owners/managers of independent community pharmacies toward strategic planning. A total of 527 (42.1%) usable questionnaires were returned. Only 124 (23.5%) respondents indicated that they participated in a formal strategic planning education program. However, of the 141 (26.85%) respondents who indicated that they had conducted strategic planning for their community pharmacy, 111 (89.5%) had participated in a formal strategic planning education program. A significant association was detected between formal education program participation and the conducting of strategic planning (Pstrategic planning based on program participation (Pstrategic planning rating than those respondents who did not participate in an educational program (Pstrategic planning education program participation and the conducting of strategic planning by owner/managers of independent community pharmacies, and those participating in such programs have significantly different attitudes and perceptions toward the conducting of strategic planning and have a significantly higher comprehensiveness of strategic planning rating.

  10. Sustaining a verification regime in a nuclear weapon-free world. VERTIC research report no. 4

    International Nuclear Information System (INIS)

    Moyland, S. van

    1999-01-01

    Sustaining high levels of commitment to and enthusiasm for the verification regime in a nuclear weapon-free world (NWFW) would be a considerable challenge, but the price of failure would be high. No verification system for a complete ban on a whole of weapon of mass destruction (WMD) has been in existence long enough to provide a precedent or the requisite experience. Nevertheless, lessons from the International Atomic Energy Agency's (IAEA) nuclear safeguards system are instructive. A potential problem over the long haul is the gradual erosion of the deterrent effect of verification that may result from the continual overlooking of minor instances of non-compliance. Flaws in the verification system must be identified and dealt with early lest they also corrode the system. To achieve this the verification organisation's inspectors and analytical staff will need sustained support, encouragement, resources and training. In drawing attention to weaknesses, they must be supported by management and at the political level. The leaking of sensitive information, either industrial or military, by staff of the verification regime is a potential problem. 'Managed access' techniques should be constantly examined and improved. The verification organisation and states parties will need to sustain close co-operation with the nuclear and related industries. Frequent review mechanisms must be established. States must invest time and effort to make them effective. Another potential problem is the withering of resources for sustained verification. Verification organisations tend to be pressured by states to cut or last least cap costs, even if the verification workload increases. The verification system must be effective as knowledge and experience allows. The organisation will need continuously to update its scientific methods and technology. This requires in-house resources plus external research and development (R and D). Universities, laboratories and industry need incentives to

  11. IMRT delivery verification using a spiral phantom

    International Nuclear Information System (INIS)

    Richardson, Susan L.; Tome, Wolfgang A.; Orton, Nigel P.; McNutt, Todd R.; Paliwal, Bhudatt R.

    2003-01-01

    In this paper we report on the testing and verification of a system for IMRT delivery quality assurance that uses a cylindrical solid water phantom with a spiral trajectory for radiographic film placement. This spiral film technique provides more complete dosimetric verification of the entire IMRT treatment than perpendicular film methods, since it samples a three-dimensional dose subspace rather than using measurements at only one or two depths. As an example, the complete analysis of the predicted and measured spiral films is described for an intracranial IMRT treatment case. The results of this analysis are compared to those of a single field perpendicular film technique that is typically used for IMRT QA. The comparison demonstrates that both methods result in a dosimetric error within a clinical tolerance of 5%, however the spiral phantom QA technique provides a more complete dosimetric verification while being less time consuming. To independently verify the dosimetry obtained with the spiral film, the same IMRT treatment was delivered to a similar phantom in which LiF thermoluminescent dosimeters were arranged along the spiral trajectory. The maximum difference between the predicted and measured TLD data for the 1.8 Gy fraction was 0.06 Gy for a TLD located in a high dose gradient region. This further validates the ability of the spiral phantom QA process to accurately verify delivery of an IMRT plan

  12. Leveraging Independent Management and Chief Engineer Hierarchy: Vertically and Horizontally-Derived Technical Authority Value

    Science.gov (United States)

    Barley, Bryan; Newhouse, Marilyn

    2012-01-01

    In the development of complex spacecraft missions, project management authority is usually extended hierarchically from NASA's highest agency levels down to the implementing institution's project team level, through both the center and the program. In parallel with management authority, NASA utilizes a complementary, but independent, hierarchy of technical authority (TA) that extends from the agency level to the project, again, through both the center and the program. The chief engineers (CEs) who serve in this technical authority capacity oversee and report on the technical status and ensure sound engineering practices, controls, and management of the projects and programs. At the lowest level, implementing institutions assign project CEs to technically engage projects, lead development teams, and ensure sound technical principles, processes, and issue resolution. At the middle level, programs and centers independently use CEs to ensure the technical success of their projects and programs. At the agency level, NASA's mission directorate CEs maintain technical cognizance over every program and project in their directorate and advise directorate management on the technical, cost, schedule, and programmatic health of each. As part of this vertically-extended CE team, a program level CE manages a continually varying balance between penetration depth and breadth across his or her assigned missions. Teamwork issues and information integration become critical for management at all levels to ensure value-added use of both the synergy available between CEs at the various agency levels, and the independence of the technical authority at each organization.

  13. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  14. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    International Nuclear Information System (INIS)

    Weaver, Phyllis C.

    2012-01-01

    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site's conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse

  15. A saddle-point for data verification and materials accountancy to control nuclear material

    International Nuclear Information System (INIS)

    Beedgen, R.

    1983-01-01

    Materials accountancy is one of the main elements in international safeguards to determine whether or not nuclear material has been diverted in nuclear plants. The inspector makes independent measurements to verify the plant-operator's data before closing the materials balance with the operator's data. All inspection statements are in principle probability statements because of random errors in measuring the material and verification on a random sampling basis. Statistical test procedures help the inspector to decide under this uncertainty. In this paper a statistical test procedure representing a saddle-point is presented that leads to the highest guaranteed detection probability taking all concealing strategies into account. There are arguments favoring a separate statistical evaluation of data verification and materials accountancy. Following these considerations, a bivariate test procedure is explained that evaluates verification and accountancy separately. (orig.) [de

  16. Development of decommissioning management system. 9. Remodeling to PC system and system verification by evaluation of real work

    International Nuclear Information System (INIS)

    Kondo, Hitoshi; Fukuda, Seiji; Okubo, Toshiyuki

    2004-03-01

    When the plan of decommissioning such as nuclear fuel cycle facilities and small-scale research reactors is examined, it is necessary to select the technology and the process of the work procedure, and to optimize the index (such as the radiation dose, the cost, amount of the waste, the number of workers, and the term of works, etc.) concerning dismantling the facility. In our waste management section, Development of the decommissioning management system, which is called 'DECMAN', for the support of making the decommissioning plan is advanced. DECMAN automatically calculates the index by using the facility data and dismantling method. This paper describes the remodeling of program to the personal computer and the system verification by evaluation of real work (Dismantling of the liquor dissolver in the old JOYO Waste Treatment Facility (the old JWTF), the glove boxes in Deuterium Critical Assembly (DCA), and the incinerator in Waste Dismantling Facility (WDF)). The outline of remodeling and verification is as follows. (1) Additional function: 1) Equipment arrangement mapping, 2) Evaluation of the radiation dose by using the air dose rate, 3) I/O of data that uses EXCEL (software). (2) Comparison of work amount between calculation value and results value: The calculation value is 222.67man·hour against the result value 249.40 man·hour in the old JWTF evaluation. (3) Forecast of accompanying work is predictable to multiply a certain coefficient by the calculation value. (4) A new idea that expected the amount of the work was constructed by using the calculation value of DECMAN. (author)

  17. 76 FR 54810 - Submission for Review: 3206-0215, Verification of Full-Time School Attendance, RI 25-49

    Science.gov (United States)

    2011-09-02

    ... other technological collection techniques or other forms of information technology, e.g., permitting... or faxed to (202) 395-6974. SUPPLEMENTARY INFORMATION: RI 25-49, Verification of Full-Time School... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: 3206-0215, Verification of Full-Time School...

  18. Evaluation of verification methods for input-accountability measurements

    International Nuclear Information System (INIS)

    Maeck, W.J.

    1980-01-01

    As part of TASTEX related programs two independent methods have been evaluated for the purpose of providing verification of the amount of Pu charged to the head-end of a nuclear fuel processing plant. The first is the Pu/U (gravimetric method), TASTEX Task-L, and the second is the Tracer Method, designated Task-M. Summaries of the basic technology, results of various studies under actual plant conditions, future requirements, are given for each of the Tasks

  19. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  20. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  1. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  2. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  3. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  4. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    Science.gov (United States)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  5. The Effect of Mystery Shopper Reports on Age Verification for Tobacco Purchases

    Science.gov (United States)

    KREVOR, BRAD S.; PONICKI, WILLIAM R.; GRUBE, JOEL W.; DeJONG, WILLIAM

    2011-01-01

    Mystery shops (MS) involving attempted tobacco purchases by young buyers have been employed to monitor retail stores’ performance in refusing underage sales. Anecdotal evidence suggests that MS visits with immediate feedback to store personnel can improve age verification. This study investigated the impact of monthly and twice-monthly MS reports on age verification. Forty-five Walgreens stores were each visited 20 times by mystery shoppers. The stores were randomly assigned to one of three conditions. Control group stores received no feedback, whereas two treatment groups received feedback communications every visit (twice monthly) or every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Post-baseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement than control group stores. Verification rates increased significantly during the study period for all three groups, with delayed improvement among control group stores. Communication between managers regarding the MS program may account for the delayed age-verification improvements observed in the control group stores. Encouraging inter-store communication might extend the benefits of MS programs beyond those stores that receive this intervention. PMID:21541874

  6. Test/QA Plan for Verification of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments across International Borders

    Science.gov (United States)

    The verification test will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Technology Verification (ETV) Program. It will be performed by Battelle, which is managing the ETV Advanced Monitoring Systems (AMS) Center throu...

  7. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  8. How do we promote independent ostomy management for people with disability?

    Science.gov (United States)

    Zeigler, Mary

    2012-01-01

    To describe how people with ostomies became independent with their ostomy management. Two clinical ostomy consultations which incorporated traditional ostomy management along with state-of-the-art ostomy technology aimed at the needs of the disabled person. The newer technologies eliminated several of the traditional steps and made a one-handed approach possible. This article describes two clinical consultations which incorporated traditional ostomy management along with state-of-the-art ostomy technology aimed at the needs of the disabled person. Peristomal skin preparation and protection, barrier wafer preparation, proper pouch emptying, and newer technology are discussed. The consultations resulted in ostomy care autonomy and an improved quality of life in both patients. © 2012 Association of Rehabilitation Nurses.

  9. Verification and validation of control system software

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  10. Systems analysis - independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  11. Taiwan Power Company's power distribution analysis and fuel thermal margin verification methods for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, P.H.

    1995-01-01

    Taiwan Power Company's (TPC's) power distribution analysis and fuel thermal margin verification methods for pressurized water reactors (PWRs) are examined. The TPC and the Institute of Nuclear Energy Research started a joint 5-yr project in 1989 to establish independent capabilities to perform reload design and transient analysis utilizing state-of-the-art computer programs. As part of the effort, these methods were developed to allow TPC to independently perform verifications of the local power density and departure from nucleate boiling design bases, which are required by the reload safety evaluation for the Maanshan PWR plant. The computer codes utilized were extensively validated for the intended applications. Sample calculations were performed for up to six reload cycles of the Maanshan plant, and the results were found to be quite consistent with the vendor's calculational results

  12. 77 FR 67737 - Proposed Information Collection (Student Verification of Enrollment) Activity: Comment Request

    Science.gov (United States)

    2012-11-13

    ...: Submit written comments on the collection of information through Federal Docket Management System (FDMS.... 3501--3521), Federal agencies must obtain approval from the Office of Management and Budget (OMB) for... and verification of the student's continued enrollment in courses leading to a standard college degree...

  13. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  14. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  15. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H [University Medical Center Mannheim, University of Heidelberg, Mannheim, Baden-Wuerttemberg (Germany)

    2015-06-15

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  16. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    International Nuclear Information System (INIS)

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H

    2015-01-01

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  17. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  18. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  19. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  20. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  1. Device-Independent Certification of a Nonprojective Qubit Measurement

    Science.gov (United States)

    Gómez, Esteban S.; Gómez, Santiago; González, Pablo; Cañas, Gustavo; Barra, Johanna F.; Delgado, Aldo; Xavier, Guilherme B.; Cabello, Adán; Kleinmann, Matthias; Vértesi, Tamás; Lima, Gustavo

    2016-12-01

    Quantum measurements on a two-level system can have more than two independent outcomes, and in this case, the measurement cannot be projective. Measurements of this general type are essential to an operational approach to quantum theory, but so far, the nonprojective character of a measurement can only be verified experimentally by already assuming a specific quantum model of parts of the experimental setup. Here, we overcome this restriction by using a device-independent approach. In an experiment on pairs of polarization-entangled photonic qubits we violate by more than 8 standard deviations a Bell-like correlation inequality that is valid for all sets of two-outcome measurements in any dimension. We combine this with a device-independent verification that the system is best described by two qubits, which therefore constitutes the first device-independent certification of a nonprojective quantum measurement.

  2. Performance characteristics of an independent dose verification program for helical tomotherapy

    Directory of Open Access Journals (Sweden)

    Isaac C. F. Chang

    2017-01-01

    Full Text Available Helical tomotherapy with its advanced method of intensity-modulated radiation therapy delivery has been used clinically for over 20 years. The standard delivery quality assurance procedure to measure the accuracy of delivered radiation dose from each treatment plan to a phantom is time-consuming. RadCalc®, a radiotherapy dose verification software, has released specifically for beta testing a module for tomotherapy plan dose calculations. RadCalc®'s accuracy for tomotherapy dose calculations was evaluated through examination of point doses in ten lung and ten prostate clinical plans. Doses calculated by the TomoHDA™ tomotherapy treatment planning system were used as the baseline. For lung cases, RadCalc® overestimated point doses in the lung by an average of 13%. Doses within the spinal cord and esophagus were overestimated by 10%. Prostate plans showed better agreement, with overestimations of 6% in the prostate, bladder, and rectum. The systematic overestimation likely resulted from limitations of the pencil beam dose calculation algorithm implemented by RadCalc®. Limitations were more severe in areas of greater inhomogeneity and less prominent in regions of homogeneity with densities closer to 1 g/cm3. Recommendations for RadCalc® dose calculation algorithms and anatomical representation were provided based on the results of the study.

  3. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  4. Complex-Wide Waste Flow Analysis V1.0 verification and validation report

    International Nuclear Information System (INIS)

    Hsu, K.M.; Lundeen, A.S.; Oswald, K.B.; Shropshire, D.E.; Robinson, J.M.; West, W.H.

    1997-01-01

    The complex-wide waste flow analysis model (CWWFA) was developed to assist the Department of Energy (DOE) Environmental Management (EM) Office of Science and Technology (EM-50) to evaluate waste management scenarios with emphasis on identifying and prioritizing technology development opportunities to reduce waste flows and public risk. In addition, the model was intended to support the needs of the Complex-Wide Environmental Integration (EMI) team supporting the DOE's Accelerating Cleanup: 2006 Plan. CWWFA represents an integrated environmental modeling system that covers the life cycle of waste management activities including waste generation, interim process storage, retrieval, characterization and sorting, waste preparation and processing, packaging, final interim storage, transport, and disposal at a final repository. The CWWFA shows waste flows through actual site-specific and facility-specific conditions. The system requirements for CWWFA are documented in the Technical Requirements Document (TRD). The TRD is intended to be a living document that will be modified over the course of the execution of CWWFA development. Thus, it is anticipated that CWWFA will continue to evolve as new requirements are identified (i.e., transportation, small sites, new streams, etc.). This report provides a documented basis for system verification of CWWFA requirements. System verification is accomplished through formal testing and evaluation to ensure that all performance requirements as specified in the TRD have been satisfied. A Requirement Verification Matrix (RVM) was used to map the technical requirements to the test procedures. The RVM is attached as Appendix A. Since February of 1997, substantial progress has been made toward development of the CWWFA to meet the system requirements. This system verification activity provides a baseline on system compliance to requirements and also an opportunity to reevaluate what requirements need to be satisfied in FY-98

  5. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Science.gov (United States)

    2010-10-01

    ... mobile radio services (CMRS) providers shall be excluded from the verification requirements of this part... not be owned, managed, controlled, or directed by the carrier or the carrier's marketing agent; must... carrier's marketing agent; and must operate in a location physically separate from the carrier or the...

  6. Formal verification and validation of the safety-critical software in a digital reactor protection system

    International Nuclear Information System (INIS)

    Kwon, K. C.; Park, G. Y.

    2006-01-01

    This paper describes the Verification and Validation (V and V) activities for the safety-critical software in a Digital Reactor Protection System (DRPS) that is being developed through the Korea nuclear instrumentation and control system project. The main activities of the DRPS V and V process are a preparation of the software planning documentation, a verification of the software according to the software life cycle, a software safety analysis and a software configuration management. The verification works for the Software Requirement Specification (SRS) of the DRPS consist of a technical evaluation, a licensing suitability evaluation, a inspection and traceability analysis, a formal verification, and preparing a test plan and procedure. Especially, the SRS is specified by the formal specification method in the development phase, and the formal SRS is verified by a formal verification method. Through these activities, we believe we can achieve the functionality, performance, reliability, and safety that are the major V and V objectives of the nuclear safety-critical software in a DRPS. (authors)

  7. Analysis of human scream and its impact on text-independent speaker verification.

    Science.gov (United States)

    Hansen, John H L; Nandwana, Mahesh Kumar; Shokouhi, Navid

    2017-04-01

    Scream is defined as sustained, high-energy vocalizations that lack phonological structure. Lack of phonological structure is how scream is identified from other forms of loud vocalization, such as "yell." This study investigates the acoustic aspects of screams and addresses those that are known to prevent standard speaker identification systems from recognizing the identity of screaming speakers. It is well established that speaker variability due to changes in vocal effort and Lombard effect contribute to degraded performance in automatic speech systems (i.e., speech recognition, speaker identification, diarization, etc.). However, previous research in the general area of speaker variability has concentrated on human speech production, whereas less is known about non-speech vocalizations. The UT-NonSpeech corpus is developed here to investigate speaker verification from scream samples. This study considers a detailed analysis in terms of fundamental frequency, spectral peak shift, frame energy distribution, and spectral tilt. It is shown that traditional speaker recognition based on the Gaussian mixture models-universal background model framework is unreliable when evaluated with screams.

  8. An Unattended Verification Station for UF6 Cylinders: Development Status

    International Nuclear Information System (INIS)

    Smith, E.; McDonald, B.; Miller, K.; Garner, J.; March-Leuba, J.; Poland, R.

    2015-01-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by advanced centrifuge technologies and the growth in separative work unit capacity at modern centrifuge enrichment plants. These measures would include permanently installed, unattended instruments capable of performing the routine and repetitive measurements previously performed by inspectors. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Stations (UCVS) that could provide independent verification of the declared relative enrichment, U-235 mass and total uranium mass of all declared cylinders moving through the plant, as well as the application and verification of a ''Non-destructive Assay Fingerprint'' to preserve verification knowledge on the contents of each cylinder throughout its life in the facility. As IAEA's vision for a UCVS has evolved, Pacific Northwest National Laboratory (PNNL) and Los Alamos National Laboratory have been developing and testing candidate non-destructive assay (NDA) methods for inclusion in a UCVS. Modeling and multiple field campaigns have indicated that these methods are capable of assaying relative cylinder enrichment with a precision comparable to or substantially better than today's high-resolution handheld devices, without the need for manual wall-thickness corrections. In addition, the methods interrogate the full volume of the cylinder, thereby offering the IAEA a new capability to assay the absolute U-235 mass in the cylinder, and much-improved sensitivity to substituted or removed material. Building on this prior work, and under the auspices of the United States Support Programme to the IAEA, a UCVS field prototype is being developed and tested. This paper provides an overview of: a) hardware and software design of the prototypes, b) preparation

  9. Robustness and device independence of verifiable blind quantum computing

    International Nuclear Information System (INIS)

    Gheorghiu, Alexandru; Kashefi, Elham; Wallden, Petros

    2015-01-01

    Recent advances in theoretical and experimental quantum computing bring us closer to scalable quantum computing devices. This makes the need for protocols that verify the correct functionality of quantum operations timely and has led to the field of quantum verification. In this paper we address key challenges to make quantum verification protocols applicable to experimental implementations. We prove the robustness of the single server verifiable universal blind quantum computing protocol of Fitzsimons and Kashefi (2012 arXiv:1203.5217) in the most general scenario. This includes the case where the purification of the deviated input state is in the hands of an adversarial server. The proved robustness property allows the composition of this protocol with a device-independent state tomography protocol that we give, which is based on the rigidity of CHSH games as proposed by Reichardt et al (2013 Nature 496 456–60). The resulting composite protocol has lower round complexity for the verification of entangled quantum servers with a classical verifier and, as we show, can be made fault tolerant. (paper)

  10. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    Science.gov (United States)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  11. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    Science.gov (United States)

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  12. Methodology and applicability of a safety and demonstration concept for a HAW final repository on clays. Safety concept and verification strategy

    International Nuclear Information System (INIS)

    Ruebel, Andre; Meleshyn, Artur

    2014-08-01

    The report describes the site independent frame for a safety concept and verification strategy for a final repository for heat generating wastes in clay rock. In the safety concept planning specifications and technical measures are summarized that are supposed to allow a safe inclusion of radionuclides in the host rock. The verification strategy defines the systematic procedures for the development of fundamentals and scenarios as basis for the demonstration of the safety case and to allow the prognosis of appropriateness. The report includes the boundary conditions, the safety concept for the post-closure phase and the verification strategy for the post-closure phase.

  13. The KNICS approach for verification and validation of safety software

    International Nuclear Information System (INIS)

    Cha, Kyung Ho; Sohn, Han Seong; Lee, Jang Soo; Kim, Jang Yeol; Cheon, Se Woo; Lee, Young Joon; Hwang, In Koo; Kwon, Kee Choon

    2003-01-01

    This paper presents verification and validation (VV) to be approached for safety software of POSAFE-Q Programmable Logic Controller (PLC) prototype and Plant Protection System (PPS) prototype, which consists of Reactor Protection System (RPS) and Engineered Safety Features-Component Control System (ESF-CCS) in development of Korea Nuclear Instrumentation and Control System (KNICS). The SVV criteria and requirements are selected from IEEE Std. 7-4.3.2, IEEE Std. 1012, IEEE Std. 1028 and BTP-14, and they have been considered for acceptance framework to be provided within SVV procedures. SVV techniques, including Review and Inspection (R and I), Formal Verification and Theorem Proving, and Automated Testing, are applied for safety software and automated SVV tools supports SVV tasks. Software Inspection Support and Requirement Traceability (SIS-RT) supports R and I and traceability analysis, a New Symbolic Model Verifier (NuSMV), Statemate MAGNUM (STM) ModelCertifier, and Prototype Verification System (PVS) are used for formal verification, and McCabe and Cantata++ are utilized for static and dynamic software testing. In addition, dedication of Commercial-Off-The-Shelf (COTS) software and firmware, Software Safety Analysis (SSA) and evaluation of Software Configuration Management (SCM) are being performed for the PPS prototype in the software requirements phase

  14. Verification of operating software for cooperative monitoring applications

    International Nuclear Information System (INIS)

    Tolk, K.M.; Rembold, R.K.

    1997-01-01

    Monitoring agencies often use computer based equipment to control instruments and to collect data at sites that are being monitored under international safeguards or other cooperative monitoring agreements. In order for this data to be used as an independent verification of data supplied by the host at the facility, the software used must be trusted by the monitoring agency. The monitoring party must be sure that the software has not be altered to give results that could lead to erroneous conclusions about nuclear materials inventories or other operating conditions at the site. The host might also want to verify that the software being used is the software that has been previously inspected in order to be assured that only data that is allowed under the agreement is being collected. A description of a method to provide this verification using keyed has functions and how the proposed method overcomes possible vulnerabilities in methods currently in use such as loading the software from trusted disks is presented. The use of public key data authentication for this purpose is also discussed

  15. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    Science.gov (United States)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  16. A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events

    Science.gov (United States)

    Kholodovsky, V.

    2017-12-01

    Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.

  17. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  18. Verification and validation of COBRA-SFS transient analysis capability

    International Nuclear Information System (INIS)

    Rector, D.R.; Michener, T.E.; Cuta, J.M.

    1998-05-01

    This report provides documentation of the verification and validation testing of the transient capability in the COBRA-SFS code, and is organized into three main sections. The primary documentation of the code was published in September 1995, with the release of COBRA-SFS, Cycle 2. The validation and verification supporting the release and licensing of COBRA-SFS was based solely on steady-state applications, even though the appropriate transient terms have been included in the conservation equations from the first cycle. Section 2.0, COBRA-SFS Code Description, presents a capsule description of the code, and a summary of the conservation equations solved to obtain the flow and temperature fields within a cask or assembly model. This section repeats in abbreviated form the code description presented in the primary documentation (Michener et al. 1995), and is meant to serve as a quick reference, rather than independent documentation of all code features and capabilities. Section 3.0, Transient Capability Verification, presents a set of comparisons between code calculations and analytical solutions for selected heat transfer and fluid flow problems. Section 4.0, Transient Capability Validation, presents comparisons between code calculations and experimental data obtained in spent fuel storage cask tests. Based on the comparisons presented in Sections 2.0 and 3.0, conclusions and recommendations for application of COBRA-SFS to transient analysis are presented in Section 5.0

  19. 78 FR 6852 - Agency Information Collection (Student Verification of Enrollment) Activity Under OMB Review

    Science.gov (United States)

    2013-01-31

    ... Management and Budget (OMB) for review and comment. The PRA submission describes the nature of the... verification of the student's continued enrollment in courses leading to a standard college degree or in non... Policy and Management, Office of General Counsel, Department of Veterans Affairs. [FR Doc. 2013-02027...

  20. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2018-01-01

    -dependent. We show that the proposed method significantly reduces the error rates of text-dependent speaker verification for the non-target types: target-wrong and impostor-wrong while it maintains comparable TD-SV performance when impostors speak a correct utterance with respect to the conventional system......In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using...... the utterances of a particular pass-phrase. During training, pass-phrase specific target speaker models are derived from the particular PBM using the training data for the respective target model. While testing, the best PBM is first selected for the test utterance in the maximum likelihood (ML) sense...

  1. Independent Verification Survey Of The SPRU Lower Level Hillside Area At The Knolls Atomic Power Laboratory Niskayuna, New York

    International Nuclear Information System (INIS)

    Harpenau, Evan M.; Weaver, Phyllis C.

    2012-01-01

    During August 10, 2011 through August 19, 2011, and October 23, 2011 through November 4, 2011, ORAU/ORISE conducted verification survey activities at the Separations Process Research Unit (SPRU) site that included in-process inspections, surface scans, and soil sampling of the Lower Level Hillside Area. According to the Type-B Investigation Report, Sr-90 was the primary contributor to the majority of the activity (60 times greater than the Cs-137 activity). The evaluation of the scan data and sample results obtained during verification activities determined that the primary radionuclide of concern, Sr-90, was well below the agreed upon soil cleanup objective (SCO) of 30 pCi/g for the site. However, the concentration of Cs-137 in the four judgmental samples collected in final status survey (FSS) Units A and B was greater than the SCO. Both ORAU and aRc surveys identified higher Cs-137 concentrations in FSS Units A and B; the greatest concentrations were indentified in FSS Unit A

  2. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH ELECTRONIC SENSOR TECHNOLOGY MODEL 4100

    Science.gov (United States)

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. As part of this program, the...

  4. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2013-09-12

    ... developed CBSV as a user- friendly, internet-based application with safeguards that protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to users in a secure manner, CBSV provides us with cost and workload management benefits. New Information...

  5. KNGR core proection calculator, software, verification and validation plan

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Park, Jong Kyun; Lee, Ki Young; Lee, Jang Soo; Cheon, Se Woo

    2001-05-01

    This document describes the Software Verification and Validation Plan(SVVP) Guidance to be used in reviewing the Software Program Manual(SPM) in Korean Next Generation Reactor(KNGR) projects. This document is intended for a verifier or reviewer who is involved with performing of software verification and validation task activity in KNGR projects. This document includeds the basic philosophy, performing V and V effort, software testing techniques, criteria of review and audit on the safety software V and V activity. Major review topics on safety software addresses three kinds of characteristics based on Standard Review Plan(SRP) Chapter 7, Branch Technical Position(BTP)-14 : management characteristics, implementation characteristics and resources characteristics when reviewing on SVVP. Based on major topics of this document, we have produced the evaluation items list such as checklist in Appendix A

  6. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    Science.gov (United States)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  7. Towards Formal Verification of a Separation Microkernel

    Science.gov (United States)

    Butterfield, Andrew; Sanan, David; Hinchey, Mike

    2013-08-01

    The best approach to verifying an IMA separation kernel is to use a (fixed) time-space partitioning kernel with a multiple independent levels of separation (MILS) architecture. We describe an activity that explores the cost and feasibility of doing a formal verification of such a kernel to the Common Criteria (CC) levels mandated by the Separation Kernel Protection Profile (SKPP). We are developing a Reference Specification of such a kernel, and are using higher-order logic (HOL) to construct formal models of this specification and key separation properties. We then plan to do a dry run of part of a formal proof of those properties using the Isabelle/HOL theorem prover.

  8. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  9. Groundwater flow code verification ''benchmarking'' activity (COVE-2A): Analysis of participants' work

    International Nuclear Information System (INIS)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project

  10. Report on site-independent environmental impacts of radioactive waste storage and management

    International Nuclear Information System (INIS)

    1985-10-01

    The organisation responsible for radioactive wastes in the Netherlands is COVRA: Centrale Organisatie Voor Radioactief Afval. It deals especially with storage and management of these wastes. For that purpose, COVRA will build a waste managing and storage facility at a central site in the Netherlands. In this report, environmental impacts of these activities are studied, that are independent of the location. The report is readable and useful for a broad audience. In the main report, the general features are outlined starting from figures and tables on environmental effects. In a separate volume, detailed numerical data are presented. (G.J.P.)

  11. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  12. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  13. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  14. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  15. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  16. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2011-09-28

    ... protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to the business community in a secure manner, CBSV provides us with cost and workload management benefits. New Information: To use CBSV, interested parties must pay a one- time non-refundable...

  17. Verification and Validation Issues in Systems of Systems

    Directory of Open Access Journals (Sweden)

    Eric Honour

    2013-11-01

    Full Text Available The cutting edge in systems development today is in the area of "systems of systems" (SoS large networks of inter-related systems that are developed and managed separately, but that also perform collective activities. Such large systems typically involve constituent systems operating with different life cycles, often with uncoordinated evolution. The result is an ever-changing SoS in which adaptation and evolution replace the older engineering paradigm of "development". This short paper presents key thoughts about verification and validation in this environment. Classic verification and validation methods rely on having (a a basis of proof, in requirements and in operational scenarios, and (b a known system configuration to be proven. However, with constant SoS evolution, management of both requirements and system configurations are problematic. Often, it is impossible to maintain a valid set of requirements for the SoS due to the ongoing changes in the constituent systems. Frequently, it is even difficult to maintain a vision of the SoS operational use as users find new ways to adapt the SoS. These features of the SoS result in significant challenges for system proof. In addition to discussing the issues, the paper also indicates some of the solutions that are currently used to prove the SoS.

  18. Community energy management in Sitka, Alaska: What strategies can help increase energy independence?

    Science.gov (United States)

    David Nicholls; Trista. Patterson

    2013-01-01

    This report summarizes practical energy management strategies that could help communities in southeast Alaska move closer to energy independence while utilizing local resources more effectively. Our analysis focuses primarily on Sitka, Alaska, yet could be relevant to other communities having similar energy structures that rely primarily on hydroelectric power...

  19. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  20. Simulation-based design process for the verification of ITER remote handling systems

    International Nuclear Information System (INIS)

    Sibois, Romain; Määttä, Timo; Siuko, Mikko; Mattila, Jouni

    2014-01-01

    Highlights: •Verification and validation process for ITER remote handling system. •Simulation-based design process for early verification of ITER RH systems. •Design process centralized around simulation lifecycle management system. •Verification and validation roadmap for digital modelling phase. -- Abstract: The work behind this paper takes place in the EFDA's European Goal Oriented Training programme on Remote Handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. One of the projects of this programme focuses on the verification and validation (V and V) of ITER RH system requirements using digital mock-ups (DMU). The purpose of this project is to study and develop efficient approach of using DMUs in the V and V process of ITER RH system design utilizing a System Engineering (SE) framework. Complex engineering systems such as ITER facilities lead to substantial rise of cost while manufacturing the full-scale prototype. In the V and V process for ITER RH equipment, physical tests are a requirement to ensure the compliance of the system according to the required operation. Therefore it is essential to virtually verify the developed system before starting the prototype manufacturing phase. This paper gives an overview of the current trends in using digital mock-up within product design processes. It suggests a simulation-based process design centralized around a simulation lifecycle management system. The purpose of this paper is to describe possible improvements in the formalization of the ITER RH design process and V and V processes, in order to increase their cost efficiency and reliability

  1. An unattended verification station for UF6 cylinders: Field trial findings

    Science.gov (United States)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.

    2017-12-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.

  2. Leading and Managing Today's Independent School: A Qualitative Analysis of the Skills and Practices of Experienced Heads of Independent Schools in the New York Metropolitan Area

    Science.gov (United States)

    Juhel, Jean-Marc

    2016-01-01

    This article presents the findings of a qualitative study conducted in 2014 with 16 experienced heads of school in the New York metropolitan area. The study was designed to better understand the skills and practices that they view as critical to leading and managing independent schools. The data collected speak to each head's ability to manage the…

  3. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  4. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  5. Preparation of Input Deck to analyze the Nuclear Power Plant for the Use of Regulatory Verification

    International Nuclear Information System (INIS)

    Kang, Doo Hyuk; Kim, Hyung Seok; Suh, Jae Seung; Ahn, Seung Hoon

    2009-01-01

    The objectives of this paper are to make out the input deck that analyzes a nuclear power plant for the use of regulatory verification and to produce its calculation note. We have been maintained the input deck of T/H safety codes used in existing domestic reactors to ensure independent and accurate regulatory verification for the thermal-hydraulic safety analysis in domestic NPPs. This paper is mainly divided into two steps: first step is to compare existing input deck to the calculation note in order to verify the consistency. Next step is to model 3-dimensional reactor pressure vessel using MULTID component instead of the 1D existing input deck

  6. Climate Change Risk Management Consulting: The opportunity for an independent business practice

    Science.gov (United States)

    Ciccozzi, R.

    2009-04-01

    The Paper outlines the main questions to be addressed with reference to the actual demand of climate change risk management consulting, in the financial services. Moreover, the Project shall also try to investigate if the Catastrophe Modelling Industry can start and manage a business practice specialised on climate change risk exposures. In this context, the Paper aims at testing the possibility to build a sound business case, based upon typical MBA course analysis tools, such as PEST(LE), SWOT, etc. Specific references to the tools to be used and to other contribution from academic literature and general documentation are also discussed in the body of the Paper and listed at the end. The analysis shall also focus on the core competencies required for an independent climate change risk management consulting business practice, with the purpose to outline a valid definition of how to achieve competitive advantage in climate change risk management consulting.

  7. Towards the Formal Verification of a Distributed Real-Time Automotive System

    Science.gov (United States)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  8. Developing a NASA strategy for the verification of large space telescope observatories

    Science.gov (United States)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  9. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  10. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  11. Monitoring/Verification Using DMS: TATP Example

    International Nuclear Information System (INIS)

    Kevin Kyle; Stephan Weeks

    2008-01-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15-300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements

  12. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  13. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  14. Working Group 3: Broader Perspectives on Non-proliferation and Nuclear Verification

    International Nuclear Information System (INIS)

    Dreicer, M.; Pregenzer, A.; Stein, G.

    2013-01-01

    This working group (WG) focused on the technical topics related to international security and stability in global nonproliferation and arms control regimes and asked how nonproliferation tools and culture might facilitate verification of future nuclear treaties. The review of existing and future nonproliferation and disarmament regimes (Comprehensive Test Ban Treaty - CTBT, UNSC Resolution 1540, UK/Norway/VERTIC exercise, Fissile Material Cut-off Treaty - FMCT) offered a view on challenges, possibilities, and limitations for future initiatives. The concepts that the WG considered, with potential use in implementing future nuclear verification treaties, are: Triple S Culture (Safety, Security, Safeguards), State-Level Approach, Safeguards-by-Design, risk-based approaches, managed access, inspections, and protection of sensitive information. Under these concepts, many existing tools, considered by the WG could be used for nuclear verification. Export control works to control sensitive technology and expertise. Global implementation is complicated and multi-faceted and would benefit from greater consistency and efficiency. In most cases, international cooperation and development international capability would supplement efforts. This document is composed of the slides and the paper of the presentation. (A.C.)

  15. Environmental Technology Verification: Biological Inactivation Efficiency by HVAC In-Duct Ultraviolet Light Systems--American Ultraviolet Corporation, DC24-6-120 [EPA600etv08005

    Science.gov (United States)

    The Air Pollution Control Technology Verification Center (APCT Center) is operated by RTI International (RTI), in cooperation with EPA's National Risk Management Research Laboratory. The APCT Center conducts verifications of technologies that clean air in ventilation systems, inc...

  16. A tool for efficient, model-independent management optimization under uncertainty

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.

    2018-01-01

    To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.

  17. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  18. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  19. Results of the radiological verification survey of the partial remediation at 90 Avenue C, Lodi, New Jersey (LJ079V)

    International Nuclear Information System (INIS)

    Foley, R.D.; Johnson, C.A.

    1994-02-01

    The property at 90 Avenue C, Lodi, New Jersey is one of the vicinity properties of the former Maywood Chemical Works, Maywood, New Jersey designated for remedial action by the US Department of Energy (DOE). In July 1991, Bechtel National, Inc. performed a partial remedial action on this property. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey in July, 1991 at this site. The purpose of the verification survey was to ensure the effectiveness of remedial actions performed within FUSRAP and to confirm the site's compliance with DOE guidelines. The radiological survey included surface gamma scans indoors and outdoors, ground-level beta-gamma measurements, and systematic and biased soil and material sampling. Results of the verification survey demonstrated that all radiological measurements on the portions of the property that had been remediated were within DOE guidelines. However, there still remains a portion of the property to be remediated that is not covered by this verification survey

  20. Verification survey of the 17th Street Drainage Area, Santa Susana Field Laboratory, The Boeing Company, Ventura County, California

    International Nuclear Information System (INIS)

    John R. Morton

    2000-01-01

    An independent (third-party) verification of contractor remedial actions of the subject site verifies that remedial actions have been effective in meeting established and site-specific guidelines and that the documentation accurately and adequately describes the radiological conditions at the site

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    Science.gov (United States)

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  2. Better Buildings Alliance, Advanced Rooftop Unit Campaign: Rooftop Unit Measurement and Verification (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2014-09-01

    This document provides facility managers and building owners an introduction to measurement and verification (M&V) methods to estimate energy and cost savings of rooftop units replacement or retrofit projects to estimate paybacks or to justify future projects.

  3. COMSY- A Software Tool For Aging And Plant Life Management With An Integrated Documentation Tool

    International Nuclear Information System (INIS)

    Baier, Roman; Zander, Andre

    2008-01-01

    For the aging and plant life management the integrity of the mechanical components and structures is one of the key objectives. In order to ensure this integrity it is essential to implement a comprehensive aging management. This should be applied to all safety relevant mechanical systems or components, civil structures, electrical systems as well as instrumentation and control (I and C). The following aspects should be covered: - Identification and assessment of relevant degradation mechanisms; - Verification and evaluation of the quality status of all safety relevant systems, structures and components (SSC's); - Verification and modernization of I and C and electrical systems; - Reliable and up-to-date documentation. For the support of this issue AREVA NP GmbH has developed the computer program COMSY, which utilizes more than 30 years of experience resulting from research activities and operational experience. The program provides the option to perform a plant-wide screening for identifying system areas, which are sensitive to specific degradation mechanisms. Another object is the administration and evaluation of NDE measurements from different techniques. An integrated documentation tool makes the document management and maintenance fast, reliable and independent from staff service. (authors)

  4. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  5. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  6. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  7. The NRC measurement verification program

    International Nuclear Information System (INIS)

    Pham, T.N.; Ong, L.D.Y.

    1995-01-01

    A perspective is presented on the US Nuclear Regulatory Commission (NRC) approach for effectively monitoring the measurement methods and directly testing the capability and performance of licensee measurement systems. A main objective in material control and accounting (MC and A) inspection activities is to assure the accuracy and precision of the accounting system and the absence of potential process anomalies through overall accountability. The primary means of verification remains the NRC random sampling during routine safeguards inspections. This involves the independent testing of licensee measurement performance with statistical sampling plans for physical inventories, item control, and auditing. A prospective cost-effective alternative overcheck is also discussed in terms of an externally coordinated sample exchange or ''round robin'' program among participating fuel cycle facilities in order to verify the quality of measurement systems, i.e., to assure that analytical measurement results are free of bias

  8. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  9. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  10. Leaf trajectory verification during dynamic intensity modulated radiotherapy using an amorphous silicon flat panel imager

    International Nuclear Information System (INIS)

    Sonke, Jan-Jakob; Ploeger, Lennert S.; Brand, Bob; Smitsmans, Monique H.P.; Herk, Marcel van

    2004-01-01

    An independent verification of the leaf trajectories during each treatment fraction improves the safety of IMRT delivery. In order to verify dynamic IMRT with an electronic portal imaging device (EPID), the EPID response should be accurate and fast such that the effect of motion blurring on the detected moving field edge position is limited. In the past, it was shown that the errors in the detected position of a moving field edge determined by a scanning liquid-filled ionization chamber (SLIC) EPID are negligible in clinical practice. Furthermore, a method for leaf trajectory verification during dynamic IMRT was successfully applied using such an EPID. EPIDs based on amorphous silicon (a-Si) arrays are now widely available. Such a-Si flat panel imagers (FPIs) produce portal images with superior image quality compared to other portal imaging systems, but they have not yet been used for leaf trajectory verification during dynamic IMRT. The aim of this study is to quantify the effect of motion distortion and motion blurring on the detection accuracy of a moving field edge for an Elekta iViewGT a-Si FPI and to investigate its applicability for the leaf trajectory verification during dynamic IMRT. We found that the detection error for a moving field edge to be smaller than 0.025 cm at a speed of 0.8 cm/s. Hence, the effect of motion blurring on the detection accuracy of a moving field edge is negligible in clinical practice. Furthermore, the a-Si FPI was successfully applied for the verification of dynamic IMRT. The verification method revealed a delay in the control system of the experimental DMLC that was also found using a SLIC EPID, resulting in leaf positional errors of 0.7 cm at a leaf speed of 0.8 cm/s

  11. MESA: Message-Based System Analysis Using Runtime Verification

    Science.gov (United States)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  12. Application of semi-active RFID power meter in automatic verification pipeline and intelligent storage system

    Science.gov (United States)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    In this paper, the semi-active RFID watt-hour meter is applied to automatic test lines and intelligent warehouse management, from the transmission system, test system and auxiliary system, monitoring system, realize the scheduling of watt-hour meter, binding, control and data exchange, and other functions, make its more accurate positioning, high efficiency of management, update the data quickly, all the information at a glance. Effectively improve the quality, efficiency and automation of verification, and realize more efficient data management and warehouse management.

  13. Revalidation and quality assurance: the application of the MUSIQ framework in independent verification visits to healthcare organisations.

    Science.gov (United States)

    Griffin, Ann; McKeown, Alex; Viney, Rowena; Rich, Antonia; Welland, Trevor; Gafson, Irene; Woolf, Katherine

    2017-02-14

    We present a national evaluation of the impact of independent verification visits (IVVs) performed by National Health Service (NHS) England as part of quality assuring medical revalidation. Organisational visits are central to NHS quality assurance. They are costly, yet little empirical research evidence exists concerning their impact, and what does exist is conflicting. The focus was on healthcare providers in the NHS (in secondary care) and private sector across England, who were designated bodies (DBs). DBs are healthcare organisations that have a statutory responsibility, via the lead clinician, the responsible officer (RO), to implement medical revalidation. All ROs who had undergone an IVV in England in 2014 and 2015 were invited to participate. 46 ROs were interviewed. Ethnographic data were gathered at 18 observations of the IVVs and 20 IVV post visit reports underwent documentary analysis. Primary outcomes were the findings pertaining to the effectiveness of the IVV system in supporting the revalidation processes at the DBs. Secondary outcomes were methodological, relating to the Model for Understanding Success in Quality (MUSIQ) and how its application to the IVV reveals the relevance of contextual factors described in the model. The impact of the IVVs varied by DB according to three major themes: the personal context of the RO; the organisational context of the DB; and the visit and its impact. ROs were largely satisfied with visits which raised the status of appraisal within their organisations. Inadequate or untimely feedback was associated with dissatisfaction. Influencing teams whose prime responsibility is establishing processes and evaluating progress was crucial for internal quality improvement. Visits acted as a nudge, generating internal quality review, which was reinforced by visit teams with relevant expertise. Diverse team membership, knowledge transfer and timely feedback made visits more impactful. Published by the BMJ Publishing Group

  14. The backfitting process and its verification

    International Nuclear Information System (INIS)

    Del Nero, G.; Grimaldi, G.

    1990-01-01

    Backfitting of plants in operation is based on: - compliance with new standards and regulations, - lessons learned from operating experience. This goal can be more effectively achieved on the basis of a valid methodology of analysis and a consistent process of collection, storage and retrieval of the operating data. The general backfitting problem, the verification process and the utilization of TPA as mean to assess backfitting are illustrated. The results of the analyses performed on Caorso plant are presented as well, using some specially designed software tools Management more than hardware problems are focused. Some general conclusions are then presented as final results of the whole work

  15. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  16. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  17. Validation, verification and evaluation of a Train to Train Distance Measurement System by means of Colored Petri Nets

    International Nuclear Information System (INIS)

    Song, Haifeng; Liu, Jieyu; Schnieder, Eckehard

    2017-01-01

    Validation, verification and evaluation are necessary processes to assure the safety and functionality of a system before its application in practice. This paper presents a Train to Train Distance Measurement System (TTDMS), which can provide distance information independently from existing onboard equipment. Afterwards, we proposed a new process using Colored Petri Nets to verify the TTDMS system functional safety, as well as to evaluate the system performance. Three main contributions are carried out in the paper: Firstly, this paper proposes a formalized TTDMS model, and the model correctness is validated using state space analysis and simulation-based verification. Secondly, corresponding checking queries are proposed for the purpose of functional safety verification. Further, the TTDMS performance is evaluated by applying parameters in the formal model. Thirdly, the reliability of a functional prototype TTDMS is estimated. It is found that the procedure can cooperate with the system development, and both formal and simulation-based verifications are performed. Using our process to evaluate and verify a system is easier to read and more reliable compared to executable code and mathematical methods. - Highlights: • A new Train to Train Distance Measurement System. • New approach verifying system functional safety and evaluating system performance by means of CPN. • System formalization using the system property concept. • Verification of system functional safety using state space analysis. • Evaluation of system performance applying simulation-based analysis.

  18. Low-Intrusion Techniques and Sensitive Information Management for Warhead Counting and Verification: FY2011 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Jarman, Kenneth D.; Robinson, Sean M.; McDonald, Benjamin S.; Gilbert, Andrew J.; Misner, Alex C.; Pitts, W. Karl; White, Timothy A.; Seifert, Allen; Miller, Erin A.

    2011-09-01

    Future arms control treaties may push nuclear weapons limits to unprecedented low levels and may entail precise counting of warheads as well as distinguishing between strategic and tactical nuclear weapons. Such advances will require assessment of form and function to confidently verify the presence or absence of nuclear warheads and/or their components. Imaging with penetrating radiation can provide such an assessment and could thus play a unique role in inspection scenarios. Yet many imaging capabilities have been viewed as too intrusive from the perspective of revealing weapon design details, and the potential for the release of sensitive information poses challenges in verification settings. A widely held perception is that verification through radiography requires images of sufficient quality that an expert (e.g., a trained inspector or an image-matching algorithm) can verify the presence or absence of components of a device. The concept of information barriers (IBs) has been established to prevent access to relevant weapon-design information by inspectors (or algorithms), and has, to date, limited the usefulness of radiographic inspection. The challenge of this project is to demonstrate that radiographic information can be used behind an IB to improve the capabilities of treaty-verification weapons-inspection systems.

  19. Inspection and verification of waste packages for near surface disposal

    International Nuclear Information System (INIS)

    2000-01-01

    Extensive experience has been gained with various disposal options for low and intermediate level waste at or near surface disposal facilities. Near surface disposal is based on proven and well demonstrated technologies. To ensure the safety of near surface disposal facilities when available technologies are applied, it is necessary to control and assure the quality of the repository system's performance, which includes waste packages, engineered features and natural barriers, as well as siting, design, construction, operation, closure and institutional controls. Recognizing the importance of repository performance, the IAEA is producing a set of technical publications on quality assurance and quality control (QA/QC) for waste disposal to provide Member States with technical guidance and current information. These publications cover issues on the application of QA/QC programmes to waste disposal, long term record management, and specific QA/QC aspects of waste packaging, repository design and R and D. Waste package QA/QC is especially important because the package is the primary barrier to radionuclide release from a disposal facility. Waste packaging also involves interface issues between the waste generator and the disposal facility operator. Waste should be packaged by generators to meet waste acceptance requirements set for a repository or disposal system. However, it is essential that the disposal facility operator ensure that waste packages conform with disposal facility acceptance requirements. Demonstration of conformance with disposal facility acceptance requirements can be achieved through the systematic inspection and verification of waste packages at both the waste generator's site and at the disposal facility, based on a waste package QA/QC programme established by the waste generator and approved by the disposal operator. However, strategies, approaches and the scope of inspection and verification will be somewhat different from country to country

  20. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  1. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  2. 76 FR 29805 - Submission for Review: Verification of Full-Time School Attendance, RI 25-49

    Science.gov (United States)

    2011-05-23

    ... other technological collection techniques or other forms of information technology, e.g., permitting... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Verification of Full-Time School Attendance... Federal agencies the opportunity to comment on a revised information collection request (ICR) 3206-0215...

  3. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  4. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  5. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations

    Science.gov (United States)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.

  6. Computer-optimized γ-NDA geometries for uranium enrichment verification of gaseous UF6

    International Nuclear Information System (INIS)

    Wichers, V.A.; Aaldijk, J.K.; Betue, P.A.C. de; Harry, R.J.S.

    1993-05-01

    An improved collimator pair of novel design tailored for deposit independent enrichment verification of gaseous UF 6 at low pressures in cascade-to-header pipes of small diameters in centrifuge enrichment plants is presented. The designs are adapted for use in a dual-geometry arrangement for simultaneous measurements with both detection geometries. The average measurement time with the dual-geometry arrangement is approximately half an hour for deposit-to-gas activity ratios as high as 20. (orig.)

  7. Formal Development and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    done applying conventional methods where requirements and designs are described using natural language, diagrams and pseudo code, and the verification of requirements has been done by code inspection and non-exhaustive testing. These techniques are not sufficient, leading to errors and an in-effective...... for Strategic Research. The work is affiliated with a number of partners: DTU Compute, DTU Transport, DTU Management, DTU Fotonik, Bremen University, Banedanmark, Trafikstyrelsen, DSB, and DSB S-tog. More information about RobustRails project is available at http://www.dtu.dk/subsites/robustrails/English.aspx...

  8. Guidance and Control Software Project Data - Volume 3: Verification Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  9. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  10. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  11. A formalism for independent checking of Gamma Knife dose calculations

    International Nuclear Information System (INIS)

    Tsai Jensan; Engler, Mark J.; Rivard, Mark J.; Mahajan, Anita; Borden, Jonathan A.; Zheng Zhen

    2001-01-01

    For stereotactic radiosurgery using the Leksell Gamma Knife system, it is important to perform a pre-treatment verification of the maximum dose calculated with the Leksell GammaPlan[reg] (D LGP ) stereotactic radiosurgery system. This verification can be incorporated as part of a routine quality assurance (QA) procedure to minimize the chance of a hazardous overdose. To implement this procedure, a formalism has been developed to calculate the dose D CAL (X,Y,Z,d av ,t) using the following parameters: average target depth (d av ), coordinates (X,Y,Z) of the maximum dose location or any other dose point(s) to be verified, 3-dimensional (3-dim) beam profiles or off-center-ratios (OCR) of the four helmets, helmet size i, output factor O i , plug factor P i , each shot j coordinates (x,y,z) i,j , and shot treatment time (t i,j ). The average depth of the target d av was obtained either from MRI/CT images or ruler measurements of the Gamma Knife Bubble Head Frame. D CAL and D LGP were then compared to evaluate the accuracy of this independent calculation. The proposed calculation for an independent check of D LGP has been demonstrated to be accurate and reliable, and thus serves as a QA tool for Gamma Knife stereotactic radiosurgery

  12. SELF CARE MANAGEMENT-HOLISTIC PSYCHOSPIRITUAL CARE ON INDEPENDENCE, GLUCOSE LEVEL, AND HBA1C OF TYPE 2 DIABETES MELLITUS PATIENT

    Directory of Open Access Journals (Sweden)

    Kusnanto Kusnanto

    2017-04-01

    Full Text Available Introduction: Diabetes mellitus is a kind of incurable chronic disease that actually manageable. The global prevalence tends to increase due to less self management of the disease and the impact of it was health condition declines physically, psychologically, socially, and spiritually. There were so many interventions implemented but failed to give positive improvement in patient's holistic condition which is lead to complications. The purpose of this research was to improve patient independency in managing the disease and to explain changes in blood glucose and HbA1C levels through self care management-holistic psychospiritual care model. Method: Patient newly diagnose with type 2 diabetes mellitus at Public Health Centre Kebonsari was selected with purposive sampling and divided into two groups. Each group contains 25 patients. Intervention group was given self care management model development with self diabetes management module. The intervention was given  five times in three months. Before and after intervention patient was observed for blood glucose level of 2 hours before and after meal, and also HbA1C level. Questionnaire was given to patient. The data then analyzed using wilcoxon, mann whitney, and student-t test. Result: The result of this research showed patient with type 2 diabetes have independency improvement and lower blood glucose level of 2 hours before and after meal and also decreased HbA1C after intervention. Discussion: Self Care Management-Holistic Psychospiritual Care Model improves patient independency in managing their disease, lowering blood glucose and HbA1C levels.

  13. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  14. Two-Level Verification of Data Integrity for Data Storage in Cloud Computing

    Science.gov (United States)

    Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping

    Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.

  15. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  16. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  17. TRU waste certification and TRUPACT-2 payload verification

    International Nuclear Information System (INIS)

    Hunter, E.K.; Johnson, J.E.

    1990-01-01

    The Waste Isolation Pilot Plant (WIPP) established a policy that requires each waste shipper to verify that all waste shipments meet the requirements of the Waste Acceptance Criteria (WAC) prior to being shipped. This verification provides assurance that transuranic (TRU) wastes meet the criteria while still retained in a facility where discrepancies can be immediately corrected. Each Department of Energy (DOE) TRU waste facility planning to ship waste to the Waste Isolation Pilot Plant (WIPP) is required to develop and implement a specific program including Quality Assurance (QA) provisions to verify that waste is in full compliance with WIPP's WAC. This program is audited by a composite DOE and contractor audit team prior to granting the facility permission to certify waste. During interaction with the Nuclear Regulatory Commission (NRC) on payload verification for shipping in TRUPACT-II, a similar system was established by DOE. The TRUPACT-II Safety Analysis Report (SAR) contains the technical requirements and physical and chemical limits that payloads must meet (like the WAC). All shippers must plan and implement a payload control program including independent QA provisions. A similar composite audit team will conduct preshipment audits, frequent subsequent audits, and operations inspections to verify that all TRU waste shipments in TRUPACT-II meet the requirements of the Certificate of Compliance issued by the NRC which invokes the SAR requirements. 1 fig

  18. Free and Reduced-Price Meal Application and Income Verification Practices in School Nutrition Programs in the United States

    Science.gov (United States)

    Kwon, Junehee; Lee, Yee Ming; Park, Eunhye; Wang, Yujia; Rushing, Keith

    2017-01-01

    Purpose/Objectives: This study assessed current practices and attitudes of school nutrition program (SNP) management staff regarding free and reduced-price (F-RP) meal application and verification in SNPs. Methods: Stratified, randomly selected 1,500 SNP management staff in 14 states received a link to an online questionnaire and/or a printed…

  19. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs.

  20. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop.

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs

  1. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  2. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  3. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  4. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  5. Radiological verification survey results at 14 Peck Ave., Pequannock, New Jersey (PJ001V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The U.S. Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W. R. Grace facility. The property at 14 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 14 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  6. Radiological verification survey results at 3 Peck Ave., Pequannock, New Jersey (PJ002V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 3 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 3 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  7. Radiological verification survey results at 15 Peck Ave., Pequannock, New Jersey (PJ005V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 15 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 15 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  8. Radiological verification survey results at 17 Peck Ave., Pequannock, New Jersey (PJ006V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 17 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 17 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  9. Radiological verification survey results at 7 Peck Ave., Pequannock, New Jersey (PJ003V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 7 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 7 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  10. Radiological verification survey results as 13 Peck Ave., Pequannock, New Jersey (PJ004V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 13 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 13 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  11. Self-verification and contextualized self-views.

    Science.gov (United States)

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  12. The Identification and Comparison of the Requirements Placed on Product Managers during the Recruitment Process

    Directory of Open Access Journals (Sweden)

    Wroblowská Zuzana

    2015-09-01

    Full Text Available The submitted paper focuses on personality traits and behavioural competencies of a key role bearer in product oriented marketing management, generally referred to as product management. An interdisciplinary approach was applied while looking into this subject, since both research into theoretical bases and analysis of the current state of the topic and the tendencies of its development required work in several fields of study. Based on research in the field of secondary data, the assumption was set that a product manager is an example of a knowledge worker of the 21st century and that the business practice sees him/her as such, which has an effect on the requirements a candidate for this position is confronted with in the recruitment process. An independent research project was carried out and it confirmed that product managers are considered to be knowledge workers and that independence and analytical thinking skills were among the most common requirements for product managers both in 2007 and 2014. A comparison of results from 2007 and 2014 also showed some differences. The statistical verification confirmed a shift in requirements within the interpersonal competency group. The findings were used to formulate recommendations for the recruitment strategy and realization of selection for positions in product management.

  13. Isotope correlation verification of analytical measurements for dissolver materials

    International Nuclear Information System (INIS)

    Satkowski, J.

    1988-01-01

    An independent verification of analytical results for accountability measurements of dissolver materials can be performed using the Iosotop Correlation Technique (ICT). ICT is based on the relationships that exist between the initial and final elemental concentration and isotopic abundances of the nuclear fuel. Linear correlation functions between isotopic ratios and plutonium/uranium ratios have been developed for specific reactor fuels. The application of these correlations to already existing analytical data provides a laboratory additional confidence in the reported results. Confirmation is done by a test of consistancy with historical data. ICT is being utilized with dissolver accountability measurements at the Savannah River Plant Laboratory. The application, implementation, and operating experience of this technique are presented

  14. Data storage accounting and verification at LHC experiments

    Energy Technology Data Exchange (ETDEWEB)

    Huang, C. H. [Fermilab; Lanciotti, E. [CERN; Magini, N. [CERN; Ratnikova, N. [Moscow, ITEP; Sanchez-Hernandez, A. [CINVESTAV, IPN; Serfon, C. [Munich U.; Wildish, T. [Princeton U.; Zhang, X. [Beijing, Inst. High Energy Phys.

    2012-01-01

    All major experiments at the Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for resource management, planning, and operations. To verify the consistency of central catalogs, experiments are asking sites to provide a full list of the files they have on storage, including size, checksum, and other file attributes. Such storage dumps, provided at regular intervals, give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. The use of common solutions helps to reduce the maintenance costs, both at the large Tier1 facilities supporting multiple virtual organizations and at the small sites that often lack manpower. We discuss requirements and solutions to the common tasks of data storage accounting and verification, and present experiment-specific strategies and implementations used within the LHC experiments according to their computing models.

  15. The Effect of Independence Audit Committee on Earnings Management: The Case in French

    OpenAIRE

    Anis Ben Amar

    2014-01-01

    Corporate failures and accounting scandals (e.g. Enron, Worldcom cases etc.) have provided a strong incentive to regulators to consider the major role that audit committees can play. This current study extends the Piot and Janin (2007) study and examines the relationship between independence audit committee and the proceeded measures of earnings management. Using a sample consisting of 279 firm- year observations concerning the years ranging from 2002 to 2005, the results of this study shows ...

  16. Process Management Plans

    Directory of Open Access Journals (Sweden)

    Tomasz Miksa

    2014-07-01

    Full Text Available In the era of research infrastructures and big data, sophisticated data management practices are becoming essential building blocks of successful science. Most practices follow a data-centric approach, which does not take into account the processes that created, analysed and presented the data. This fact limits the possibilities for reliable verification of results. Furthermore, it does not guarantee the reuse of research, which is one of the key aspects of credible data-driven science. For that reason, we propose the introduction of the new concept of Process Management Plans, which focus on the identification, description, sharing and preservation of the entire scientific processes. They enable verification and later reuse of result data and processes of scientific experiments. In this paper we describe the structure and explain the novelty of Process Management Plans by showing in what way they complement existing Data Management Plans. We also highlight key differences, major advantages, as well as references to tools and solutions that can facilitate the introduction of Process Management Plans.

  17. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  18. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  19. System design and verification process for LHC programmable trigger electronics

    CERN Document Server

    Crosetto, D

    1999-01-01

    The rapid evolution of electronics has made it essential to design systems in a technology-independent form that will permit their realization in any future technology. This article describes two practical projects that have been developed for fast, programmable, scalable, modular electronics for the first-level trigger of Large Hadron Collider (LHC) experiments at CERN, Geneva. In both projects, one for the front-end electronics and the second for executing first- level trigger algorithms, the whole system requirements were constrained to two types of replicated components. The overall problem is described, the 3D-Flow design is introduced as a novel solution, and current solutions to the problem are described and compared with the 3D-Flow solution. The design/verification methodology proposed allows the user's real-time system algorithm to be verified down to the gate-level simulation on a technology- independent platform, thus yielding the design for a system that can be implemented with any technology at ...

  20. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  1. Formal Verification of Digital Protection Logic and Automatic Testing Software

    Energy Technology Data Exchange (ETDEWEB)

    Cha, S. D.; Ha, J. S.; Seo, J. S. [KAIST, Daejeon (Korea, Republic of)

    2008-06-15

    - Technical aspect {center_dot} It is intended that digital I and C software have safety and reliability. Project results help the software to acquire license. Software verification technique, which results in this project, can be to use for digital NPP(Nuclear power plant) in the future. {center_dot} This research introduces many meaningful results of verification on digital protection logic and suggests I and C software testing strategy. These results apply to verify nuclear fusion device, accelerator, nuclear waste management and nuclear medical device that require dependable software and high-reliable controller. Moreover, These can be used for military, medical or aerospace-related software. - Economical and industrial aspect {center_dot} Since safety of digital I and C software is highly import, It is essential for the software to be verified. But verification and licence acquisition related to digital I and C software face high cost. This project gives economic profit to domestic economy by using introduced verification and testing technique instead of foreign technique. {center_dot} The operation rate of NPP will rise, when NPP safety critical software is verified with intellectual V and V tool. It is expected that these software substitute safety-critical software that wholly depend on foreign. Consequently, the result of this project has high commercial value and the recognition of the software development works will be able to be spread to the industrial circles. - Social and cultural aspect People expect that nuclear power generation contributes to relieving environmental problems because that does not emit more harmful air pollution source than other power generations. To give more trust and expectation about nuclear power generation to our society, we should make people to believe that NPP is highly safe system. In that point of view, we can present high-reliable I and C proofed by intellectual V and V technique as evidence

  2. Audit Independence : Its Importance to the External Auditor's Role in Banking Regulation and Supervision

    OpenAIRE

    Ojo, Marianne

    2006-01-01

    The role of the external auditor in the supervisory process requires standards such as independence,objectivity and integrity to be achieved. Even though the regulator and external auditor perform similar functions, namely the verification of financial statements, they serve particular interests. The regulator works towards safeguarding financial stability and investor interests. On the other hand, the external auditor serves the private int...

  3. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  4. 16th edition IEE wiring regulations design and verification of electrical installations

    CERN Document Server

    Scaddan, Brian

    1995-01-01

    This book builds on the basic knowledge and techniques covered in 16th Edition IEE Wiring Regulations Explained and Illustrated, providing the information and revision materials needed for the City & Guilds 2400 (Design, Erection and Verification ofElectrical Installations) exam. All Qualifying Managers will be required to gain this qualification, and Brian Scaddan's book is the ideal text for all students undertaking C&G 2400 courses.

  5. BAVP: Blockchain-Based Access Verification Protocol in LEO Constellation Using IBE Keys

    OpenAIRE

    Wei, Songjie; Li, Shuai; Liu, Peilong; Liu, Meilin

    2018-01-01

    LEO constellation has received intensive research attention in the field of satellite communication. The existing centralized authentication protocols traditionally used for MEO/GEO satellite networks cannot accommodate LEO satellites with frequent user connection switching. This paper proposes a fast and efficient access verification protocol named BAVP by combining identity-based encryption and blockchain technology. Two different key management schemes with IBE and blockchain, respectively...

  6. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  7. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, R., E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Salminen, K.; Siuko, M. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Mattila, J. [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland); Määttä, T. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland)

    2013-10-15

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs.

  8. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  9. Management Technology of Students’ Independent Work

    Directory of Open Access Journals (Sweden)

    Melis K. Asanaliev

    2012-09-01

    Full Text Available The researchers are convinced that for an intensification of educational process in higher education institution it is necessary development of essentially new approaches, forms and the methods of social and pedagogical interaction adequate to new requirements, new pedagogical thinking. Among them we can choose the methods of social and psychological training (SPT which received their development in experimental psychology by synthesis of wide practical experience educational, creative, administrative, and other types of interrelation between people. These methods conditionally divide on: debatable (group discussion, analysis of a situation of a moral choice, game methods (didactic, creative, role-playing games, sensitive training (training of interpersonal sensitivity which are formations of independent informative activity of students on the basis of modern technologies, as the mechanism of improvement of independent work. Researches are expressed in search and finding enough effective forms and means of activation of educational and informative process of preparation of young teachers of vocational training, theoretically and practically prepared in the field of the independent informative activity, use the modern technology of training and its further realization in work with students of technical secondary. It is offered the model of the organization and application in educational process of the complex of these methods in contents complex of training programs of systems of tasks as one of ways of formation of social and psychological culture of future teacher.

  10. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  11. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  12. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  13. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  14. Leadership, Management, and Organization for National Security Space: Report to Congress of the Independent Assessment Panel on the Organization and Management of National Security Space

    National Research Council Canada - National Science Library

    Young, A. T; Anderson, Edward; Bien, Lyle; Fogleman, Ronald R; Hall, Keith; Lyles, Lester; Mark, Hans

    2008-01-01

    The Independent Assessment Panel (IAP) was chartered to review and assess the DoD management and organization of National Security in Space and make appropriate recommendations to strengthen the U.S. position...

  15. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  16. Self-verification motives at the collective level of self-definition.

    Science.gov (United States)

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  17. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  18. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  19. Symposium on International Safeguards: Preparing for Future Verification Challenges

    International Nuclear Information System (INIS)

    2010-01-01

    The purpose of the symposium is to foster dialogue and information exchange involving Member States, the nuclear industry and members of the broader nuclear non-proliferation community to prepare for future verification challenges. Topics addressed during the 2010 symposium include the following: - Supporting the global nuclear non-proliferation regime: Building support for strengthening international safeguards; Enhancing confidence in compliance with safeguards obligations; Legal authority as a means to enhance effectiveness and efficiency; Verification roles in support of arms control and disarmament. - Building collaboration and partnerships with other international forums: Other verification and non-proliferation regimes; Synergies between safety, security and safeguards regimes. - Improving cooperation between IAEA and States for safeguards implementation: Strengthening State systems for meeting safeguards obligations; Enhancing safeguards effectiveness and efficiency through greater cooperation; Lessons learned: recommendations for enhancing integrated safeguards implementation. - Addressing safeguards challenges in an increasingly interconnected world: Non-State actors and covert trade networks; Globalization of nuclear information and technology. - Preparing for the global nuclear expansion and increasing safeguards workload: Furthering implementation of the State-level concept and integrated safeguards; Information-driven safeguards; Remote data-driven safeguards inspections; Safeguards in States without comprehensive safeguards agreements. - Safeguarding advanced nuclear facilities and innovative fuel cycles: Proliferation resistance; Safeguards by design; Safeguards approaches for advanced facilities. - Advanced technologies and methodologies: For verifying nuclear material and activities; For detecting undeclared nuclear material and activities; For information collection, analysis and integration. - Enhancing the development and use of safeguards

  20. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  1. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  2. DarcyTools, Version 2.1. Verification and validation

    International Nuclear Information System (INIS)

    Svensson, Urban

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  3. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  4. DarcyTools, Version 2.1. Verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  5. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  6. Multi-canister overpack project - verification and validation, MCNP 4A

    International Nuclear Information System (INIS)

    Goldmann, L.H.

    1997-01-01

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error

  7. ESTRO ACROP guidelines for positioning, immobilisation and position verification of head and neck patients for radiation therapists

    Directory of Open Access Journals (Sweden)

    Michelle Leech

    2017-03-01

    Full Text Available Background and purpose: Over the last decade, the management of locally advanced head and neck cancers (HNCs has seen a substantial increase in the use of chemoradiation. These guidelines have been developed to assist Radiation TherapisTs (RTTs in positioning, immobilisation and position verification for head and neck cancer patients. Materials and methods: A critical review of the literature was undertaken by the writing committee.Based on the literature review, a survey was developed to ascertain the current positioning, immobilisation and position verification methods for head and neck radiation therapy across Europe. The survey was translated into Italian, German, Greek, Portuguese, Russian, Croatian, French and Spanish.Guidelines were subsequently developed by the writing committee. Results: Results from the survey indicated that a wide variety of treatment practices and treatment verification protocols are in operation for head and neck cancer patients across Europe currently.The guidelines developed are based on the experience and expertise of the writing committee, remaining cognisant of the variations in imaging and immobilisation techniques used currently in Europe. Conclusions: These guidelines have been developed to provide RTTs with guidance on positioning, immobilisation and position verification of HNC patients. The guidelines will also provide RTTs with the means to critically reflect on their own daily clinical practice with this patient group. Keywords: Head and neck, Immobilisation, Positioning, Verification

  8. Independent monitor unit calculation for intensity modulated radiotherapy using the MIMiC multileaf collimator

    International Nuclear Information System (INIS)

    Chen Zhe; Xing Lei; Nath, Ravinder

    2002-01-01

    A self-consistent monitor unit (MU) and isocenter point-dose calculation method has been developed that provides an independent verification of the MU for intensity modulated radiotherapy (IMRT) using the MIMiC (Nomos Corporation) multileaf collimator. The method takes into account two unique features of IMRT using the MIMiC: namely the gantry-dynamic arc delivery of intensity modulated photon beams and the slice-by-slice dose delivery for large tumor volumes. The method converts the nonuniform beam intensity planned at discrete gantry angles of 5 deg. or 10 deg. into conventional nonmodulated beam intensity apertures of elemental arc segments of 1 deg. This approach more closely simulates the actual gantry-dynamic arc delivery by MIMiC. Because each elemental arc segment is of uniform intensity, the MU calculation for an IMRT arc is made equivalent to a conventional arc with gantry-angle dependent beam apertures. The dose to the isocenter from each 1 deg. elemental arc segment is calculated by using the Clarkson scatter summation technique based on measured tissue-maximum-ratio and output factors, independent of the dose calculation model used in the IMRT planning system. For treatments requiring multiple treatment slices, the MU for the arc at each treatment slice takes into account the MU, leakage and scatter doses from other slices. This is achieved by solving a set of coupled linear equations for the MUs of all involved treatment slices. All input dosimetry data for the independent MU/isocenter point-dose calculation are measured directly. Comparison of the MU and isocenter point dose calculated by the independent program to those calculated by the Corvus planning system and to direct measurements has shown good agreement with relative difference less than ±3%. The program can be used as an independent initial MU verification for IMRT plans using the MIMiC multileaf collimators

  9. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  10. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  11. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  12. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  13. Internet-based dimensional verification system for reverse engineering processes

    International Nuclear Information System (INIS)

    Song, In Ho; Kim, Kyung Don; Chung, Sung Chong

    2008-01-01

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  14. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  15. Verification and validation of decision support software: Expert Choice{trademark} and PCM{trademark}

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Q.H.; Martin, J.D.

    1994-11-04

    This report documents the verification and validation of two decision support programs: EXPERT CHOICE{trademark} and PCM{trademark}. Both programs use the Analytic Hierarchy Process (AHP) -- or pairwise comparison technique -- developed by Dr. Thomas L. Saaty. In order to provide an independent method for the validating the two programs, the pairwise comparison algorithm was developed for a standard mathematical program. A standard data set -- selecting a car to purchase -- was used with each of the three programs for validation. The results show that both commercial programs performed correctly.

  16. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  17. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  18. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  19. Executive dysfunction is independently associated with reduced functional independence in heart failure.

    Science.gov (United States)

    Alosco, Michael L; Spitznagel, Mary Beth; Raz, Naftali; Cohen, Ronald; Sweet, Lawrence H; Colbert, Lisa H; Josephson, Richard; van Dulmen, Manfred; Hughes, Joel; Rosneck, Jim; Gunstad, John

    2014-03-01

    To examine the independent association between executive function with instrumental activities of daily living and health behaviours in older adults with heart failure. Executive function is an important contributor to functional independence as it consists of cognitive processes needed for decision-making, planning, organising and behavioural monitoring. Impairment in this domain is common in heart failure patients and associated with reduced performance of instrumental activities of daily living in many medical and neurological populations. However, the contribution of executive functions to functional independence and healthy lifestyle choices in heart failure patients has not been fully examined. Cross-sectional analyses. One hundred and seventy-five heart failure patients completed a neuropsychological battery and echocardiogram. Participants also completed the Lawton-Brody Instrumental Activities of Daily Living Scale and reported current cigarette use. Hierarchical regressions revealed that reduced executive function was independently associated with worse instrumental activity of daily living performance with a specific association for decreased ability to manage medications. Partial correlations showed that executive dysfunction was associated with current cigarette use. Our findings suggest that executive dysfunction is associated with poorer functional independence and contributes to unhealthy behaviours in heart failure. Future studies should examine whether heart failure patients benefit from formal organisation schema (i.e. pill organisers) to maintain independence. Screening of executive function in heart failure patients may provide key insight into their ability to perform daily tasks, including the management of treatment recommendations. © 2013 John Wiley & Sons Ltd.

  20. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  1. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  2. Independent verification of tank volume measurements by pressure-volume authentication

    International Nuclear Information System (INIS)

    Suda, S.C.; Keisch, B.

    1992-01-01

    Brookhaven National Laboratory has developed a portable pressure-volume authenticator** as a standard and means of checking the functionality and quality of bubbler-probe volumetric devices. The pressure-volume authenticator (PVA) consists of an automated electromanometer system that is controlled by a laptop computer, and a transportable volumetric artifact. A portable pressure gage is connected, via a scanivalve, to the operator's bubbler-probe system and independently measures all bubbler probes. The transportable volumetric artifact is a one-meter high vessel equipped with bubble-probes, computer controlled air-purge rotameters, and platinum resistance (RTD) thermometer. High quality measurements are obtained by use of a fast sampling technique and sophisticated software developed under this program. The computer software performs the following functions: (a) instrument control, (b) data acquisition, (c) on-line graphical and numerical display of measurement data, and (d) detailed data analysis. The device also may provide hands-on training for inspectors and plant operators in high quality volumetric data collection and analysis. A field demonstration of the automated electromanometer system was conducted on the PETRA input accountancy tank, JRC-Ispra in November 1991

  3. Weighted piecewise LDA for solving the small sample size problem in face verification.

    Science.gov (United States)

    Kyperountas, Marios; Tefas, Anastasios; Pitas, Ioannis

    2007-03-01

    A novel algorithm that can be used to boost the performance of face-verification methods that utilize Fisher's criterion is presented and evaluated. The algorithm is applied to similarity, or matching error, data and provides a general solution for overcoming the "small sample size" (SSS) problem, where the lack of sufficient training samples causes improper estimation of a linear separation hyperplane between the classes. Two independent phases constitute the proposed method. Initially, a set of weighted piecewise discriminant hyperplanes are used in order to provide a more accurate discriminant decision than the one produced by the traditional linear discriminant analysis (LDA) methodology. The expected classification ability of this method is investigated throughout a series of simulations. The second phase defines proper combinations for person-specific similarity scores and describes an outlier removal process that further enhances the classification ability. The proposed technique has been tested on the M2VTS and XM2VTS frontal face databases. Experimental results indicate that the proposed framework greatly improves the face-verification performance.

  4. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  5. V&V of Fault Management: Challenges and Successes

    Science.gov (United States)

    Fesq, Lorraine M.; Costello, Ken; Ohi, Don; Lu, Tiffany; Newhouse, Marilyn

    2013-01-01

    This paper describes the results of a special breakout session of the NASA Independent Verification and Validation (IV&V) Workshop held in the fall of 2012 entitled "V&V of Fault Management: Challenges and Successes." The NASA IV&V Program is in a unique position to interact with projects across all of the NASA development domains. Using this unique opportunity, the IV&V program convened a breakout session to enable IV&V teams to share their challenges and successes with respect to the V&V of Fault Management (FM) architectures and software. The presentations and discussions provided practical examples of pitfalls encountered while performing V&V of FM including the lack of consistent designs for implementing faults monitors and the fact that FM information is not centralized but scattered among many diverse project artifacts. The discussions also solidified the need for an early commitment to developing FM in parallel with the spacecraft systems as well as clearly defining FM terminology within a project.

  6. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  7. Independent technical review, handbook

    International Nuclear Information System (INIS)

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project

  8. Independent technical review, handbook

    Energy Technology Data Exchange (ETDEWEB)

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project.

  9. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  10. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  11. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  12. Standard practices for verification of displacement measuring systems and devices used in material testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 These practices cover procedures and requirements for the calibration and verification of displacement measuring systems by means of standard calibration devices for static and quasi-static testing machines. This practice is not intended to be complete purchase specifications for testing machines or displacement measuring systems. Displacement measuring systems are not intended to be used for the determination of strain. See Practice E83. 1.2 These procedures apply to the verification of the displacement measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, etc. In all cases the buyer/owner/user must designate the displacement-measuring system(s) to be verified. 1.3 The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combining values from the two systems m...

  13. Environmental Technology Verification Report for Abraxis Ecologenia® Ethynylestradiol (EE2) Microplate Enzyme-Linked Immunosorbent Assay (ELISA) Test Kits

    Science.gov (United States)

    The EPA's National Risk Management Research Laboratory (NRMRL) and its verification organization partner, Battelle, operate the Advanced Monitoring Systems (AMS) Center under ETV. The AMS Center recently evaluated the performance of the Abraxis Ecologenia Ethynylestradiol (EE2) ...

  14. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  15. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  16. Identification, Verification, and Compilation of Produced Water Management Practices for Conventional Oil and Gas Production Operations

    Energy Technology Data Exchange (ETDEWEB)

    Rachel Henderson

    2007-09-30

    The project is titled 'Identification, Verification, and Compilation of Produced Water Management Practices for Conventional Oil and Gas Production Operations'. The Interstate Oil and Gas Compact Commission (IOGCC), headquartered in Oklahoma City, Oklahoma, is the principal investigator and the IOGCC has partnered with ALL Consulting, Inc., headquartered in Tulsa, Oklahoma, in this project. State agencies that also have partnered in the project are the Wyoming Oil and Gas Conservation Commission, the Montana Board of Oil and Gas Conservation, the Kansas Oil and Gas Conservation Division, the Oklahoma Oil and Gas Conservation Division and the Alaska Oil and Gas Conservation Commission. The objective is to characterize produced water quality and management practices for the handling, treating, and disposing of produced water from conventional oil and gas operations throughout the industry nationwide. Water produced from these operations varies greatly in quality and quantity and is often the single largest barrier to the economic viability of wells. The lack of data, coupled with renewed emphasis on domestic oil and gas development, has prompted many experts to speculate that the number of wells drilled over the next 20 years will approach 3 million, or near the number of current wells. This level of exploration and development undoubtedly will draw the attention of environmental communities, focusing their concerns on produced water management based on perceived potential impacts to fresh water resources. Therefore, it is imperative that produced water management practices be performed in a manner that best minimizes environmental impacts. This is being accomplished by compiling current best management practices for produced water from conventional oil and gas operations and to develop an analysis tool based on a geographic information system (GIS) to assist in the understanding of watershed-issued permits. That would allow management costs to be kept in

  17. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  18. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  19. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  20. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  1. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  2. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  3. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  4. Verification Survey of the Building 315 Zero Power Reactor-6 Facility, Argonne National Laboratory-East, Argonne, Illinois

    International Nuclear Information System (INIS)

    W. C. Adams

    2007-01-01

    Oak Ridge Institute for Science and Education (ORISE) conducted independent verification radiological survey activities at Argonne National Laboratory's Building 315, Zero Power Reactor-6 facility in Argonne, Illinois. Independent verification survey activities included document and data reviews, alpha plus beta and gamma surface scans, alpha and beta surface activity measurements, and instrumentation comparisons. An interim letter report and a draft report, documenting the verification survey findings, were submitted to the DOE on November 8, 2006 and February 22, 2007, respectively (ORISE 2006b and 2007). Argonne National Laboratory-East (ANL-E) is owned by the U.S. Department of Energy (DOE) and is operated under a contract with the University of Chicago. Fundamental and applied research in the physical, biomedical, and environmental sciences are conducted at ANL-E and the laboratory serves as a major center of energy research and development. Building 315, which was completed in 1962, contained two cells, Cells 5 and 4, for holding Zero Power Reactor (ZPR)-6 and ZPR-9, respectively. These reactors were built to increase the knowledge and understanding of fast reactor technology. ZPR-6 was also referred to as the Fast Critical Facility and focused on fast reactor studies for civilian power production. ZPR-9 was used for nuclear rocket and fast reactor studies. In 1967, the reactors were converted for plutonium use. The reactors operated from the mid-1960's until 1982 when they were both shut down. Low levels of radioactivity were expected to be present due to the operating power levels of the ZPR's being restricted to well below 1,000 watts. To evaluate the presence of radiological contamination, DOE characterized the ZPRs in 2001. Currently, the Melt Attack and Coolability Experiments (MACE) and Melt Coolability and Concrete Interaction (MCCI) Experiments are being conducted in Cell 4 where the ZPR-9 is located (ANL 2002 and 2006). ANL has performed final

  5. SU-F-T-463: Light-Field Based Dynalog Verification

    International Nuclear Information System (INIS)

    Atwal, P; Ramaseshan, R

    2016-01-01

    Purpose: To independently verify leaf positions in so-called dynalog files for a Varian iX linac with a Millennium 120 MLC. This verification provides a measure of confidence that the files can be used directly as part of a more extensive intensity modulated radiation therapy / volumetric modulated arc therapy QA program. Methods: Initial testing used white paper placed at the collimator plane and a standard hand-held digital camera to image the light and shadow of a static MLC field through the paper. Known markings on the paper allow for image calibration. Noise reduction was attempted with removal of ‘inherent noise’ from an open-field light image through the paper, but the method was found to be inconsequential. This is likely because the environment could not be controlled to the precision required for the sort of reproducible characterization of the quantum noise needed in order to meaningfully characterize and account for it. A multi-scale iterative edge detection algorithm was used for localizing the leaf ends. These were compared with the planned locations from the treatment console. Results: With a very basic setup, the image of the central bank A leaves 15–45, which are arguably the most important for beam modulation, differed from the planned location by [0.38±0.28] mm. Similarly, for bank B leaves 15–45 had a difference of [0.42±0.28] mm Conclusion: It should be possible to determine leaf position accurately with not much more than a modern hand-held camera and some software. This means we can have a periodic and independent verification of the dynalog file information. This is indicated by the precision already achieved using a basic setup and analysis methodology. Currently, work is being done to reduce imaging and setup errors, which will bring the leaf position error down further, and allow meaningful analysis over the full range of leaves.

  6. SU-F-T-463: Light-Field Based Dynalog Verification

    Energy Technology Data Exchange (ETDEWEB)

    Atwal, P; Ramaseshan, R [BC Cancer Agency, Abbotsford, BC (Canada)

    2016-06-15

    Purpose: To independently verify leaf positions in so-called dynalog files for a Varian iX linac with a Millennium 120 MLC. This verification provides a measure of confidence that the files can be used directly as part of a more extensive intensity modulated radiation therapy / volumetric modulated arc therapy QA program. Methods: Initial testing used white paper placed at the collimator plane and a standard hand-held digital camera to image the light and shadow of a static MLC field through the paper. Known markings on the paper allow for image calibration. Noise reduction was attempted with removal of ‘inherent noise’ from an open-field light image through the paper, but the method was found to be inconsequential. This is likely because the environment could not be controlled to the precision required for the sort of reproducible characterization of the quantum noise needed in order to meaningfully characterize and account for it. A multi-scale iterative edge detection algorithm was used for localizing the leaf ends. These were compared with the planned locations from the treatment console. Results: With a very basic setup, the image of the central bank A leaves 15–45, which are arguably the most important for beam modulation, differed from the planned location by [0.38±0.28] mm. Similarly, for bank B leaves 15–45 had a difference of [0.42±0.28] mm Conclusion: It should be possible to determine leaf position accurately with not much more than a modern hand-held camera and some software. This means we can have a periodic and independent verification of the dynalog file information. This is indicated by the precision already achieved using a basic setup and analysis methodology. Currently, work is being done to reduce imaging and setup errors, which will bring the leaf position error down further, and allow meaningful analysis over the full range of leaves.

  7. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  8. Media independence and dividend policy

    DEFF Research Database (Denmark)

    Farooq, Omar; Dandoune, Salma

    2012-01-01

    independence and dividend policies in emerging markets. Using a dataset from twenty three emerging markets, we show a significantly negative relationship between dividend policies (payout ratio and decision to pay dividend) and media independence. We argue that independent media reduces information asymmetries...... for stock market participants. Consequently, stock market participants in emerging markets with more independent media do not demand as high and as much dividends as their counterparts in emerging markets with less independent media. We also show that press independence is more important in defining......Can media pressurize managers to disgorge excess cash to shareholders? Do firms in countries with more independent media follow different dividend policies than firms with less independent media? This paper seeks to answer these questions and aims to document the relationship between media...

  9. Static and Dynamic Verification of Critical Software for Space Applications

    Science.gov (United States)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  10. Investigation of novel spent fuel verification system for safeguard application

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Haneol; Yim, Man-Sung [KAIST, Daejeon (Korea, Republic of)

    2016-10-15

    Radioactive waste, especially spent fuel, is generated from the operation of nuclear power plants. The final stage of radioactive waste management is disposal which isolates radioactive waste from the accessible environment and allows it to decay. The safety, security, and safeguard of a spent fuel repository have to be evaluated before its operation. Many researchers have evaluated the safety of a repository. These researchers calculated dose to public after the repository is closed depending on their scenario. Because most spent fuel repositories are non-retrievable, research on security or safeguards of spent fuel repositories have to be performed. Design based security or safeguard have to be developed for future repository designs. This study summarizes the requirements of future spent fuel repositories especially safeguards, and suggests a novel system which meets the safeguard requirements. Applying safeguards to a spent fuel repository is becoming increasingly important. The future requirements for a spent fuel repository are suggested by several expert groups, such as ASTOR in IAEA. The requirements emphasizes surveillance and verification. The surveillance and verification of spent fuel is currently accomplished by using the Cerenkov radiation detector while spent fuel is being stored in a fuel pool. This research investigated an advanced spent fuel verification system using a system which converts spent fuel radiation into electricity. The system generates electricity while it is conveyed from a transportation cask to a disposal cask. The electricity conversion system was verified in a lab scale experiment using an 8.51GBq Cs-137 gamma source.

  11. Investigation of novel spent fuel verification system for safeguard application

    International Nuclear Information System (INIS)

    Lee, Haneol; Yim, Man-Sung

    2016-01-01

    Radioactive waste, especially spent fuel, is generated from the operation of nuclear power plants. The final stage of radioactive waste management is disposal which isolates radioactive waste from the accessible environment and allows it to decay. The safety, security, and safeguard of a spent fuel repository have to be evaluated before its operation. Many researchers have evaluated the safety of a repository. These researchers calculated dose to public after the repository is closed depending on their scenario. Because most spent fuel repositories are non-retrievable, research on security or safeguards of spent fuel repositories have to be performed. Design based security or safeguard have to be developed for future repository designs. This study summarizes the requirements of future spent fuel repositories especially safeguards, and suggests a novel system which meets the safeguard requirements. Applying safeguards to a spent fuel repository is becoming increasingly important. The future requirements for a spent fuel repository are suggested by several expert groups, such as ASTOR in IAEA. The requirements emphasizes surveillance and verification. The surveillance and verification of spent fuel is currently accomplished by using the Cerenkov radiation detector while spent fuel is being stored in a fuel pool. This research investigated an advanced spent fuel verification system using a system which converts spent fuel radiation into electricity. The system generates electricity while it is conveyed from a transportation cask to a disposal cask. The electricity conversion system was verified in a lab scale experiment using an 8.51GBq Cs-137 gamma source

  12. Subsurface barrier verification technologies, informal report

    International Nuclear Information System (INIS)

    Heiser, J.H.

    1994-06-01

    One of the more promising remediation options available to the DOE waste management community is subsurface barriers. Some of the uses of subsurface barriers include surrounding and/or containing buried waste, as secondary confinement of underground storage tanks, to direct or contain subsurface contaminant plumes and to restrict remediation methods, such as vacuum extraction, to a limited area. To be most effective the barriers should be continuous and depending on use, have few or no breaches. A breach may be formed through numerous pathways including: discontinuous grout application, from joints between panels and from cracking due to grout curing or wet-dry cycling. The ability to verify barrier integrity is valuable to the DOE, EPA, and commercial sector and will be required to gain full public acceptance of subsurface barriers as either primary or secondary confinement at waste sites. It is recognized that no suitable method exists for the verification of an emplaced barrier's integrity. The large size and deep placement of subsurface barriers makes detection of leaks challenging. This becomes magnified if the permissible leakage from the site is low. Detection of small cracks (fractions of an inch) at depths of 100 feet or more has not been possible using existing surface geophysical techniques. Compounding the problem of locating flaws in a barrier is the fact that no placement technology can guarantee the completeness or integrity of the emplaced barrier. This report summarizes several commonly used or promising technologies that have been or may be applied to in-situ barrier continuity verification

  13. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  14. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  15. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  16. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  17. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  18. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  19. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  20. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  1. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  2. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  3. LH2 on-orbit storage tank support trunnion design and verification

    Science.gov (United States)

    Bailey, W. J.; Fester, D. A.; Toth, J. M., Jr.

    1985-01-01

    A detailed fatigue analysis was conducted to provide verification of the trunnion design in the reusable Cryogenic Fluid Management Facility for Shuttle flights and to assess the performance capability of the trunnion E-glass/S-glass epoxy composite material. Basic material property data at ambient and liquid hydrogen temperatures support the adequacy of the epoxy composite for seven-mission requirement. Testing of trunnions fabricated to the flight design has verified adequate strength and fatigue properties of the design to meet the requirements of seven Shuttle flights.

  4. Waste management - textbook for secondary schools

    International Nuclear Information System (INIS)

    Chmielewska, E.; Kuruc, J.

    2010-09-01

    This text-book consist of five parts: (I) Waste management; (II) Solid waste management; (III) Recovery and recycling of secondary raw materials; (IV) Radioactive waste management; Examples of verification knowledge and testing of the secondary students through the worksheet. (V) Suggestions for leisure time activities. This text-book is assigned for high school students.

  5. Life cycle management of analytical methods.

    Science.gov (United States)

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  7. 37 CFR 262.7 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  8. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  9. TU-H-CAMPUS-JeP1-02: Fully Automatic Verification of Automatically Contoured Normal Tissues in the Head and Neck

    Energy Technology Data Exchange (ETDEWEB)

    McCarroll, R [UT MD Anderson Cancer Center, Houston, TX (United States); UT Health Science Center, Graduate School of Biomedical Sciences, Houston, TX (United States); Beadle, B; Yang, J; Zhang, L; Kisling, K; Balter, P; Stingo, F; Nelson, C; Followill, D; Court, L [UT MD Anderson Cancer Center, Houston, TX (United States); Mejia, M [University of Santo Tomas Hospital, Manila, Metro Manila (Philippines)

    2016-06-15

    Purpose: To investigate and validate the use of an independent deformable-based contouring algorithm for automatic verification of auto-contoured structures in the head and neck towards fully automated treatment planning. Methods: Two independent automatic contouring algorithms [(1) Eclipse’s Smart Segmentation followed by pixel-wise majority voting, (2) an in-house multi-atlas based method] were used to create contours of 6 normal structures of 10 head-and-neck patients. After rating by a radiation oncologist, the higher performing algorithm was selected as the primary contouring method, the other used for automatic verification of the primary. To determine the ability of the verification algorithm to detect incorrect contours, contours from the primary method were shifted from 0.5 to 2cm. Using a logit model the structure-specific minimum detectable shift was identified. The models were then applied to a set of twenty different patients and the sensitivity and specificity of the models verified. Results: Per physician rating, the multi-atlas method (4.8/5 point scale, with 3 rated as generally acceptable for planning purposes) was selected as primary and the Eclipse-based method (3.5/5) for verification. Mean distance to agreement and true positive rate were selected as covariates in an optimized logit model. These models, when applied to a group of twenty different patients, indicated that shifts could be detected at 0.5cm (brain), 0.75cm (mandible, cord), 1cm (brainstem, cochlea), or 1.25cm (parotid), with sensitivity and specificity greater than 0.95. If sensitivity and specificity constraints are reduced to 0.9, detectable shifts of mandible and brainstem were reduced by 0.25cm. These shifts represent additional safety margins which might be considered if auto-contours are used for automatic treatment planning without physician review. Conclusion: Automatically contoured structures can be automatically verified. This fully automated process could be used to

  10. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  11. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno

    1997-12-31

    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  12. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    Science.gov (United States)

    Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.

    2008-08-01

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET

  13. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Knopf, A; Paganetti, H; Cascio, E; Bortfeld, T [Department of Radiation Oncology, MGH and Harvard Medical School, Boston, MA 02114 (United States); Parodi, K [Heidelberg Ion Therapy Center, Heidelberg (Germany); Bonab, A [Department of Radiology, MGH and Harvard Medical School, Boston, MA 02114 (United States)

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6{sup 0} to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the

  14. Quantitative assessment of the physical potential of proton beam range verification with PET/CT.

    Science.gov (United States)

    Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the

  15. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  16. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  17. The role of board independence on R&D investment’ choice decided by committed managers: The cognitive management of executives’ discretion

    Directory of Open Access Journals (Sweden)

    Fadhila Hamza

    2015-11-01

    Full Text Available This study deals with appreciating the role of both governance system and executives cognitive and attitudinal aspects in the innovation decision-making. After discussing the theoretical relationship between board independence and CEOs attitude and behavior, we are advancing an empirical model testing the correlation between the managers’ attitude and behavior towards innovation and his psychological commitment level. The CEOs commitment bias and attitude constituent were measured using questionnaire. The data analysis was performed using the Bayesian network method on 220 Tunisian executives. Empirical results confirm the theoretical prediction and shows that processing with persuasive mechanism does not have an effective role on the alignment of the manager’s attitude and behavior in key tasks such innovation decision. CEOs authentic behavior was more related to an important manager involvement in this behavior rather than to persuasive effort committed by outside directors to make him contract this action. CEOs attitude and behavior towards innovation are shown related to commitment link “manager-task” and suggests that the board of directors plays no role in the CEOs discretion management. We argue that persuasive approach is not a sufficient path in behavior and interests alignment; yet, it should be applied with the commitment approach for understanding manager decision-making.

  18. 37 CFR 260.6 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... verification of the payment of royalty fees to those parties entitled to receive such fees, according to terms... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION...

  19. Independent management and financial review, Yucca Mountain Project, Nevada. Final report, Appendix

    International Nuclear Information System (INIS)

    1995-01-01

    The Nuclear Waste Policy Act of 1982 (Public Law 97-425), as amended by Public Law 100-203, December 22, 1987, established the Office of Civilian Radioactive Waste Management (OCRWM) within the Department of Energy (DOE), and directed the Office to investigate a site at Yucca Mountain, Nevada, to determine if this site is suitable for the construction of a repository for the disposal of high level nuclear waste. Work on site characterization has been under way for several years. Thus far, about $1.47 billion have been spent on Yucca Mountain programs. This work has been funded by Congressional appropriations from a Nuclear Waste Fund to which contributions have been made by electric utility ratepayers through electric utilities generating power from nuclear power stations. The Secretary of Energy and the Governor of the State of Nevada have appointed one person each to a panel to oversee an objective, independent financial and management evaluation of the Yucca Mountain Project. The Requirements for the work will include an analysis of (1) the Yucca Mountain financial and, contract management techniques and controls; (2) Project schedules and credibility of the proposed milestones; (3) Project organizational effectiveness and internal planning processes, and (4) adequacy of funding levels and funding priorities, including the cost of infrastructure and scientific studies. The recipient will provide monthly progress report and the following reports/documents will be presented as deliverables under the contract: (1) Financial and Contract Management Preliminary Report; (2) Project Scheduling Preliminary Report; (3)Project Organizational Effectiveness Preliminary Report; (4) Project Funding Levels and Funding Priorities Preliminary Report; and (5) Final Report

  20. Independent management and financial review, Yucca Mountain Project, Nevada. Final report, Appendix

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-15

    The Nuclear Waste Policy Act of 1982 (Public Law 97-425), as amended by Public Law 100-203, December 22, 1987, established the Office of Civilian Radioactive Waste Management (OCRWM) within the Department of Energy (DOE), and directed the Office to investigate a site at Yucca Mountain, Nevada, to determine if this site is suitable for the construction of a repository for the disposal of high level nuclear waste. Work on site characterization has been under way for several years. Thus far, about $1.47 billion have been spent on Yucca Mountain programs. This work has been funded by Congressional appropriations from a Nuclear Waste Fund to which contributions have been made by electric utility ratepayers through electric utilities generating power from nuclear power stations. The Secretary of Energy and the Governor of the State of Nevada have appointed one person each to a panel to oversee an objective, independent financial and management evaluation of the Yucca Mountain Project. The Requirements for the work will include an analysis of (1) the Yucca Mountain financial and, contract management techniques and controls; (2) Project schedules and credibility of the proposed milestones; (3) Project organizational effectiveness and internal planning processes, and (4) adequacy of funding levels and funding priorities, including the cost of infrastructure and scientific studies. The recipient will provide monthly progress report and the following reports/documents will be presented as deliverables under the contract: (1) Financial and Contract Management Preliminary Report; (2) Project Scheduling Preliminary Report; (3)Project Organizational Effectiveness Preliminary Report; (4) Project Funding Levels and Funding Priorities Preliminary Report; and (5) Final Report.

  1. Environmental Technology Verification Report for Abraxis 17β-Estradiol (E2) Magnetic Particle Enzyme-Linked Immunosorbent Assay (ELISA) Test Kits

    Science.gov (United States)

    The EPA's National Risk Management Research Laboratory (NRMRL) and its verification organization partner, Battelle, operate the Advanced Monitoring Systems (AMS) Center under ETV. The AMS Center recently evaluated the performance of the Abraxis 17(beta)-estradiol (E2) magnetic p...

  2. Radiological verification survey results at 898 Black Oak Ridge Rd., Wayne, New Jersey (WJ004V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 898 Black Oak Ridge Road, Wayne, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at one meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 898 Black Oak Ridge Road were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  3. Radiological verification survey results at the Pompton Plains Railroad Spur, Pequannock, New Jersey (PJ008V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains railroad spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at the Pompton Plains Railroad Spur, Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at the Pompton Plains railroad spur were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  4. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    Science.gov (United States)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  5. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  6. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  7. Office of Personnel Management (OPM)

    Data.gov (United States)

    Social Security Administration — The purpose of this agreement is for SSA to verify SSN information for the Office of Personnel Management. OPM will use the SSN verifications in its investigative...

  8. 78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines

    Science.gov (United States)

    2013-05-13

    ... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...

  9. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  10. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  11. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  12. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  13. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  14. The legal points at issue concerning the Non-proliferation Treaty and the verification agreement of EURATOM with the IAEA

    International Nuclear Information System (INIS)

    Zieger, G.

    1975-01-01

    An excellent and comprehensive but very juridicial contribution on: Development and content of the Non-proliferation Treaty, the problems which this Treaty poses for EURATOM (a common market, common supply, joint enterprises, Non-proliferation Treaty and France), the compatibility of the Non-proliferation Treaty with the EURATOM Treaty, verification agreement EURATOM - IAEA (the IAEA as the supervisory authority, the control system of the IAEA, guidelines elaborated by the IAEA to be used as the basis for negotiating safeguard agreements, national systems of accounting for and control of nuclear material IAEA supervision as a secondary control, principles of IAEA supervision, secrecy, conflict management), conclusion of the verification agreement, deviations from the guidelines elaborated by the IAEA to be used as the basis for negotiating safeguard agreements legal reflexions (verification agreement and Non-proliferation Treaty, reservations concerning the Non-proliferation Treaty, questions of competence between EURATOM and member states without nuclear weapons, problems of equal treatment). (HP/LN) [de

  15. A verification regime for the spatial discretization of the SN transport equations

    Energy Technology Data Exchange (ETDEWEB)

    Schunert, S.; Azmy, Y. [North Carolina State Univ., Dept. of Nuclear Engineering, 2500 Stinson Drive, Raleigh, NC 27695 (United States)

    2012-07-01

    The order-of-accuracy test in conjunction with the method of manufactured solutions is the current state of the art in computer code verification. In this work we investigate the application of a verification procedure including the order-of-accuracy test on a generic SN transport solver that implements the AHOTN spatial discretization. Different types of semantic errors, e.g. removal of a line of code or changing a single character, are introduced randomly into the previously verified S{sub N} code and the proposed verification procedure is used to identify the coding mistakes (if possible) and classify them. Itemized by error type we record the stage of the verification procedure where the error is detected and report the frequency with which the errors are correctly identified at various stages of the verification. Errors that remain undetected by the verification procedure are further scrutinized to determine the reason why the introduced coding mistake eluded the verification procedure. The result of this work is that the verification procedure based on an order-of-accuracy test finds almost all detectable coding mistakes but rarely, 1.44% of the time, and under certain circumstances can fail. (authors)

  16. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  17. Verification of EPA's ''Preliminary Remediation Goals for radionuclides'' (PRG) electronic calculator

    Energy Technology Data Exchange (ETDEWEB)

    Jannik, Tim [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Stagich, Brooke [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-08-28

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their updated “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides PRGs for radionuclides that are used as a screening tool at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and Resource Conservation and Recovery Act (RCRA) sites. These risk-based PRGs establish concentration limits under specific exposure scenarios. The purpose of this verification study is to determine that the calculator has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly. There are 167 equations used in the calculator. To verify the calculator, all equations for each of seven receptor types (resident, construction worker, outdoor and indoor worker, recreator, farmer, and composite worker) were hand calculated using the default parameters. The same four radionuclides (Am-241, Co-60, H-3, and Pu-238) were used for each calculation for consistency throughout.

  18. The importance of independent research and evaluation in assessing nuclear fuel cycle and waste management facility safety

    International Nuclear Information System (INIS)

    Downing, Walter D.; Patrick, Wesley C.; Sagar, Budhi

    2009-01-01

    In 1987, the United States Nuclear Regulatory Commission (NRC) established at Southwest Research Institute (SwRI) a federally funded research and development center. Known as the Center for Nuclear Waste Regulatory Analyses (CNWRA), its overall mission is to provide NRC with an independent assessment capability on technical and regulatory issues related to a potential geologic repository for spent nuclear fuel and high-level radioactive waste, as well as interim storage and other nuclear fuel-cycle facilities. For more than 20 years, the CNWRA has supported NRC through an extensive pre-licensing period of establishing the framework of regulations and guidance documents, developing computer codes and other review tools, and conducting independent laboratory, field, and numerical analyses. In June 2008, the United States Department of Energy (DOE) submitted a license application and final environmental impact statement to NRC seeking authorization to construct the nation's first geologic repository at Yucca Mountain, Nevada. The CNWRA will assist NRC in conducting a detailed technical review to critically evaluate the DOE license application to assess whether the potential repository has been designed and can be constructed and operated to safely dispose spent nuclear fuel and high-level radioactive waste. NRC access to independent, unbiased, technical advice from the CNWRA is an important aspect of the evaluation process. This paper discusses why an independent perspective is important when dealing with nuclear fuel cycle and waste management issues. It addresses practical considerations such as avoiding conflicts of interest while at the same time maintaining a world-class research program in technical areas related to the nuclear fuel cycle. It also describes an innovative approach for providing CNWRA scientists and engineers a creative outlet for professional development through an internally funded research program that is focused on future nuclear waste

  19. IGENPRO knowledge-based digital system for process transient diagnostics and management

    International Nuclear Information System (INIS)

    Morman, J.A.; Reifman, J.; Vitela, J.E.; Wei, T.Y.C.; Applequist, C.A.; Hippely, P.; Kuk, W.; Tsoukalas, L.H.

    1998-01-01

    Verification and validation issues have been perceived as important factors in the large scale deployment of knowledge-based digital systems for plant transient diagnostics and management. Research and development (R and D) is being performed on the IGENPRO package to resolve knowledge base issues. The IGENPRO approach is to structure the knowledge bases on generic thermal-hydraulic (T-H) first principles and not use the conventional event-basis structure. This allows for generic comprehensive knowledge, relatively small knowledge bases and above all the possibility of T-H system/plant independence. To demonstrate concept feasibility the knowledge structure has been implemented in the diagnostic module PRODIAG. Promising laboratory testing results have been obtained using data from the full scope Braidwood PWR operator training simulator. This knowledge structure is now being implemented in the transient management module PROMANA to treat unanticipated events and the PROTREN module is being developed to process actual plant data. Achievement of the IGENPRO R and D goals should contribute to the acceptance of knowledge-based digital systems for transient diagnostics and management. (author)

  20. IGENPRO knowledge-based digital system for process transient diagnostics and management

    International Nuclear Information System (INIS)

    Morman, J.A.; Reifman, J.; Wei, T.Y.C.

    1997-01-01

    Verification and validation issues have been perceived as important factors in the large scale deployment of knowledge-based digital systems for plant transient diagnostics and management. Research and development (R ampersand D) is being performed on the IGENPRO package to resolve knowledge base issues. The IGENPRO approach is to structure the knowledge bases on generic thermal-hydraulic (T-H) first principles and not use the conventional event-basis structure. This allows for generic comprehensive knowledge, relatively small knowledge bases and above all the possibility of T-H system/plant independence. To demonstrate concept feasibility the knowledge structure has been implemented in the diagnostic module PRODIAG. Promising laboratory testing results have been obtained using data from the full scope Braidwood PWR operator training simulator. This knowledge structure is now being implemented in the transient management module PROMANA to treat unanticipated events and the PROTREN module is being developed to process actual plant data. Achievement of the IGENPRO R ampersand D goals should contribute to the acceptance of knowledge-based digital systems for transient diagnostics and management

  1. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  2. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  3. Compressive sensing using optimized sensing matrix for face verification

    Science.gov (United States)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  4. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [ORNL; Heitmann, Katrin [ORNL; Petersen, Mark R [ORNL; Woodring, Jonathan [Los Alamos National Laboratory (LANL); Williams, Sean [Los Alamos National Laboratory (LANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Ahrens, Christine [Los Alamos National Laboratory (LANL); Hsu, Chung-Hsing [ORNL; Geveci, Berk [ORNL

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  5. A formal design verification and validation on the human factors of a computerized information system in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Park, Jae Chang; Cheon, Se Woo; Jung, Kwang Tae; Baek, Seung Min; Han, Seung; Park, Hee Suk; Son, Ki Chang; Kim, Jung Man; Jung Yung Woo

    1999-11-01

    This report describe a technical transfer under the title of ''A formal design verification and validation on the human factors of a computerized information system in nuclear power plants''. Human factors requirements for the information system designs are extracted from various regulatory and industrial standards and guidelines, and interpreted into a more specific procedures and checklists for verifying the satisfaction of those requirements. A formalized implementation plan is established for human factors verification and validation of a computerized information system in nuclear power plants. Additionally, a Computer support system, named as DIMS-web (design Issue Management System), is developed based upon web internet environment so as to enhance the implementation of the human factors activities. DIMS-Web has three maine functions: supporting requirements review, tracking design issues, and management if issues screening evaluation. DIMS-Web shows its benefits in practice through a trial application to the design review of CFMS for YGN nuclear unit 5 and 6. (author)

  6. A formal design verification and validation on the human factors of a computerized information system in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Park, Jae Chang; Cheon, Se Woo; Jung, Kwang Tae; Baek, Seung Min; Han, Seung; Park, Hee Suk; Son, Ki Chang; Kim, Jung Man; Jung Yung Woo

    1999-11-01

    This report describe a technical transfer under the title of ''A formal design verification and validation on the human factors of a computerized information system in nuclear power plants''. Human factors requirements for the information system designs are extracted from various regulatory and industrial standards and guidelines, and interpreted into a more specific procedures and checklists for verifying the satisfaction of those requirements. A formalized implementation plan is established for human factors verification and validation of a computerized information system in nuclear power plants. Additionally, a Computer support system, named as DIMS-web (design Issue Management System), is developed based upon web internet environment so as to enhance the implementation of the human factors activities. DIMS-Web has three maine functions: supporting requirements review, tracking design issues, and management if issues screening evaluation. DIMS-Web shows its benefits in practice through a trial application to the design review of CFMS for YGN nuclear unit 5 and 6. (author)

  7. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    Science.gov (United States)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  8. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  9. Burnup verification measurements on spent fuel assemblies at Arkansas Nuclear One

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1995-01-01

    Burnup verification measurements have been performed using the Fork system at Arkansas Nuclear One, Units 1 and 2, operated by Energy Operations, Inc. Passive neutron and gamma-ray measurements on individual spent fuel assemblies were correlated with the reactor records for burnup, cooling time, and initial enrichment. The correlation generates an internal calibration for the system in the form of a power law determined by a least squares fit to the neutron data. The values of the exponent in the power laws were 3.83 and 4.35 for Units 1 and 2, respectively. The average deviation of the reactor burnup records from the calibration determined from the measurements is a measure of the random error in the burnup records. The observed average deviations were 2.7% and 3.5% for assemblies at Units 1 and 2, respectively, indicating a high degree of consistency in the reactor records. Two non-standard assemblies containing neutron sources were studied at Unit 2. No anomalous measurements were observed among the standard assemblies at either Unit. The effectiveness of the Fork system for verification of reactor records is due to the sensitivity of the neutron yield to burnup, the self-calibration generated by a series of measurements, the redundancy provided by three independent detection systems, and the operational simplicity and flexibility of the design

  10. International Space Station Temperature and Humidity Control Subsystem Verification for Node 1

    Science.gov (United States)

    Williams, David E.

    2007-01-01

    The International Space Station (ISS) Node 1 Environmental Control and Life Support (ECLS) System is comprised of five subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). This paper provides a summary of the nominal operation of the Node 1 THC subsystem design. The paper will also provide a discussion of the detailed Element Verification methodologies for nominal operation of the Node 1 THC subsystem operations utilized during the Qualification phase.

  11. Accident management strategy in Sweden - implementation and verification

    International Nuclear Information System (INIS)

    Loewenhielm, Gustaf; Engqvist, Alf; Espefaelt, Ralf

    1994-01-01

    A comprehensive program for severe accident mitigation was completed in Sweden by the end of 1988. As described in this paper, this program included plant modifications such as the introduction of filtered containment venting, and an accident management system comprising emergency operating strategies and procedures, training and emergency drills. The accident management system at Vattenfall has been further developed since 1988 and some results and experience from this development are reported in this paper. The main aspects covered concern the emergency organization and the supporting tools developed for use by the emergency response teams, the radiological implications such as accessibility to various locations and the long-term aspects of accident management. ((orig.))

  12. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  13. Interpolant tree automata and their application in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2016-01-01

    This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way in this ......This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way...... clause verification problems indicates that the combination of interpolant tree automaton with abstract interpretation gives some increase in the power of the verification tool, while sometimes incurring a performance overhead....

  14. The new geospatial tools: global transparency enhancing safeguards verification

    International Nuclear Information System (INIS)

    Pabian, Frank Vincent

    2010-01-01

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  15. Synergies across verification regimes: Nuclear safeguards and chemical weapons convention compliance

    International Nuclear Information System (INIS)

    Kadner, Steven P.; Turpen, Elizabeth

    2001-01-01

    In the implementation of all arms control agreements, accurate verification is essential. In setting a course for verifying compliance with a given treaty - whether the NPT or the CWC, one must make a technical comparison of existing information-gathering capabilities against the constraints in an agreement. Then it must be decided whether this level of verifiability is good enough. Generally, the policy standard of 'effective verification' includes the ability to detect significant violations, with high confidence, in sufficient time to respond effectively with policy adjustments or other responses, as needed. It is at this juncture where verification approaches have traditionally diverged. Nuclear safeguards requirements have taken one path while chemical verification methods have pursued another. However, recent technological advances have brought a number of changes affecting verification, and lately their pace has been accelerating. First, all verification regimes have more and better information as a result of new kinds of sensors, imagery, and other technologies. Second, the verification provisions in agreements have also advanced, to include on-site inspections, portal monitoring, data exchanges, and a variety of transparency, confidence-building, and other cooperative measures, Together these developments translate into a technological overlap of certain institutional verification measures such as the NPT's safeguards requirements and the IAEA and the CWC's verification visions and the OPCW. Hence, a priority of international treaty-implementing organizations is exploring the development of a synergistic and coordinated approach to WMD policy making that takes into account existing inter-linkages between nuclear, chemical, and biological weapons issues. Specific areas of coordination include harmonizing information systems and information exchanges and the shared application of scientific mechanisms, as well as collaboration on technological developments

  16. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  17. System verification and validation report for the TMAD code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the Verification and Validation Report for the TMAD code system, which includes the TMAD code and the LIBMAKR Code. The TMAD code was commissioned to facilitate the interpretation of moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, can also be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code will then interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data. The primary purpose of this document is to provide the results of the system testing and the conclusions based thereon. The results of the testing process are documented in the body of the report. Appendix A gives the test plan, including test procedures, used in conducting the tests. Appendix B lists the input data required to conduct the tests, and Appendices C and 0 list the numerical results of the tests

  18. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    International Nuclear Information System (INIS)

    Kim, Eui Sub; Yoo, Jun Beom; Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo

    2014-01-01

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  19. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of); Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  20. Very fast road database verification using textured 3D city models obtained from airborne imagery

    Science.gov (United States)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models

  1. Logic verification system for power plant sequence diagrams

    International Nuclear Information System (INIS)

    Fukuda, Mitsuko; Yamada, Naoyuki; Teshima, Toshiaki; Kan, Ken-ichi; Utsunomiya, Mitsugu.

    1994-01-01

    A logic verification system for sequence diagrams of power plants has been developed. The system's main function is to verify correctness of the logic realized by sequence diagrams for power plant control systems. The verification is based on a symbolic comparison of the logic of the sequence diagrams with the logic of the corresponding IBDs (interlock Block Diagrams) in combination with reference to design knowledge. The developed system points out the sub-circuit which is responsible for any existing mismatches between the IBD logic and the logic realized by the sequence diagrams. Applications to the verification of actual sequence diagrams of power plants confirmed that the developed system is practical and effective. (author)

  2. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  3. Implementation and verification of global optimization benchmark problems

    Science.gov (United States)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  4. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    Science.gov (United States)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  5. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  6. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  7. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  9. Independent calculation of dose distributions for helical tomotherapy using a conventional treatment planning system

    Energy Technology Data Exchange (ETDEWEB)

    Klüter, Sebastian, E-mail: sebastian.klueter@med.uni-heidelberg.de; Schubert, Kai; Lissner, Steffen; Sterzing, Florian; Oetzel, Dieter; Debus, Jürgen [Department of Radiation Oncology, Heidelberg University Hospital, Im Neuenheimer Feld 400, 69120 Heidelberg, Germany, and Heidelberg Institute for Radiation Oncology (HIRO), Im Neuenheimer Feld 400, 69120 Heidelberg, Germany, and German Consortium for Translational Cancer Research (DKTK), Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Schlegel, Wolfgang [German Cancer Research Center (DKFZ), Im Neuenheimer Feld 280, 69120 Heidelberg (Germany); Oelfke, Uwe [German Cancer Research Center (DKFZ), Im Neuenheimer Feld 280, 69120 Heidelberg, Germany and Joint Department of Physics at The Institute of Cancer Research and The Royal Marsden NHS Foundation Trust, London SM2 5NG (United Kingdom); Nill, Simeon [Joint Department of Physics at The Institute of Cancer Research and The Royal Marsden NHS Foundation Trust, London SM2 5NG (United Kingdom)

    2014-08-15

    Purpose: The dosimetric verification of treatment plans in helical tomotherapy usually is carried out via verification measurements. In this study, a method for independent dose calculation of tomotherapy treatment plans is presented, that uses a conventional treatment planning system with a pencil kernel dose calculation algorithm for generation of verification dose distributions based on patient CT data. Methods: A pencil beam algorithm that directly uses measured beam data was configured for dose calculation for a tomotherapy machine. Tomotherapy treatment plans were converted into a format readable by an in-house treatment planning system by assigning each projection to one static treatment field and shifting the calculation isocenter for each field in order to account for the couch movement. The modulation of the fluence for each projection is read out of the delivery sinogram, and with the kernel-based dose calculation, this information can directly be used for dose calculation without the need for decomposition of the sinogram. The sinogram values are only corrected for leaf output and leaf latency. Using the converted treatment plans, dose was recalculated with the independent treatment planning system. Multiple treatment plans ranging from simple static fields to real patient treatment plans were calculated using the new approach and either compared to actual measurements or the 3D dose distribution calculated by the tomotherapy treatment planning system. In addition, dose–volume histograms were calculated for the patient plans. Results: Except for minor deviations at the maximum field size, the pencil beam dose calculation for static beams agreed with measurements in a water tank within 2%/2 mm. A mean deviation to point dose measurements in the cheese phantom of 0.89% ± 0.81% was found for unmodulated helical plans. A mean voxel-based deviation of −0.67% ± 1.11% for all voxels in the respective high dose region (dose values >80%), and a mean local

  10. Independent calculation of dose distributions for helical tomotherapy using a conventional treatment planning system

    International Nuclear Information System (INIS)

    Klüter, Sebastian; Schubert, Kai; Lissner, Steffen; Sterzing, Florian; Oetzel, Dieter; Debus, Jürgen; Schlegel, Wolfgang; Oelfke, Uwe; Nill, Simeon

    2014-01-01

    Purpose: The dosimetric verification of treatment plans in helical tomotherapy usually is carried out via verification measurements. In this study, a method for independent dose calculation of tomotherapy treatment plans is presented, that uses a conventional treatment planning system with a pencil kernel dose calculation algorithm for generation of verification dose distributions based on patient CT data. Methods: A pencil beam algorithm that directly uses measured beam data was configured for dose calculation for a tomotherapy machine. Tomotherapy treatment plans were converted into a format readable by an in-house treatment planning system by assigning each projection to one static treatment field and shifting the calculation isocenter for each field in order to account for the couch movement. The modulation of the fluence for each projection is read out of the delivery sinogram, and with the kernel-based dose calculation, this information can directly be used for dose calculation without the need for decomposition of the sinogram. The sinogram values are only corrected for leaf output and leaf latency. Using the converted treatment plans, dose was recalculated with the independent treatment planning system. Multiple treatment plans ranging from simple static fields to real patient treatment plans were calculated using the new approach and either compared to actual measurements or the 3D dose distribution calculated by the tomotherapy treatment planning system. In addition, dose–volume histograms were calculated for the patient plans. Results: Except for minor deviations at the maximum field size, the pencil beam dose calculation for static beams agreed with measurements in a water tank within 2%/2 mm. A mean deviation to point dose measurements in the cheese phantom of 0.89% ± 0.81% was found for unmodulated helical plans. A mean voxel-based deviation of −0.67% ± 1.11% for all voxels in the respective high dose region (dose values >80%), and a mean local

  11. IAEA inspectors complete verification of nuclear material in Iraq

    International Nuclear Information System (INIS)

    2004-01-01

    Full text: At the request of the Government of Iraq and pursuant to the NPT Safeguards Agreement with Iraq, a team of IAEA safeguards inspectors has completed the annual Physical Inventory Verification of declared nuclear material in Iraq, and is returning to Vienna. The material - natural or low-enriched uranium - is not sensitive from a proliferation perspective and is consolidated at a storage facility near the Tuwaitha complex, south of Baghdad. This inspection was conducted with the logistical and security assistance of the Multinational Force and the Office of the UN Security Coordinator. Inspections such as this are required by safeguards agreements with every non-nuclear-weapon state party to the NPT that has declared holdings of nuclear material, to verify the correctness of the declaration, and that material has not been diverted to any undeclared activity. Such inspections have been performed in Iraq on a continuing basis. The most recent took place in June 2003, following reports of looting of nuclear material at the Tuwaitha complex; IAEA inspectors recovered, repackaged and resealed all but a minute amount of material. NPT safeguards inspections are limited in scope and coverage as compared to the verification activities carried out in 1991-98 and 2002-03 by the IAEA under Security Council resolution 687 and related resolutions. 'This week's mission was a good first step,' IAEA Director General Mohamed ElBaradei said. 'Now we hope to be in a position to complete the mandate entrusted to us by the Security Council, to enable the Council over time to remove all sanctions and restrictions imposed on Iraq - so that Iraq's rights as a full-fledged member of the international community can be restored.' The removal of remaining sanctions is dependent on completion of the verification process by the IAEA and the UN Monitoring, Verification and Inspection Commission (UNMOVIC). It should be noted that IAEA technical assistance to Iraq has been resumed over

  12. 12 CFR 1261.6 - Nominations for member and independent directorships.

    Science.gov (United States)

    2010-01-01

    ... areas: auditing and accounting, derivatives, financial management, organizational management, project... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Nominations for member and independent... § 1261.6 Nominations for member and independent directorships. (a) Election announcement. Within a...

  13. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  14. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security.

  15. Management of radioisotope, radiation generator and fuel materials for independent administrative corporations of national university

    International Nuclear Information System (INIS)

    2003-03-01

    This report states the situation, problems and proposal of management of radioisotope, radiation generator and fuel materials by independent administrative corporations of national universities. Four proposals are stated as followings; 1) in order to improve management of radioisotope, radiation generator, fuel materials and X-ray in the universities, organization and definition of the control department in each university and accident measures have to be decided. The middle object and plan should be needed. An appropriate management for proceeding researches should be discussed by closer connection of universities in the country. 2) The budget for safety control has to be identified at distribution of budget of each national university corporations. The insurance method is needed to be discussed. 3) The department in the MEXT (Ministry of Education, Culture, Sports, Science and Technology) should be enriched to support researches and safety control of the staff and students. 4) The system, which carries out treatment and disposal of disuse materials and keeps them under the responsibility of the nation, is necessary. (S.Y.)

  16. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  17. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  18. Verification of product design using regulation knowledge base and Web services

    International Nuclear Information System (INIS)

    Kim, Ik June; Lee, Jae Chul; Mun Du Hwan; Kim, Byung Chul; Hwang, Jin Sang; Lim, Chae Ho

    2015-01-01

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  19. Compromises produced by the dialectic between self-verification and self-enhancement.

    Science.gov (United States)

    Morling, B; Epstein, S

    1997-12-01

    Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.

  20. Calibration and verification of surface contamination meters --- Procedures and techniques

    International Nuclear Information System (INIS)

    Schuler, C; Butterweck, G.; Wernli, C.; Bochud, F.; Valley, J.-F.

    2007-03-01

    A standardised measurement procedure for surface contamination meters (SCM) is presented. The procedure aims at rendering surface contamination measurements to be simply and safely interpretable. Essential for the approach is the introduction and common use of the radionuclide specific quantity 'guideline value' specified in the Swiss Radiation Protection Ordinance as unit for the measurement of surface activity. The according radionuclide specific 'guideline value count rate' can be summarized as verification reference value for a group of radionuclides ('basis guideline value count rate'). The concept can be generalized for SCM of the same type or for SCM of different types using he same principle of detection. A SCM multi source calibration technique is applied for the determination of the instrument efficiency. Four different electron radiation energy regions, four different photon radiation energy regions and an alpha radiation energy region are represented by a set of calibration sources built according to ISO standard 8769-2. A guideline value count rate representing the activity per unit area of a surface contamination of one guideline value can be calculated for any radionuclide using instrument efficiency, radionuclide decay data, contamination source efficiency, guideline value averaging area (100 cm 2 ), and radionuclide specific guideline value. n this way, instrument responses for the evaluation of surface contaminations are obtained for radionuclides without available calibration sources as well as for short-Iived radionuclides, for which the continuous replacement of certified calibration sources can lead to unreasonable costs. SCM verification is based on surface emission rates of reference sources with an active area of 100 cm 2 . The verification for a given list of radionuclides is based on the radionuclide specific quantity guideline value count rate. Guideline value count rates for groups of radionuclides can be represented within the maximum

  1. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  2. Intelligent energy management control for independent microgrid

    Indian Academy of Sciences (India)

    Energy management control; multi-agent system; microgrid; energy forecast; hybrid power ... power to the local load most of the time in this energy management strategy. ... Electrical and Electronics Engineering Department, PSG College of ...

  3. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  4. Verification and Validation of RADTRAN 5.5.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  5. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  6. Report on the Trilateral Initiative. IAEA verification of weapon-origin material in the Russian Federation and the United States

    International Nuclear Information System (INIS)

    Shea, Thomas E.

    2001-01-01

    Just over five years ago, the Trilateral Initiative was launched to investigate the technical, legal and financial issues associated with IAEA verification of weapon-origin fissile material in the Russian Federation and the United States. Since then, the Joint Working Group has developed concepts and equipment suitable for such a verification mission, anticipating that the States would submit classified forms of fissile material to IAEA verification under new agreements developed for this purpose. This article summarizes the accomplishments to date and identifies the future steps foreseen under the Trilateral Initiative. As there is no legal commitment on the Parties to this Initiative as yet, the issues considered are still changing. Since it was launched, the Initiative has been given a sense of importance and weight, raising the expectations of the international community. The Final Document of the 2000 Conference on the Treaty on the Non-Proliferation of Nuclear Weapons (NPT), for example, under the review of Article VI of the Treaty, includes the statement to 'complete and implement the Trilateral Initiative'. It was launched following independent statements by the President of the United States beginning in 1993, and by the President of the Russian Federation in 1996. It is an Initiative between the IAEA, the Russian Federation and the United States that is in the context of Article VI of the NPT. The intention is to examine the technical, legal and financial issues associated with IAEA verification of weapon origin and other fissile material released from defense programmes in those two countries

  7. Implementation and verification of global optimization benchmark problems

    Directory of Open Access Journals (Sweden)

    Posypkin Mikhail

    2017-12-01

    Full Text Available The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its’ gradient at a given point and the interval estimates of a function and its’ gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  8. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  9. Automatic verification of a lip-synchronisation protocol using Uppaal

    NARCIS (Netherlands)

    Bowman, H.; Faconti, G.; Katoen, J.-P.; Latella, D.; Massink, M.

    1998-01-01

    We present the formal specification and verification of a lip-synchronisation protocol using the real-time model checker Uppaal. A number of specifications of this protocol can be found in the literature, but this is the first automatic verification. We take a published specification of the

  10. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  11. Systematic study of source mask optimization and verification flows

    Science.gov (United States)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  12. Viability Study for an Unattended UF_6 Cylinder Verification Station: Phase I Final Report

    International Nuclear Information System (INIS)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.; Branney, Sean; McDonald, Benjamin S.; Webster, Jennifer B.; Zalavadia, Mital A.; Todd, Lindsay C.; Kulisek, Jonathan A.; Nordquist, Heather; Deshmukh, Nikhil S.; Stewart, Scott

    2016-01-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, "2"3"5U mass, total uranium mass and identification for all declared UF_6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 ''typical'' Type 30B cylinders, and the viability of an ''NDA Fingerprint'' concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument

  13. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  14. The role of the United Nations in the field of verification

    International Nuclear Information System (INIS)

    1991-01-01

    By resolution 43/81 B of 7 December 1988, the General Assembly requested the Secretary General to undertake, with the assistance of a group of qualified governmental experts, an in-depth study of the role of the United Nations in the field of verification. In August 1990, the Secretary-General transmitted to the General Assembly the unanimously approved report of the experts. The report is structured in six chapters and contains a bibliographic appendix on technical aspects of verification. The Introduction provides a brief historical background on the development of the question of verification in the United Nations context, culminating with the adoption by the General Assembly of resolution 43/81 B, which requested the study. Chapters II and III address the definition and functions of verification and the various approaches, methods, procedures and techniques used in the process of verification. Chapters IV and V examine the existing activities of the United Nations in the field of verification, possibilities for improvements in those activities as well as possible additional activities, while addressing the organizational, technical, legal, operational and financial implications of each of the possibilities discussed. Chapter VI presents the conclusions and recommendations of the Group

  15. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  16. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  17. Verification of a quality management theory: using a delphi study.

    Science.gov (United States)

    Mosadeghrad, Ali Mohammad

    2013-11-01

    A model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. The proposed model was further developed using feedback from thirty quality management experts using a Delphi method. Further, a guidebook for its implementation was prepared including a road map and performance measurement. The research led to the development of a context-specific model of quality management for healthcare organisations and a series of guidelines for its implementation. A proper model of quality management should be developed and implemented properly in healthcare organisations to achieve business excellence.

  18. Fault Management Architectures and the Challenges of Providing Software Assurance

    Science.gov (United States)

    Savarino, Shirley; Fitz, Rhonda; Fesq, Lorraine; Whitman, Gerek

    2015-01-01

    The satellite systems Fault Management (FM) is focused on safety, the preservation of assets, and maintaining the desired functionality of the system. How FM is implemented varies among missions. Common to most is system complexity due to a need to establish a multi-dimensional structure across hardware, software and operations. This structure is necessary to identify and respond to system faults, mitigate technical risks and ensure operational continuity. These architecture, implementation and software assurance efforts increase with mission complexity. Because FM is a systems engineering discipline with a distributed implementation, providing efficient and effective verification and validation (VV) is challenging. A breakout session at the 2012 NASA Independent Verification Validation (IVV) Annual Workshop titled VV of Fault Management: Challenges and Successes exposed these issues in terms of VV for a representative set of architectures. NASA's IVV is funded by NASA's Software Assurance Research Program (SARP) in partnership with NASA's Jet Propulsion Laboratory (JPL) to extend the work performed at the Workshop session. NASA IVV will extract FM architectures across the IVV portfolio and evaluate the data set for robustness, assess visibility for validation and test, and define software assurance methods that could be applied to the various architectures and designs. This work focuses efforts on FM architectures from critical and complex projects within NASA. The identification of particular FM architectures, visibility, and associated VVIVV techniques provides a data set that can enable higher assurance that a satellite system will adequately detect and respond to adverse conditions. Ultimately, results from this activity will be incorporated into the NASA Fault Management Handbook providing dissemination across NASA, other agencies and the satellite community. This paper discusses the approach taken to perform the evaluations and preliminary findings from the

  19. Information handbook on independent spent fuel storage installations

    International Nuclear Information System (INIS)

    Raddatz, M.G.; Waters, M.D.

    1996-12-01

    In this information handbook, the staff of the U.S. Nuclear Regulatory Commission describes (1) background information regarding the licensing and history of independent spent fuel storage installations (ISFSIs), (2) a discussion of the licensing process, (3) a description of all currently approved or certified models of dry cask storage systems (DCSSs), and (4) a description of sites currently storing spent fuel in an ISFSI. Storage of spent fuel at ISFSIs must be in accordance with the provisions of 10 CFR Part 72. The staff has provided this handbook for information purposes only. The accuracy of any information herein is not guaranteed. For verification or for more details, the reader should refer to the respective docket files for each DCSS and ISFSI site. The information in this handbook is current as of September 1, 1996

  20. 45 CFR 1626.7 - Verification of eligible alien status.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...