WorldWideScience

Sample records for verification test dvt

  1. AXAF-I Low Intensity-Low Temperature (LILT) Testing of the Development Verification Test (DVT) Solar Panel

    Science.gov (United States)

    Alexander, Doug; Edge, Ted; Willowby, Doug

    1998-01-01

    The planned orbit of the AXAF-I spacecraft will subject the spacecraft to both short, less than 30 minutes for solar and less than 2 hours for lunar, and long earth eclipses and lunar eclipses with combined conjunctive duration of up to 3 to 4 hours. Lack of proper Electrical Power System (EPS) conditioning prior to eclipse may cause loss of mission. To avoid this problem, for short eclipses, it is necessary to off-point the solar array prior to or at the beginning of the eclipse to reduce the battery state of charge (SOC). This yields less overcharge during the high charge currents at sun entry. For long lunar eclipses, solar array pointing and load scheduling must be tailored for the profile of the eclipse. The battery SOC, loads, and solar array current-voltage (I-V) must be known or predictable to maintain the bus voltage within acceptable range. To address engineering concerns about the electrical performance of the AXAF-I solar array under Low Intensity and Low Temperature (LILT) conditions, Marshall Space Flight Center (MSFC) engineers undertook special testing of the AXAF-I Development Verification Test (DVT) solar panel in September-November 1997. In the test the DVT test panel was installed in a thermal vacuum chamber with a large view window with a mechanical "flapper door". The DVT test panel was "flash" tested with a Large Area Pulse Solar Simulator (LAPSS) at various fractional sun intensities and panel (solar cell) temperatures. The testing was unique with regards to the large size of the test article and type of testing performed. The test setup, results, and lessons learned from the testing will be presented.

  2. In too deep: understanding, detecting and managing DVT.

    Science.gov (United States)

    Meetoo, Danny

    Venous thromboembolism (VTE), which includes deep vein thrombosis (DVT) and pulmonary embolism (PE), is a serious health and social care problem of the developed world, affecting 1 in 1000 adults every year, and with an annual financial overhead of approximately £640 million. The nature of DVT means that often the condition can go unrecognized until the thrombus becomes an embolus. The pathogenesis of DVT continues to be based on Virchow's triad, which attributes VTE to 'hypercoagulability', 'stasis' and 'intimal injury'. The diagnosis of DVT is often the result of a number of tests performed either sequentially or in combination before mechanical and/or chemical treatment is embarked on. Creating public awareness of DVT and PE is the best way to prevent this condition. Nurses are in an ideal position to discuss the importance of lifestyle changes and other related measures to prevent DVT.

  3. Analysis of 1,338 Patients with Acute Lower Limb Deep Venous Thrombosis (DVT) Supports the Inadequacy of the Term "Proximal DVT".

    Science.gov (United States)

    De Maeseneer, M G R; Bochanen, N; van Rooijen, G; Neglén, P

    2016-03-01

    For decades acute lower limb deep venous thrombosis (DVT) has been subdivided into distal DVT (isolated to the calf veins) and proximal DVT (extending above calf vein level). The aim of this study was to analyse the anatomical site and extent of thrombus in a large cohort of patients with acute DVT. A retrospective analysis of all patients aged >18 years, presenting with unilateral DVT according to duplex ultrasound investigation was performed at the University Hospital of Antwerp, Belgium (1994-2012). The anatomical site and extent of thrombus was registered and subdivided into five segments: calf veins (segment 1), popliteal vein (segment 2), femoral vein (segment 3), common femoral vein (segment 4), and iliac veins, with or without inferior vena cava (segment 5). The median age of the 1,338 patients (50% male) included was 62 years (range 18-98 years). Left sided DVT was predominant (57%). DVT was limited to one segment in 443 patients, of whom 370 had DVT isolated to the calf veins (28% of total cohort). In 968 patients with what was previously called "proximal DVT", the median number of affected segments was three (range 1-5 segments). In this group iliofemoral DVT (at least involving segment four and/or five) was present in 506 patients (38% of total cohort), whereas the remaining patients had femoropopliteal DVT (at least in segment two and/or three but not in four or five). Iliofemoral DVT without thrombus in segments one and two was present in 160 patients (12% of total cohort). This study illustrates the large diversity of thrombus distribution in patients previously described as having "proximal DVT". Therefore, this term should be abandoned and replaced with iliofemoral and femoropopliteal DVT. Patients with iliofemoral DVT (38%) could be considered for early clot removal; 12% of all patients with DVT would be ideal candidates for such intervention. Copyright © 2015 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.

  4. Where is the most common site of DVT? Evaluation by CT venography

    International Nuclear Information System (INIS)

    Yoshimura, Norihiko; Hori, Yoshiro; Horii, Yosuke; Takano, Toru; Ishikawa, Hiroyuki; Aoyama, Hidefumi

    2012-01-01

    Our aim was to clarify the common site of deep venous thrombosis (DVT) in patients suspected of having pulmonary embolism using computed tomography pulmonary angiography with computed tomography venography (CTV). We evaluated 215 patients. For all studies, 100 ml of 370 mg I/ml nonionic contrast material was administered. CTV were scanned with helical acquisition starting at 3 min in four-slice multidetector-row computed tomography (MDCT) or 5 min in 64-MDCT after the start of contrast material injection. The site of DVT was divided into iliac vein, femoral vein, popliteal vein, or calf vein. Calf vein was divided into muscular (soleal and gastrocnemius) and nonmuscular (anterior/posterior tibial and peroneal) veins. The 2 x 2 chi-square test was used. One hundred and thirty-seven patients showed DVT; the muscular calf vein was more prevalent than other veins (P<0.01). Our study showed that the most common site of DVT was the muscular calf vein. (author)

  5. A profile of lower-limb deep-vein thrombosis: the hidden menace of below-knee DVT

    Energy Technology Data Exchange (ETDEWEB)

    Cowell, G.W. [Department of Clinical Radiology, Royal Infirmary of Edinburgh, Edinburgh (United Kingdom); Reid, J.H. [Department of Clinical Radiology, Borders General Hospital, Melrose (United Kingdom); Simpson, A.J. [Department of Respiratory Medicine, Royal Infirmary of Edinburgh, Edinburgh (United Kingdom); Murchison, J.T. [Department of Clinical Radiology, Royal Infirmary of Edinburgh, Edinburgh (United Kingdom)]. E-mail: john.murchison@luht.scot.nhs.uk

    2007-09-15

    Aims: To describe the anatomical site and laterality of deep-vein thrombosis (DVT) in symptomatic patients using contrast venography (CV), and to assess age, sex distribution, and accuracy of pre-test clinical suspicion of DVT. Methods: One thousand, five hundred and seventy-two patients undergoing CV because of a clinical suspicion of DVT at a large teaching hospital from October 1995 to March 2003 were prospectively studied. Results: Thrombi were demonstrated in 511 (32.5%) of all CV studies. Isolated, below-knee thrombi were identified in 29.4% of positive studies. There was a left-sided predominance of DVT (ratio 1.24:1) that was most evident in the elderly and in more proximal veins. Conclusion: Almost a third of positive cases were shown to be isolated, below-knee thrombi. These are thrombi that are more difficult to detect by non-invasive means. A left-sided predominance of DVT is evident.

  6. A profile of lower-limb deep-vein thrombosis: the hidden menace of below-knee DVT

    International Nuclear Information System (INIS)

    Cowell, G.W.; Reid, J.H.; Simpson, A.J.; Murchison, J.T.

    2007-01-01

    Aims: To describe the anatomical site and laterality of deep-vein thrombosis (DVT) in symptomatic patients using contrast venography (CV), and to assess age, sex distribution, and accuracy of pre-test clinical suspicion of DVT. Methods: One thousand, five hundred and seventy-two patients undergoing CV because of a clinical suspicion of DVT at a large teaching hospital from October 1995 to March 2003 were prospectively studied. Results: Thrombi were demonstrated in 511 (32.5%) of all CV studies. Isolated, below-knee thrombi were identified in 29.4% of positive studies. There was a left-sided predominance of DVT (ratio 1.24:1) that was most evident in the elderly and in more proximal veins. Conclusion: Almost a third of positive cases were shown to be isolated, below-knee thrombi. These are thrombi that are more difficult to detect by non-invasive means. A left-sided predominance of DVT is evident

  7. DVT presentations to an emergency department: a study of guideline based care and decision making

    LENUS (Irish Health Repository)

    Lillis, D

    2016-02-01

    Pre-test probability scoring and blood tests for deep venous thrombosis (DVT) assessment are sensitive, but not specific leading to increased demands on radiology services. Three hundred and eighty-five patients presenting to an Emergency Department (ED), with suspected DVT, were studied to explore our actual work-up of patients with possible DVT relating to risk stratification, further investigation and follow up. Of the 205 patients with an initially negative scan, 36 (17.6%) were brought for review to the ED Consultant clinic. Thirty-four (16.6%) patients underwent repeat compression ultrasound with 5 (2.4%) demonstrating a DVT on the second scan. Repeat compression ultrasound scans were performed on 34 (16.6%) patients with an initially negative scan, with essentially the same diagnostic yield as other larger studies where 100% of such patients had repeat scanning. Where there is ongoing concern, repeat above-knee compression ultrasound within one week will pick up a small number of deep venous thromboses.

  8. DVT surveillance program in the ICU: analysis of cost-effectiveness.

    Directory of Open Access Journals (Sweden)

    Ajai K Malhotra

    Full Text Available BACKGROUND: Venous Thrombo-embolism (VTE--Deep venous thrombosis (DVT and/or pulmonary embolism (PE--in traumatized patients causes significant morbidity and mortality. The current study evaluates the effectiveness of DVT surveillance in reducing PE, and performs a cost-effectiveness analysis. METHODS: All traumatized patients admitted to the adult ICU underwent twice weekly DVT surveillance by bilateral lower extremity venous Duplex examination (48-month surveillance period--SP. The rates of DVT and PE were recorded and compared to the rates observed in the 36-month pre-surveillance period (PSP. All patients in both periods received mechanical and pharmacologic prophylaxis unless contraindicated. Total costs--diagnostic, therapeutic and surveillance--for both periods were recorded and the incremental cost for each Quality Adjusted Life Year (QALY gained was calculated. RESULTS: 4234 patients were eligible (PSP--1422 and SP--2812. Rate of DVT in SP (2.8% was significantly higher than in PSP (1.3% - p<0.05, and rate of PE in SP (0.7% was significantly lower than that in PSP (1.5% - p<0.05. Logistic regression demonstrated that surveillance was an independent predictor of increased DVT detection (OR: 2.53 - CI: 1.462-4.378 and decreased PE incidence (OR: 0.487 - CI: 0.262-0.904. The incremental cost was $509,091/life saved in the base case, translating to $29,102/QALY gained. A sensitivity analysis over four of the parameters used in the model indicated that the incremental cost ranged from $18,661 to $48,821/QALY gained. CONCLUSIONS: Surveillance of traumatized ICU patients increases DVT detection and reduces PE incidence. Costs in terms of QALY gained compares favorably with other interventions accepted by society.

  9. Risk of deep venous thrombosis (DVT) in bedridden or wheelchair-bound multiple sclerosis patients: a prospective study.

    Science.gov (United States)

    Arpaia, G; Bavera, P M; Caputo, D; Mendozzi, L; Cavarretta, R; Agus, G B; Milani, M; Ippolito, E; Cimminiello, C

    2010-04-01

    Multiple sclerosis (MS) often causes progressive loss of mobility, leading to limb paralysis. Venous and lymphatic stasis is a risk condition for venous thromboembolism (VTE). There is, however, no data on the frequency of VTE complicating the progression of MS. The aim of this study was to assess the frequency of deep vein thrombosis (DVT) in patients with late-stage MS attending a neurology center for rehabilitation. A total of 132 patients with MS were enrolled, 87 women and 45 men, mean age 58+/-11 years. The disease had started on average 18.7 years before; patients reported 9.6 hours bedridden per day or 14.3 hours wheelchair-bound. Only 25 patients reported a residual ability to walk alone or with help. Lower limb edema was present in 113 patients, bilateral in 41 cases. At admission all patients underwent extended compression ultrasonography. Their plasma D-dimer levels were measured. No antithrombotic prophylaxis was given. DVT was found in 58 patients (43.9%); 32 had a history of VTE. Forty of these patients (69%) had chronic lower limb edema, in 19 cases bilateral. D-dimer levels in the DVT patients were significantly higher than in patients without DVT (553+/-678 vs. 261+/-152 ng/mL, p=0.0112, Mann-Whitney Test). Nearly half the DVT patients (26, 45%) had high D-dimer levels (701+/-684 ng/mL). Of the 74 patients without DVT, 48 had normal D-dimer (193.37+/-67.28 ng/mL) and 26 high (387.61+/-187.42 ng/mL). The frequency of DVT in late-stage MS may be over 40%. The long history of the disease means the onset of each episode cannot be established with certainty. A number of patients with positive CUS findings had negative D-dimer values, suggesting a VTE event in the past. However, the level of DVT risk in this series should lead physicians to consider the systematic application of long-term preventive measures. (c) 2009 Elsevier Ltd. All rights reserved.

  10. Duplex Ultrasonography Has Limited Utility in Detection of Postoperative DVT After Primary Total Joint Arthroplasty.

    Science.gov (United States)

    Vira, Shaleen; Ramme, Austin J; Alaia, Michael J; Steiger, David; Vigdorchik, Jonathan M; Jaffe, Frederick

    2016-07-01

    Duplex ultrasound is routinely used to evaluate suspected deep venous thrombosis after total joint arthroplasty. When there is a clinical suspicion for a pulmonary embolism, a chest angiogram (chest CTA) is concomitantly obtained. Two questions were addressed: First, for the population of patients who receive duplex ultrasound after total joint arthroplasty, what is the rate of positive results? Second, for these patients, how many of these also undergo chest CTA for clinical suspicion of pulmonary embolus and how many of these tests are positive? Furthermore, what is the correlation between duplex ultrasound results and chest CTA results? A retrospective chart review was conducted of total joint replacement patients in 2011 at a single institution. Inclusion criteria were adult patients who underwent a postoperative duplex ultrasonography for clinical suspicion of deep venous thrombosis (DVT). Demographic data, result of duplex scan, clinical indications for obtaining the duplex scan, and DVT prophylaxis used were recorded. Additionally, if a chest CTA was obtained for clinical suspicion for pulmonary embolus, results and clinical indication for obtaining the test were recorded. The rate of positive results for duplex ultrasonography and chest CTA was computed and correlated based on clinical indications. Two hundred ninety-five patients underwent duplex ultrasonography of which only 0.7% were positive for a DVT. One hundred three patients underwent a chest CTA for clinical suspicion of a pulmonary embolism (PE) of which 26 revealed a pulmonary embolus, none of which had a positive duplex ultrasound. Postoperative duplex scans have a low rate of positive results. A substantial number of patients with negative duplex results subsequently underwent chest CTA for clinical suspicion for which a pulmonary embolus was found, presumably resulting from a DVT despite negative duplex ultrasound result. A negative duplex ultrasonography should not rule out the presence of a

  11. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  12. Epidemiologic study of patients with DVT in Birjand Vali-e-asr hospital- (2009-2014: Short Communication

    Directory of Open Access Journals (Sweden)

    Toba Kazemi

    2016-04-01

    Full Text Available Background and Aim: Deep vein thrombosis (DVT is a condition that, in case of delay in diagnosis and treatment, can lead to serious complications like pulmonary embolism. Given the importance of assessment and identification of diseases in every community, the current study aimed at assessing the epidemiology of DVT patients in Birjand. Materials and Methods: The present descriptive-analytical study was conducted on all DVT patients admitted to Birjand Vali-e-asr hospital between 2009 and 2014. A trained medical student completed each researcher-designed questionnaire. based on an intern’s history recording, a physician's orders ,and a nurse’s note. Then, the patients were called up demanding the status of the patient and disease complications, readmission ,or death. Finally, the obtained data was encoded and analyzed by SPSS(V: 18 at the significant level P<0.05. Results: During the study period,263 patients with DVThad been hospitalized in Birjand Vali-e-asr hospital .Out of the patients, 50.2% were males. Mean age of the subjects was 55.84 ± 18.45 years. In 98.1% of the cases the lower extremity was involved. The most prevalent risk factor was immobilization and the least risk factor was family history of DVT. Regarding the relationship between DVT risk factors and sex only smoking cigarettes was both significant and more prevalent. During 5 years, 3.8% of the population had died due to DVT complications. Recurrent DVT in 6% and pulmonary emboli in 3.4% of the patients were diagnosed. Conclusion: Given that the most common risk factor for DVT in our study was immobilization, prophylaxis is necessary in patients at high risk tin order to decrease occurrence possibility of DVT.

  13. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  14. VEG-01: Veggie Hardware Verification Testing

    Science.gov (United States)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  15. Measurements of the local dose with regard to digital volume tomography (DVT) for the purpose of assessing deviations between phantom and human anatomy; Ortsdosismessungen an einer digitalen Volumen-Tomographieeinrichtung (DVT) hinsichtlich der Unterschiede zwischen einem Phantom und der menschlichen Anatomie

    Energy Technology Data Exchange (ETDEWEB)

    Neuwirth, J.; Hefner, A.; Ernst, G. [Austrian Research Centers (ARC), Radiation Safety and Applications (Austria)

    2009-07-01

    In Dental Radiography Digital Volume Tomography (DVT) gains more and more importance due to its possibilities of three-dimensional imaging of teeth, jaw and the reduced radiation dose in comparison to conventional Computer tomography (CT). Contrary to other, well documented radiographic procedures like dental panorama X-ray imaging there are no national or international guidelines or recommendations relating to DVT which regulate the designation of areas and standardize risk assessment. This study aims to asses the parameters necessary for local radiation protection in dental practices. This paper describes the results of Measurements, which are carried out in dental practices in order to evaluate the local dose in varied distances by different rotation times of DVT devices. A (orig.)

  16. CATS Deliverable 5.1 : CATS verification of test matrix and protocol

    OpenAIRE

    Uittenbogaard, J.; Camp, O.M.G.C. op den; Montfort, S. van

    2016-01-01

    This report summarizes the work conducted within work package (WP) 5 "Verification of test matrix and protocol" of the Cyclist AEB testing system (CATS) project. It describes the verification process of the draft CATS test matrix resulting from WP1 and WP2, and the feasibility of meeting requirements set by CATS consortium based on requirements in Euro NCAP AEB protocols regarding accuracy, repeatability and reproducibility using the developed test hardware. For the cases where verification t...

  17. Skater Tara Lipinski Speaks Out About DVT | NIH MedlinePlus the Magazine

    Science.gov (United States)

    ... don’t exercise or who are sick and bedridden.” I just never thought that a teenager—especially ... DVT is not just a condition of the elderly. It can strike anyone in any physical condition ...

  18. Verification tests for CANDU advanced fuel

    International Nuclear Information System (INIS)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D.

    1997-07-01

    For the development of a CANDU advanced fuel, the CANFLEX-NU fuel bundles were tested under reactor operating conditions at the CANDU-Hot test loop. This report describes test results and test methods in the performance verification tests for the CANFLEX-NU bundle design. The main items described in the report are as follows. - Fuel bundle cross-flow test - Endurance fretting/vibration test - Freon CHF test - Production of technical document. (author). 25 refs., 45 tabs., 46 figs

  19. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  20. Development of the advanced PHWR technology -Verification tests for CANDU advanced fuel-

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Jang Hwan; Suk, Hoh Chun; Jung, Moon Kee; Oh, Duk Joo; Park, Joo Hwan; Shim, Kee Sub; Jang, Suk Kyoo; Jung, Heung Joon; Park, Jin Suk; Jung, Seung Hoh; Jun, Ji Soo; Lee, Yung Wook; Jung, Chang Joon; Byun, Taek Sang; Park, Kwang Suk; Kim, Bok Deuk; Min, Kyung Hoh [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    This is the `94 annual report of the CANDU advanced fuel verification test project. This report describes the out-of pile hydraulic tests at CANDU-hot test loop for verification of CANFLEX fuel bundle. It is also describes the reactor thermal-hydraulic analysis for thermal margin and flow stability. The contents in this report are as follows; (1) Out-of pile hydraulic tests for verification of CANFLEX fuel bundle. (a) Pressure drop tests at reactor operation condition (b) Strength test during reload at static condition (c) Impact test during reload at impact load condition (d) Endurance test for verification of fuel integrity during life time (2) Reactor thermal-hydraulic analysis with CANFLEX fuel bundle. (a) Critical channel power sensitivity analysis (b) CANDU-6 channel flow analysis (c) Flow instability analysis. 61 figs, 29 tabs, 21 refs. (Author).

  1. Development of the advanced PHWR technology -Verification tests for CANDU advanced fuel-

    International Nuclear Information System (INIS)

    Jung, Jang Hwan; Suk, Hoh Chun; Jung, Moon Kee; Oh, Duk Joo; Park, Joo Hwan; Shim, Kee Sub; Jang, Suk Kyoo; Jung, Heung Joon; Park, Jin Suk; Jung, Seung Hoh; Jun, Ji Soo; Lee, Yung Wook; Jung, Chang Joon; Byun, Taek Sang; Park, Kwang Suk; Kim, Bok Deuk; Min, Kyung Hoh

    1995-07-01

    This is the '94 annual report of the CANDU advanced fuel verification test project. This report describes the out-of pile hydraulic tests at CANDU-hot test loop for verification of CANFLEX fuel bundle. It is also describes the reactor thermal-hydraulic analysis for thermal margin and flow stability. The contents in this report are as follows; (1) Out-of pile hydraulic tests for verification of CANFLEX fuel bundle. (a) Pressure drop tests at reactor operation condition (b) Strength test during reload at static condition (c) Impact test during reload at impact load condition (d) Endurance test for verification of fuel integrity during life time (2) Reactor thermal-hydraulic analysis with CANFLEX fuel bundle. (a) Critical channel power sensitivity analysis (b) CANDU-6 channel flow analysis (c) Flow instability analysis. 61 figs, 29 tabs, 21 refs. (Author)

  2. CATS Deliverable 5.1 : CATS verification of test matrix and protocol

    NARCIS (Netherlands)

    Uittenbogaard, J.; Camp, O.M.G.C. op den; Montfort, S. van

    2016-01-01

    This report summarizes the work conducted within work package (WP) 5 "Verification of test matrix and protocol" of the Cyclist AEB testing system (CATS) project. It describes the verification process of the draft CATS test matrix resulting from WP1 and WP2, and the feasibility of meeting

  3. Sierra/SolidMechanics 4.48 Verification Tests Manual.

    Energy Technology Data Exchange (ETDEWEB)

    Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose; Le, San; Littlewood, David John; Merewether, Mark Thomas; Mosby, Matthew David; Pierson, Kendall H.; Porter, Vicki L.; Shelton, Timothy; Thomas, Jesse David; Tupek, Michael R.; Veilleux, Michael; Xavier, Patrick G.

    2018-03-01

    Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This document can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.

  4. DVT prophylaxis: better living through chemistry: affirms.

    Science.gov (United States)

    Pellegrini, Vincent D

    2010-09-07

    Venous thromboembolism remains the most common cause of hospital readmission and death after total joint arthroplasty. The 2008 American College of Chest Physicians (ACCP) guidelines, based on prospective randomized clinical trials with a venography endpoint, endorse the use of low-molecular-weight heparin, fondaparinux, or adjusted dose warfarin (target international normalized ratio, 2.5; range, 2-3) for up to 35 days after total hip arthroplasty (THA) and total knee arthroplasty (TKA). In the past, the ACCP has recommended against the use of aspirin, graduated compression stockings, or venous compression devices as the sole means of prophylaxis, but in 2008 they first recommended the "optimal use of mechanical thromboprophylaxis with venous foot pumps or intermittent pneumatic compression devices" in patients undergoing total joint arthroplasty who "have a high risk of bleeding." When the high risk subsides, pharmacologic thromboprophylaxis is substituted for, or added to, mechanical methods. Fractionated heparins and pentasaccharide are the most effective agents in reducing venographic deep venous thrombosis (DVT) after total joint arthroplasty with residual clot rates rates. Low-intensity warfarin (target international normalized ratio, 2.0) combines safety (bleeding rates exchange for a lower bleeding rate; genetic testing will likely simplify warfarin use and reduce outlier responders. Copyright 2010, SLACK Incorporated.

  5. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  6. Combined use of clinical pre-test probability and D-dimer test in the diagnosis of preoperative deep venous thrombosis in colorectal cancer patients

    DEFF Research Database (Denmark)

    Stender, Mogens; Frøkjaer, Jens Brøndum; Hagedorn Nielsen, Tina Sandie

    2008-01-01

    The preoperative prevalence of deep venous thrombosis (DVT) in patients with colorectal cancer may be as high as 8%. In order to minimize the risk of pulmonary embolism, it is important to rule out preoperative DVT. A large study has confirmed that a negative D-dimer test in combination with a low...... preoperative DVT in colorectal cancer patients admitted for surgery. Preoperative D-dimer test and compression ultrasonography for DVT were performed in 193 consecutive patients with newly diagnosed colorectal cancer. Diagnostic accuracy indices of the D-dimer test were assessed according to the PTP score...... in ruling out preoperative DVT in colorectal cancer patients admitted for surgery....

  7. High Performance Electrical Modeling and Simulation Verification Test Suite - Tier I; TOPICAL

    International Nuclear Information System (INIS)

    SCHELLS, REGINA L.; BOGDAN, CAROLYN W.; WIX, STEVEN D.

    2001-01-01

    This document describes the High Performance Electrical Modeling and Simulation (HPEMS) Global Verification Test Suite (VERTS). The VERTS is a regression test suite used for verification of the electrical circuit simulation codes currently being developed by the HPEMS code development team. This document contains descriptions of the Tier I test cases

  8. Verification tests for remote controlled inspection system in nuclear power plants

    International Nuclear Information System (INIS)

    Kohno, Tadaaki

    1986-01-01

    Following the increase of nuclear power plants, the total radiation exposure dose accompanying inspection and maintenance works tended to increase. Japan Power Engineering and Inspection Corp. carried out the verification test of a practical power reactor automatic inspection system from November, 1981, to March, 1986, and in this report, the state of having carried out this verification test is described. The objects of the verification test were the equipment which is urgently required for reducing radiation exposure dose, the possibility of realization of which is high, and which is important for ensuring the safety and reliability of plants, that is, an automatic ultrasonic flaw detector for the welded parts of bend pipes, an automatic disassembling and inspection system for control rod driving mechanism, a fuel automatic inspection system, and automatic decontaminating equipments for steam generator water chambers, primary system crud and radioactive gas in coolant. The results of the verification test of these equipments were judged as satisfactory, therefore, the application to actual plants is possible. (Kako, I.)

  9. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  10. The Healy Clean Coal Project: Design verification tests

    International Nuclear Information System (INIS)

    Guidetti, R.H.; Sheppard, D.B.; Ubhayakar, S.K.; Weede, J.J.; McCrohan, D.V.; Rosendahl, S.M.

    1993-01-01

    As part of the Healy Clean Coal Project, TRW Inc., the supplier of the advanced slagging coal combustors, has successfully completed design verification tests on the major components of the combustion system at its Southern California test facility. These tests, which included the firing of a full-scale precombustor with a new non-storage direct coal feed system, supported the design of the Healy combustion system and its auxiliaries performed under Phase 1 of the project. Two 350 million BTU/hr combustion systems have been designed and are now ready for fabrication and erection, as part of Phase 2 of the project. These systems, along with a back-end Spray Dryer Absorber system, designed and supplied by Joy Technologies, will be integrated with a Foster Wheeler boiler for the 50 MWe power plant at Healy, Alaska. This paper describes the design verification tests and the current status of the project

  11. Length of stay and economic consequences with rivaroxaban vs enoxaparin/vitamin K antagonist in patients with DVT and PE: findings from the North American EINSTEIN clinical trial program.

    Science.gov (United States)

    Bookhart, Brahim K; Haskell, Lloyd; Bamber, Luke; Wang, Maria; Schein, Jeff; Mody, Samir H

    2014-10-01

    Venous thromboembolism (VTE) (deep vein thrombosis [DVT] and pulmonary embolism [(PE]) represents a substantial economic burden to the healthcare system. Using data from the randomized EINSTEIN DVT and PE trials, this North American sub-group analysis investigated the potential of rivaroxaban to reduce the length of initial hospitalization in patients with acute symptomatic DVT or PE. A post-hoc analysis of hospitalization and length-of-stay (LOS) data was conducted in the North American sub-set of patients from the randomized, open-label EINSTEIN trial program. Patients received either rivaroxaban (15 mg twice daily for 3 weeks followed by 20 mg once daily; n = 405) or dose-adjusted subcutaneous enoxaparin overlapping with (guideline-recommended 'bridging' therapy) and followed by a vitamin K antagonist (VKA) (international normalized ratio = 2.0-3.0; n = 401). The open-label study design allowed for the comparison of LOS between treatment arms under conditions reflecting normal clinical practice. LOS was evaluated using investigator records of dates of admission and discharge. Analyses were carried out in the intention-to-treat population using parametric tests. Costs were applied to the LOS based on weighted mean cost per day for DVT and PE diagnoses obtained from the Healthcare Cost and Utilization Project dataset. Of 382 patients hospitalized, 321 (84%), had acute symptomatic PE; few DVT patients required hospitalization. Similar rates of VTE patients were hospitalized in the rivaroxaban and enoxaparin/VKA treatment groups, 189/405 (47%) and 193/401 (48%), respectively. In hospitalized VTE patients, rivaroxaban treatment produced a 1.6-day mean reduction in LOS (median = 1 day) compared with enoxaparin/VKA (mean = 4.5 vs 6.1; median = 3 vs 4), translating to total costs that were $3419 lower in rivaroxaban-treated patients. In hospitalized North American patients with VTE, treatment with rivaroxaban produced a statistically

  12. Design verification testing for fuel element type CAREM

    International Nuclear Information System (INIS)

    Martin Ghiselli, A.; Bonifacio Pulido, K.; Villabrille, G.; Rozembaum, I.

    2013-01-01

    The hydraulic and hydrodynamic characterization tests are part of the design verification process of a nuclear fuel element prototype and its components. These tests are performed in a low pressure and temperature facility. The tests requires the definition of the simulation parameters for setting the test conditions, the results evaluation to feedback mathematical models, extrapolated the results to reactor conditions and finally to decide the acceptability of the tested prototype. (author)

  13. Environmental Testing Campaign and Verification of Satellite Deimos-2 at INTA

    Science.gov (United States)

    Hernandez, Daniel; Vazquez, Mercedes; Anon, Manuel; Olivo, Esperanza; Gallego, Pablo; Morillo, Pablo; Parra, Javier; Capraro; Luengo, Mar; Garcia, Beatriz; Villacorta, Pablo

    2014-06-01

    In this paper the environmental test campaign and verification of the DEIMOS-2 (DM2) satellite will be presented and described. DM2 will be ready for launch in 2014.Firstly, a short description of the satellite is presented, including its physical characteristics and intended optical performances. DEIMOS-2 is a LEO satellite for earth observation that will provide high resolution imaging services for agriculture, civil protection, environmental issues, disasters monitoring, climate change, urban planning, cartography, security and intelligence.Then, the verification and test campaign carried out on the SM and FM models at INTA is described; including Mechanical test for the SM and Climatic, Mechanical and Electromagnetic Compatibility tests for the FM. In addition, this paper includes Centre of Gravity and Moment of Inertia measurements for both models, and other verification activities carried out in order to ensure satellite's health during launch and its in orbit performance.

  14. Comparison of the local dose of scattered radiation of a special dental - phantom and a real human head by using a Digital Volume Tomography (DVT)

    International Nuclear Information System (INIS)

    Neuwirth, J.; Hefner, A.

    2008-01-01

    Dental Radiography Digital Volume Tomography (DVT) gains more and more importance due to its possibility of three-dimensional imaging of teeth, jaw and visercoranium and the reduced radiation dose in comparison to conventional Computer Tomography (CT). Contrary to other, well documented radiographic procedures like dental panorama X-ray imaging there are no national or international guidelines or recommendations relating to DVT which regulate the designation of areas and standardize risk assessment. This study aims to assess the parameters necessary for local radiation protection in dental practices. Measurements were carried out in dental practices in order to evaluate the local dose resulting from different DVT devices. A special dental-phantom and a real human head were used in the irradiations in order to define the local dose of scattered radiation by nominal voltage. The dental-phantom was created for conventional dental panorama X-ray devices which make use of lower nominal voltages. This poses the question if the scatter performance of the special dental-phantom is comparable to a real human head and therefore applicable to the estimation of the radiation quality of a DVT when using 120 kV. The existing guidelines for dental panorama xray are analyzed and suggestions for future recommendations concerning the designation of areas and risk assessment for DVT are then deducted by comparing both sets of measurements. The results show that the special dental-phantom is absolutely suitable for the definition of the local dose resulting from the scattered radiation of a DVT. (author)

  15. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Science.gov (United States)

    2010-07-01

    ... corrective action does not resolve the deficiency, you may request to use the contaminated system as an... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION...

  16. A digital volumetric tomography (DVT study in the mandibular molar region for miniscrew placement during mixed dentition

    Directory of Open Access Journals (Sweden)

    Mayur S. Bhattad

    2015-04-01

    Full Text Available OBJECTIVE: To assess bone thickness for miniscrew placement in the mandible during mixed dentition by using digital volumetric tomograph (DVT. MATERIAL AND METHODS: A total of 15 healthy patients aged 8-10 years old, with early exfoliated mandibular second deciduous molar, were included. DVT images of one quadrant of the mandible were obtained using Kodak extraoral imaging systems and analyzed by Kodak dental imaging software. The error of the method (EM was calculated using Dahlberg's formula. Mean and standard deviation were calculated at 6 and 8 mm from the cementoenamel junction (CEJ.Paired t-test was used to analyze the measurements. RESULTS: Buccal cortical bone thickness, mesiodistal width and buccolingual bone depth at 6 mm were found to be 1.73 + 0.41, 2.15 + 0.49 and 13.18 + 1.22 mm, respectively; while at 8 mm measurements were 2.42 + 0.34, 2.48 + 0.33 and 13.65 + 1.25 mm, respectively. EM for buccal cortical bone thickness, mesiodistal width and buccolingual bone depth was 0.58, 0.40 and 0.48, respectively. The difference in measurement at 6 and 8 mm for buccal cortical plate thickness (P 0.05. CONCLUSION: Bone thickness measurement has shown promising evidence for safe placement of miniscrews in the mandible during mixed dentition. The use of miniscrew is the best alternative, even in younger patients.

  17. Bias associated with delayed verification in test accuracy studies: accuracy of tests for endometrial hyperplasia may be much higher than we think!

    Directory of Open Access Journals (Sweden)

    Coomarasamy Aravinthan

    2004-05-01

    Full Text Available Abstract Background To empirically evaluate bias in estimation of accuracy associated with delay in verification of diagnosis among studies evaluating tests for predicting endometrial hyperplasia. Methods Systematic reviews of all published research on accuracy of miniature endometrial biopsy and endometr ial ultrasonography for diagnosing endometrial hyperplasia identified 27 test accuracy studies (2,982 subjects. Of these, 16 had immediate histological verification of diagnosis while 11 had verification delayed > 24 hrs after testing. The effect of delay in verification of diagnosis on estimates of accuracy was evaluated using meta-regression with diagnostic odds ratio (dOR as the accuracy measure. This analysis was adjusted for study quality and type of test (miniature endometrial biopsy or endometrial ultrasound. Results Compared to studies with immediate verification of diagnosis (dOR 67.2, 95% CI 21.7–208.8, those with delayed verification (dOR 16.2, 95% CI 8.6–30.5 underestimated the diagnostic accuracy by 74% (95% CI 7%–99%; P value = 0.048. Conclusion Among studies of miniature endometrial biopsy and endometrial ultrasound, diagnostic accuracy is considerably underestimated if there is a delay in histological verification of diagnosis.

  18. Support of Construction and Verification of Out-of-Pile Fuel Assembly Test Facilities

    International Nuclear Information System (INIS)

    Park, Nam Gyu; Kim, K. T.; Park, J. K.

    2006-12-01

    Fuel assembly and components should be verified by the out-of-pile test facilities in order to load the developed fuel in reactor. Even though most of the component-wise tests have been performed using the facilities in land, the assembly-wise tests has been depended on the oversees' facility due to the lack of the facilities. KAERI started to construct the assembly-wise mechanical/hydraulic test facilities and KNF, as an end user, is supporting the mechanical/hydraulic test facility construction by using the technologies studied through the fuel development programs. The works performed are as follows: - Test assembly shipping container design and manufacturing support - Fuel handling tool design : Gripper, Upper and lower core simulators for assembly mechanical test facility, Internals for assembly hydraulic test facility - Manufacture of test specimens : skeleton and assembly for preliminary functional verification of assembly mechanical/hydraulic test facilities, two assemblies for the verification of assembly mechanical/hydraulic test facilities, Instrumented rod design and integrity evaluation - Verification of assembly mechanical/hydraulic test facilities : test data evaluation

  19. Support of Construction and Verification of Out-of-Pile Fuel Assembly Test Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam Gyu; Kim, K. T.; Park, J. K. [KNF, Daejeon (Korea, Republic of)] (and others)

    2006-12-15

    Fuel assembly and components should be verified by the out-of-pile test facilities in order to load the developed fuel in reactor. Even though most of the component-wise tests have been performed using the facilities in land, the assembly-wise tests has been depended on the oversees' facility due to the lack of the facilities. KAERI started to construct the assembly-wise mechanical/hydraulic test facilities and KNF, as an end user, is supporting the mechanical/hydraulic test facility construction by using the technologies studied through the fuel development programs. The works performed are as follows: - Test assembly shipping container design and manufacturing support - Fuel handling tool design : Gripper, Upper and lower core simulators for assembly mechanical test facility, Internals for assembly hydraulic test facility - Manufacture of test specimens : skeleton and assembly for preliminary functional verification of assembly mechanical/hydraulic test facilities, two assemblies for the verification of assembly mechanical/hydraulic test facilities, Instrumented rod design and integrity evaluation - Verification of assembly mechanical/hydraulic test facilities : test data evaluation.

  20. CIT photoheliograph functional verification unit test program

    Science.gov (United States)

    1973-01-01

    Tests of the 2/3-meter photoheliograph functional verification unit FVU were performed with the FVU installed in its Big Bear Solar Observatory vacuum chamber. Interferometric tests were run both in Newtonian (f/3.85) and Gregorian (f/50) configurations. Tests were run in both configurations with optical axis horizontal, vertical, and at 45 deg to attempt to determine any gravity effects on the system. Gravity effects, if present, were masked by scatter in the data associated with the system wavefront error of 0.16 lambda rms ( = 6328A) apparently due to problems in the primary mirror. Tests showed that the redesigned secondary mirror assembly works well.

  1. EQ3/6 software test and verification report 9/94

    International Nuclear Information System (INIS)

    Kishi, T.

    1996-02-01

    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ''V and V'' report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT

  2. EQ3/6 software test and verification report 9/94

    Energy Technology Data Exchange (ETDEWEB)

    Kishi, T.

    1996-02-01

    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ``V and V`` report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT.

  3. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF THE INDUSTRIAL TEST SYSTEM, INC. CYANIDE REAGENTSTRIP™ TEST KIT

    Science.gov (United States)

    Cyanide can be present in various forms in water. The cyanide test kit evaluated in this verification study (Industrial Test System, Inc. Cyanide Regent Strip ™ Test Kit) was designed to detect free cyanide in water. This is done by converting cyanide in water to cyanogen...

  4. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  5. Test/QA Plan for Verification of Cavity Ringdown Spectroscopy Systems for Ammonia Monitoring in Stack Gas

    Science.gov (United States)

    The purpose of the cavity ringdown spectroscopy (CRDS) technology test and quality assurance plan is to specify procedures for a verification test applicable to commercial cavity ringdown spectroscopy technologies. The purpose of the verification test is to evaluate the performa...

  6. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  7. Verification of FPGA-Signal using the test board which is applied to Safety-related controller

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Youn-Hu; Yoo, Kwanwoo; Lee, Myeongkyun; Yun, Donghwa [SOOSAN ENS, Seoul (Korea, Republic of)

    2016-10-15

    This article aims to provide the verification method for BGA-type FPGA of Programmable Logic Controller (PLC) developed as Safety Class. The logic of FPGA in the control device with Safety Class is the circuit to control overall logic of PLC. Saftety-related PLC must meet the international standard specifications. With this reason, we use V and V according to an international standard in order to secure high reliability and safety. By using this, we are supposed to proceed to a variety of verification courses for extra reliability and safety analysis. In order to have efficient verification of test results, we propose the test using the newly changed BGA socket which can resolve the problems of the conventional socket on this paper. The Verification of processes is divided into verification of Hardware and firmware. That processes are carried out in the unit testing and integration testing. The proposed test method is simple, the effect of cost reductions by batch process. In addition, it is advantageous to measure the signal from the Hi-speed-IC due to its short length of the pins and it was plated with the copper around it. Further, it also to prevent abrasion on the IC ball because it has no direct contact with the PCB. Therefore, it can be actually applied is to the BGA package test and we can easily verify logic as well as easily checking the operation of the designed data.

  8. Rivaroxaban for the treatment of symptomatic deep-vein thrombosis and pulmonary embolism in Chinese patients: a subgroup analysis of the EINSTEIN DVT and PE studies.

    Science.gov (United States)

    Wang, Yuqi; Wang, Chen; Chen, Zhong; Zhang, Jiwei; Liu, Zhihong; Jin, Bi; Ying, Kejing; Liu, Changwei; Shao, Yuxia; Jing, Zhicheng; Meng, Isabelle Ling; Prins, Martin H; Pap, Akos F; Müller, Katharina; Lensing, Anthonie Wa

    2013-12-16

    The worldwide EINSTEIN DVT and EINSTEIN PE studies randomized 8282 patients with acute symptomatic deep-vein thrombosis (DVT) and/or pulmonary embolism (PE) and, for the first time in trials in this setting, included patients in China. This analysis evaluates the results of these studies in this subgroup of patients. A total of 439 Chinese patients who had acute symptomatic DVT (n=211), or PE with or without DVT (n=228), were randomized to receive rivaroxaban (15 mg twice daily for 21 days, followed by 20 mg once daily) or standard therapy of enoxaparin overlapping with and followed by an adjusted-dose vitamin K antagonist, for 3, 6, or 12 months. The primary efficacy outcome was symptomatic recurrent venous thromboembolism. The principal safety outcome was major or non-major clinically relevant bleeding. The primary efficacy outcome occurred in seven (3.2%) of the 220 patients in the rivaroxaban group and in seven (3.2%) of the 219 patients in the standard-therapy group (hazard ratio, 1.04; 95% confidence interval 0.36-3.0; p=0.94). The principal safety outcome occurred in 13 (5.9%) patients in the rivaroxaban group and in 20 (9.2%) patients in the standard-therapy group (hazard ratio, 0.63; 95% confidence interval 0.31-1.26; p=0.19). Major bleeding was observed in no patients in the rivaroxaban group and in five (2.3%) patients in the standard-therapy group. In fragile patients (defined as age >75 years, creatinine clearance EINSTEIN PE, ClinicalTrials.gov NCT00439777; EINSTEIN DVT, ClinicalTrials.gov NCT00440193.

  9. The US National Resources Defense Council/Soviet Academy of Sciences Nuclear Test Ban Verification Project

    International Nuclear Information System (INIS)

    Cochran, T.B.

    1989-01-01

    The first week in September 1987 was an extraordinary one for arms control verification. As part of the co-operative Test Ban Verification Project of the Natural Resources Defense Council (NRDC) and the Soviet Academy of Sciences, fourteen American scientists from the Scripps Institution of Oceanography (at the University of California- San Diego), University of Nevada-Reno and the University of Colorado went to the region of the Soviet's principal nuclear test site near Semipalatinsk. Together with their Soviet counterparts from the Institute of Physics of the Earth (IPE) in Moscow, they fired off three large chemical explosions. The purpose of these explosions was to demonstrate the sensitivity of the three seismic stations surrounding the test site, to study the efficiency with which high-frequency seismic waves propagate in the region, and to study differences between chemical explosions, nuclear explosions and earthquakes in order more firmly to establish procedures for verification of a nuclear test ban. This paper presents a review of the results of these experiments, an update on the status of the joint project, and a review of the significance of high frequency seismic data to test ban verification

  10. Verification test of control rod system for HTR-10

    International Nuclear Information System (INIS)

    Zhou Huizhong; Diao Xingzhong; Huang Zhiyong; Cao Li; Yang Nianzu

    2002-01-01

    There are 10 sets of control rods and driving devices in 10 MW High Temperature Gas-cooled Test Reactor (HTR-10). The control rod system is the controlling and shutdown system of HTR-10, which is designed for reactor criticality, operation, and shutdown. In order to guarantee technical feasibility, a series of verification tests were performed, including room temperature test, thermal test, test after control rod system installed in HTR-10, and test of control rod system before HTR-10 first criticality. All the tests data showed that driving devices working well, control rods running smoothly up and down, random position settling well, and exactly position indicating

  11. Verification Testing: Meet User Needs Figure of Merit

    Science.gov (United States)

    Kelly, Bryan W.; Welch, Bryan W.

    2017-01-01

    Verification is the process through which Modeling and Simulation(M&S) software goes to ensure that it has been rigorously tested and debugged for its intended use. Validation confirms that said software accurately models and represents the real world system. Credibility gives an assessment of the development and testing effort that the software has gone through as well as how accurate and reliable test results are. Together, these three components form Verification, Validation, and Credibility(VV&C), the process by which all NASA modeling software is to be tested to ensure that it is ready for implementation. NASA created this process following the CAIB (Columbia Accident Investigation Board) report seeking to understand the reasons the Columbia space shuttle failed during reentry. The reports conclusion was that the accident was fully avoidable, however, among other issues, the necessary data to make an informed decision was not there and the result was complete loss of the shuttle and crew. In an effort to mitigate this problem, NASA put out their Standard for Models and Simulations, currently in version NASA-STD-7009A, in which they detailed their recommendations, requirements and rationale for the different components of VV&C. They did this with the intention that it would allow for people receiving MS software to clearly understand and have data from the past development effort. This in turn would allow the people who had not worked with the MS software before to move forward with greater confidence and efficiency in their work. This particular project looks to perform Verification on several MATLAB (Registered Trademark)(The MathWorks, Inc.) scripts that will be later implemented in a website interface. It seeks to take note and define the limits of operation, the units and significance, and the expected datatype and format of the inputs and outputs of each of the scripts. This is intended to prevent the code from attempting to make incorrect or impossible

  12. Integrating software testing and run-time checking in an assertion verification framework

    OpenAIRE

    Mera, E.; López García, Pedro; Hermenegildo, Manuel V.

    2009-01-01

    We have designed and implemented a framework that unifies unit testing and run-time verification (as well as static verification and static debugging). A key contribution of our approach is that a unified assertion language is used for all of these tasks. We first propose methods for compiling runtime checks for (parts of) assertions which cannot be verified at compile-time via program transformation. This transformation allows checking preconditions and postconditions, including conditional...

  13. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  14. D-dimer as an applicable test for detection of posttraumatic deep vein thrombosis in lower limb fracture.

    Science.gov (United States)

    Bakhshi, Hooman; Alavi-Moghaddam, Mostafa; Wu, Karin C; Imami, Mohammad; Banasiri, Mohammad

    2012-06-01

    Measuring the plasma levels of D-dimer is an accurate and easy modality to detect deep vein thrombosis (DVT) in nontraumatic settings. However, the diagnostic reliability of D-dimer assays in detecting posttraumatic DVT among patients with lower limb fracture undergoing orthopedic surgery is not validated. In this study, 141 patients with lower limb fracture admitted through the emergency department and undergoing orthopedic surgery were enrolled. Postoperative venous blood samples for D-dimer assay were taken on the 1st, 7th, and 28th postoperative days. Color Doppler sonography examination of both lower limbs was performed at the same time as a standard test. Eight out of the 141 patients (6%) had acute DVT based on Color Doppler sonography. Mean D-dimer was 2160 ng/mL in DVT positive patients and 864 in DVT negative patients. D-dimer levels greater than 1000 ng/mL were 100% sensitive and 71% specific for detecting postoperative DVT. D-dimer assay is a useful and sensitive test for detecting posttraumatic DVT.

  15. Formal Verification of Digital Protection Logic and Automatic Testing Software

    Energy Technology Data Exchange (ETDEWEB)

    Cha, S. D.; Ha, J. S.; Seo, J. S. [KAIST, Daejeon (Korea, Republic of)

    2008-06-15

    - Technical aspect {center_dot} It is intended that digital I and C software have safety and reliability. Project results help the software to acquire license. Software verification technique, which results in this project, can be to use for digital NPP(Nuclear power plant) in the future. {center_dot} This research introduces many meaningful results of verification on digital protection logic and suggests I and C software testing strategy. These results apply to verify nuclear fusion device, accelerator, nuclear waste management and nuclear medical device that require dependable software and high-reliable controller. Moreover, These can be used for military, medical or aerospace-related software. - Economical and industrial aspect {center_dot} Since safety of digital I and C software is highly import, It is essential for the software to be verified. But verification and licence acquisition related to digital I and C software face high cost. This project gives economic profit to domestic economy by using introduced verification and testing technique instead of foreign technique. {center_dot} The operation rate of NPP will rise, when NPP safety critical software is verified with intellectual V and V tool. It is expected that these software substitute safety-critical software that wholly depend on foreign. Consequently, the result of this project has high commercial value and the recognition of the software development works will be able to be spread to the industrial circles. - Social and cultural aspect People expect that nuclear power generation contributes to relieving environmental problems because that does not emit more harmful air pollution source than other power generations. To give more trust and expectation about nuclear power generation to our society, we should make people to believe that NPP is highly safe system. In that point of view, we can present high-reliable I and C proofed by intellectual V and V technique as evidence

  16. Bias associated with delayed verification in test accuracy studies: accuracy of tests for endometrial hyperplasia may be much higher than we think!

    OpenAIRE

    Clark, T Justin; ter Riet, Gerben; Coomarasamy, Aravinthan; Khan, Khalid S

    2004-01-01

    Abstract Background To empirically evaluate bias in estimation of accuracy associated with delay in verification of diagnosis among studies evaluating tests for predicting endometrial hyperplasia. Methods Systematic reviews of all published research on accuracy of miniature endometrial biopsy and endometr ial ultrasonography for diagnosing endometrial hyperplasia identified 27 test accuracy studies (2,982 subjects). Of these, 16 had immediate histological verification of diagnosis while 11 ha...

  17. Verification test of an engineering-scale multi-purpose radwaste incinerator

    International Nuclear Information System (INIS)

    Wang Peiyi; Zhou Lianquan; Ma Mingxie; Qiu Mingcai; Yang Liguo; Li Xiaohai; Zhang Xiaobin; Lu Xiaowu; Dong Jingling; Wang Xujin; Li Chuanlian; Yang Baomin

    2002-01-01

    The verification test of an engineering-scale multi-purpose radwaste incinerator was implemented. The test items include performance determination for the system when solid wastes (include resins) or spent oil were incinerating and off gas was cleaning, tracer test for determining decontamination factor and 72 h continuos running test. 500 h tests verify the reliability and feasibility of designs of technological process, main structure, instrument control and system safety. The incineration system ran smoothly, devices and instruments worked stably. The specifications such as capacity, volume reduction factor, carbon remainder in ash and decontamination factor all meet the design requirements

  18. The concept verification testing of materials science payloads

    Science.gov (United States)

    Griner, C. S.; Johnston, M. H.; Whitaker, A.

    1976-01-01

    The concept Verification Testing (CVT) project at the Marshall Space Flight Center, Alabama, is a developmental activity that supports Shuttle Payload Projects such as Spacelab. It provides an operational 1-g environment for testing NASA and other agency experiment and support systems concepts that may be used in shuttle. A dedicated Materials Science Payload was tested in the General Purpose Laboratory to assess the requirements of a space processing payload on a Spacelab type facility. Physical and functional integration of the experiments into the facility was studied, and the impact of the experiments on the facility (and vice versa) was evaluated. A follow-up test designated CVT Test IVA was also held. The purpose of this test was to repeat Test IV experiments with a crew composed of selected and trained scientists. These personnel were not required to have prior knowledge of the materials science disciplines, but were required to have a basic knowledge of science and the scientific method.

  19. Performance verification tests of JT-60SA CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Obana, Tetsuhiro, E-mail: obana.tetsuhiro@LHD.nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Murakami, Haruyuki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan)

    2015-11-15

    Highlights: • The performance of the JT-60SA CS model coil was verified. • The CS model coil comprised a quad-pancake wound with a Nb{sub 3}Sn CIC conductor. • The CS model coil met the design requirements. - Abstract: As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb{sub 3}Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of −0.62% for the Nb{sub 3}Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  20. Computer Forensic Function Testing: Media Preparation, Write Protection And Verification

    Directory of Open Access Journals (Sweden)

    Yinghua (David Guo

    2010-06-01

    Full Text Available Normal 0 false false false EN-US JA AR-SA The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemical description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions.

  1. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR- BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND...Sensor-Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...NUMBER 62601F 6. AUTHOR(S) Norman Fitz-Coy 5d. PROJECT NUMBER 4846 5e. TASK NUMBER PPM00015968 5f. WORK UNIT NUMBER EF125135 7. PERFORMING

  2. Verification Test of Hydraulic Performance for Reactor Coolant Pump

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang Jun; Kim, Jae Shin; Ryu, In Wan; Ko, Bok Seong; Song, Keun Myung [Samjin Ind. Co., Seoul (Korea, Republic of)

    2010-01-15

    According to this project, basic design for prototype pump and model pump of reactor coolant pump and test facilities has been completed. Basic design for prototype pump to establish structure, dimension and hydraulic performance has been completed and through primary flow analysis by computational fluid dynamics(CFD), flow characteristics and hydraulic performance have been established. This pump was designed with mixed flow pump having the following design requirements; specific velocity(Ns); 1080.9(rpm{center_dot}m{sup 3}/m{center_dot}m), capacity; 3115m{sup 3}/h, total head ; 26.3m, pump speed; 1710rpm, pump efficiency; 77.0%, Impeller out-diameter; 349mm, motor output; 360kw, design pressure; 17MPaG. The features of the pump are leakage free due to no mechanical seal on the pump shaft which insures reactor's safety and law noise level and low vibration due to no cooling fan on the motor which makes eco-friendly product. Model pump size was reduced to 44% of prototype pump for the verification test for hydraulic performance of reactor coolant pump and was designed with mixed flow pump and canned motor having the following design requirements; specific speed(NS); 1060.9(rpm{center_dot}m{sup 3}/m{center_dot}m), capacity; 539.4m{sup 3}/h, total head; 21.0m, pump speed; 3476rpm, pump efficiency; 72.9%, Impeller out-diameter; 154mm, motor output; 55kw, design pressure; 1.0MPaG. The test facilities were designed for verification test of hydraulic performance suitable for pump performance test, homologous test, NPSH test(cavitation), cost down test and pressure pulsation test of inlet and outlet ports. Test tank was designed with testing capacity enabling up to 2000m{sup 3}/h and design pressure 1.0MPaG. Auxiliary pump was designed with centrifugal pump having capacity; 1100m{sup 3}/h, total head; 42.0m, motor output; 190kw

  3. Constrained structural dynamic model verification using free vehicle suspension testing methods

    Science.gov (United States)

    Blair, Mark A.; Vadlamudi, Nagarjuna

    1988-01-01

    Verification of the validity of a spacecraft's structural dynamic math model used in computing ascent (or in the case of the STS, ascent and landing) loads is mandatory. This verification process requires that tests be carried out on both the payload and the math model such that the ensuing correlation may validate the flight loads calculations. To properly achieve this goal, the tests should be performed with the payload in the launch constraint (i.e., held fixed at only the payload-booster interface DOFs). The practical achievement of this set of boundary conditions is quite difficult, especially with larger payloads, such as the 12-ton Hubble Space Telescope. The development of equations in the paper will show that by exciting the payload at its booster interface while it is suspended in the 'free-free' state, a set of transfer functions can be produced that will have minima that are directly related to the fundamental modes of the payload when it is constrained in its launch configuration.

  4. Verification and testing of the RTOS for safety-critical embedded systems

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Na Young [Seoul National University, Seoul (Korea, Republic of); Kim, Jin Hyun; Choi, Jin Young [Korea University, Seoul (Korea, Republic of); Sung, Ah Young; Choi, Byung Ju [Ewha Womans University, Seoul (Korea, Republic of); Lee, Jang Soo [KAERI, Taejon (Korea, Republic of)

    2003-07-01

    Development in Instrumentation and Control (I and C) technology provides more convenience and better performance, thus, adopted in many fields. To adopt newly developed technology, nuclear industry requires rigorous V and V procedure and tests to assure reliable operation. Adoption of digital system requires verification and testing of the OS for licensing. Commercial real-time operating system (RTOS) is targeted to apply to various, unpredictable needs, which makes it difficult to verify. For this reason, simple, application-oriented realtime OS is developed for the nuclear application. In this work, we show how to verify the developed RTOS at each development lifecycle. Commercial formal tool is used in specification and verification of the system. Based on the developed model, software in C language is automatically generated. Tests are performed for two purposes; one is to identify consistency between the verified model and the generated code, the other is to find errors in the generated code. The former assumes that the verified model is correct, and the latter incorrect. Test data are generated separately to satisfy each purpose. After we test the RTOS software, we implement the test board embedded with the developed RTOS and the application software, which simulates the safety critical plant protection function. Testing to identify whether the reliability criteria is satisfied or not is also designed in this work. It results in that the developed RTOS software works well when it is embedded in the system.

  5. Verification and testing of the RTOS for safety-critical embedded systems

    International Nuclear Information System (INIS)

    Lee, Na Young; Kim, Jin Hyun; Choi, Jin Young; Sung, Ah Young; Choi, Byung Ju; Lee, Jang Soo

    2003-01-01

    Development in Instrumentation and Control (I and C) technology provides more convenience and better performance, thus, adopted in many fields. To adopt newly developed technology, nuclear industry requires rigorous V and V procedure and tests to assure reliable operation. Adoption of digital system requires verification and testing of the OS for licensing. Commercial real-time operating system (RTOS) is targeted to apply to various, unpredictable needs, which makes it difficult to verify. For this reason, simple, application-oriented realtime OS is developed for the nuclear application. In this work, we show how to verify the developed RTOS at each development lifecycle. Commercial formal tool is used in specification and verification of the system. Based on the developed model, software in C language is automatically generated. Tests are performed for two purposes; one is to identify consistency between the verified model and the generated code, the other is to find errors in the generated code. The former assumes that the verified model is correct, and the latter incorrect. Test data are generated separately to satisfy each purpose. After we test the RTOS software, we implement the test board embedded with the developed RTOS and the application software, which simulates the safety critical plant protection function. Testing to identify whether the reliability criteria is satisfied or not is also designed in this work. It results in that the developed RTOS software works well when it is embedded in the system

  6. Environmental Technology Verification Report for Abraxis Ecologenia® 17β-Estradiol (E2) Microplate Enzyme-Linked Immunosorbent Assay (ELISA) Test Kits

    Science.gov (United States)

    This verification test was conducted according to procedures specifiedin the Test/QA Planfor Verification of Enzyme-Linked Immunosorbent Assay (ELISA) Test Kis for the Quantitative Determination of Endocrine Disrupting Compounds (EDCs) in Aqueous Phase Samples. Deviations to the...

  7. Cassini's Test Methodology for Flight Software Verification and Operations

    Science.gov (United States)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  8. FY 1983 report on the results of the verification test on the methanol conversion for oil-fired power plant. Part 1. Verification test on the environmental safety; 1983 nendo sekiyu karyoku hatsudensho metanoru tenkan tou jissho shiken seika hokokusho. Kankyo anzensei jissho shiken (Sono 1)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1984-03-01

    As to the verification test on the environmental safety in the use of methanol as power generation use fuel, the following were summed up: review of the verification test and the interim evaluation, state of implementation of the FY 1983 verification test, study/evaluation of the results of the FY 1983 test, survey of research trends, plan of the FY 1984 verification test, record of the committee, etc. Concerning the interim evaluation, high evaluation was obtained as described below: Testing facilities were constructed as planned at first to make the implementation of various tests possible; Tests were smoothly conducted, and among the acute test using monkey, test on mock flue gas using monkey/rat, test on mutagenicity and test on the effect on aquatic animals, tests using oryzias latipes and abalone on the fatal concentration, avoidance behavior and chronic effect were finished by the end of FY 1983 almost as planned; The long-term inhalation test using monkey and rat/mouse has been smoothly in progress. In the survey of research trends, the paper introduced the outlined literature on the methanol metabolism of monkey, changes in the methanol concentration in blood/urine in the case of drinking methanol by mistake. (NEDO)

  9. Test/QA Plan For Verification Of Anaerobic Digester For Energy Production And Pollution Prevention

    Science.gov (United States)

    The ETV-ESTE Program conducts third-party verification testing of commercially available technologies that improve the environmental conditions in the U.S. A stakeholder committee of buyers and users of such technologies guided the development of this test on anaerobic digesters...

  10. Power Performance Verification of a Wind Farm Using the Friedman's Test.

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L

    2016-06-03

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  11. Field test for treatment verification of an in-situ enhanced bioremediation study

    International Nuclear Information System (INIS)

    Taur, C.K.; Chang, S.C.

    1995-01-01

    Due to a leakage from a 12-inch pressurized diesel steel pipe four years ago, an area of approximately 30,000 square meters was contaminated. A pilot study applying the technology of in-situ enhanced bioremediation was conducted. In the study, a field test kit and on-site monitoring equipment were applied for site characterization and treatment verification. Physically, the enhanced bioremediation study consisted of an air extraction and air supply system, and a nutrition supply network. Certain consistent sampling methodology was employed. Progress was verified by daily monitoring and monthly verification. The objective of this study was to evaluate the capabilities of indigenous microorganisms to biodegrade the petroleum hydrocarbons with provision of oxygen and nutrients. Nine extraction wells and eight air sparging wells were installed. The air sparging wells injected the air into geoformation and the extraction wells provided the underground air circulation. The soil samples were obtained monthly for treatment verification by a Minuteman drilling machine with 2.5-foot-long hollow-stem augers. The samples were analyzed on site for TPH-diesel concentration by a field test kit manufactured by HNU-Hanby, Houston, Texas. The analytical results from the field test kit were compared with the results from an environmental laboratory. The TVPH concentrations of the air extracted from the vadose zone by a vacuum blower and the extraction wells were routinely monitored by a Foxboro FID and Cosmos XP-311A combustible air detector. The daily monitoring of TVPH concentrations provided the reliable data for assessing the remedial progress

  12. Environmental Technology Verification: Supplement to Test/QA Plan for Biological and Aerosol Testing of General Ventilation Air Cleaners; Bioaerosol Inactivation Efficiency by HVAC In-Duct Ultraviolet Light Air Cleaners

    Science.gov (United States)

    The Air Pollution Control Technology Verification Center has selected general ventilation air cleaners as a technology area. The Generic Verification Protocol for Biological and Aerosol Testing of General Ventilation Air Cleaners is on the Environmental Technology Verification we...

  13. Testing and verification of a novel single-channel IGBT driver circuit

    OpenAIRE

    Lukić, Milan; Ninković, Predrag

    2016-01-01

    This paper presents a novel single-channel IGBT driver circuit together with a procedure for testing and verification. It is based on a specialized integrated circuit with complete range of protective functions. Experiments are performed to test and verify its behaviour. Experimental results are presented in the form of oscilloscope recordings. It is concluded that the new driver circuit is compatible with modern IGBT transistors and power converter demands and that it can be applied in new d...

  14. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  15. Test/QA Plan for Verification of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments across International Borders

    Science.gov (United States)

    The verification test will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Technology Verification (ETV) Program. It will be performed by Battelle, which is managing the ETV Advanced Monitoring Systems (AMS) Center throu...

  16. Technique for unit testing of safety software verification and validation

    International Nuclear Information System (INIS)

    Li Duo; Zhang Liangju; Feng Junting

    2008-01-01

    The key issue arising from digitalization of the reactor protection system for nuclear power plant is how to carry out verification and validation (V and V), to demonstrate and confirm the software that performs reactor safety functions is safe and reliable. One of the most important processes for software V and V is unit testing, which verifies and validates the software coding based on concept design for consistency, correctness and completeness during software development. The paper shows a preliminary study on the technique for unit testing of safety software V and V, focusing on such aspects as how to confirm test completeness, how to establish test platform, how to develop test cases and how to carry out unit testing. The technique discussed here was successfully used in the work of unit testing on safety software of a digital reactor protection system. (authors)

  17. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  18. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  19. System and Component Software Specification, Run-time Verification and Automatic Test Generation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  1. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  2. Testing and verification of a novel single-channel IGBT driver circuit

    Directory of Open Access Journals (Sweden)

    Lukić Milan

    2016-01-01

    Full Text Available This paper presents a novel single-channel IGBT driver circuit together with a procedure for testing and verification. It is based on a specialized integrated circuit with complete range of protective functions. Experiments are performed to test and verify its behaviour. Experimental results are presented in the form of oscilloscope recordings. It is concluded that the new driver circuit is compatible with modern IGBT transistors and power converter demands and that it can be applied in new designs. It is a part of new 20kW industrial-grade boost converter.

  3. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.

    2016-01-01

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628

  4. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2016-06-01

    Full Text Available In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  5. Structural and optical properties of WTe2 single crystals synthesized by DVT technique

    Science.gov (United States)

    Dixit, Vijay; Vyas, Chirag; Pathak, V. M.; Soalanki, G. K.; Patel, K. D.

    2018-05-01

    Layered transition metal di-chalcogenide (LTMDCs) crystals have attracted much attention due to their potential in optoelectronic device applications recently due to realization of their monolayer based structures. In the present investigation we report growth of WTe2 single crystals by direct vapor transport (DVT) technique. These crystals are then characterized by energy dispersive analysis of x-rays (EDAX) to study stoichiometric composition after growth. The structural properties are studied by x-ray diffraction (XRD) and selected area electron diffraction (SAED) is used to confirm orthorhombic structure of grown WTe2 crystal. Surface morphological properties of the crystals are also studied by scanning electron microscope (SEM). The optical properties of the grown crystals are studied by UV-Visible spectroscopy which gives direct band gap of 1.44 eV for grown WTe2 single crystals.

  6. Two important safety-related verification tests in the design of Qinshan NPP 600 MWe reactor

    International Nuclear Information System (INIS)

    Li Pengzhou; Li Tianyong; Yu Danping; Sun Lei

    2005-01-01

    This paper summarizes two most important verification tests performed in the design of reactor of Qinshan NPP Phase II: seismic qualification test of control rod drive line (CRDL), flow-induced vibration test of reactor internals both in 1:5 scaled model and on-site measurement during heat function testing (HFT). Both qualification tests proved that the structural design of the reactor has large safety margin. (authors)

  7. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    International Nuclear Information System (INIS)

    ERMI, A.M.

    2000-01-01

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (VandV) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification

  8. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  9. SASSYS-1 computer code verification with EBR-II test data

    International Nuclear Information System (INIS)

    Warinner, D.K.; Dunn, F.E.

    1985-01-01

    The EBR-II natural circulation experiment, XX08 Test 8A, is simulated with the SASSYS-1 computer code and the results for the latter are compared with published data taken during the transient at selected points in the core. The SASSYS-1 results provide transient temperature and flow responses for all points of interest simultaneously during one run, once such basic parameters as pipe sizes, initial core flows, and elevations are specified. The SASSYS-1 simulation results for the EBR-II experiment XX08 Test 8A, conducted in March 1979, are within the published plant data uncertainties and, thereby, serve as a partial verification/validation of the SASSYS-1 code

  10. Design verification of the CANFLEX fuel bundle - quality assurance requirements for mechanical flow testing

    International Nuclear Information System (INIS)

    Alavi, P.; Oldaker, I.E.; Chung, C.H.; Suk, H.C.

    1997-01-01

    As part of the design verification program for the new fuel bundle, a series of out-reactor tests was conducted on the CANFLEX 43-element fuel bundle design. These tests simulated current CANDU 6 reactor normal operating conditions of flow, temperature and pressure. This paper describes the Quality Assurance (QA) Program implemented for the tests that were run at the testing laboratories of Atomic Energy of Canada Limited (AECL) and Korea Atomic energy Research Institute (KAERI). (author)

  11. Benchmark testing and independent verification of the VS2DT computer code

    International Nuclear Information System (INIS)

    McCord, J.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code's original documentation

  12. Testing Conducted for Lithium-Ion Cell and Battery Verification

    Science.gov (United States)

    Reid, Concha M.; Miller, Thomas B.; Manzo, Michelle A.

    2004-01-01

    The NASA Glenn Research Center has been conducting in-house testing in support of NASA's Lithium-Ion Cell Verification Test Program, which is evaluating the performance of lithium-ion cells and batteries for NASA mission operations. The test program is supported by NASA's Office of Aerospace Technology under the NASA Aerospace Flight Battery Systems Program, which serves to bridge the gap between the development of technology advances and the realization of these advances into mission applications. During fiscal year 2003, much of the in-house testing effort focused on the evaluation of a flight battery originally intended for use on the Mars Surveyor Program 2001 Lander. Results of this testing will be compared with the results for similar batteries being tested at the Jet Propulsion Laboratory, the Air Force Research Laboratory, and the Naval Research Laboratory. Ultimately, this work will be used to validate lithium-ion battery technology for future space missions. The Mars Surveyor Program 2001 Lander battery was characterized at several different voltages and temperatures before life-cycle testing was begun. During characterization, the battery displayed excellent capacity and efficiency characteristics across a range of temperatures and charge/discharge conditions. Currently, the battery is undergoing lifecycle testing at 0 C and 40-percent depth of discharge under low-Earth-orbit (LEO) conditions.

  13. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Contamination with Microorganisms... § 381.94 Contamination with Microorganisms; process control verification criteria and testing; pathogen... maintaining process controls sufficient to prevent fecal contamination. FSIS shall take further action as...

  14. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  15. Thrombelastography detects dabigatran at therapeutic concentrations in vitro to the same extent as gold-standard tests

    DEFF Research Database (Denmark)

    Solbeck, Sacha; Ostrowski, Sisse R; Stensballe, Jakob

    2016-01-01

    BACKGROUND/OBJECTIVES: Dabigatran is an oral anticoagulant approved for treatment of non-valvular atrial fibrillation, deep venous thrombosis (DVT), pulmonary embolism and prevention of DVT following orthopedic surgery. Monitoring of the dabigatran level is essential in trauma and bleeding patien...... to the current gold-standard tests Hemoclot and ECT, for assessing dabigatran. TEG R is applicable as a rapid and precise whole blood monitoring test for dabigatran treated patients in the emergency setting....

  16. Standard practices for verification of displacement measuring systems and devices used in material testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 These practices cover procedures and requirements for the calibration and verification of displacement measuring systems by means of standard calibration devices for static and quasi-static testing machines. This practice is not intended to be complete purchase specifications for testing machines or displacement measuring systems. Displacement measuring systems are not intended to be used for the determination of strain. See Practice E83. 1.2 These procedures apply to the verification of the displacement measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, etc. In all cases the buyer/owner/user must designate the displacement-measuring system(s) to be verified. 1.3 The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combining values from the two systems m...

  17. CANDU RU fuel manufacturing basic technology development and advanced fuel verification tests

    International Nuclear Information System (INIS)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D.

    1999-04-01

    A PHWR advanced fuel named the CANFLEX fuel has been developed through a KAERI/AECL joint Program. The KAERI made fuel bundle was tested at the KAERI Hot Test Loop for the performance verification of the bundle design. The major test activities were the fuel bundle cross-flow test, the endurance fretting/vibration test, the freon CHF test, and the fuel bundle heat-up test. KAERI also has developing a more advanced PHWR fuel, the CANFLEX-RU fuel, using recovered uranium to extend fuel burn-up in the CANDU reactors. For the purpose of proving safety of the RU handling techniques and appraising feasibility of the CANFLEX-RU fuel fabrication in near future, a physical, chemical and radiological characterization of the RU powder and pellets was performed. (author). 54 refs., 46 tabs., 62 figs

  18. CANDU RU fuel manufacturing basic technology development and advanced fuel verification tests

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D. [and others

    1999-04-01

    A PHWR advanced fuel named the CANFLEX fuel has been developed through a KAERI/AECL joint Program. The KAERI made fuel bundle was tested at the KAERI Hot Test Loop for the performance verification of the bundle design. The major test activities were the fuel bundle cross-flow test, the endurance fretting/vibration test, the freon CHF test, and the fuel bundle heat-up test. KAERI also has developing a more advanced PHWR fuel, the CANFLEX-RU fuel, using recovered uranium to extend fuel burn-up in the CANDU reactors. For the purpose of proving safety of the RU handling techniques and appraising feasibility of the CANFLEX-RU fuel fabrication in near future, a physical, chemical and radiological characterization of the RU powder and pellets was performed. (author). 54 refs., 46 tabs., 62 figs.

  19. Verification test for radiation reduction effect and material integrity on PWR primary system by zinc injection

    Energy Technology Data Exchange (ETDEWEB)

    Kawakami, H.; Nagata, T.; Yamada, M. [Nuclear Power Engineering Corp. (Japan); Kasahara, K.; Tsuruta, T.; Nishimura, T. [Mitsubishi Heavy Industries, Ltd. (Japan); Ishigure, K. [Saitama Inst. of Tech. (Japan)

    2002-07-01

    Zinc injection is known to be an effective method for the reduction of radiation source in the primary water system of a PWR. There is a need to verify the effect of Zn injection operation on radiation source reduction and materials integrity of PWR primary circuit. In order to confirm the effectiveness of Zn injection, verification test as a national program sponsored by Ministry of Economy, Trade and Industry (METI) was started in 1995 for 7-year program, and will be finished by the end of March in 2002. This program consists of irradiation test and material integrity test. Irradiation test as an In-Pile-Test managed by AEAT Plc(UK) was performed using the LVR-15 reactor of NRI Rez in Check Republic. Furthermore, Out-of-Pile-Test using film adding unit was also performed to obtain supplemental data for In-Pile-Test at Takasago Engineering Laboratory of NUPEC. Material Integrity test was planned to perform constant load test, constant strain test and corrosion test at the same time using large scale Loop and slow strain extension rate testing (SSRT) at Takasago Engineering Laboratory of NUPEC. In this paper, the results of the verification test for Zinc program at present are discussed. (authors)

  20. Explosion overpressure test series: General-Purpose Heat Source development: Safety Verification Test program

    International Nuclear Information System (INIS)

    Cull, T.A.; George, T.G.; Pavone, D.

    1986-09-01

    The General-Purpose Heat Source (GPHS) is a modular, radioisotope heat source that will be used in radioisotope thermoelectric generators (RTGs) to supply electric power for space missions. The first two uses will be the NASA Galileo and the ESA Ulysses missions. The RTG for these missions will contain 18 GPHS modules, each of which contains four 238 PuO 2 -fueled clads and generates 250 W/sub (t)/. A series of Safety Verification Tests (SVTs) was conducted to assess the ability of the GPHS modules to contain the plutonia in accident environments. Because a launch pad or postlaunch explosion of the Space Transportation System vehicle (space shuttle) is a conceivable accident, the SVT plan included a series of tests that simulated the overpressure exposure the RTG and GPHS modules could experience in such an event. Results of these tests, in which we used depleted UO 2 as a fuel simulant, suggest that exposure to overpressures as high as 15.2 MPa (2200 psi), without subsequent impact, does not result in a release of fuel

  1. Lower limb edema after arterial reconstruction, a comparison with lymph, reconstruction and DVT edema by RI scintigram

    International Nuclear Information System (INIS)

    Ojiro, M.; Takenosita, M.; Toshinaga, R.; Shimazu, H.; Nakajo, M.; Iwasita, S.

    1991-01-01

    Postoperative lower limb edema after arterial-reconstruction is common complication. However the precise mechanism of this process is not fully understood. In order to investigate this pathogenesis, it was studied whether the postoperative edema was affected by the various types of reconstruction, the materials, the degree of preoperative ischemia and the grade of improvement of ankle pressure index (API) after reconstruction retrospectively. Furthermore, by pertechnetate anion the difference of scintigraph pattern was studied in the lower limb and was compared with postoperative edema, lymph edema and acute deep vein thrombosis (DVT) with swelling limb. (author). 4 refs.; 2 figs

  2. TEST DESIGN FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF ADD-ON NOX CONTROL UTILIZING OZONE INJECTION

    Science.gov (United States)

    The paper discusses the test design for environmental technology verification (ETV) of add-0n nitrogen oxides (NOx) control utilizing ozone injection. (NOTE: ETV is an EPA-established program to enhance domestic and international market acceptance of new or improved commercially...

  3. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  4. Oral rivaroxaban versus standard therapy for the treatment of symptomatic venous thromboembolism: a pooled analysis of the EINSTEIN-DVT and PE randomized studies.

    Science.gov (United States)

    Prins, Martin H; Lensing, Anthonie Wa; Bauersachs, Rupert; van Bellen, Bonno; Bounameaux, Henri; Brighton, Timothy A; Cohen, Alexander T; Davidson, Bruce L; Decousus, Hervé; Raskob, Gary E; Berkowitz, Scott D; Wells, Philip S

    2013-09-20

    Standard treatment for venous thromboembolism (VTE) consists of a heparin combined with vitamin K antagonists. Direct oral anticoagulants have been investigated for acute and extended treatment of symptomatic VTE; their use could avoid parenteral treatment and/or laboratory monitoring of anticoagulant effects. A prespecified pooled analysis of the EINSTEIN-DVT and EINSTEIN-PE studies compared the efficacy and safety of rivaroxaban (15 mg twice-daily for 21 days, followed by 20 mg once-daily) with standard-therapy (enoxaparin 1.0 mg/kg twice-daily and warfarin or acenocoumarol). Patients were treated for 3, 6, or 12 months and followed for suspected recurrent VTE and bleeding. The prespecified noninferiority margin was 1.75. A total of 8282 patients were enrolled; 4151 received rivaroxaban and 4131 received standard-therapy. The primary efficacy outcome occurred in 86 (2.1%) rivaroxaban-treated patients compared with 95 (2.3%) standard-therapy-treated patients (hazard ratio, 0.89; 95% confidence interval [CI], 0.66-1.19; pnoninferiority EINSTEIN-DVT: ClinicalTrials.gov, NCT00440193.

  5. Recent trends on Software Verification and Validation Testing

    International Nuclear Information System (INIS)

    Kim, Hyungtae; Jeong, Choongheui

    2013-01-01

    Verification and Validation (V and V) include the analysis, evaluation, review, inspection, assessment, and testing of products. Especially testing is an important method to verify and validate software. Software V and V testing covers test planning to execution. IEEE Std. 1012 is a standard on the software V and V. Recently, IEEE Std. 1012-2012 was published. This standard is a major revision to IEEE Std. 1012-2004 which defines only software V and V. It expands the scope of the V and V processes to include system and hardware as well as software. This standard describes the scope of V and V testing according to integrity level. In addition, independent V and V requirement related to software V and V testing in IEEE 7-4.3.2-2010 have been revised. This paper provides a recent trend of software V and V testing by reviewing of IEEE Std. 1012-2012 and IEEE 7-4.3.2-2010. There are no major changes of software V and V testing activities and tasks in IEEE 1012-2012 compared with IEEE 1012-2004. But the positions on the responsibility to perform software V and V testing are changed. In addition IEEE 7-4.3.2-2010 newly describes the positions on responsibility to perform Software V and V Testing. However, the positions of these standards on the V and V testing are different. For integrity level 3 and 4, IEEE 1012-2012 basically requires that V and V organization shall conduct all of V and V testing tasks such as test plan, test design, test case, and test procedure except test execution. If V and V testing is conducted by not V and V but another organization, the results of that testing shall be analyzed by the V and V organization. For safety-related software, IEEE 7-4.3.2-2010 requires that test procedures and reports shall be independently verified by the alternate organization regardless of who writes the procedures and/or conducts the tests

  6. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  7. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  8. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  9. General-Purpose Heat Source Safety Verification Test program: Edge-on flyer plate tests

    International Nuclear Information System (INIS)

    George, T.G.

    1987-03-01

    The radioisotope thermoelectric generator (RTG) that will supply power for the Galileo and Ulysses space missions contains 18 General-Purpose Heat Source (GPHS) modules. The GPHS modules provide power by transmitting the heat of 238 Pu α-decay to an array of thermoelectric elements. Each module contains four 238 PuO 2 -fueled clads and generates 250 W(t). Because the possibility of a launch vehicle explosion always exists, and because such an explosion could generate a field of high-energy fragments, the fueled clads within each GPHS module must survive fragment impact. The edge-on flyer plate tests were included in the Safety Verification Test series to provide information on the module/clad response to the impact of high-energy plate fragments. The test results indicate that the edge-on impact of a 3.2-mm-thick, aluminum-alloy (2219-T87) plate traveling at 915 m/s causes the complete release of fuel from capsules contained within a bare GPHS module, and that the threshold velocity sufficient to cause the breach of a bare, simulant-fueled clad impacted by a 3.5-mm-thick, aluminum-alloy (5052-T0) plate is approximately 140 m/s

  10. General-Purpose Heat Source development: Safety Verification Test Program. Bullet/fragment test series

    Energy Technology Data Exchange (ETDEWEB)

    George, T.G.; Tate, R.E.; Axler, K.M.

    1985-05-01

    The radioisotope thermoelectric generator (RTG) that will provide power for space missions contains 18 General-Purpose Heat Source (GPHS) modules. Each module contains four /sup 238/PuO/sub 2/-fueled clads and generates 250 W/sub (t)/. Because a launch-pad or post-launch explosion is always possible, we need to determine the ability of GPHS fueled clads within a module to survive fragment impact. The bullet/fragment test series, part of the Safety Verification Test Plan, was designed to provide information on clad response to impact by a compact, high-energy, aluminum-alloy fragment and to establish a threshold value of fragment energy required to breach the iridium cladding. Test results show that a velocity of 555 m/s (1820 ft/s) with an 18-g bullet is at or near the threshold value of fragment velocity that will cause a clad breach. Results also show that an exothermic Ir/Al reaction occurs if aluminum and hot iridium are in contact, a contact that is possible and most damaging to the clad within a narrow velocity range. The observed reactions between the iridium and the aluminum were studied in the laboratory and are reported in the Appendix.

  11. Development of Out-pile Test Technology for Fuel Assembly Performance Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Tae Hyun; In, W. K.; Oh, D. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)] (and others)

    2007-03-15

    Out-pile tests with full scale fuel assembly are to verify the design and to evaluate the performance of the final products. HTL for the hydraulic tests and FAMeCT for mechanical/structural tests were constructed in this project. The maximum operating conditions of HTL are 30 bar, 320 .deg. C, and 500 m3/hr. This facility can perform the pressure drop test, fuel assembly uplift test, and flow induced vibration test. FAMeCT can perform the bending and vibration tests. The verification of the developed facilities were carried out by comparing the reference data of the fuel assembly which was obtained at the Westinghouse Co. The compared data showed a good coincidence within uncertainties. FRETONUS was developed for high temperature and high pressure fretting wear simulator and performance test. A performance test was conducted for 500 hours to check the integrity, endurance, data acquisition capability of the simulator. The technology of turbulent flow analysis and finite element analysis by computation was developed. From the establishments of out-pile test facilities for full scale fuel assembly, the domestic infrastructure for PWR fuel development has been greatly upgraded.

  12. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  13. bcROCsurface: an R package for correcting verification bias in estimation of the ROC surface and its volume for continuous diagnostic tests.

    Science.gov (United States)

    To Duc, Khanh

    2017-11-18

    Receiver operating characteristic (ROC) surface analysis is usually employed to assess the accuracy of a medical diagnostic test when there are three ordered disease status (e.g. non-diseased, intermediate, diseased). In practice, verification bias can occur due to missingness of the true disease status and can lead to a distorted conclusion on diagnostic accuracy. In such situations, bias-corrected inference tools are required. This paper introduce an R package, named bcROCsurface, which provides utility functions for verification bias-corrected ROC surface analysis. The shiny web application of the correction for verification bias in estimation of the ROC surface analysis is also developed. bcROCsurface may become an important tool for the statistical evaluation of three-class diagnostic markers in presence of verification bias. The R package, readme and example data are available on CRAN. The web interface enables users less familiar with R to evaluate the accuracy of diagnostic tests, and can be found at http://khanhtoduc.shinyapps.io/bcROCsurface_shiny/ .

  14. Integrated verification and testing system (IVTS) for HAL/S programs

    Science.gov (United States)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  15. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  16. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  17. Finite Element Verification of Non-Homogeneous Strain and Stress Fields during Composite Material Testing

    DEFF Research Database (Denmark)

    Mikkelsen, Lars Pilgaard

    2015-01-01

    Uni-directional glass fiber reinforced polymers play a central role in the task increasing the length of wind turbines blades and thereby lowering the cost of energy from wind turbine installations. During this, optimizing the mechanical performance regarding material stiffness, compression...... strength and fatigue performance is essential. Nevertheless, testing composites includes some challenges regarding stiffness determination using conventional strain gauges and achieving correct material failure unaffected by the gripping region during fatigue testing. Challenges, which in the present study......, has been addressed using the finite element method. During this, a verification of experimental observations, a deeper understanding on the test coupon loading and thereby improved test methods has been achieved....

  18. Verification test for three WindCube WLS7 LiDARs at the Høvsøre test site

    DEFF Research Database (Denmark)

    Gottschall, Julia; Courtney, Michael

    The report describes the procedure of testing ground-based WindCube lidars (manufactured by the French company Leosphere) at the Høvsøre test site in comparison to reference sensors mounted at a meteorological mast. Results are presented for three tested units – in detail for unit WLS7-0062, and ......-0062, and in a summary for units WLS7-0064 and WLS7-0066. The verification test covers the evaluation of measured mean wind speeds, wind directions and wind speed standard deviations. The data analysis is basically performed in terms of different kinds of regression analyses.......The report describes the procedure of testing ground-based WindCube lidars (manufactured by the French company Leosphere) at the Høvsøre test site in comparison to reference sensors mounted at a meteorological mast. Results are presented for three tested units – in detail for unit WLS7...

  19. Verification testing of the PKI collector at Sandia National Laboratories, Albuquerque, New Mexico

    Science.gov (United States)

    Hauger, J. S.; Pond, S. L.

    1982-07-01

    Verification testing of a solar collector was undertaken prior to its operation as part of an industrial process heat plant at Capitol Concrete Products in Topeka, Kansas. Testing was performed at a control plant installed at Sandia National Laboratory, Albuquerque, New Mexico (SNLA). Early results show that plant performance is even better than anticipated and far in excess of test criteria. Overall plant efficiencies of 65 to 80 percent were typical during hours of good insolation. A number of flaws and imperfections were detected during operability testing, the most important being a problem in elevation drive alignment due to a manufacturing error. All problems were corrected as they occurred and the plant, with over 40 hours of operation, is currently continuing operability testing in a wholly-automatic mode.

  20. Verification Test for Ultra-Light Deployment Mechanism for Sectioned Deployable Antenna Reflectors

    Science.gov (United States)

    Zajac, Kai; Schmidt, Tilo; Schiller, Marko; Seifart, Klaus; Schmalbach, Matthias; Scolamiero, Lucio

    2013-09-01

    The ultra-light deployment mechanism (UDM) is based on three carbon fibre reinforced plastics (CFRP) curved tape springs made of carbon fibre / cyanate ester prepregs.In the frame of the activity its space application suitability for the deployment of solid reflector antenna sections was investigated. A projected diameter of the full reflector of 4 m to 7 m and specific mass in the order of magnitude of 2.6kg/m2 was focused for requirement derivation.Extensive verification tests including health checks, environmental and functional tests were carried out with an engineering model to enable representative characterizing of the UDM unit.This paper presents the design and a technical description of the UDM as well as a summary of achieved development status with respect to test results and possible design improvements.

  1. 40 CFR 86.1847-01 - Manufacturer in-use verification and in-use confirmatory testing; submittal of information and...

    Science.gov (United States)

    2010-07-01

    ... laboratory equipment calibrations and verifications as prescribed by subpart B of this part or by good... in-use confirmatory testing; submittal of information and maintenance of records. 86.1847-01 Section... confirmatory testing; submittal of information and maintenance of records. (a) The manufacturer who conducts or...

  2. BAGHOUSE FILTRATION PRODUCTS VERIFICATION TESTING, HOW IT BENEFITS THE BOILER BAGHOUSE OPERATOR

    Science.gov (United States)

    The paper describes the Environmental Technology Verification (ETV) Program for baghouse filtration products developed by the Air Pollution Control Technology Verification Center, one of six Centers under the ETV Program, and discusses how it benefits boiler baghouse operators. A...

  3. Testing Dialog-Verification of SIP Phones with Single-Message Denial-of-Service Attacks

    Science.gov (United States)

    Seedorf, Jan; Beckers, Kristian; Huici, Felipe

    The Session Initiation Protocol (SIP) is widely used for signaling in multimedia communications. However, many SIP implementations are still in their infancy and vulnerable to malicious messages. We investigate flaws in the SIP implementations of eight phones, showing that the deficient verification of SIP dialogs further aggravates the problem by making it easier for attacks to succeed. Our results show that the majority of the phones we tested are susceptible to these attacks.

  4. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  5. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  6. FY 1981 report on the results of the verification test on the methanol conversion for oil-fired power plant. Verification test on the environmental safety; 1981 nendo sekiyu karyoku hatsudensho metanoru tenkan tou jissho shiken seika hokokusho. Kankyo anzensei jissho shiken

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1982-08-01

    Assuming the use of methanol which is expected to be promising as petroleum substituting fluid fuel, an investigational study was made on the environmental safety, and the FY 1981 results were summed up. In the study/evaluation of the verification test, conducted were the survey of the results of the studies having been made on toxicity of methanol, working-out of a plan for verification test on the environmental safety of methanol, etc. Moreover, for the purpose of grasping effects of methanol and methanol combustion gas on living organisms, the following were carried out: design and a part of the construction work of facilities in which the test is made for breeding monkey/aquatic animal in the methanol environment, test on its effect on aquatic animal, and purchase of a part of the equipment used for test on its effect on rat/mouse. As to the tests, the following were in the planning stage: toxicity test using macaca on high-concentration (acute)/low-concentration (chronic) inhalation of methanol gas, toxicity test on inhalation of formaldehyde as mock combustion flue gas, test on effects of methanol on fish/shellfish in terms of the fatal concentration/repellent behavior/chronic influence/hindrance of multiplication, etc. (NEDO)

  7. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  8. Examples of verification knowledge and testing of the secondary students through the worksheet. Suggestions for leisure time activities

    International Nuclear Information System (INIS)

    Chmielewska, E.; Kuruc, J.

    2010-01-01

    In this chapter some examples of verification knowledge and testing of the secondary students through the worksheet as well as suggestions for leisure time activities are presented. Used and recommended literature is included.

  9. ON-LINE MONITORING OF I&C TRANSMITTERS AND SENSORS FOR CALIBRATION VERIFICATION AND RESPONSE TIME TESTING WAS SUCCESSFULLY IMPLEMENTED AT ATR

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, Phillip A.; O' Hagan, Ryan; Shumaker, Brent; Hashemian, H. M.

    2017-03-01

    The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carried out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.

  10. Project W-030 flammable gas verification monitoring test

    International Nuclear Information System (INIS)

    BARKER, S.A.

    1999-01-01

    This document describes the verification monitoring campaign used to document the ability of the new ventilation system to mitigate flammable gas accumulation under steady state tank conditions. This document reports the results of the monitoring campaign. The ventilation system configuration, process data, and data analysis are presented

  11. Test/QA plan for the verification testing of alternative or reformulated liquid fuels, fuel additives, fuel emulsions, and lubricants for highway and nonroad use heavy-duty diesel engines

    Science.gov (United States)

    This Environmental Technology Verification Program test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR P...

  12. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  13. Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3

    Science.gov (United States)

    The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.

  14. Implementation and verification of global optimization benchmark problems

    Science.gov (United States)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  15. Development and verification of a reciprocating test rig designed for investigation of piston ring tribology

    DEFF Research Database (Denmark)

    Pedersen, Michael Torben; Imran, Tajammal; Klit, Peder

    2009-01-01

    This paper describes the development and verification of a reciprocating test rig, which was designed to study the piston ring tribology. A crank mechanism is used to generate a reciprocating motion for a moving plate, which acts as the liner. A stationary block acting as the ring package is loaded......, which is suitable for the study of piston ring tribology....

  16. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  17. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P. [and others

    1996-12-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  18. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    International Nuclear Information System (INIS)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P.

    1996-01-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising

  19. Implementation and verification of global optimization benchmark problems

    Directory of Open Access Journals (Sweden)

    Posypkin Mikhail

    2017-12-01

    Full Text Available The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its’ gradient at a given point and the interval estimates of a function and its’ gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  20. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    for the review and any actions that were taken when these items were missing are documented in Section 5 of this report. The availability and use of user experience were limited to extensive experience in performing RESRAD-BUILD calculations by the verification project manager and by participation in the RESRAD-BUILD workshop offered by the code developers on May 11, 2001. The level of a posteriori verification that was implemented is defined in Sections 2 through 4 of this report. In general, a rigorous verification review plan addresses program requirements, design, coding, documentation, test coverage, and evaluation of test results. The scope of the RESRAD-BUILD verification is to focus primarily on program requirements, documentation, testing and evaluation. Detailed program design and source code review would be warranted only in those cases when the evaluation of test results and user experience revealed possible problems in these areas. The verification tasks were conducted in three parts and were applied to version 3.1 of the RESRAD-BUILD code and the final version of the user.s manual, issued in November 2001 (Yu (and others) 2001). These parts include the verification of the deterministic models used in RESRAD-BUILD (Section 2), the verification of the uncertainty analysis model included in RESRAD-BUILD (Section 3), and recommendations for improvement of the RESRAD-BUILD user interface, including evaluations of the user's manual, code design, and calculation methodology (Section 4). Any verification issues that were identified were promptly communicated to the RESRAD-BUILD development team, in particular those that arose from the database and parameter verification tasks. This allowed the developers to start implementing necessary database or coding changes well before this final report was issued

  1. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  2. A verification regime for the spatial discretization of the SN transport equations

    Energy Technology Data Exchange (ETDEWEB)

    Schunert, S.; Azmy, Y. [North Carolina State Univ., Dept. of Nuclear Engineering, 2500 Stinson Drive, Raleigh, NC 27695 (United States)

    2012-07-01

    The order-of-accuracy test in conjunction with the method of manufactured solutions is the current state of the art in computer code verification. In this work we investigate the application of a verification procedure including the order-of-accuracy test on a generic SN transport solver that implements the AHOTN spatial discretization. Different types of semantic errors, e.g. removal of a line of code or changing a single character, are introduced randomly into the previously verified S{sub N} code and the proposed verification procedure is used to identify the coding mistakes (if possible) and classify them. Itemized by error type we record the stage of the verification procedure where the error is detected and report the frequency with which the errors are correctly identified at various stages of the verification. Errors that remain undetected by the verification procedure are further scrutinized to determine the reason why the introduced coding mistake eluded the verification procedure. The result of this work is that the verification procedure based on an order-of-accuracy test finds almost all detectable coding mistakes but rarely, 1.44% of the time, and under certain circumstances can fail. (authors)

  3. Verification test for on-line diagnosis algorithm based on noise analysis

    International Nuclear Information System (INIS)

    Tamaoki, T.; Naito, N.; Tsunoda, T.; Sato, M.; Kameda, A.

    1980-01-01

    An on-line diagnosis algorithm was developed and its verification test was performed using a minicomputer. This algorithm identifies the plant state by analyzing various system noise patterns, such as power spectral densities, coherence functions etc., in three procedure steps. Each obtained noise pattern is examined by using the distances from its reference patterns prepared for various plant states. Then, the plant state is identified by synthesizing each result with an evaluation weight. This weight is determined automatically from the reference noise patterns prior to on-line diagnosis. The test was performed with 50 MW (th) Steam Generator noise data recorded under various controller parameter values. The algorithm performance was evaluated based on a newly devised index. The results obtained with one kind of weight showed the algorithm efficiency under the proper selection of noise patterns. Results for another kind of weight showed the robustness of the algorithm to this selection. (orig.)

  4. TRAC-PF1 code verification with data from the OTIS test facility

    International Nuclear Information System (INIS)

    Childerson, M.T.; Fujita, R.K.

    1985-01-01

    A computer code (TRAC-PF1/MOD1) developed for predicting transient thermal and hydraulic integral nuclear steam supply system (NSSS) response was benchmarked. Post-small break loss-of-coolant accident (LOCA) data from a scaled, experimental facility, designated the One-Through Integral System (OTIS), were obtained for the Babcock and Wilcox NSSS and compared to TRAC predictions. The OTIS tests provided a challenging small break LOCA data set for TRAC verification. The major phases of a small break LOCA observed in the OTIS tests included pressurizer draining and loop saturation, intermittent reactor coolant system circulation, boiler-condenser mode, and the initial stages of refill. The TRAC code was successful in predicting OTIS loop conditions (system pressures and temperatures) after modification of the steam generator model. In particular, the code predicted both pool and auxiliary-feedwater initiated boiler-condenser mode heat transfer

  5. TRAC-PF1 code verification with data from the OTIS test facility

    International Nuclear Information System (INIS)

    Childerson, M.T.; Fujits, R.K.

    1985-01-01

    A computer code (TRAC-PFI/MODI; denoted as TRAC) developed for predicting transient thermal and hydraulic integral nuclear steam supply system (NSSS) response was benchmarked. Post-small break loss-of-coolant accident (LOCA) data from a scaled, experimental facility, designated the Once-Through Integral Systems (OTIS), were obtained for the Babcock and Wilcox NSSS and compared to TRAC predictions. The OTIS tests provided a challenging small break LOCA data set for TRAC verification. The major phases of a small break LOCA observed in the OTIS tests included pressurizer draining and saturation, intermittent reactor coolant system circulation, boiler-condenser mode and the initial stages of refill. The TRAC code was successful in predicting OTIS loop conditions (system pressures and temperatures) after modification of the steam generator model. In particular, the code predicted both pool- and auxiliary- feedwater initiated boiler-condenser mode heat transfer

  6. The joint verification experiments as a global non-proliferation exercise

    International Nuclear Information System (INIS)

    Shaner, J.W.

    1998-01-01

    This conference commemorates the 10th anniversary of the second of two Joint Verification Experiments conducted by the Soviet Union and the US. These two experiments, one at the Nevada test site in the US, and the second here at the Semipalatinsk test site were designed to test the verification of a nuclear testing treaty limiting the size underground explosions to 150 kilotons. By building trust and technical respect between the weapons scientists of the two most powerful adversaries, the Joint Verification Experiment (JVE) had the unanticipated result of initiating a suite of cooperative projects and programs aimed at reducing the Cold War threats and preventing the proliferation of weapons of mass destruction

  7. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  8. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  9. Selected Examples of LDRD Projects Supporting Test Ban Treaty Verification and Nonproliferation

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Al-Ayat, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walter, W. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-02-23

    The Laboratory Directed Research and Development (LDRD) Program at the DOE National Laboratories was established to ensure the scientific and technical vitality of these institutions and to enhance the their ability to respond to evolving missions and anticipate national needs. LDRD allows the Laboratory directors to invest a percentage of their total annual budget in cutting-edge research and development projects within their mission areas. We highlight a selected set of LDRD-funded projects, in chronological order, that have helped provide capabilities, people and infrastructure that contributed greatly to our ability to respond to technical challenges in support of test ban treaty verification and nonproliferation.

  10. Verification Test Report for CFAST 3.1.6

    International Nuclear Information System (INIS)

    Vincent, A.M. III

    2002-01-01

    Fire is a significant hazard in most facilities that handle radioactive materials. The severity of fire varies with room arrangement, combustible loading, ventilation and protective system response. The complexity of even simple situations can be unwieldy to solve by hand calculations. Thus, computer simulation of the fire severity has become an important tool in characterizing fire risk. The Savannah River Site (SRS), a Department of Energy facility, has been using the Consolidated Model of Fire Growth and Smoke Transport (CFAST) software to complete such deterministic evaluations to better characterize the nuclear facility fire severity. To fully utilize CFAST at SRS it is necessary to demonstrate that CFAST produces valid analytic solutions over its range of use. This report describes the primary verification exercise that is required to establish that CFAST, and its user interface program FAST, produce valid analytic solutions. This verification exercise may be used to check the fu nctionality of FAST and as a training tool to familiarize users with the software. In addition, the report consolidates the lessons learned by the SRS staff in using FAST and CFAST as fire modeling tools

  11. Independent verification: operational phase liquid metal breeder reactors

    International Nuclear Information System (INIS)

    Bourne, P.B.

    1981-01-01

    The Fast Flux Test Facility (FFTF) recently achieved 100-percent power and now is in the initial stages of operation as a test reactor. An independent verification program has been established to assist in maintaining stable plant conditions, and to assure the safe operation of the reactor. Independent verification begins with the development of administrative procedures to control all other procedures and changes to the plant configurations. The technical content of the controlling procedures is subject to independent verification. The actual accomplishment of test procedures and operational maneuvers is witnessed by personnel not responsible for operating the plant. Off-normal events are analyzed, problem reports from other operating reactors are evaluated, and these results are used to improve on-line performance. Audits are used to confirm compliance with established practices and to identify areas where individual performance can be improved

  12. Verification and Diagnostics Framework in ATLAS Trigger/DAQ

    CERN Document Server

    Barczyk, M.; Caprini, M.; Da Silva Conceicao, J.; Dobson, M.; Flammer, J.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Soloviev, I.; Hart, R.; Amorim, A.; Klose, D.; Lima, J.; Pedro, J.; Wolters, H.; Badescu, E.; Alexandrov, I.; Kotov, V.; Mineev, M.; Ryabov, Yu.; Ryabov, Yu.

    2003-01-01

    Trigger and data acquisition (TDAQ) systems for modern HEP experiments are composed of thousands of hardware and software components depending on each other in a very complex manner. Typically, such systems are operated by non-expert shift operators, which are not aware of system functionality details. It is therefore necessary to help the operator to control the system and to minimize system down-time by providing knowledge-based facilities for automatic testing and verification of system components and also for error diagnostics and recovery. For this purpose, a verification and diagnostic framework was developed in the scope of ATLAS TDAQ. The verification functionality of the framework allows developers to configure simple low-level tests for any component in a TDAQ configuration. The test can be configured as one or more processes running on different hosts. The framework organizes tests in sequences, using knowledge about components hierarchy and dependencies, and allowing the operator to verify the fun...

  13. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    OLGUIN, L.J.

    2000-01-01

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  14. Surface coatings as xenon diffusion barriers on plastic scintillators : Improving Nuclear-Test-Ban Treaty verification

    OpenAIRE

    Bläckberg, Lisa

    2011-01-01

    This thesis investigates the ability of transparent surface coatings to reduce xenon diffusion into plastic scintillators. The motivation for the work is improved radioxenon monitoring equipment, used with in the framework of the verification regime of the Comprehensive Nuclear-Test-Ban Treaty. A large part of the equipment used in this context incorporates plastic scintillators which are in direct contact with the radioactive gas to be detected. One problem with such setup is that radioxenon...

  15. Development of an automated testing system for verification and validation of nuclear data

    International Nuclear Information System (INIS)

    Triplett, B. S.; Anghaie, S.; White, M. C.

    2008-01-01

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National Laboratory (LANL) in collaboration with the University of Florida is developing a methodology to automate the process of nuclear data verification and validation. The International Criticality Safety Benchmark Experiment Project (ICSBEP) provides a set of criticality problems that may be used to evaluate nuclear data. This process tests a number of data libraries using cases from the ICSBEP benchmark set to demonstrate how automation of these tasks may reduce errors and increase efficiency. The process is driven by an integrated set of Python scripts. Material and geometry data may be read from an existing code input file to generate a standardized template or the template may be generated directly by the user The user specifies the desired precision and other vital problem parameters. The Python scripts generate input decks for multiple transport codes from these templates, run and monitor individual jobs, and parse the relevant output. This output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. (authors)

  16. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  17. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  18. Return to normal of sup(99m)Tc-plasmin test after deep venous thrombosis and its relationship to vessel wall fibrinolysis

    Energy Technology Data Exchange (ETDEWEB)

    Edenbrandt, C.M.; Hedner, U.; Tengborn, L.; Nilsson, J.; Ohlin, P.

    1986-08-01

    Fourteen patients with deep venous thrombosis (DVT) and a positive sup(99m)Tc-plasmin test were followed up to determine how soon a negative test was obtained. Localization and extension of the thrombi were determined by phlebography. Plasminogen activator activity in vein walls and local fibrinolytic activity after venous occlusion were measured in order to find out what the prerequisites for impaired thrombolysis are. The time required to obtain a negative sup(99m)Tc-plasmin test showed considerable variation, ranging from less than 1 week to more than 6 months. The sup(99m)Tc-plasmin test had returned to normal in 64% of the patients after 6 months. No relationship was found between vessel wall fibrinolysis and time to normalization. Instead, we found an association between the time to normalization of the sup(99m)Tc-plasmin test and the size of the thrombus, according to phlebography, as well as between the time to normalization of the sup(99m)Tc-plasmin test and the extension of leg points with a positive sup(99m)Tc-plasmin test at admission. The finding of abnormal sup(99m)Tc-plasmin test results more than 6 months after acute DVT is of practical importance and warrants caution when evaluating patients with symptoms and signs suggestive of acute recurrent DVT.

  19. Production controls (PC) and technical verification testing (TVT). A methodology for the control and tracking of LILW waste package conditioning

    International Nuclear Information System (INIS)

    Leon, A.M.; Nieto, J.L.L.; Garrido, J.G.

    2003-01-01

    As part of its low and intermediate level radioactive waste (LILW) characterisation and acceptance activities, ENRESA has set up a quality control programme that covers the different phases of radioactive waste package production and implies different levels of tracking in generation, assessment of activity and control of the documentation associated therewith. Furthermore, ENRESA has made available the mechanisms required for verification, depending on the results of periodic sampling, of the quality of the end product delivered by the waste producers. Both processes are included within the framework of two programmes of complementary activities: production controls (PC) and technical verification testing (TVT). (orig.)

  20. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  1. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  2. [Practical Use of Doppler Ultrasonography of the Cardiovascular System and Clinical Laboratory Tests for the Management of Pulmonary Embolism].

    Science.gov (United States)

    Mizukami, Naoko

    2015-08-01

    Acute or chronic pulmonary embolism (PE) is a serious disease, and the risk of mortality is increased if untreated. In 90% of cases the embolus source is deep vein thrombosis (DVT) of the lower limbs or pelvic cavity. Therefore, it is necessary to recognize these as venous thromboembolism (VTE) which includes both DVT and PE. I suggest that Doppler ultrasonography of cardiovascular and clinical laboratory tests provide very valuable medical support for the management of VTE. Specifically, in the early diagnosis of VTE and the prevention of fatal PE, Doppler ultrasonography (cardiac and vascular) can provide very useful information. On the other hand, blood coagulation and thrombophilia tests are important to determine the risk of VTE and evaluate the effect of anticoagulant therapy on VTE. In this paper, I explain the main points of each examination of VTE by describing representative cases. I also show the results on investigating cases in our hospital involving diseases related to VTE and the onset site of DVT. In addition, I introduce how we convey the results of analysis to the clinical side.

  3. Verification of industrial x-ray machine: MINTs experience

    International Nuclear Information System (INIS)

    Aziz Amat; Saidi Rajab; Eesan Pasupathi; Saipo Bahari Abdul Ratan; Shaharudin Sayuti; Abd Nassir Ibrahim; Abd Razak Hamzah

    2005-01-01

    Radiation and electrical safety of the industrial x-ray equipment required to meet Atomic Energy Licensing Board(AELB) guidelines ( LEM/TEK/42 ) at the time of installation and subsequently a periodic verification should be ensured. The purpose of the guide is to explain the requirements employed in conducting the test on industrial x-ray apparatus and be certified in meeting with our local legislative and regulation. Verification is aimed to provide safety assurance information on electrical requirements and the minimum radiation exposure to the operator. This regulation is introduced on new models imported into the Malaysian market. Since June, 1997, Malaysian Institute for Nuclear Technology Research (MINT) has been approved by AELB to provide verification services to private company, government and corporate body throughout Malaysia. Early January 1997, AELB has made it mandatory that all x-ray equipment for industrial purpose (especially Industrial Radiography) must fulfill certain performance test based on the LEM/TEK/42 guidelines. MINT as the third party verification encourages user to improve maintenance of the equipment. MINT experiences in measuring the performance on intermittent and continuous duty rating single-phase industrial x-ray machine in the year 2004 indicated that all of irradiating apparatus tested pass the test and met the requirements of the guideline. From MINT record, 1997 to 2005 , three x-ray models did not meet the requirement and thus not allowed to be used unless the manufacturers willing to modify it to meet AELB requirement. This verification procedures on electrical and radiation safety on industrial x-ray has significantly improved the the maintenance cultures and safety awareness in the usage of x-ray apparatus in the industrial environment. (Author)

  4. VERIFICATION TESTING OF AIR POLLUTION CONTROL TECHNOLOGY QUALITY MANAGEMENT PLAN

    Science.gov (United States)

    This document is the basis for quality assurance for the Air Pollution Control Technology Verification Center (APCT Center) operated under the U.S. Environmental Protection Agency (EPA). It describes the policies, organizational structure, responsibilities, procedures, and qualit...

  5. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  6. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno

    1997-12-31

    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  7. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  8. Emergency Department Management of Suspected Calf-Vein Deep Venous Thrombosis: A Diagnostic Algorithm

    Directory of Open Access Journals (Sweden)

    Levi Kitchen

    2016-06-01

    Full Text Available Introduction: Unilateral leg swelling with suspicion of deep venous thrombosis (DVT is a common emergency department (ED presentation. Proximal DVT (thrombus in the popliteal or femoral veins can usually be diagnosed and treated at the initial ED encounter. When proximal DVT has been ruled out, isolated calf-vein deep venous thrombosis (IC-DVT often remains a consideration. The current standard for the diagnosis of IC-DVT is whole-leg vascular duplex ultrasonography (WLUS, a test that is unavailable in many hospitals outside normal business hours. When WLUS is not available from the ED, recommendations for managing suspected IC-DVT vary. The objectives of the study is to use current evidence and recommendations to (1 propose a diagnostic algorithm for IC-DVT when definitive testing (WLUS is unavailable; and (2 summarize the controversy surrounding IC-DVT treatment. Discussion: The Figure combines D-dimer testing with serial CUS or a single deferred FLUS for the diagnosis of IC-DVT. Such an algorithm has the potential to safely direct the management of suspected IC-DVT when definitive testing is unavailable. Whether or not to treat diagnosed IC-DVT remains widely debated and awaiting further evidence. Conclusion: When IC-DVT is not ruled out in the ED, the suggested algorithm, although not prospectively validated by a controlled study, offers an approach to diagnosis that is consistent with current data and recommendations. When IC-DVT is diagnosed, current references suggest that a decision between anticoagulation and continued follow-up outpatient testing can be based on shared decision-making. The risks of proximal progression and life-threatening embolization should be balanced against the generally more benign natural history of such thrombi, and an individual patient’s risk factors for both thrombus propagation and complications of anticoagulation. [West J Emerg Med. 2016;17(4384-390.

  9. Emergency Department Management of Suspected Calf-Vein Deep Venous Thrombosis: A Diagnostic Algorithm.

    Science.gov (United States)

    Kitchen, Levi; Lawrence, Matthew; Speicher, Matthew; Frumkin, Kenneth

    2016-07-01

    Unilateral leg swelling with suspicion of deep venous thrombosis (DVT) is a common emergency department (ED) presentation. Proximal DVT (thrombus in the popliteal or femoral veins) can usually be diagnosed and treated at the initial ED encounter. When proximal DVT has been ruled out, isolated calf-vein deep venous thrombosis (IC-DVT) often remains a consideration. The current standard for the diagnosis of IC-DVT is whole-leg vascular duplex ultrasonography (WLUS), a test that is unavailable in many hospitals outside normal business hours. When WLUS is not available from the ED, recommendations for managing suspected IC-DVT vary. The objectives of the study is to use current evidence and recommendations to (1) propose a diagnostic algorithm for IC-DVT when definitive testing (WLUS) is unavailable; and (2) summarize the controversy surrounding IC-DVT treatment. The Figure combines D-dimer testing with serial CUS or a single deferred FLUS for the diagnosis of IC-DVT. Such an algorithm has the potential to safely direct the management of suspected IC-DVT when definitive testing is unavailable. Whether or not to treat diagnosed IC-DVT remains widely debated and awaiting further evidence. When IC-DVT is not ruled out in the ED, the suggested algorithm, although not prospectively validated by a controlled study, offers an approach to diagnosis that is consistent with current data and recommendations. When IC-DVT is diagnosed, current references suggest that a decision between anticoagulation and continued follow-up outpatient testing can be based on shared decision-making. The risks of proximal progression and life-threatening embolization should be balanced against the generally more benign natural history of such thrombi, and an individual patient's risk factors for both thrombus propagation and complications of anticoagulation.

  10. Verification test of the SURF and SURFplus models in xRage

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-18

    As a verification test of the SURF and SURFplus models in the xRage code we use a propagating underdriven detonation wave in 1-D. This is about the only test cases for which an accurate solution can be determined based on the theoretical structure of the solution. The solution consists of a steady ZND reaction zone profile joined with a scale invariant rarefaction or Taylor wave and followed by a constant state. The end of the reaction profile and the head of the rarefaction coincide with the sonic CJ state of the detonation wave. The constant state is required to match a rigid wall boundary condition. For a test case, we use PBX 9502 with the same EOS and burn rate as previously used to test the shock detector algorithm utilized by the SURF model. The detonation wave is propagated for 10 μs (slightly under 80mm). As expected, the pointwise errors are largest in the neighborhood of discontinuities; pressure discontinuity at the lead shock front and pressure derivative discontinuities at the head and tail of the rarefaction. As a quantitative measure of the overall accuracy, the L2 norm of the difference of the numerical pressure and the exact solution is used. Results are presented for simulations using both a uniform grid and an adaptive grid that refines the reaction zone.

  11. Verification of experimental modal modeling using HDR (Heissdampfreaktor) dynamic test data

    International Nuclear Information System (INIS)

    Srinivasan, M.G.; Kot, C.A.; Hsieh, B.J.

    1983-01-01

    Experimental modal modeling involves the determination of the modal parameters of the model of a structure from recorded input-output data from dynamic tests. Though commercial modal analysis algorithms are being widely used in many industries their ability to identify a set of reliable modal parameters of an as-built nuclear power plant structure has not been systematically verified. This paper describes the effort to verify MODAL-PLUS, a widely used modal analysis code, using recorded data from the dynamic tests performed on the reactor building of the Heissdampfreaktor, situated near Frankfurt, Federal Republic of Germany. In the series of dynamic tests on HDR in 1979, the reactor building was subjected to forced vibrations from different types and levels of dynamic excitations. Two sets of HDR containment building input-output data were chosen for MODAL-PLUS analyses. To reduce the influence of nonlinear behavior on the results, these sets were chosen so that the levels of excitation are relatively low and about the same in the two sets. The attempted verification was only partially successful in that only one modal model, with a limited range of validity, could be synthesized and in that the goodness of fit could be verified only in this limited range

  12. Current Status of Aerosol Generation and Measurement Facilities for the Verification Test of Containment Filtered Venting System in KAERI

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Il; An, Sang Mo; Ha, Kwang Soon; Kim, Hwan Yeol [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this study, the design of aerosol generation and measurement systems are explained and present circumstances are also described. In addition, the aerosol test plan is shown. Containment Filtered Venting System (FCVS) is one of the safety features to reduce the amount of released fission product into the environment by depressurizing the containment. Since Chernobyl accident, the regulatory agency in several countries in Europe such as France, Germany, Sweden, etc. have been demanded the installation of the CFVS. Moreover, the feasibility study on the CFVS was also performed in U.S. After the Fukushima accident, there is a need to improve a containment venting or installation of depressurizing facility in Korea. As a part of a Ministry of Trade, Industry and Energy (MOTIE) project, KAERI has been conducted the integrated performance verification test of CFVS. As a part of the test, aerosol generation system and measurement systems were designed to simulate the fission products behavior. To perform the integrated verification test of CFVS, aerosol generation and measurement system was designed and manufactured. The component operating condition is determined to consider the severe accident condition. The test will be performed in normal conditions at first, and will be conducted under severe condition, high pressure and high temperature. Undesirable difficulties which disturb the elaborate test are expected, such as thermophoresis on the pipe, vapor condensation on aerosol, etc.

  13. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  14. FEFTRA {sup TM} verification. Update 2013

    Energy Technology Data Exchange (ETDEWEB)

    Loefman, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Meszaros, F. [The Relief Lab., Harskut, (Hungary)

    2013-12-15

    FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems

  15. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Directory of Open Access Journals (Sweden)

    Mingyu Park

    2012-12-01

    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.

  16. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    Science.gov (United States)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  17. Verification Test Report for CFAST 3.1.6; TOPICAL

    International Nuclear Information System (INIS)

    Vincent, A.M. III

    2002-01-01

    Fire is a significant hazard in most facilities that handle radioactive materials. The severity of fire varies with room arrangement, combustible loading, ventilation and protective system response. The complexity of even simple situations can be unwieldy to solve by hand calculations. Thus, computer simulation of the fire severity has become an important tool in characterizing fire risk. The Savannah River Site (SRS), a Department of Energy facility, has been using the Consolidated Model of Fire Growth and Smoke Transport (CFAST) software to complete such deterministic evaluations to better characterize the nuclear facility fire severity. To fully utilize CFAST at SRS it is necessary to demonstrate that CFAST produces valid analytic solutions over its range of use. This report describes the primary verification exercise that is required to establish that CFAST, and its user interface program FAST, produce valid analytic solutions. This verification exercise may be used to check the fu nctionality of FAST and as a training tool to familiarize users with the software. In addition, the report consolidates the lessons learned by the SRS staff in using FAST and CFAST as fire modeling tools

  18. APPLICATION OF STEEL PIPE PILE LOADING TESTS TO DESIGN VERIFICATION OF FOUNDATION OF THE TOKYO GATE BRIDGE

    Science.gov (United States)

    Saitou, Yutaka; Kikuchi, Yoshiaki; Kusakabe, Osamu; Kiyomiya, Osamu; Yoneyama, Haruo; Kawakami, Taiji

    Steel sheet pipe pile foundations with large diameter steel pipe sheet pile were used for the foundation of the main pier of the Tokyo Gateway bridge. However, as for the large diameter steel pipe pile, the bearing mechanism including a pile tip plugging effect is still unclear due to lack of the practical examinations even though loading tests are performed on Trans-Tokyo Bay Highway. In the light of the foregoing problems, static pile loading tests both vertical and horizontal directions, a dynamic loading test, and cone penetration tests we re conducted for determining proper design parameters of the ground for the foundations. Design parameters were determined rationally based on the tests results. Rational design verification was obtained from this research.

  19. SORO post-simulations of Bruce A Unit 4 in-core flux detector verification tests

    Energy Technology Data Exchange (ETDEWEB)

    Braverman, E.; Nainer, O. [Bruce Power, Nuclear Safety Analysis and Support Dept., Toronto, Ontario (Canada)]. E-mail: Evgeny.Braverman@brucepower.com; Ovidiu.Nainer@brucepower.com

    2004-07-01

    During the plant equipment assessment prior to requesting approval for restart of Bruce A Units 3 and 4 it was determined that all in-core flux detectors needed to be replaced. Flux detector verification tests were performed to confirm that the newly installed detectors had been positioned according to design specifications and that their response closely follows the calculated flux shape changes caused by selected reactivity mechanism movements. By comparing the measured and post-simulated RRS and NOP detector responses to various perturbations, it was confirmed that the new detectors are wired and positioned correctly. (author)

  20. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  1. On the dragnosis of deep vein thrombosis

    International Nuclear Information System (INIS)

    Olsson, C.-G.

    1979-01-01

    Clinical and laboratory diagnostic methods were studied in 301 consecutive patients with suspected deep vein thrombosis (DVT). Unexpectedly, phlebography (the reference method) was found to cause DVT in estimated 48 % of patients without initial DVT. Using a new type of contrast medium, however, no thrombotic complications were found. - Neither clinical examination nor plethysmography were found to give reliable results. Using a modified technique for radioisotope detection, high sensitivity to DVT was found with the 125 I-fibrinogen uptake test (within 2 days) and a newly developed 99 Tcsup(m)-plasmin test (within one hour). Since both tests showed low specificity, they are reliable as screening tests to exclude DVT, but not as independent diagnostic methods. (author)

  2. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  3. Fluence complexity for IMRT field and simplification of IMRT verification

    International Nuclear Information System (INIS)

    Hanushova, Tereza; Vondarchek, Vladimir

    2013-01-01

    Intensity Modulated Radiation Therapy (IMRT) requires dosimetric verification of each patient’s plan, which is time consuming. This work deals with the idea of minimizing the number of fields for control, or even replacing plan verification by machine quality assurance (QA). We propose methods for estimation of fluence complexity in an IMRT field based on dose gradients and investigate the relation between results of gamma analysis and this quantity. If there is a relation, it might be possible to only verify the most complex field of a plan. We determine the average fluence complexity in clinical fields and design a test fluence corresponding to this amount of complexity which might be used in daily QA and potentially replace patient-related verification. Its applicability is assessed in clinical practice. The relation between fluence complexity and results of gamma analysis has been confirmed for plans but not for single fields. There is an agreement between the suggested test fluence and clinical fields in the average gamma parameter. A critical value of average gamma has been specified for the test fluence as a criterion for distinguishing between poorly and well deliverable plans. It will not be possible to only verify the most complex field of a plan but verification of individual plans could be replaced by a morning check of the suggested test fluence, together with a well-established set of QA tests. (Author)

  4. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  5. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at{sub R}isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at{sub R}isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the

  6. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    International Nuclear Information System (INIS)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at R isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at R isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the perspective of

  7. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    Science.gov (United States)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  8. Testing the algorithms for automatic identification of errors on the measured quantities of the nuclear power plant. Verification tests

    International Nuclear Information System (INIS)

    Svatek, J.

    1999-12-01

    During the development and implementation of supporting software for the control room and emergency control centre at the Dukovany nuclear power plant it appeared necessary to validate the input quantities in order to assure operating reliability of the software tools. Therefore, the development of software for validation of the measured quantities of the plant data sources was initiated, and the software had to be debugged and verified. The report contains the proposal for and description of the verification tests for testing the algorithms of automatic identification of errors on the observed quantities of the NPP by means of homemade validation software. In particular, the algorithms treated serve the validation of the hot leg temperature at primary circuit loop no. 2 or 4 at the Dukovany-2 reactor unit using data from the URAN and VK3 information systems, recorded during 3 different days. (author)

  9. Z-2 Architecture Description and Requirements Verification Results

    Science.gov (United States)

    Graziosi, Dave; Jones, Bobby; Ferl, Jinny; Scarborough, Steve; Hewes, Linda; Ross, Amy; Rhodes, Richard

    2016-01-01

    The Z-2 Prototype Planetary Extravehicular Space Suit Assembly is a continuation of NASA's Z series of spacesuits. The Z-2 is another step in NASA's technology development roadmap leading to human exploration of the Martian surface. The suit was designed for maximum mobility at 8.3 psid, reduced mass, and to have high fidelity life support interfaces. As Z-2 will be man-tested at full vacuum in NASA JSC's Chamber B, it was manufactured as Class II, making it the most flight-like planetary walking suit produced to date. The Z-2 suit architecture is an evolution of previous EVA suits, namely the ISS EMU, Mark III, Rear Entry I-Suit and Z-1 spacesuits. The suit is a hybrid hard and soft multi-bearing, rear entry spacesuit. The hard upper torso (HUT) is an all-composite structure and includes a 2-bearing rolling convolute shoulder with Vernier sizing mechanism, removable suit port interface plate (SIP), elliptical hemispherical helmet and self-don/doff shoulder harness. The hatch is a hybrid aluminum and composite construction with Apollo style gas connectors, custom water pass-thru, removable hatch cage and interfaces to primary and auxiliary life support feed water bags. The suit includes Z-1 style lower arms with cam brackets for Vernier sizing and government furnished equipment (GFE) Phase VI gloves. The lower torso includes a telescopic waist sizing system, waist bearing, rolling convolute waist joint, hard brief, 2 bearing soft hip thigh, Z-1 style legs with ISS EMU style cam brackets for sizing, and conformal walking boots with ankle bearings. The Z-2 Requirements Verification Plan includes the verification of more than 200 individual requirements. The verification methods include test, analysis, inspection, demonstration or a combination of methods. Examples of unmanned requirements include suit leakage, proof pressure testing, operational life, mass, isometric man-loads, sizing adjustment ranges, internal and external interfaces such as in-suit drink bag

  10. Design and Verification of Critical Pressurised Windows for Manned Spaceflight

    Science.gov (United States)

    Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.

    2014-06-01

    The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.

  11. DOE/LLNL verification symposium on technologies for monitoring nuclear tests related to weapons proliferation

    International Nuclear Information System (INIS)

    Nakanishi, K.K.

    1993-01-01

    The rapidly changing world situation has raised concerns regarding the proliferation of nuclear weapons and the ability to monitor a possible clandestine nuclear testing program. To address these issues, Lawrence Livermore National Laboratory's (LLNL) Treaty Verification Program sponsored a symposium funded by the US Department of Energy's (DOE) Office of Arms Control, Division of Systems and Technology. The DOE/LLNL Symposium on Technologies for Monitoring Nuclear Tests Related to Weapons Proliferation was held at the DOE's Nevada Operations Office in Las Vegas, May 6--7,1992. This volume is a collection of several papers presented at the symposium. Several experts in monitoring technology presented invited talks assessing the status of monitoring technology with emphasis on the deficient areas requiring more attention in the future. In addition, several speakers discussed proliferation monitoring technologies being developed by the DOE's weapons laboratories

  12. Empirical Tests and Preliminary Results with the Krakatoa Tool for Full Static Program Verification

    Directory of Open Access Journals (Sweden)

    Ramírez-de León Edgar Darío

    2014-10-01

    Full Text Available XJML (Ramírez et al., 2012 is a modular external platform for Verification and Validation of Java classes using the Java Modeling Language (JML through contracts written in XML. One problem faced in the XJML development was how to integrate Full Static Program Verification (FSPV. This paper presents the experiments and results that allowed us to define what tool to embed in XJML to execute FSPV.

  13. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    International Nuclear Information System (INIS)

    Weaver, Phyllis C.

    2012-01-01

    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site's conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse

  14. Compressive sensing using optimized sensing matrix for face verification

    Science.gov (United States)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  15. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, REMOVAL OF ARSENIC IN DRINKING WATER: PHASE 1-ADI PILOT TEST UNIT NO. 2002-09 WITH MEDIA G2®

    Science.gov (United States)

    Integrity verification testing of the ADI International Inc. Pilot Test Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8...

  17. In-flight calibration and verification of the Planck-LFI instrument

    OpenAIRE

    Gregorio, Anna; Cuttaia, Francesco; Mennella, Aniello; Bersanelli, Marco; Maris, Michele; Meinhold, Peter; Sandri, Maura; Terenzi, Luca; Tomasi, Maurizio; Villa, Fabrizio; Frailis, Marco; Morgante, Gianluca; Pearson, Dave; Zacchei, Andrea; Battaglia, Paola

    2013-01-01

    In this paper we discuss the Planck-LFI in-flight calibration campaign. After a brief overview of the ground test campaigns, we describe in detail the calibration and performance verification (CPV) phase, carried out in space during and just after the cool-down of LFI. We discuss in detail the functionality verification, the tuning of the front-end and warm electronics, the preliminary performance assessment and the thermal susceptibility tests. The logic, sequence, goals and results of the i...

  18. VERIFICATION OF THE SENTINEL-4 FOCAL PLANE SUBSYSTEM

    Directory of Open Access Journals (Sweden)

    C. Williges

    2017-05-01

    Full Text Available The Sentinel-4 payload is a multi-spectral camera system which is designed to monitor atmospheric conditions over Europe. The German Aerospace Center (DLR in Berlin, Germany conducted the verification campaign of the Focal Plane Subsystem (FPS on behalf of Airbus Defense and Space GmbH, Ottobrunn, Germany. The FPS consists, inter alia, of two Focal Plane Assemblies (FPAs, one for the UV-VIS spectral range (305 nm … 500 nm, the second for NIR (750 nm … 775 nm. In this publication, we will present in detail the opto-mechanical laboratory set-up of the verification campaign of the Sentinel-4 Qualification Model (QM which will also be used for the upcoming Flight Model (FM verification. The test campaign consists mainly of radiometric tests performed with an integrating sphere as homogenous light source. The FPAs have mainly to be operated at 215 K ± 5 K, making it necessary to exploit a thermal vacuum chamber (TVC for the test accomplishment. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Furthermore selected test analyses and results will be presented, showing that the Sentinel-4 FPS meets specifications.

  19. Verification of the Sentinel-4 focal plane subsystem

    Science.gov (United States)

    Williges, Christian; Uhlig, Mathias; Hilbert, Stefan; Rossmann, Hannes; Buchwinkler, Kevin; Babben, Steffen; Sebastian, Ilse; Hohn, Rüdiger; Reulke, Ralf

    2017-09-01

    The Sentinel-4 payload is a multi-spectral camera system, designed to monitor atmospheric conditions over Europe from a geostationary orbit. The German Aerospace Center, DLR Berlin, conducted the verification campaign of the Focal Plane Subsystem (FPS) during the second half of 2016. The FPS consists, of two Focal Plane Assemblies (FPAs), two Front End Electronics (FEEs), one Front End Support Electronic (FSE) and one Instrument Control Unit (ICU). The FPAs are designed for two spectral ranges: UV-VIS (305 nm - 500 nm) and NIR (750 nm - 775 nm). In this publication, we will present in detail the set-up of the verification campaign of the Sentinel-4 Qualification Model (QM). This set up will also be used for the upcoming Flight Model (FM) verification, planned for early 2018. The FPAs have to be operated at 215 K +/- 5 K, making it necessary to exploit a thermal vacuum chamber (TVC) for the test accomplishment. The test campaign consists mainly of radiometric tests. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Selected test analyses and results will be presented.

  20. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  1. Inverse transport for the verification of the Comprehensive Nuclear Test Ban Treaty

    Directory of Open Access Journals (Sweden)

    J.-P. Issartel

    2003-01-01

    Full Text Available An international monitoring system is being built as a verification tool for the Comprehensive Test Ban Treaty. Forty stations will measure on a worldwide daily basis the concentration of radioactive noble gases. The paper introduces, by handling preliminary real data, a new approach of backtracking for the identification of sources of passive tracers after positive measurements. When several measurements are available the ambiguity about possible sources is reduced significantly. The approach is validated against ETEX data. A distinction is made between adjoint and inverse transport shown to be, indeed, different though equivalent ideas. As an interesting side result it is shown that, in the passive tracer dispersion equation, the diffusion stemming from a time symmetric turbulence is necessarily a self-adjoint operator, a result easily verified for the usual gradient closure, but more general.

  2. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  3. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  4. Formal Verification of Real-Time System Requirements

    Directory of Open Access Journals (Sweden)

    Marcin Szpyrka

    2000-01-01

    Full Text Available The methodology of system requirements verification presented in this paper is a proposition of a practical procedure for reducing some negatives of the specification of requirements. The main problem that is considered is to create a complete description of the system requirements without any negatives. Verification of the initially defined requirements is based on the coloured Petri nets. Those nets are useful for testing some properties of system requirements such as completeness, consistency and optimality. An example ofthe litt controller is presented.

  5. Electric and hybrid vehicle self-certification and verification procedures: Market Demonstration Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-03-01

    The process by which a manufacturer of an electric or hybrid vehicle certifies that his vehicle meets the DOE Performance Standards for Demonstration is described. Such certification is required for any vehicles to be purchased under the Market Demonstration Program. It also explains the verification testing process followed by DOE for testing to verify compliance. Finally, the document outlines manufacturer responsibilities and presents procedures for recertification of vehicles that have failed verification testing.

  6. Online 3D EPID-based dose verification: Proof of concept

    Energy Technology Data Exchange (ETDEWEB)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozendaal@nki.nl; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben [Department of Radiation Oncology, The Netherlands Cancer Institute, Amsterdam 1066 CX (Netherlands); Herk, Marcel van [University of Manchester, Manchester Academic Health Science Centre, The Christie NHS Foundation Trust, Manchester M20 4BX (United Kingdom)

    2016-07-15

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  7. Online 3D EPID-based dose verification: Proof of concept

    International Nuclear Information System (INIS)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; Herk, Marcel van

    2016-01-01

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  8. Online 3D EPID-based dose verification: Proof of concept.

    Science.gov (United States)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took

  9. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  10. Verification of the HDR-test V44 using the computer program RALOC-MOD1/83

    International Nuclear Information System (INIS)

    Jahn, H.; Pham, T. v.; Weber, G.; Pham, B.T.

    1985-01-01

    RALOC-MOD1/83 was extended by a drainage and sump level modul and several component models to serve as a containment systems code for various LWR types. One such application is to simulate the blowdown in a full pressure containment which is important for the short and long term hydrogen distribution. The post test calculation of the containment standard problem experiment HDR-V44 shows a good agreement, to the test data. The code may be used for short and long term predictions, but it was learned that double containments need the representation of the gap between the inner and outer shell into several zones to achieve a good long-term temperature prediction. The present work completes the development, verification and documentation of RALOC-MOD1. (orig.) [de

  11. A two-dimensional liquid-filled ionization chamber array prototype for small-field verification: characterization and first clinical tests

    International Nuclear Information System (INIS)

    Brualla-González, Luis; Vicedo, Aurora; Roselló, Joan V; Gómez, Faustino; González-Castaño, Diego M; Gago-Arias, Araceli; Pazos, Antonio; Zapata, Martín; Pardo-Montero, Juan

    2012-01-01

    In this work we present the design, characterization and first clinical tests of an in-house developed two-dimensional liquid-filled ionization chamber prototype for the verification of small radiotherapy fields and treatments containing such small fields as in radiosurgery, which consists of 2 mm × 2 mm pixels arranged on a 16×8 rectangular grid. The ionization medium is isooctane. The characterization of the device included the study of depth, field-size and dose-rate dependences, which are sufficiently moderate for a good operation at therapy radiation levels. However, the detector presents an important anisotropic response, up to ≃ 12% for front versus near-lateral incidence, which can impact the verification of full treatments with different incidences. In such a case, an anisotropy correction factor can be applied. Output factors of small square fields measured with the device show a small systematic over-response, less than 1%, when compared to unshielded diode measurements. An IMRT radiosurgery treatment has been acquired with the liquid-filled ionization chamber device and compared with film dosimetry by using the gamma method, showing good agreement: over 99% passing rates for 1.2% and 1.2 mm for an incidence-per-incidence analysis; 100% passing rates for tolerances 1.8% and 1.8 mm when the whole treatment is analysed and the anisotropy correction factor is applied. The point dose verification for each incidence of the treatment performed with the liquid-filled ionization chamber agrees within 1% with a CC01 ionization chamber. This prototype has shown the utility of this kind of technology for the verification of small fields/treatments. Currently, a larger device covering a 5 cm × 5 cm area is under development. (paper)

  12. Field test of short-notice random inspections for inventory-change verification at a low-enriched-uranium fuel-fabrication plant

    International Nuclear Information System (INIS)

    Fishbone, L.G.; Moussalli, G.; Naegele, G.

    1995-01-01

    An approach of short-notice random inspections (SNRIs) for inventory-change verification can enhance the effectiveness and efficiency of international safeguards at natural or low-enriched uranium (LEU) fuel fabrication plants. According to this approach, the plant operator declares the contents of nuclear material items before knowing if an inspection will occur to verify them. Additionally, items about which declarations are newly made should remain available for verification for an agreed time. Then a statistical inference can be made from verification results for items verified during SNRIs to the entire populations, i.e. the entire strata, even if inspectors were not present when many items were received or produced. A six-month field test of the feasibility of such SNRIs took place at the Westinghouse Electric Corporation Commercial Nuclear Fuel Division during 1993. Westinghouse personnel made daily declarations about both feed and product items, uranium hexafluoride cylinders and finished fuel assemblies, using a custom-designed computer ''mailbox''. Safeguards inspectors from the IAEA conducted eight SNRIs to verify these declarations. They arrived unannounced at the plant, in most cases immediately after travel from Canada, where the IAEA maintains a regional office. Items from both strata were verified during the SNRIs by meant of nondestructive assay equipment

  13. Verification and Validation of Tropospheric Model/Database

    National Research Council Canada - National Science Library

    Junho, choi

    1998-01-01

    A verification and validation of tropospheric models and databases has been performed based on ray tracing algorithm, statistical analysis, test on real time system operation, and other technical evaluation process...

  14. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  15. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  16. A study of compositional verification based IMA integration method

    Science.gov (United States)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  17. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  18. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  19. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  20. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  1. GENERIC VERIFICATION PROTOCOL FOR AQUEOUS CLEANER RECYCLING TECHNOLOGIES

    Science.gov (United States)

    This generic verification protocol has been structured based on a format developed for ETV-MF projects. This document describes the intended approach and explain plans for testing with respect to areas such as test methodology, procedures, parameters, and instrumentation. Also ...

  2. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  3. SSME Alternate Turbopump Development Program: Design verification specification for high-pressure fuel turbopump

    Science.gov (United States)

    1989-01-01

    The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.

  4. Some New Verification Test Problems for Multimaterial Diffusion on Meshes that are Non-Aligned with Material Boundaries

    Energy Technology Data Exchange (ETDEWEB)

    Dawes, Alan Sidney [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Malone, Christopher M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shashkov, Mikhail Jurievich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-07

    In this report a number of new verification test problems for multimaterial diffusion will be shown. Using them we will show that homogenization of multimaterial cells in either Arbitrary Lagrangian Eulerian (ALE) or Eulerian simulations can lead to errors in the energy flow at the interfaces. Results will be presented that show that significant improvements and predictive capability can be gained by using either a surrogate supermesh, such as Thin Mesh in FLAG, or the emerging method based on Static Condensation.

  5. Evaluation of a new computerized psychometric test battery: Effects of zolpidem and caffeine.

    Science.gov (United States)

    Pilli, Raveendranadh; Naidu, Mur; Pingali, Usharani; Shobha, Jc

    2013-10-01

    To evaluate the effects of centrally active drugs using a new indigenously developed automated psychometric test system and compare the results with that obtained using pencil- and paper-based techniques. The tests were standardized in 24 healthy participants. Reproducibility of the test procedure was evaluated by performing the tests by a single experimenter on two occasions (interday reproducibility). To evaluate the sensitivity of the tests, the effects of zolpidem (5 mg) and caffeine (500 mg) versus placebo were studied in 24 healthy participants in a randomized, double-blind three-way crossover design. Psychometric tests were performed at baseline and at 1, 2, and 3 h after administration of study medication. The effects of zolpidem and caffeine on the psychomotor performance were most pronounced 1 h after administration. At this time, a significant impairment of performance in the simple reaction test (SRT), choice discrimination test (CDT), digit symbol substitution test (DSST), digit vigilance test (DVT), and card sorting test (CST) was observed with zolpidem. In contrast, caffeine showed a significant improvement in performance in CDT and DVT only. The results suggest that the tests of the computerized system are more sensitive and reliable then the pencil and paper tests in detecting the effects of central acting agents and are suitable for use in clinical areas to conduct studies with patients.

  6. Charpy impact test results on five materials and NIST verification specimens using instrumented 2-mm and 8-mm strikers

    International Nuclear Information System (INIS)

    Nanstad, R.K.; Sokolov, M.A.

    1995-01-01

    The Heavy-Section Steel Irradiation Program at Oak Ridge National Laboratory is involved in two cooperative projects, with international participants, both of which involve Charpy V-notch impact tests with instrumented strikers of 2mm and 8mm radii. Two heats of A 533 grade B class I pressure vessel steel and a low upper-shelf (LUS) submerged-arc (SA) weld were tested on the same Charpy machine, while one heat of a Russian Cr-Mo-V forging steel and a high upper-shelf (HUS) SA weld were tested on two different machines. The number of replicate tests at any one temperature ranged from 2 to 46 specimens. Prior to testing with each striker, verification specimens at the low, high, and super high energy levels from the National Institute of Standards and Technology (NIST) were tested. In the two series of verification tests, the tests with the 2mm striker met the requirements at the low and high energy levels but not at the super high energy. For one plate, the 2mm striker showed somewhat higher average absorbed energies than those for the 8-mm striker at all three test temperatures. For the second plate and the LUS weld, however, the 2mm striker showed somewhat lower energies at both test temperatures. For the Russian forging steel and the HUS weld, tests were conducted over a range of temperatures with tests at one laboratory using the 8mm striker and tests at a second laboratory using the 2mm striker. Lateral expansion was measured for all specimens and the results are compared with the absorbed energy results. The overall results showed generally good agreement (within one standard deviation) in energy measurements by the two strikers. Load-time traces from the instrumented strikers were also compared and used to estimate shear fracture percentage. Four different formulas from the European Structural Integrity Society draft standard for instrumented Charpy test are compared and a new formula is proposed for estimation of percent shear from the force-time trace

  7. Design verification and acceptance tests of the ASST-A helium refrigeration system

    International Nuclear Information System (INIS)

    Ganni, V.; Apparao, T.V.V.R.

    1993-07-01

    Three similar helium refrigerator systems have been installed at the Superconducting Super Collider Laboratory (SSCL) N15 site; the ASST-A system, which will be used for the accelerator system's full cell string test; the N15-B system, which will be used for string testing in the tunnel; and a third plant, dedicated to magnet testing at the Magnet Testing Laboratory. The ASST-A and N15-B systems will ultimately be a part of the collider's N15 sector station equipment. Each of these three systems has many subsystems, but the design basis for the main refrigerator is the same. Each system has a guaranteed capacity of 2000 W of refrigeration and 20 g/s liquefaction at 4.5K. The testing and design verification of the ASST-A refrigeration system consisted of parametric tests on the compressors and the total system. A summary of the initial performance test data is given in this paper. The tests were conducted for two cases: in the first, all four compressors were operating; in the second, only one compressor in each stage was operating. In each case, tests were conducted in three modes of operation described later on. The process design basis supplied by the manufacturers and used in the design of the main components -- the compressor, and expanders and heat exchangers for the coldbox -- were used to reduce the actual test data using process simulation methodology. In addition, the test results and the process design submitted by the manufacturer were analyzed using exergy analysis. This paper presents both the process and the exergy analyses of the manufacturer's design and the actual test data for Case 1. The process analyses are presented in the form of T-S diagrams. The results of the exergy analyses comparing the exergy losses of each component and the total system for the manufacturer's design and the test data are presented in the tables

  8. Verification and validation of RADMODL Version 1.0

    International Nuclear Information System (INIS)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V ampersand V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident

  9. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  10. Expose : procedure and results of the joint experiment verification tests

    Science.gov (United States)

    Panitz, C.; Rettberg, P.; Horneck, G.; Rabbow, E.; Baglioni, P.

    The International Space Station will carry the EXPOSE facility accommodated at the universal workplace URM-D located outside the Russian Service Module. The launch will be affected in 2005 and it is planned to stay in space for 1.5 years. The tray like structure will accomodate 2 chemical and 6 biological PI-experiments or experiment systems of the ROSE (Response of Organisms to Space Environment) consortium. EXPOSE will support long-term in situ studies of microbes in artificial meteorites, as well as of microbial communities from special ecological niches, such as endolithic and evaporitic ecosystems. The either vented or sealed experiment pockets will be covered by an optical filter system to control intensity and spectral range of solar UV irradiation. Control of sun exposure will be achieved by the use of individual shutters. To test the compatibility of the different biological systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed. The procedure and first results of this joint Experiment Verification Tests (EVT) will be presented. The results will be essential for the success of the EXPOSE mission and have been done in parallel with the development and construction of the final hardware design of the facility. The results of the mission will contribute to the understanding of the organic chemistry processes in space, the biological adaptation strategies to extreme conditions, e.g. on early Earth and Mars, and the distribution of life beyond its planet of origin.

  11. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  12. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  13. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  14. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  15. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  16. Field test of short-notice random inspections for inventory-change verification at a low-enriched-uranium fuel-fabrication plant: Preliminary summary

    International Nuclear Information System (INIS)

    Fishbone, L.G.; Moussalli, G.; Naegele, G.; Ikonomou, P.; Hosoya, M.; Scott, P.; Fager, J.; Sanders, C.; Colwell, D.; Joyner, C.J.

    1994-01-01

    An approach of short-notice random inspections (SNRIs) for inventory-change verification can enhance the effectiveness and efficiency of international safeguards at natural or low-enriched uranium (LEU) fuel fabrication plants. According to this approach, the plant operator declares the contents of nuclear material items before knowing if an inspection will occur to verify them. Additionally, items about which declarations are newly made should remain available for verification for an agreed time. This report details a six-month field test of the feasibility of such SNRIs which took place at the Westinghouse Electric Corporation Commercial Nuclear Fuel Division. Westinghouse personnel made daily declarations about both feed and product items, uranium hexafluoride cylinders and finished fuel assemblies, using a custom-designed computer ''mailbox''. Safeguards inspectors from the IAEA conducted eight SNRIs to verify these declarations. Items from both strata were verified during the SNRIs by means of nondestructive assay equipment. The field test demonstrated the feasibility and practicality of key elements of the SNRI approach for a large LEU fuel fabrication plant

  17. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  18. Verification of consumers' experiences and perceptions of genetic discrimination and its impact on utilization of genetic testing.

    Science.gov (United States)

    Barlow-Stewart, Kristine; Taylor, Sandra D; Treloar, Susan A; Stranger, Mark; Otlowski, Margaret

    2009-03-01

    To undertake a systematic process of verification of consumer accounts of alleged genetic discrimination. Verification of incidents reported in life insurance and other contexts that met the criteria of genetic discrimination, and the impact of fear of such treatment, was determined, with consent, through interview, document analysis and where appropriate, direct contact with the third party involved. The process comprised obtaining evidence that the alleged incident was accurately reported and determining whether the decision or action seemed to be justifiable and/or ethical. Reported incidents of genetic discrimination were verified in life insurance access, underwriting and coercion (9), applications for worker's compensation (1) and early release from prison (1) and in two cases of fear of discrimination impacting on access to genetic testing. Relevant conditions were inherited cancer susceptibility (8), Huntington disease (3), hereditary hemochromatosis (1), and polycystic kidney disease (1). In two cases, the reversal of an adverse underwriting decision to standard rate after intervention with insurers by genetics health professionals was verified. The mismatch between consumer and third party accounts in three life insurance incidents involved miscommunication or lack of information provision by financial advisers. These first cases of verified genetic discrimination make it essential for policies and guidelines to be developed and implemented to ensure appropriate use of genetic test results in insurance underwriting, to promote education and training in the financial industry, and to provide support for consumers and health professionals undertaking challenges of adverse decisions.

  19. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Reports: Final Comprehensive Performance Test Report, P/N: 1356006-1, S.N: 202/A2

    Science.gov (United States)

    Platt, R.

    1998-01-01

    This is the Performance Verification Report. the process specification establishes the requirements for the comprehensive performance test (CPT) and limited performance test (LPT) of the earth observing system advanced microwave sounding unit-A2 (EOS/AMSU-A2), referred to as the unit. The unit is defined on drawing 1356006.

  20. National Accounts Energy Alliance : Field test and verification of CHP components and systems

    Energy Technology Data Exchange (ETDEWEB)

    Sweetser, R. [Exergy Partners Corporation, Herndon, VA (United States)

    2003-07-01

    Exergy is a consulting firm which specializes in capitalizing on opportunities that result from the nexus of utility deregulation and global climate change in both the construction and energy industries. The firm offers assistance in technical business and market planning, product development and high impact marketing and technology transfer programs. The author discussed National Accounts Energy Alliance (NAEA) program on distributed energy resources (DER) and identified some advantageous areas such as homeland security (less possible terrorist targets to be protected), food safety (protection of food supply and delivery system), reliability, power quality, energy density, grid congestion and energy price. In the future, an essential role in moderating energy prices for commercial buildings will probably be played by distributed generation (DG) and combined heat and power (CHP). The technical merits of these technologies is being investigated by national accounts and utilities partnering with non-profit organizations, the United States Department of Energy (US DOE), state governments and industry. In that light, in 2001 an Alliance program was developed, which allows investors to broaden their knowledge from the application and verification of Advanced Energy Technologies. This program was the result of a synergy between the American Gas Foundation and the Gas Technology Institute (GTI), and it assists investors with their strategic planning. It was proven that a customer-led Energy Technology Test and Verification Program (TA and VP) could be cost-effective and successful. The NAEA activities in five locations were reviewed and discussed. They were: (1) Russell Development, Portland, Oregon; (2) A and P-Waldbaums, Hauppage, New York; (3) HEB, Southern, Texas; (4) Cinemark, Plano, Texas; and McDonald's, Tampa, Florida. 4 tabs., figs.

  1. General-purpose heat source safety verification test series: SVT-11 through SVT-13

    International Nuclear Information System (INIS)

    George, T.G.; Pavone, D.

    1986-05-01

    The General-Purpose Heat Source (GPHS) is a modular component of the radioisotope thermoelectric generator that will provide power for the Galileo and Ulysses (formerly ISPM) space missions. The GPHS provides power by transmitting the heat of 238 Pu α-decay to an array of thermoelectric elements. Because the possibility of an orbital abort always exists, the heat source was designed and constructed to minimize plutonia release in any accident environment. The Safety Verification Test (SVT) series was formulated to evaluate the effectiveness of GPHS plutonia containment after atmospheric reentry and Earth impact. The first two reports (covering SVT-1 through SVT-10) described the results of flat, side-on, and angular module impacts against steel targets at 54 m/s. This report describes flat-on module impacts against concrete and granite targets, at velocities equivalent to or higher than previous SVTs

  2. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Jang-Soo; Kim, Jang-Yeol

    2004-01-01

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  3. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  4. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  5. Sierra/Aria 4.48 Verification Manual.

    Energy Technology Data Exchange (ETDEWEB)

    Sierra Thermal Fluid Development Team

    2018-04-01

    Presented in this document is a portion of the tests that exist in the Sierra Thermal/Fluids verification test suite. Each of these tests is run nightly with the Sierra/TF code suite and the results of the test checked under mesh refinement against the correct analytic result. For each of the tests presented in this document the test setup, derivation of the analytic solution, and comparison of the code results to the analytic solution is provided. This document can be used to confirm that a given code capability is verified or referenced as a compilation of example problems.

  6. Verification testing of the compression performance of the HEVC screen content coding extensions

    Science.gov (United States)

    Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng

    2017-09-01

    This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, COLUMBUS INDUSTRIES HIGH EFFICIENCY MINI PLEAT

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  8. Software engineering and automatic continuous verification of scientific software

    Science.gov (United States)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  9. Final Report for 'Verification and Validation of Radiation Hydrodynamics for Astrophysical Applications'

    International Nuclear Information System (INIS)

    Zingale, M.; Howell, L.H.

    2010-01-01

    The motivation for this work is to gain experience in the methodology of verification and validation (V and V) of astrophysical radiation hydrodynamics codes. In the first period of this work, we focused on building the infrastructure to test a single astrophysical application code, Castro, developed in collaboration between Lawrence Livermore National Laboratory (LLNL) and Lawrence Berkeley Laboratory (LBL). We delivered several hydrodynamic test problems, in the form of coded initial conditions and documentation for verification, routines to perform data analysis, and a generalized regression test suite to allow for continued automated testing. Astrophysical simulation codes aim to model phenomena that elude direct experimentation. Our only direct information about these systems comes from what we observe, and may be transient. Simulation can help further our understanding by allowing virtual experimentation of these systems. However, to have confidence in our simulations requires us to have confidence in the tools we use. Verification and Validation is a process by which we work to build confidence that a simulation code is accurately representing reality. V and V is a multistep process, and is never really complete. Once a single test problem is working as desired (i.e. that problem is verified), one wants to ensure that subsequent code changes do not break that test. At the same time, one must also search for new verification problems that test the code in a new way. It can be rather tedious to manually retest each of the problems, so before going too far with V and V, it is desirable to have an automated test suite. Our project aims to provide these basic tools for astrophysical radiation hydrodynamics codes.

  10. Class 1E software verification and validation: Past, present, and future

    Energy Technology Data Exchange (ETDEWEB)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V&V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V&V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V&V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V&V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V&V Guidelines is introduced. The paper concludes with a glossary and bibliography.

  11. Engineering Trade-off Considerations Regarding Design-for-Security, Design-for-Verification, and Design-for-Test

    Science.gov (United States)

    Berg, Melanie; Label, Kenneth

    2018-01-01

    The United States government has identified that application specific integrated circuit (ASIC) and field programmable gate array (FPGA) hardware are at risk from a variety of adversary attacks. This finding affects system security and trust. Consequently, processes are being developed for system mitigation and countermeasure application. The scope of this tutorial pertains to potential vulnerabilities and countermeasures within the ASIC/FPGA design cycle. The presentation demonstrates how design practices can affect the risk for the adversary to: change circuitry, steal intellectual property, and listen to data operations. An important portion of the design cycle is assuring the design is working as specified or as expected. This is accomplished by exhaustive testing of the target design. Alternatively, it has been shown that well established schemes for test coverage enhancement (design-for-verification (DFV) and design-for-test (DFT)) can create conduits for adversary accessibility. As a result, it is essential to perform a trade between robust test coverage versus reliable design implementation. The goal of this tutorial is to explain the evolution of design practices; review adversary accessibility points due to DFV and DFT circuitry insertion (back door circuitry); and to describe common engineering trade-off considerations for test versus adversary threats.

  12. The KNICS approach for verification and validation of safety software

    International Nuclear Information System (INIS)

    Cha, Kyung Ho; Sohn, Han Seong; Lee, Jang Soo; Kim, Jang Yeol; Cheon, Se Woo; Lee, Young Joon; Hwang, In Koo; Kwon, Kee Choon

    2003-01-01

    This paper presents verification and validation (VV) to be approached for safety software of POSAFE-Q Programmable Logic Controller (PLC) prototype and Plant Protection System (PPS) prototype, which consists of Reactor Protection System (RPS) and Engineered Safety Features-Component Control System (ESF-CCS) in development of Korea Nuclear Instrumentation and Control System (KNICS). The SVV criteria and requirements are selected from IEEE Std. 7-4.3.2, IEEE Std. 1012, IEEE Std. 1028 and BTP-14, and they have been considered for acceptance framework to be provided within SVV procedures. SVV techniques, including Review and Inspection (R and I), Formal Verification and Theorem Proving, and Automated Testing, are applied for safety software and automated SVV tools supports SVV tasks. Software Inspection Support and Requirement Traceability (SIS-RT) supports R and I and traceability analysis, a New Symbolic Model Verifier (NuSMV), Statemate MAGNUM (STM) ModelCertifier, and Prototype Verification System (PVS) are used for formal verification, and McCabe and Cantata++ are utilized for static and dynamic software testing. In addition, dedication of Commercial-Off-The-Shelf (COTS) software and firmware, Software Safety Analysis (SSA) and evaluation of Software Configuration Management (SCM) are being performed for the PPS prototype in the software requirements phase

  13. Safety of a DVT chemoprophylaxis protocol following traumatic brain injury: a single center quality improvement initiative.

    Science.gov (United States)

    Nickele, Christopher M; Kamps, Timothy K; Medow, Joshua E

    2013-04-01

    significant deep venous thrombosis (DVT) was 6.9 % (6 of 87). Three protocol patients (3.45 %) went to the operating room for surgery after the initiation of PTP; none of these patients had a measurable change in hemorrhage size on head CT. The change in percentage of patients receiving PTP was significantly increased by the protocol (p average days to first PTP dose trended down with institution of the protocol, this change was not statistically significant. A PTP protocol in the NSICU is useful in controlling the number of complications from DVT and pulmonary embolism while avoiding additional IH. This protocol, based on a published body of literature, allowed for VTE rates similar to published rates, while having no PTP-related hemorrhage expansion. The protocol significantly changed physician behavior, increasing the percentage of patients receiving PTP during their hospitalization; whether long-term patient outcomes are affected is a potential goal for future study.

  14. A saddle-point for data verification and materials accountancy to control nuclear material

    International Nuclear Information System (INIS)

    Beedgen, R.

    1983-01-01

    Materials accountancy is one of the main elements in international safeguards to determine whether or not nuclear material has been diverted in nuclear plants. The inspector makes independent measurements to verify the plant-operator's data before closing the materials balance with the operator's data. All inspection statements are in principle probability statements because of random errors in measuring the material and verification on a random sampling basis. Statistical test procedures help the inspector to decide under this uncertainty. In this paper a statistical test procedure representing a saddle-point is presented that leads to the highest guaranteed detection probability taking all concealing strategies into account. There are arguments favoring a separate statistical evaluation of data verification and materials accountancy. Following these considerations, a bivariate test procedure is explained that evaluates verification and accountancy separately. (orig.) [de

  15. Battery algorithm verification and development using hardware-in-the-loop testing

    Science.gov (United States)

    He, Yongsheng; Liu, Wei; Koch, Brain J.

    Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO 4) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs.

  16. Battery algorithm verification and development using hardware-in-the-loop testing

    Energy Technology Data Exchange (ETDEWEB)

    He, Yongsheng [General Motors Global Research and Development, 30500 Mound Road, MC 480-106-252, Warren, MI 48090 (United States); Liu, Wei; Koch, Brain J. [General Motors Global Vehicle Engineering, Warren, MI 48090 (United States)

    2010-05-01

    Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO{sub 4}) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs. (author)

  17. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.

  18. Verification of FPGA-based NPP I and C systems. General approach and techniques

    International Nuclear Information System (INIS)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Reva, Lubov; Siora, Alexander

    2011-01-01

    This paper presents a general approach and techniques for design and verification of Field Programmable Gates Arrays (FPGA)-based Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP). Appropriate regulatory documents used for I and C systems design, development, verification and validation (V and V) are discussed considering the latest international standards and guidelines. Typical development and V and V processes of FPGA electronic design for FPGA-based NPP I and C systems are presented. Some safety-related features of implementation process are discussed. Corresponding development artifacts, related to design and implementation activities are outlined. An approach to test-based verification of FPGA electronic design algorithms, used in FPGA-based reactor trip systems is proposed. The results of application of test-based techniques for assessment of FPGA electronic design algorithms for reactor trip system (RTS) produced by Research and Production Corporation (RPC) 'Radiy' are presented. Some principles of invariant-oriented verification for FPGA-based safety-critical systems are outlined. (author)

  19. NbTi Strands Verification for ITER PF CICC Process Qualification of CNDA

    Science.gov (United States)

    Liu, F.; Liu, H.; Liu, S.; Liu, B.; Lei, L.; Wu, Y.

    2014-05-01

    China is in charge of most of Poloidal Field (PF) conductors production for the International Thermonuclear Experimental Reactor (ITER). The execution for PF conductors shall be in three main phases. According to ITER Procurement Arrangement (PA), the Domestic Agency (DA) shall be required to verify the room and low temperature acceptance tests carried out by the strand suppliers. As the reference laboratory of Chinese DA (CNDA), the superconducting strands test laboratory of Institute of Plasma Physics, Chinese Academy of Sciences (ASIPP) was undertaking the task of strands verification for ITER conductors. The verification test includes: diameter, Nickel plating thickness, copper-to-non-copper volume ratio, twist pitch direction and length, standard critical current (IC) and resistive transition index (n), residual resistance ration (RRR), and hysteresis loss. 48 NbTi strands with 7 billets were supplied for the PF Cable-In-Conduit Conductor (CICC) process qualification. In total, 54 samples were measured. The verification level for PF CICC process qualification was 100%. The test method, facility and results of each item are described in detail in this publication.

  20. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of); Jung, Jaecheon, E-mail: jcjung@kings.ac.kr [Department of Nuclear Power Plant Engineering, KEPCO International Nuclear Graduate School, 658-91 Haemaji-ro, Seosang-myeon, Ulju-gun, Ulsan 45014 (Korea, Republic of); Heo, Gyunyoung [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of)

    2017-06-15

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  1. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Jung, Jaecheon; Heo, Gyunyoung

    2017-01-01

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  2. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  3. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  4. Do people embrace praise even when they feel unworthy? A review of critical tests of self-enhancement versus self-verification.

    Science.gov (United States)

    Kwang, Tracy; Swann, William B

    2010-08-01

    Some contemporary theorists contend that the desire for self-enhancement is prepotent and more powerful than rival motives such as self-verification. If so, then even people with negative self-views will embrace positive evaluations. The authors tested this proposition by conducting a meta-analytic review of the relevant literature. The data provided ample evidence of self-enhancement strivings but little evidence of its prepotency. Instead, the evidence suggested that both motives are influential but control different response classes. In addition, other motives may sometimes come into play. For example, when rejection risk is high, people seem to abandon self-verification strivings, apparently in an effort to gratify their desire for communion. However, when rejection risk is low, as is the case in many secure marital relationships, people prefer self-verifying evaluations. The authors conclude that future researchers should broaden the bandwidth of their explanatory frameworks to include motives other than self-enhancement.

  5. Multimodal Personal Verification Using Likelihood Ratio for the Match Score Fusion

    Directory of Open Access Journals (Sweden)

    Long Binh Tran

    2017-01-01

    Full Text Available In this paper, the authors present a novel personal verification system based on the likelihood ratio test for fusion of match scores from multiple biometric matchers (face, fingerprint, hand shape, and palm print. In the proposed system, multimodal features are extracted by Zernike Moment (ZM. After matching, the match scores from multiple biometric matchers are fused based on the likelihood ratio test. A finite Gaussian mixture model (GMM is used for estimating the genuine and impostor densities of match scores for personal verification. Our approach is also compared to some different famous approaches such as the support vector machine and the sum rule with min-max. The experimental results have confirmed that the proposed system can achieve excellent identification performance for its higher level in accuracy than different famous approaches and thus can be utilized for more application related to person verification.

  6. The verification tests of residual radioactivity measurement and assessment techniques for buildings and soils

    International Nuclear Information System (INIS)

    Onozawa, T.; Ishikura, T.; Yoshimura, Yukio; Nakazawa, M.; Makino, S.; Urayama, K.; Kawasaki, S.

    1996-01-01

    According to the standard procedure for decommissioning a commercial nuclear power plant (CNPP) in Japan, controlled areas will be released for unrestricted use before the dismantling of a reactor building. If manual survey and sampling techniques were applied to measurement for unrestricted release on and in the extensive surface of the building, much time and much specialized labor would be required to assess the appropriateness of the releasing. Therefore the authors selected the following three techniques for demonstrating reliability and applicability of the techniques for CNPPs: (1) technique of assessing radioactive concentration distribution on the surface of buildings (ADB); (2) technique of assessing radioactive permeation distribution in the concrete structure of buildings (APB); (3) technique of assessing radioactive concentration distribution in soil (ADS). These tests include the techniques of measuring and assessing very low radioactive concentration distribution on the extensive surfaces of buildings and the soil surrounding of a plant with automatic devices. Technical investigation and preliminary study of the verification tests were started in 1990. In the study, preconditions were clarified for each technique and the performance requirements were set up. Moreover, simulation models have been constructed for several feasible measurement method to assess their performance in terms of both measurement test and simulation analysis. Fundamental tests have been under way using small-scale apparatuses since 1994

  7. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  8. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1994-01-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: PROTOCOL FOR THE VERIFICATION OF GROUTING MATERIALS FOR INFRASTRUCTURE REHABILITATION AT THE UNIVERSITY OF HOUSTON - CIGMAT

    Science.gov (United States)

    This protocol was developed under the Environmental Protection Agency's Environmental Technology Verification (ETV) Program, and is intended to be used as a guide in preparing laboratory test plans for the purpose of verifying the performance of grouting materials used for infra...

  10. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  11. Quantity of residual thrombus after successful catheter-directed thrombolysis for iliofemoral deep venous thrombosis correlates with recurrence.

    Science.gov (United States)

    Aziz, F; Comerota, A J

    2012-08-01

    Iliofemoral deep venous thrombosis (IFDVT) is an independent risk factor for recurrent DVT. It has been observed that recurrent DVT correlates with residual thrombus. This study evaluates whether risk of recurrence is related to the amount of residual thrombus following catheter-directed thrombolysis (CDT) for IFDVT. Patients who underwent CDT for IFDVT had their degree of lysis quantified by a reader blind to the patients' long-term clinical outcome. Patients were classified into two groups, ≥50% and thrombus. Recurrence was defined as a symptomatic presentation with image verification of new or additional thrombus. A total of 75 patients underwent CDT for IFDVT. Median follow-up was 35.9 months. Sixty-eight patients (91%) had no evidence of recurrence and seven (9%) developed recurrence. Of the patients who had ≥50% (mean 80%) residual thrombus, 50% (4/8) experienced recurrence, but in those with thrombus, only 5% (3/67) had recurrent DVT (P = 0.0014). The burden of residual thrombus at completion of CDT correlates with the risk of DVT recurrence. Patients having CDT for IFDVT had a lower risk of recurrence than expected. Successful clearing of acute clot in IFDVT patients significantly reduces the recurrence risk compared to patients with a large residual thrombus burden. Copyright © 2012 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.

  12. Rule Systems for Runtime Verification: A Short Tutorial

    Science.gov (United States)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  13. HTA of genetic testing for susceptibility to venous thromboembolism in Italiy

    Directory of Open Access Journals (Sweden)

    Betti Silvia

    2012-06-01

    Full Text Available Venous thromboembolism (VTE is a condition in which a thrombus (a solid mass of blood constituents forms in a vein. VTE represents an extremely common medical problem manifested as either deep venous thrombosis (DVT or pulmonary embolism (PE affecting apparently healthy as well as hospitalized patients. Often PE is the physiopathological consequence of the DVT of low extremities vessels, in particular of the calve......

  14. Functions of social support and self-verification in association with loneliness, depression, and stress.

    Science.gov (United States)

    Wright, Kevin B; King, Shawn; Rosenberg, Jenny

    2014-01-01

    This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.

  15. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  16. Verification and validation of the PLTEMP/ANL code for thermal hydraulic analysis of experimental and test reactors

    International Nuclear Information System (INIS)

    Kalimullah, M.; Olson, A.O.; Feldman, E.E.; Hanan, N.; Dionne, B.

    2012-01-01

    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  17. Verification and Validation of the PLTEMP/ANL Code for Thermal-Hydraulic Analysis of Experimental and Test Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kalimullah, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hanan, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-07

    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  18. Testing, verification and application of CONTAIN for severe accident analysis of LMFBR-containments

    International Nuclear Information System (INIS)

    Langhans, J.

    1991-01-01

    Severe accident analysis for LMFBR-containments has to consider various phenomena influencing the development of containment loads as pressure and temperatures as well as generation, transport, depletion and release of aerosols and radioactive materials. As most of the different phenomena are linked together their feedback has to be taken into account within the calculation of severe accident consequences. Otherwise no best-estimate results can be assured. Under the sponsorship of the German BMFT the US code CONTAIN is being developed, verified and applied in GRS for future fast breeder reactor concepts. In the first step of verification, the basic calculation models of a containment code have been proven: (i) flow calculation for different flow situations, (ii) heat transfer from and to structures, (iii) coolant evaporation, boiling and condensation, (iv) material properties. In the second step the proof of the interaction of coupled phenomena has been checked. The calculation of integrated containment experiments relating natural convection flow, structure heating and coolant condensation as well as parallel calculation of results obtained with an other code give detailed information on the applicability of CONTAIN. The actual verification status allows the following conclusion: a caucious analyst experienced in containment accident modelling using the proven parts of CONTAIN will obtain results which have the same accuracy as other well optimized and detailed lumped parameter containment codes can achieve. Further code development, additional verification and international exchange of experience and results will assure an adequate code for the application in safety analyses for LMFBRs. (orig.)

  19. Approach to IAEA material-balance verification at the Portsmouth Gas Centrifuge Enrichment Plant

    International Nuclear Information System (INIS)

    Gordon, D.M.; Sanborn, J.B.; Younkin, J.M.; DeVito, V.J.

    1983-01-01

    This paper describes a potential approach by which the International Atomic Energy Agency (IAEA) might verify the nuclear-material balance at the Portsmouth Gas Centrifuge Enrichment Plant (GCEP). The strategy makes use of the attributes and variables measurement verification approach, whereby the IAEA would perform independent measurements on a randomly selected subset of the items comprising the U-235 flows and inventories at the plant. In addition, the MUF-D statistic is used as the test statistic for the detection of diversion. The paper includes descriptions of the potential verification activities, as well as calculations of: (1) attributes and variables sample sizes for the various strata, (2) standard deviations of the relevant test statistics, and (3) the detection sensitivity which the IAEA might achieve by this verification strategy at GCEP

  20. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: TRITON SYSTEMS, LLC SOLID BOWL CENTRIFUGE, MODEL TS-5000

    Science.gov (United States)

    Verification testing of the Triton Systems, LLC Solid Bowl Centrifuge Model TS-5000 (TS-5000) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The TS-5000 was 48" in diameter and 30" deep, with a bowl capacity of 16 ft3. ...

  2. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  3. ETV REPORT AND VERIFICATION STATEMENT - KASELCO POSI-FLO ELECTROCOAGULATION TREATMENT SYSTEM

    Science.gov (United States)

    The Kaselco Electrocoagulation Treatment System (Kaselco system) in combination with an ion exchange polishing system was tested, under actual production conditions, processing metal finishing wastewater at Gull Industries in Houston, Texas. The verification test evaluated the a...

  4. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  5. Transparencies used in describing the CTBT verification regime and its four monitoring technologies

    International Nuclear Information System (INIS)

    Basham, P.

    1999-01-01

    This presentation includes description of the CTBT verification regime and its four monitoring technologies, (namely, seismic monitoring, hydro acoustic monitoring, infrasound monitoring and radionuclides monitoring) CTBT global verification system, sequence of steps needed for installing an international monitoring system station which includes: site survey, site preparation and construction, equipment procurement and installation, final tests and certification

  6. Nuclear power plant C and I design verification by simulation

    International Nuclear Information System (INIS)

    Storm, Joachim; Yu, Kim; Lee, D.Y

    2003-01-01

    An important part of the Advanced Boiling Water Reactor (ABWR) in the Taiwan NPP Lungmen Units no.1 and no.2 is the Full Scope Simulator (FSS). The simulator was to be built according to design data and therefore, apart from the training aspect, a major part of the development is to apply a simulation based test bed for the verification, validation and improvement of plant design in the control and instrumentation (C and I) areas of unit control room equipment, operator Man Machine Interface (MMI), process computer functions and plant procedures. Furthermore the Full Scope Simulator will be used after that to allow proper training of the plant operators two years before Unit no.1 fuel load. The article describes scope, methods and results of the advanced verification and validation process and highlights the advantages of test bed simulation for real power plant design and implementation. Subsequent application of advanced simulation software tools like instrumentation and control translators, graphical model builders, process models, graphical on-line test tools and screen based or projected soft panels, allowed a team to fulfil the task of C and I verification in time before the implementation of the Distributed Control and Information System (DCIS) started. An additional area of activity was the Human Factors Engineering (HFE) for the operator MMI. Due to the fact that the ABWR design incorporates a display-based operation with most of the plant components, a dedicated verification and validation process is required by NUREG-0711. In order to support this activity an engineering test system had been installed for all the necessary HFE investigations. All detected improvements had been properly documented and used to update the plant design documentation by a defined process. The Full Scope Simulator (FSS) with hard panels and stimulated digital control and information system are in the final acceptance test process with the end customer, Taiwan Power Company

  7. Verification of Kaplan turbine cam curves realization accuracy at power plant

    Directory of Open Access Journals (Sweden)

    Džepčeski Dane

    2016-01-01

    Full Text Available Sustainability of approximately constant value of Kaplan turbine efficiency, for relatively large net head changes, is a result of turbine runner variable geometry. Dependence of runner blades position change on guide vane opening represents the turbine cam curve. The cam curve realization accuracy is of great importance for the efficient and proper exploitation of turbines and consequently complete units. Due to the reasons mentioned above, special attention has been given to the tests designed for cam curves verification. The goal of this paper is to provide the description of the methodology and the results of the tests performed in the process of Kaplan turbine cam curves verification.

  8. Production of plastic scintillation survey meter for clearance verification measurement

    International Nuclear Information System (INIS)

    Tachibana, Mitsuo; Shiraishi, Kunio; Ishigami, Tsutomu; Tomii, Hiroyuki

    2008-03-01

    In the Nuclear Science Research Institute, the decommissioning of various nuclear facilities is carried out according to the plan for meeting the midterm goal of the Japan Atomic Energy Agency (JAEA). An increase in the clearance verification measurement of concrete on buildings and the radiation measurement for releasing controlled areas will be expected along with the dismantlement of nuclear facilities in the future. The radiation measurement for releasing controlled areas has been carried out in small-scale nuclear facilities including the JPDR (Japan Power Demonstration Reactor). However, the radiation measurement with an existing measuring device was difficult in effects of radiation from radioactive materials that remains in buried piping. On the other hand, there is no experience that the clearance verification measurement is executed in the JAEA. The generation of a large amount of clearance object will be expected along with the decommissioning of the nuclear facilities in the future. The plastic scintillation survey meter (hereafter, 'PL measuring device') was produced to apply to the clearance verification measurement and the radiation measurement for releasing controlled areas. The basic characteristic test and the actual test were confirmed using the PL measuring device. As a result of these tests, it was found that the evaluation value of radioactivity with the PL measuring device was accuracy equal with the existing measuring device. The PL measuring device has feature of the existing measuring device with a light weight and easy operability. The PL measuring device can correct the gamma ray too. The PL measuring device is effective to the clearance verification measurement of concrete on buildings and the radiation measurement for releasing controlled areas. (author)

  9. Computerized strain-gauge plethysmography - An alternative method for the detection of lower limb deep venous thrombosis?

    Energy Technology Data Exchange (ETDEWEB)

    Elford, Julian; Wells, Irving; Cowie, Jim; Hurlock, Carol; Sanders, Hilary

    2000-01-01

    AIM: To test the ability of computerized strain-gauge plethysmography to act as a screening test for lower limb deep venous thrombosis (DVT). MATERIALS AND METHODS: Over an 8-month period, all patients referred to our Medical Assessment Unit with suspected lower limb DVT were considered for inclusion in the study. Each patient underwent both plethysmography and ascending venography within 24 h, and the presence or absence of thrombus in the popliteal, superficial femoral or iliac veins was noted. The results of the two tests were then used to determine the accuracy of computerized strain-gauge plethysmography in detecting above knee DVT. RESULTS: The screening tests and venograms of 239 patients referred with clinically suspected lower limb DVT were compared. The false negative rate of plethysmography was 15.4%, which is significantly different from the 4.8% claimed by the manufacturers of this device (P = 0.00003). CONCLUSIONS: In a population of acute admissions with suspected lower limb DVT, computerized strain-gauge plethysmography is not suitable for use as a screening test due to an unacceptably high proportion of false negative screens. J. Elford (2000)

  10. Relationship between deep venous thrombosis and inflammatory cytokines in postoperative patients with malignant abdominal tumors

    Energy Technology Data Exchange (ETDEWEB)

    Du, T.; Tan, Z. [National Wuhan University, Zhongnan Hospital, School of Medicine, Department of General Surgery, Wuhan, Hubei Province (China)

    2014-08-22

    Deep venous thrombosis (DVT) is a common surgical complication in cancer patients and evidence that inflammation plays a role in the occurrence of DVT is increasing. We studied a population of cancer patients with abdominal malignancies with the aim of investigating whether the levels of circulating inflammatory cytokines were associated with postoperative DVT, and to determine the levels in DVT diagnoses. The serum levels of C-reactive protein (CRP), interleukins (IL)-6 and IL-10, nuclear transcription factor-κB (NF-κB) and E-selectin (E-Sel) were determined in 120 individuals, who were divided into 3 groups: healthy controls, patients with and patients without DVT after surgery for an abdominal malignancy. Data were analyzed by ANOVA, Dunnet's T3 test, chi-square test, and univariate and multivariate logistic regression as needed. The CRP, IL-6, NF-κB, and E-Sel levels in patients with DVT were significantly higher than those in the other groups (P<0.05). The IL-10 level was higher in patients with DVT than in controls but lower than in patients without DVT. Univariate analysis revealed that CRP, IL-6, NF-κB, and E-Sel were statistically associated with the risk of DVT (OR=1.98, P=0.002; OR=1.17, P=0.000; OR=1.03, P=0.042; and OR=1.38, P=0.003; respectively), whereas IL-10 had a protective effect (OR=0.94, P=0.011). Multivariate analysis showed that E-Sel was an independent risk factor (OR=1.41, P=0.000). Thus, this study indicated that an increased serum level of E-Sel was associated with increased DVT risk in postoperative patients with abdominal malignancy, indicating that E-Sel may be a useful predictor of diagnosis of DVT.

  11. Timing of deep vein thrombosis formation after aneurysmal subarachnoid hemorrhage

    Science.gov (United States)

    Liang, Conrad W.; Su, Kimmy; Liu, Jesse J.; Dogan, Aclan; Hinson, Holly E.

    2015-01-01

    OBJECT Deep vein thrombosis (DVT) is a common complication of aneurysmal subarachnoid hemorrhage (aSAH). The time period of greatest risk for developing DVT after aSAH is not currently known. aSAH induces a prothrombotic state, which may contribute to DVT formation. Using repeated ultrasound screening, the hypothesis that patients would be at greatest risk for developing DVT in the subacute post-rupture period was tested. METHODS One hundred ninety-eight patients with aSAH admitted to the Oregon Health & Science University Neurosciences Intensive Care Unit between April 2008 and March 2012 were included in a retrospective analysis. Ultrasound screening was performed every 5.2 ± 3.3 days between admission and discharge. The chi-square test was used to compare DVT incidence during different time periods of interest. Patient baseline characteristics as well as stroke severity and hospital complications were evaluated in univariate and multivariate analyses. RESULTS Forty-two (21%) of 198 patients were diagnosed with DVT, and 3 (2%) of 198 patients were symptomatic. Twenty-nine (69%) of the 42 cases of DVT were first detected between Days 3 and 14, compared with 3 cases (7%) detected between Days 0 and 3 and 10 cases (24%) detected after Day 14 (p < 0.05). The postrupture 5-day window of highest risk for DVT development was between Days 5 and 9 (40%, p < 0.05). In the multivariate analysis, length of hospital stay and use of mechanical prophylaxis alone were significantly associated with DVT formation. CONCLUSIONS DVT formation most commonly occurs in the first 2 weeks following aSAH, with detection in this cohort peaking between Days 5 and 9. Chemoprophylaxis is associated with a significantly lower incidence of DVT. PMID:26162047

  12. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  13. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  14. Source Code Verification for Embedded Systems using Prolog

    Directory of Open Access Journals (Sweden)

    Frank Flederer

    2017-01-01

    Full Text Available System relevant embedded software needs to be reliable and, therefore, well tested, especially for aerospace systems. A common technique to verify programs is the analysis of their abstract syntax tree (AST. Tree structures can be elegantly analyzed with the logic programming language Prolog. Moreover, Prolog offers further advantages for a thorough analysis: On the one hand, it natively provides versatile options to efficiently process tree or graph data structures. On the other hand, Prolog's non-determinism and backtracking eases tests of different variations of the program flow without big effort. A rule-based approach with Prolog allows to characterize the verification goals in a concise and declarative way. In this paper, we describe our approach to verify the source code of a flash file system with the help of Prolog. The flash file system is written in C++ and has been developed particularly for the use in satellites. We transform a given abstract syntax tree of C++ source code into Prolog facts and derive the call graph and the execution sequence (tree, which then are further tested against verification goals. The different program flow branching due to control structures is derived by backtracking as subtrees of the full execution sequence. Finally, these subtrees are verified in Prolog. We illustrate our approach with a case study, where we search for incorrect applications of semaphores in embedded software using the real-time operating system RODOS. We rely on computation tree logic (CTL and have designed an embedded domain specific language (DSL in Prolog to express the verification goals.

  15. Estimated effect of an integrated approach to suspected deep venous thrombosis using limited-compression ultrasound.

    Science.gov (United States)

    Poley, Rachel A; Newbigging, Joseph L; Sivilotti, Marco L A

    2014-09-01

    Deep vein thrombosis (DVT) is both common and serious, yet the desire to never miss the diagnosis, coupled with the low specificity of D-dimer testing, results in high imaging rates, return visits, and empirical anticoagulation. The objective of this study was to evaluate a new approach incorporating bedside limited-compression ultrasound (LC US) by emergency physicians (EPs) into the workup strategy for DVT. This was a cross-sectional observational study of emergency department (ED) patients with suspected DVT. Patients on anticoagulants; those with chronic DVT, leg cast, or amputation; or when the results of comprehensive imaging were already known were excluded. All patients were treated in the usual fashion based on the protocol in use at the center, including comprehensive imaging based on the modified Wells score and serum D-dimer testing. Seventeen physicians were trained and performed LC US in all subjects. The authors identified a priori an alternate workup strategy in which DVT would be ruled out in "DVT unlikely" (Wells score return visits for imaging and 10 (4.4%) cases of unnecessary anticoagulation. In 19% of cases, the treating and scanning physician disagreed whether the patient was DVT likely or DVT unlikely based on Wells score (κ = 0.62; 95% CI = 0.48 to 0.77). Limited-compression US holds promise as one component of the diagnostic approach to DVT, but should not be used as a stand-alone test due to imperfect sensitivity. Tradeoffs in diagnostic efficiency for the sake of perfect sensitivity remain a difficult issue collectively in emergency medicine (EM), but need to be scrutinized carefully in light of the costs of overinvestigation, delays in diagnosis, and risks of empirical anticoagulation. © 2014 by the Society for Academic Emergency Medicine.

  16. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  17. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  18. An Unattended Verification Station for UF6 Cylinders: Development Status

    International Nuclear Information System (INIS)

    Smith, E.; McDonald, B.; Miller, K.; Garner, J.; March-Leuba, J.; Poland, R.

    2015-01-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by advanced centrifuge technologies and the growth in separative work unit capacity at modern centrifuge enrichment plants. These measures would include permanently installed, unattended instruments capable of performing the routine and repetitive measurements previously performed by inspectors. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Stations (UCVS) that could provide independent verification of the declared relative enrichment, U-235 mass and total uranium mass of all declared cylinders moving through the plant, as well as the application and verification of a ''Non-destructive Assay Fingerprint'' to preserve verification knowledge on the contents of each cylinder throughout its life in the facility. As IAEA's vision for a UCVS has evolved, Pacific Northwest National Laboratory (PNNL) and Los Alamos National Laboratory have been developing and testing candidate non-destructive assay (NDA) methods for inclusion in a UCVS. Modeling and multiple field campaigns have indicated that these methods are capable of assaying relative cylinder enrichment with a precision comparable to or substantially better than today's high-resolution handheld devices, without the need for manual wall-thickness corrections. In addition, the methods interrogate the full volume of the cylinder, thereby offering the IAEA a new capability to assay the absolute U-235 mass in the cylinder, and much-improved sensitivity to substituted or removed material. Building on this prior work, and under the auspices of the United States Support Programme to the IAEA, a UCVS field prototype is being developed and tested. This paper provides an overview of: a) hardware and software design of the prototypes, b) preparation

  19. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  20. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  1. Seismic verification of underground explosions

    International Nuclear Information System (INIS)

    Glenn, L.A.

    1985-06-01

    The first nuclear test agreement, the test moratorium, was made in 1958 and lasted until the Soviet Union unilaterally resumed testing in the atmosphere in 1961. It was followed by the Limited Test Ban Treaty of 1963, which prohibited nuclear tests in the atmosphere, in outer space, and underwater. In 1974 the Threshold Test Ban Treaty (TTBT) was signed, limiting underground tests after March 1976 to a maximum yield of 250 kt. The TTBT was followed by a treaty limiting peaceful nuclear explosions and both the United States and the Soviet Union claim to be abiding by the 150-kt yield limit. A comprehensive test ban treaty (CTBT), prohibiting all testing of nuclear weapons, has also been discussed. However, a verifiable CTBT is a contradiction in terms. No monitoring technology can offer absolute assurance that very-low-yield illicit explosions have not occurred. The verification process, evasion opportunities, and cavity decoupling are discussed in this paper

  2. Hot-cell verification facility

    International Nuclear Information System (INIS)

    Eschenbaum, R.A.

    1981-01-01

    The Hot Cell Verification Facility (HCVF) was established as the test facility for the Fuels and Materials Examination Facility (FMEF) examination equipment. HCVF provides a prototypic hot cell environment to check the equipment for functional and remote operation. It also provides actual hands-on training for future FMEF Operators. In its two years of operation, HCVF has already provided data to make significant changes in items prior to final fabrication. It will also shorten the startup time in FMEF since the examination equipment will have been debugged and operated in HCVF

  3. The Effect of Mystery Shopper Reports on Age Verification for Tobacco Purchases

    Science.gov (United States)

    KREVOR, BRAD S.; PONICKI, WILLIAM R.; GRUBE, JOEL W.; DeJONG, WILLIAM

    2011-01-01

    Mystery shops (MS) involving attempted tobacco purchases by young buyers have been employed to monitor retail stores’ performance in refusing underage sales. Anecdotal evidence suggests that MS visits with immediate feedback to store personnel can improve age verification. This study investigated the impact of monthly and twice-monthly MS reports on age verification. Forty-five Walgreens stores were each visited 20 times by mystery shoppers. The stores were randomly assigned to one of three conditions. Control group stores received no feedback, whereas two treatment groups received feedback communications every visit (twice monthly) or every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Post-baseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement than control group stores. Verification rates increased significantly during the study period for all three groups, with delayed improvement among control group stores. Communication between managers regarding the MS program may account for the delayed age-verification improvements observed in the control group stores. Encouraging inter-store communication might extend the benefits of MS programs beyond those stores that receive this intervention. PMID:21541874

  4. Developing a NASA strategy for the verification of large space telescope observatories

    Science.gov (United States)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  5. Self-verification and depression among youth psychiatric inpatients.

    Science.gov (United States)

    Joiner, T E; Katz, J; Lew, A S

    1997-11-01

    According to self-verification theory (e.g., W.B. Swann, 1983), people are motivated to preserve stable self-concepts by seeking self-confirming interpersonal responses, even if the responses are negative. In the current study of 72 youth psychiatric inpatients (36 boys; 36 girls; ages 7-17, M = 13.18; SD = 2.59), the authors provide the 1st test of self-verification theory among a youth sample. Participants completed self-report questionnaires on depression, self-esteem, anxiety, negative and positive affect, and interest in negative feedback from others. The authors made chart diagnoses available, and they collected peer rejection ratings. Consistent with hypotheses, the authors found that interest in negative feedback was associated with depression, was predictive of peer rejection (but only within relatively longer peer relationships), was more highly related to cognitive than emotional aspects of depression, and was specifically associated with depression, rather than being generally associated with emotional distress. The authors discuss implications for self-verification theory and for the phenomenology of youth depression.

  6. Grip-Pattern Verification for Smart Gun Based on Maximum-Pairwise Comparison and Mean-Template Comparison

    NARCIS (Netherlands)

    Shang, X.; Veldhuis, Raymond N.J.

    2008-01-01

    In our biometric verification system of a smart gun, the rightful user of a gun is authenticated by grip-pattern recognition. In this work verification will be done using two types of comparison methods, respectively. One is mean-template comparison, where the matching score between a test image and

  7. Testing and Demonstrating Speaker Verification Technology in Iraqi-Arabic as Part of the Iraqi Enrollment Via Voice Authentication Project (IEVAP) in Support of the Global War on Terrorism (GWOT)

    National Research Council Canada - National Science Library

    Withee, Jeffrey W; Pena, Edwin D

    2007-01-01

    This thesis documents the findings of an Iraqi-Arabic language test and concept of operations for speaker verification technology as part of the Iraqi Banking System in support of the Iraqi Enrollment...

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, JCH FUEL SOLUTIONS, INC., JCH ENVIRO AUTOMATED FUEL CLEANING AND MAINTENANCE SYSTEM

    Science.gov (United States)

    The verification testing was conducted at the Cl facility in North Las Vegas, NV, on July 17 and 18, 2001. During this period, engine emissions, fuel consumption, and fuel quality were evaluated with contaminated and cleaned fuel.To facilitate this verification, JCH repre...

  9. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  10. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  11. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    Science.gov (United States)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  12. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  13. Exclusion and diagnosis of pulmonary embolism by a rapid ELISA D-dimer test and noninvasive imaging techniques within the context of a clinical model.

    Science.gov (United States)

    Michiels, J J; Pattynama, P M

    2000-01-01

    A negative rapid ELISA D-dimer test alone in out-patients with a low to moderate clinical probability (CP) on pulmonary embolism (PE) is predicted to safely exclude pulmonary embolism. The combination of a negative rapid ELISA D-dimer test and a low to moderate CP on PE followed by compression ultrasonography (CUS) for the detection of deep vein thrombosis (DVT) is safe and cost-effective as it reduces the need for noninvasive imaging techniques to about 50% to 60% of outpatients with suspected PE. A high probability ventilation-perfusion (VP) scan or a positive spiral CT consistent with PE and the detection of DVT by CUS are currently considered to be clear indications for anticoagulant treatment. Subsequent pulmonary angiography (PA) is the gold standard diagnostic strategy to exclude or diagnose PE in suspected outpatients with a negative CUS, a positive rapid ELISA D-dimer test, and a nondiagnostic VP scan or negative spiral CT to prevent overtreatment with anticoagulants. However, the willingness of clinicians and the availability of resources to perform PA is restricted, a fact that has provided an impetus for clinical investigators to search for alternative noninvasive strategies to exclude or detect venous thromboembolism (VTE). Serial CUS testing for the detection of DVT in patients with a low to moderate CP on PE and a nondiagnostic VP scan or negative spiral CT is predicted to be safe and will reduce the need for PA to less than 10% or even less than 5%. This noninvasive serial CUS strategy restricts the need for invasive PA to a minor group of patients (spiral CT and a high CP on PE. Prospective evaluations are warranted to implement and to validate the advantages and the disadvantages of the various combinations of noninvasive strategies and to compare serial CUS testing versus PA in randomized clinical management studies of outpatients with suspected pulmonary embolism.

  14. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  15. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  16. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Budzien, Joanne Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ferguson, Jim Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Harwell, Megan Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hickmann, Kyle Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Israel, Daniel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Magrogan, William Richard III [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Singleton, Jr., Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Srinivasan, Gowri [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Walter, Jr, John William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Woods, Charles Nathan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-26

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents serve as the compilation of results demonstrating accomplishment of these objectives.

  17. Prevalence of deep vein thrombosis and associated factors in adult ...

    African Journals Online (AJOL)

    Without a high index of suspicion, lower limb DVT is likely to be missed. The pretest Wells score correlated well with the USS findings and could be used as a rule out test for those with suspected DVT.A follow up study to evaluate for genetic and biochemical factors that predispose to DVT need to be undertaken in the near ...

  18. Face Verification using MLP and SVM

    OpenAIRE

    Cardinaux, Fabien; Marcel, Sébastien

    2002-01-01

    The performance of machine learning algorithms has steadily improved over the past few years, such as MLP or more recently SVM. In this paper, we compare two successful discriminant machine learning algorithms apply to the problem of face verification: MLP and SVM. These two algorithms are tested on a benchmark database, namely XM2VTS. Results show that a MLP is better than a SVM on this particular task.

  19. Deep venous thrombosis and pulmonary embolism in patients with acute spinal cord injury: a comparison with nonparalyzed patients immobilized due to spinal fractures

    International Nuclear Information System (INIS)

    Myllynen, P.; Kammonen, M.; Rokkanen, P.; Boestman, O.L.; Lalla, M.; Laasonen, E.

    1985-01-01

    The occurrence of deep venous thrombosis (DVT) was studied in the series of 23 consecutive patients with acute spinal cord injury and 14 immobilized patients with spinal fractures without paralysis. The incidence of DVT in paralyzed patients was 100% as detected by the 125 I-labeled fibrinogen test and confirmed by contrast venography, and 64% as detected by repeated clinical examinations and confirmed by contrast venography. The respective incidence of DVT in nonparalyzed patients with spinal fractures was 0%. The diagnosis of DVT was reached earlier with the radiofibrinogen test than with the clinical followup (5 days vs. 25 days). Two of the 23 paralyzed patients (9%) developed nonfatal clinical pulmonary embolism (PE). There were no differences in the values of routine coagulation tests. The result justifies prophylactic anticoagulant therapy in all cases of spinal cord injury during the acute post-traumatic phase

  20. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  1. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  2. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    Science.gov (United States)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  3. Acceptance test report: Field test of mixer pump for 241-AN-107 caustic addition project

    International Nuclear Information System (INIS)

    Leshikar, G.A.

    1997-01-01

    The field acceptance test of a 75 HP mixer pump (Hazleton serial number N-20801) installed in Tank 241-AN-107 was conducted from October 1995 thru February 1996. The objectives defined in the acceptance test were successfully met, with two exceptions recorded. The acceptance test encompassed field verification of mixer pump turntable rotation set-up and operation, verification that the pump instrumentation functions within established limits, facilitation of baseline data collection from the mixer pump mounted ultrasonic instrumentation, verification of mixer pump water flush system operation and validation of a procedure for its operation, and several brief test runs (bump) of the mixer pump

  4. Academic Self-Esteem and Perceived Validity of Grades: A Test of Self-Verification Theory.

    Science.gov (United States)

    Okun, Morris A.; Fournet, Lee M.

    1993-01-01

    The hypothesis derived from self-verification theory that semester grade point average would be positively related to perceived validity of grade scores among high self-esteem undergraduates and inversely related for low self-esteem students was not supported in a study with 281 undergraduates. (SLD)

  5. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  6. Verification of absorbed dose calculation with XIO Radiotherapy Treatment Planning System

    International Nuclear Information System (INIS)

    Bokulic, T.; Budanec, M.; Frobe, A.; Gregov, M.; Kusic, Z.; Mlinaric, M.; Mrcela, I.

    2013-01-01

    Modern radiotherapy relies on computerized treatment planning systems (TPS) for absorbed dose calculation. Most TPS require a detailed model of a given machine and therapy beams. International Atomic Energy Agency (IAEA) recommends acceptance testing for the TPS (IAEA-TECDOC-1540). In this study we present customization of those tests for measurements with the purpose of verification of beam models intended for clinical use in our department. Elekta Synergy S linear accelerator installation and data acquisition for Elekta CMS XiO 4.62 TPS was finished in 2011. After the completion of beam modelling in TPS, tests were conducted in accordance with the IAEA protocol for TPS dose calculation verification. The deviations between the measured and calculated dose were recorded for 854 points and 11 groups of tests in a homogenous phantom. Most of the deviations were within tolerance. Similar to previously published results, results for irregular L shaped field and asymmetric wedged fields were out of tolerance for certain groups of points.(author)

  7. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  8. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  9. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The specification establishes the requirements for the Comprehensive Performance Test (CPT) and Limited Performance Test (LPT) of the Advanced Microwave Sounding, Unit-A2 (AMSU-A2), referred to herein as the unit. The unit is defined on Drawing 1331200. 1.2 Test procedure sequence. The sequence in which the several phases of this test procedure shall take place is shown in Figure 1, but the sequence can be in any order.

  10. Verification and validation of COBRA-SFS transient analysis capability

    International Nuclear Information System (INIS)

    Rector, D.R.; Michener, T.E.; Cuta, J.M.

    1998-05-01

    This report provides documentation of the verification and validation testing of the transient capability in the COBRA-SFS code, and is organized into three main sections. The primary documentation of the code was published in September 1995, with the release of COBRA-SFS, Cycle 2. The validation and verification supporting the release and licensing of COBRA-SFS was based solely on steady-state applications, even though the appropriate transient terms have been included in the conservation equations from the first cycle. Section 2.0, COBRA-SFS Code Description, presents a capsule description of the code, and a summary of the conservation equations solved to obtain the flow and temperature fields within a cask or assembly model. This section repeats in abbreviated form the code description presented in the primary documentation (Michener et al. 1995), and is meant to serve as a quick reference, rather than independent documentation of all code features and capabilities. Section 3.0, Transient Capability Verification, presents a set of comparisons between code calculations and analytical solutions for selected heat transfer and fluid flow problems. Section 4.0, Transient Capability Validation, presents comparisons between code calculations and experimental data obtained in spent fuel storage cask tests. Based on the comparisons presented in Sections 2.0 and 3.0, conclusions and recommendations for application of COBRA-SFS to transient analysis are presented in Section 5.0

  11. Independent verification of monitor unit calculation for radiation treatment planning system.

    Science.gov (United States)

    Chen, Li; Chen, Li-Xin; Huang, Shao-Min; Sun, Wen-Zhao; Sun, Hong-Qiang; Deng, Xiao-Wu

    2010-02-01

    To ensure the accuracy of dose calculation for radiation treatment plans is an important part of quality assurance (QA) procedures for radiotherapy. This study evaluated the Monitor Units (MU) calculation accuracy of a third-party QA software and a 3-dimensional treatment planning system (3D TPS), to investigate the feasibility and reliability of independent verification for radiation treatment planning. Test plans in a homogenous phantom were designed with 3-D TPS, according to the International Atomic Energy Agency (IAEA) Technical Report No. 430, including open, blocked, wedge, and multileaf collimator (MLC) fields. Test plans were delivered and measured in the phantom. The delivered doses were input to the QA software and the independent calculated MUs were compared with delivery. All test plans were verified with independent calculation and phantom measurements separately, and the differences of the two kinds of verification were then compared. The deviation of the independent calculation to the measurements was (0.1 +/- 0.9)%, the biggest difference fell onto the plans that used block and wedge fields (2.0%). The mean MU difference between the TPS and the QA software was (0.6 +/- 1.0)%, ranging from -0.8% to 2.8%. The deviation in dose of the TPS calculation compared to the measurements was (-0.2 +/- 1.7)%, ranging from -3.9% to 2.9%. MU accuracy of the third-party QA software is clinically acceptable. Similar results were achieved with the independent calculations and the phantom measurements for all test plans. The tested independent calculation software can be used as an efficient tool for TPS plan verification.

  12. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, REMOVAL OF ARSENIC IN DRINKING WATER: ADI INTERNATIONAL INC. ADI PILOT TEST UNIT NO. 2002-09 WITH MEDIA G2®; PHASE II

    Science.gov (United States)

    Verification testing of the ADI International Inc. Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8, 2003 through May 28,...

  14. Functional verification of a safety class controller for NPPs using a UVM register Model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kyu Chull [Dept. of Applied Computer Engineering, Dankook University, Cheonan (Korea, Republic of)

    2014-06-15

    A highly reliable safety class controller for NPPs (Nuclear Power Plants) is mandatory as even a minor malfunction can lead to disastrous consequences for people, the environment or the facility. In order to enhance the reliability of a safety class digital controller for NPPs, we employed a diversity approach, in which a PLC-type controller and a PLD-type controller are to be operated in parallel. We built and used structured testbenches based on the classes supported by UVM for functional verification of the PLD-type controller designed for NPPs. We incorporated a UVM register model into the testbenches in order to increase the controllability and the observability of the DUT(Device Under Test). With the increased testability, we could easily verify the datapaths between I/O ports and the register sets of the DUT, otherwise we had to perform black box tests for the datapaths, which is very cumbersome and time consuming. We were also able to perform constrained random verification very easily and systematically. From the study, we confirmed the various advantages of using the UVM register model in verification such as scalability, reusability and interoperability, and set some design guidelines for verification of the NPP controllers.

  15. MO-F-16A-01: Implementation of MPPG TPS Verification Tests On Various Accelerators

    International Nuclear Information System (INIS)

    Smilowitz, J; Bredfeldt, J; Geurts, M; Miller, J

    2014-01-01

    Purpose: To demonstrate the implementation of the Medical Physics Practice Guideline (MPPG) for dose calculation and beam parameters verification of treatment planning systems (TPS). Methods: We implemented the draft TPS MPPG for three linacs: Varian Trilogy, TomoHDA and Elekta Infinity. Static and modulated test plans were created. The static fields are different than used in commissioning. Data was collected using ion chambers and diodes in a scanning water tank, Delta4 phantom and a custom phantom. MatLab and Microsoft Excel were used to create analysis tools to compare reference DICOM dose with scan data. This custom code allowed for the interpolation, registration and gamma analysis of arbitrary dose profiles. It will be provided as open source code. IMRT fields were validated with Delta4 registration and comparison tools. The time for each task was recorded. Results: The tests confirmed the strengths, and revealed some limitations, of our TPS. The agreement between calculated and measured dose was reported for all beams. For static fields, percent depth dose and profiles were analyzed with criteria in the draft MPPG. The results reveal areas of slight mismatch with the model (MLC leaf penumbra, buildup region.) For TomoTherapy, the IMRT plan 2%/2 mm gamma analysis revealed poorest agreement in the low dose regions. For one static test plan for all 10MV Trilogy photon beams, the plan generation, scan queue creation, data collection, data analysis and report took 2 hours, excluding tank setup. Conclusions: We have demonstrated the implementation feasibility of the TPS MPPG. This exercise generated an open source tool for dose comparisons between scan data and DICOM dose data. An easily reproducible and efficient infrastructure with streamlined data collection was created for repeatable robust testing of the TPS. The tests revealed minor discrepancies in our models and areas for improvement that are being investigated

  16. MO-F-16A-01: Implementation of MPPG TPS Verification Tests On Various Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Smilowitz, J; Bredfeldt, J; Geurts, M; Miller, J [University of Wisconsin, Madison, WI (United States)

    2014-06-15

    Purpose: To demonstrate the implementation of the Medical Physics Practice Guideline (MPPG) for dose calculation and beam parameters verification of treatment planning systems (TPS). Methods: We implemented the draft TPS MPPG for three linacs: Varian Trilogy, TomoHDA and Elekta Infinity. Static and modulated test plans were created. The static fields are different than used in commissioning. Data was collected using ion chambers and diodes in a scanning water tank, Delta4 phantom and a custom phantom. MatLab and Microsoft Excel were used to create analysis tools to compare reference DICOM dose with scan data. This custom code allowed for the interpolation, registration and gamma analysis of arbitrary dose profiles. It will be provided as open source code. IMRT fields were validated with Delta4 registration and comparison tools. The time for each task was recorded. Results: The tests confirmed the strengths, and revealed some limitations, of our TPS. The agreement between calculated and measured dose was reported for all beams. For static fields, percent depth dose and profiles were analyzed with criteria in the draft MPPG. The results reveal areas of slight mismatch with the model (MLC leaf penumbra, buildup region.) For TomoTherapy, the IMRT plan 2%/2 mm gamma analysis revealed poorest agreement in the low dose regions. For one static test plan for all 10MV Trilogy photon beams, the plan generation, scan queue creation, data collection, data analysis and report took 2 hours, excluding tank setup. Conclusions: We have demonstrated the implementation feasibility of the TPS MPPG. This exercise generated an open source tool for dose comparisons between scan data and DICOM dose data. An easily reproducible and efficient infrastructure with streamlined data collection was created for repeatable robust testing of the TPS. The tests revealed minor discrepancies in our models and areas for improvement that are being investigated.

  17. A new 125I-fibrinogen technique for detection and depth localization of post-operative venous thrombosis

    International Nuclear Information System (INIS)

    Bernstein, K.

    1981-10-01

    The reliability and sensitivity of the 125 I-fibrinogen uptake test (FUT) was improved by using an equipment that allowed frequent controls of its sensitivity. A new technique the 125 I-fibrinogen-sum-coincidence method (FSC), which can be used in combination with the conventional FUT for detection deep venous thrombosis (DVT) was developed. With this new method the depths of the fibrin deposits detected by the FUT could be determined. Very good agreement was demonstrated between depth determinations of thrombi by the FSC-technique and by phlebography. The new technique permits differentiation between true DVT and superficial venous thrombosis. Altogether 354 patients subjected to gynecology surgery were studied postoperatively with the improved FUT and 65 of them had signs of lower limb DVT with this test. 41 patients with a positive FUT were investigated with the FSC-technique as well, and in 37 of them the diagnosis of DVT was confirmed. Advanced age, and malignancy were preoperative risk factors for the development of DVT whereas the method of anaesthesia (general or epidural) had no significant influence on the rate of DVT. The five-fold increase in the rate of DVT after preoperative treatment with synthetic oestrogens neccesitated a change in the preoperative administration of such drugs. The new FSC-technique offers the possibilities of both determining the true 125 I-activity in athrombosis and of following its course for several weeks. It is recommended that thrombi with a maximum net-activity >2kBq and with no sign of lysis when checked by the FSC-test should be treated. (author)

  18. Verification of Thermal Models of Internally Cooled Gas Turbine Blades

    Directory of Open Access Journals (Sweden)

    Igor Shevchenko

    2018-01-01

    Full Text Available Numerical simulation of temperature field of cooled turbine blades is a required element of gas turbine engine design process. The verification is usually performed on the basis of results of test of full-size blade prototype on a gas-dynamic test bench. A method of calorimetric measurement in a molten metal thermostat for verification of a thermal model of cooled blade is proposed in this paper. The method allows obtaining local values of heat flux in each point of blade surface within a single experiment. The error of determination of local heat transfer coefficients using this method does not exceed 8% for blades with radial channels. An important feature of the method is that the heat load remains unchanged during the experiment and the blade outer surface temperature equals zinc melting point. The verification of thermal-hydraulic model of high-pressure turbine blade with cooling allowing asymmetrical heat removal from pressure and suction sides was carried out using the developed method. An analysis of heat transfer coefficients confirmed the high level of heat transfer in the leading edge, whose value is comparable with jet impingement heat transfer. The maximum of the heat transfer coefficients is shifted from the critical point of the leading edge to the pressure side.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: BROME AGRI SALES, LTD., MAXIMIZER SEPARATOR, MODEL MAX 1016 - 03/01/WQPC-SWP

    Science.gov (United States)

    Verification testing of the Brome Agri Sales Ltd. Maximizer Separator, Model MAX 1016 (Maximizer) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The Maximizer is an inclined screen solids separator that can be used to s...

  20. Case Study: Test Results of a Tool and Method for In-Flight, Adaptive Control System Verification on a NASA F-15 Flight Research Aircraft

    Science.gov (United States)

    Jacklin, Stephen A.; Schumann, Johann; Guenther, Kurt; Bosworth, John

    2006-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable autonomous flight control and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments [1-2]. At the present time, however, it is unknown how adaptive algorithms can be routinely verified, validated, and certified for use in safety-critical applications. Rigorous methods for adaptive software verification end validation must be developed to ensure that. the control software functions as required and is highly safe and reliable. A large gap appears to exist between the point at which control system designers feel the verification process is complete, and when FAA certification officials agree it is complete. Certification of adaptive flight control software verification is complicated by the use of learning algorithms (e.g., neural networks) and degrees of system non-determinism. Of course, analytical efforts must be made in the verification process to place guarantees on learning algorithm stability, rate of convergence, and convergence accuracy. However, to satisfy FAA certification requirements, it must be demonstrated that the adaptive flight control system is also able to fail and still allow the aircraft to be flown safely or to land, while at the same time providing a means of crew notification of the (impending) failure. It was for this purpose that the NASA Ames Confidence Tool was developed [3]. This paper presents the Confidence Tool as a means of providing in-flight software assurance monitoring of an adaptive flight control system. The paper will present the data obtained from flight testing the tool on a specially modified F-15 aircraft designed to simulate loss of flight control faces.

  1. Installation with magnetic suspension of test bodies for measurement of small forces. Verification of equivalence of inertial and gravitational mass

    International Nuclear Information System (INIS)

    Kalebin, S.M.

    1988-01-01

    Torsion installation with magnetic suspension of test bodies for detection of small forces is considered. Installation application for verification of equivalence of inertial and gravitational mass in the case of test body incidence on the Earth (Etvesh experiment) and in the case of their incidene on the Sun (Dicke experiment) is discussed. The total mass of test bodies, produced in the form of cylinders with 3 cm radius, equals 50 kg (one lead body and one copper body); beam radius of test bodies equals 3 cm (the cylinders are tight against one another); ferrite cylinder with 3 cm radius and 10 cm height is used for their suspension in magnetic field. Effect of thermal noise and electromagnetic force disturbances on measurement results is considered. Conducted calculations show that suggested installation enables to improve the accuracy of verifying mentioned equivalence at least by one order and upwards. This suggests that such installation is a matter of interest for experiments on small force detection

  2. Proposal for a verification facility of ADS in China

    International Nuclear Information System (INIS)

    Guan Xialing; Luo Zhanglin

    1999-01-01

    The concept, general layout and some specifications of a proposed verification facility of the accelerator driven radioactive clean nuclear power system (AD-RCNPS) in China are described. It is composed of a 150 MeV/3 mA low energy accelerator, a swimming pool reactor and some basic research facilities. The 150 MeV accelerator consists of an ECR proton source, LEBT, RFQ, CCDTL and SCC. As the sub-critical reactor, the swimming pool reactor is an existing research reactor at the China Institute of Atomic Energy, whose maximum output power is 3.5 MW. The effect of the instability of proton beam and possibility of simulation tests on the verification facility have been analysed. (author)

  3. Proposal for a verification facility of ADS in China

    International Nuclear Information System (INIS)

    Guan Xialing; Luo Zhanglin

    2000-01-01

    The concept, the general layout and some specifications of a proposed verification facility of the accelerator driven radioactive clean nuclear power system (AD-RCNPS) in China has been described. It is composed of a 150 MeV/3 mA low energy accelerator, a swimming pool reactor and some basic research facility. The 150 MeV accelerator consists of an ECR proton source, LEBT, RFQ, CCDTL and SCC. As the sub-critical reactor, the swimming pool reactor is an existing research reactor in China Institute of Atomic Energy, its maximum output power is 3.5 MW. The effect of the instability of proton beam and possibility of simulation test on the verification facility have been analyzed

  4. Cleanup Verification Package for the 300 VTS Waste Site

    International Nuclear Information System (INIS)

    Clark, S.W.; Mitchell, T.H.

    2006-01-01

    This cleanup verification package documents completion of remedial action for the 300 Area Vitrification Test Site, also known as the 300 VTS site. The site was used by Pacific Northwest National Laboratory as a field demonstration site for in situ vitrification of soils containing simulated waste

  5. Cleanup Verification Package for the 300 VTS Waste Site

    Energy Technology Data Exchange (ETDEWEB)

    S. W. Clark and T. H. Mitchell

    2006-03-13

    This cleanup verification package documents completion of remedial action for the 300 Area Vitrification Test Site, also known as the 300 VTS site. The site was used by Pacific Northwest National Laboratory as a field demonstration site for in situ vitrification of soils containing simulated waste.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    Science.gov (United States)

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  7. System Description: Embedding Verification into Microsoft Excel

    OpenAIRE

    Collins, Graham; Dennis, Louise Abigail

    2000-01-01

    The aim of the PROSPER project is to allow the embedding of existing verification technology into applications in such a way that the theorem proving is hidden, or presented to the end user in a natural way. This paper describes a system built to test whether the PROSPER toolkit satisfied this aim. The system combines the toolkit with Microsoft Excel, a popular commercial spreadsheet application.

  8. Formal verification and validation of the safety-critical software in a digital reactor protection system

    International Nuclear Information System (INIS)

    Kwon, K. C.; Park, G. Y.

    2006-01-01

    This paper describes the Verification and Validation (V and V) activities for the safety-critical software in a Digital Reactor Protection System (DRPS) that is being developed through the Korea nuclear instrumentation and control system project. The main activities of the DRPS V and V process are a preparation of the software planning documentation, a verification of the software according to the software life cycle, a software safety analysis and a software configuration management. The verification works for the Software Requirement Specification (SRS) of the DRPS consist of a technical evaluation, a licensing suitability evaluation, a inspection and traceability analysis, a formal verification, and preparing a test plan and procedure. Especially, the SRS is specified by the formal specification method in the development phase, and the formal SRS is verified by a formal verification method. Through these activities, we believe we can achieve the functionality, performance, reliability, and safety that are the major V and V objectives of the nuclear safety-critical software in a DRPS. (authors)

  9. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  10. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  11. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  12. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  13. Dynamic Isotope Power System: technology verification phase, program plan, 1 October 1978

    International Nuclear Information System (INIS)

    1979-01-01

    The technology verification phase program plan of the Dynamic Isotope Power System (DIPS) project is presented. DIPS is a project to develop a 0.5 to 2.0 kW power system for spacecraft using an isotope heat source and a closed-cycle Rankine power-system with an organic working fluid. The technology verification phase's purposes are to increase the system efficiency to over 18%, to demonstrate system reliability, and to provide an estimate for flight test scheduling. Progress toward these goals is reported

  14. SHIELD verification and validation report

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    This document outlines the verification and validation effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system code. Along with its predecessors, SHIELD has been in use at the Savannah River Site (SRS) for more than ten years. During this time the code has been extensively tested and a variety of validation documents have been issued. The primary function of this report is to specify the features and capabilities for which SHIELD is to be considered validated, and to reference the documents that establish the validation

  15. On the Verification of a WiMax Design Using Symbolic Simulation

    Directory of Open Access Journals (Sweden)

    Gabriela Nicolescu

    2013-07-01

    Full Text Available In top-down multi-level design methodologies, design descriptions at higher levels of abstraction are incrementally refined to the final realizations. Simulation based techniques have traditionally been used to verify that such model refinements do not change the design functionality. Unfortunately, with computer simulations it is not possible to completely check that a design transformation is correct in a reasonable amount of time, as the number of test patterns required to do so increase exponentially with the number of system state variables. In this paper, we propose a methodology for the verification of conformance of models generated at higher levels of abstraction in the design process to the design specifications. We model the system behavior using sequence of recurrence equations. We then use symbolic simulation together with equivalence checking and property checking techniques for design verification. Using our proposed method, we have verified the equivalence of three WiMax system models at different levels of design abstraction, and the correctness of various system properties on those models. Our symbolic modeling and verification experiments show that the proposed verification methodology provides performance advantage over its numerical counterpart.

  16. Testing of the dual slab verification detector for attended measurements of the BN-350 dry storage casks

    Energy Technology Data Exchange (ETDEWEB)

    Santi, Peter A [Los Alamos National Laboratory; Browne, Michael C [Los Alamos National Laboratory; Williams, Richard B [Los Alamos National Laboratory; Parker, Robert F [Los Alamos National Laboratory

    2009-01-01

    The Dual Slab Verification Detector (DSVD) has been developed and built by Los Alamos National Laboratory in cooperation with the International Atomic Energy Agency (IAEA) as part of the dry storage safeguards system for the spent fuel from the BN-350 fast reactor. The detector consists of two rows of {sup 3}He tubes embedded in a slab of polyethylene which has been designed to be placed on the outer surface of the dry storage cask. The DSVD will be used to perform measurements of the neutron flux emanating from inside the dry storage cask at several locations around each cask to establish a neutron 'fingerprint' that is sensitive to the contents of the cask. The sensitivity of the fingerprinting technique to the removal of specific amount of nuclear material from the cask is determined by the characteristics of the detector that is used to perform the measurements, the characteristics of the spent fuel being measured, and systematic uncertainties that are associated with the dry storage scenario. MCNPX calculations of the BN-350 dry storage asks and layout have shown that the neutron fingerprint verification technique using measurements from the DSVD would be sensitive to both the amount and location of material that is present within an individual cask. To confirm the performance of the neutron fingerprint technique in verifying the presence of BN-350 spent fuel in dry storage, an initial series of measurements have been performed to test the performance and characteristics of the DSVD. Results of these measurements will be presented and compared with MCNPX results.

  17. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  18. A rule-based verification and control framework in ATLAS Trigger-DAQ

    CERN Document Server

    Kazarov, A; Lehmann-Miotto, G; Sloper, J E; Ryabov, Yu; Computing In High Energy and Nuclear Physics

    2007-01-01

    In order to meet the requirements of ATLAS data taking, the ATLAS Trigger-DAQ system is composed of O(1000) of applications running on more than 2600 computers in a network. With such system size, s/w and h/w failures are quite often. To minimize system downtime, the Trigger-DAQ control system shall include advanced verification and diagnostics facilities. The operator should use tests and expertise of the TDAQ and detectors developers in order to diagnose and recover from errors, if possible automatically. The TDAQ control system is built as a distributed tree of controllers, where behavior of each controller is defined in a rule-based language allowing easy customization. The control system also includes verification framework which allow users to develop and configure tests for any component in the system with different levels of complexity. It can be used as a stand-alone test facility for a small detector installation, as part of the general TDAQ initialization procedure, and for diagnosing the problems ...

  19. IMRT delivery verification using a spiral phantom

    International Nuclear Information System (INIS)

    Richardson, Susan L.; Tome, Wolfgang A.; Orton, Nigel P.; McNutt, Todd R.; Paliwal, Bhudatt R.

    2003-01-01

    In this paper we report on the testing and verification of a system for IMRT delivery quality assurance that uses a cylindrical solid water phantom with a spiral trajectory for radiographic film placement. This spiral film technique provides more complete dosimetric verification of the entire IMRT treatment than perpendicular film methods, since it samples a three-dimensional dose subspace rather than using measurements at only one or two depths. As an example, the complete analysis of the predicted and measured spiral films is described for an intracranial IMRT treatment case. The results of this analysis are compared to those of a single field perpendicular film technique that is typically used for IMRT QA. The comparison demonstrates that both methods result in a dosimetric error within a clinical tolerance of 5%, however the spiral phantom QA technique provides a more complete dosimetric verification while being less time consuming. To independently verify the dosimetry obtained with the spiral film, the same IMRT treatment was delivered to a similar phantom in which LiF thermoluminescent dosimeters were arranged along the spiral trajectory. The maximum difference between the predicted and measured TLD data for the 1.8 Gy fraction was 0.06 Gy for a TLD located in a high dose gradient region. This further validates the ability of the spiral phantom QA process to accurately verify delivery of an IMRT plan

  20. On site PWR fuel inspection measurements for operational and design verification

    International Nuclear Information System (INIS)

    1996-01-01

    The on-site inspection of irradiated Pressurized Water Reactor (PWR) fuel and Non-Fuel Bearing Components (NFBC) is typically limited to visual inspections during refuelings using underwater TV cameras and is intended primarily to confirm whether the components will continue in operation. These inspections do not normally provide data for design verification nor information to benefit future fuel designs. Japanese PWR utilities and Nuclear Fuel Industries Ltd. designed, built, and performed demonstration tests of on-site inspection equipment that confirms operational readiness of PWR fuel and NFBC and also gathers data for design verification of these components. 4 figs, 3 tabs

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DEVILBISS JGHV-531-46FF HVLP SPRAY GUN

    Science.gov (United States)

    This report presents the results of the verification test of the DeVilbiss JGHV-531-46FF high-volume, low-pressure pressure-feed spray gun, hereafter referred to as the DeVilbiss JGHV, which is designed for use in industrial finishing. The test coating chosen by ITW Industrial Fi...

  2. VERIFICATION OF GEAR DYNAMIC MODEL IN DIFFERENT OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ

    2014-09-01

    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  3. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  4. The verification of neutron activation analysis support system (cooperative research)

    Energy Technology Data Exchange (ETDEWEB)

    Sasajima, Fumio; Ichimura, Shigeju; Ohtomo, Akitoshi; Takayanagi, Masaji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Sawahata, Hiroyuki; Ito, Yasuo [Tokyo Univ. (Japan). Research Center for Nuclear Science and Technology; Onizawa, Kouji [Radiation Application Development Association, Tokai, Ibaraki (Japan)

    2000-12-01

    Neutron activation analysis support system is the system in which even the user who has not much experience in the neutron activation analysis can conveniently and accurately carry out the multi-element analysis of the sample. In this verification test, subjects such functions, usability, precision and accuracy of the analysis and etc. of the neutron activation analysis support system were confirmed. As a method of the verification test, it was carried out using irradiation device, measuring device, automatic sample changer and analyzer equipped in the JRR-3M PN-3 facility, and analysis software KAYZERO/SOLCOI based on the k{sub 0} method. With these equipments, calibration of the germanium detector, measurement of the parameter of the irradiation field and analysis of three kinds of environmental standard sample were carried out. The k{sub 0} method adopted in this system is primarily utilized in Europe recently, and it is the analysis method, which can conveniently and accurately carried out the multi-element analysis of the sample without requiring individual comparison standard sample. By this system, total 28 elements were determined quantitatively, and 16 elements with the value guaranteed as analytical data of the NIST (National Institute of Standards and Technology) environment standard sample were analyzed in the accuracy within 15%. This report describes content and verification result of neutron activation support system. (author)

  5. OpenMP 4.5 Validation and Verification Suite

    Energy Technology Data Exchange (ETDEWEB)

    2017-12-15

    OpenMP, a directive-based programming API, introduce directives for accelerator devices that programmers are starting to use more frequently in production codes. To make sure OpenMP directives work correctly across architectures, it is critical to have a mechanism that tests for an implementation's conformance to the OpenMP standard. This testing process can uncover ambiguities in the OpenMP specification, which helps compiler developers and users make a better use of the standard. We fill this gap through our validation and verification test suite that focuses on the offload directives available in OpenMP 4.5.

  6. Fault-specific verification (FSV) - An alternative VV ampersand T strategy for high reliability nuclear software systems

    International Nuclear Information System (INIS)

    Miller, L.A.

    1994-01-01

    The author puts forth an argument that digital instrumentation and control systems can be safely applied in the nuclear industry, but it will require changes to the way software for such systems is developed and tested. He argues for a fault-specific verification procedure to be applied to software development. This plan includes enumerating and classifying all software faults at all levels of the product development, over the whole development process. While collecting this data, develop and validate different methods for software verification, validation and testing, and apply them against all the detected faults. Force all of this development toward an automated product for doing this testing. Continue to develop, expand, test, and share these testing methods across a wide array of software products

  7. Burnup verification tests with the FORK measurement system-implementation for burnup credit

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. It was designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program and is well suited to verify burnup and cooling time records at commercial Pressurized Water Reactor (PWR) sites. This report deals with the application of the FORK system to burnup credit operations

  8. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2018-01-01

    -dependent. We show that the proposed method significantly reduces the error rates of text-dependent speaker verification for the non-target types: target-wrong and impostor-wrong while it maintains comparable TD-SV performance when impostors speak a correct utterance with respect to the conventional system......In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using...... the utterances of a particular pass-phrase. During training, pass-phrase specific target speaker models are derived from the particular PBM using the training data for the respective target model. While testing, the best PBM is first selected for the test utterance in the maximum likelihood (ML) sense...

  9. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  10. Requirements Verification Report AN Farm to 200E Waste Transfer System for Project W-314, Tank Farm Restoration and Safe Operations

    International Nuclear Information System (INIS)

    MCGREW, D.L.

    1999-01-01

    This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate

  11. Systemverilog for verification a guide to learning the testbench language features

    CERN Document Server

    Spear, Chris

    2012-01-01

    Based on the highly successful second edition, this extended edition of SystemVerilog for Verification: A Guide to Learning the Testbench Language Features teaches all verification features of the SystemVerilog language, providing hundreds of examples to clearly explain the concepts and basic fundamentals. It contains materials for both the full-time verification engineer and the student learning this valuable skill. In the third edition, authors Chris Spear and Greg Tumbush start with how to verify a design, and then use that context to demonstrate the language features,  including the advantages and disadvantages of different styles, allowing readers to choose between alternatives. This textbook contains end-of-chapter exercises designed to enhance students’ understanding of the material. Other features of this revision include: New sections on static variables, print specifiers, and DPI from the 2009 IEEE language standard Descriptions of UVM features such as factories, the test registry, and the config...

  12. Assessment of Condylar Changes in Patients with Temporomandibular Joint Pain Using Digital Volumetric Tomography

    International Nuclear Information System (INIS)

    Shetty, U.Sh.; Burde, K.N.; Naikmasur, V.G.; Sattur, A.P.

    2014-01-01

    Objective. To evaluate the efficiency of DVT in comparison with OPG in the assessment of bony condylar changes in patients of TMJ pain. Methods. 100 temporomandibular joints of 62 patients with the complaint of temporomandibular joint pain were included in the study. DVT and OPG radiographs were taken for all the 100 joints. Three observers interpreted the DVT and OPG radiograph for the bony changes separately for two times with an interval of one week. The bony changes seen in the condyle were given coding from 0 to 6. (0: Normal, 1: Erosion, 2: Flattening, 3: Osteophyte, 4: Sclerosis, 5: Resorption, and 6: other changes). Interobserver and intra observer variability was assessed with one-way Anoka statistics. Z test was used to see the significant difference between OPG and DVT. Results. In the present study the inter examiner reliability for Og and DVT was 0.903 and 0.978, respectively. Intra examiner reliability for OPG and DVT was 0.908 and 0.980, respectively. The most common condylar bony change seen in OPG and DVT was erosion followed by flattening and osteophyte. There was significant difference between OPG and DVT in detecting erosion and osteophytes. The other changes observed in our study were Elys cyst, pointed condyle, and bifid condyle. All the bony changes are more commonly seen in females than males. Conclusion. DVT provides more valid and accurate information on condylar bony changes. The DVT has an added advantage of lesser radiation exposure to the patient and cost effectiveness and could be easily accessible in a dental hospital

  13. Assessment of Condylar Changes in Patients with Temporomandibular Joint Pain Using Digital Volumetric Tomography

    Directory of Open Access Journals (Sweden)

    Ujwala Shivarama Shetty

    2014-01-01

    Full Text Available Objective. To evaluate the efficiency of DVT in comparison with OPG in the assessment of bony condylar changes in patients of TMJ pain. Methods. 100 temporomandibular joints of 62 patients with the complaint of temporomandibular joint pain were included in the study. DVT and OPG radiographs were taken for all the 100 joints. Three observers interpreted the DVT and OPG radiograph for the bony changes separately for two times with an interval of one week. The bony changes seen in the condyle were given coding from 0 to 6. (0: Normal, 1: Erosion, 2: Flattening, 3: Osteophyte, 4: Sclerosis, 5: Resorption, and 6: other changes. Interobserver and intraobserver variability was assessed with one-way ANOVA statistics. Z test was used to see the significant difference between OPG and DVT. Results. In the present study the interexaminer reliability for OPG and DVT was 0.903 and 0.978, respectively. Intraexaminer reliability for OPG and DVT was 0.908 and 0.980, respectively. The most common condylar bony change seen in OPG and DVT was erosion followed by flattening and osteophyte. There was significant difference between OPG and DVT in detecting erosion and osteophytes. The other changes observed in our study were Ely’s cyst, pointed condyle, and bifid condyle. All the bony changes are more commonly seen in females than males. Conclusion. DVT provides more valid and accurate information on condylar bony changes. The DVT has an added advantage of lesser radiation exposure to the patient and cost effectiveness and could be easily accessible in a dental hospital.

  14. Expert system verification and validation for nuclear power industry applications

    International Nuclear Information System (INIS)

    Naser, J.A.

    1990-01-01

    The potential for the use of expert systems in the nuclear power industry is widely recognized. The benefits of such systems include consistency of reasoning during off-normal situations when humans are under great stress, the reduction of times required to perform certain functions, the prevention of equipment failures through predictive diagnostics, and the retention of human expertise in performing specialized functions. The increased use of expert systems brings with it concerns about their reliability. Difficulties arising from software problems can affect plant safety, reliability, and availability. A joint project between EPRI and the US Nuclear Regulatory Commission is being initiated to develop a methodology for verification and validation of expert systems for nuclear power applications. This methodology will be tested on existing and developing expert systems. This effort will explore the applicability of conventional verification and validation methodologies to expert systems. The major area of concern will be certification of the knowledge base. This is expected to require new types of verification and validation techniques. A methodology for developing validation scenarios will also be studied

  15. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  16. Verification of the MOTIF code version 3.0

    International Nuclear Information System (INIS)

    Chan, T.; Guvanasen, V.; Nakka, B.W.; Reid, J.A.K.; Scheier, N.W.; Stanchell, F.W.

    1996-12-01

    As part of the Canadian Nuclear Fuel Waste Management Program (CNFWMP), AECL has developed a three-dimensional finite-element code, MOTIF (Model Of Transport In Fractured/ porous media), for detailed modelling of groundwater flow, heat transport and solute transport in a fractured rock mass. The code solves the transient and steady-state equations of groundwater flow, solute (including one-species radionuclide) transport, and heat transport in variably saturated fractured/porous media. The initial development was completed in 1985 (Guvanasen 1985) and version 3.0 was completed in 1986. This version is documented in detail in Guvanasen and Chan (in preparation). This report describes a series of fourteen verification cases which has been used to test the numerical solution techniques and coding of MOTIF, as well as demonstrate some of the MOTIF analysis capabilities. For each case the MOTIF solution has been compared with a corresponding analytical or independently developed alternate numerical solution. Several of the verification cases were included in Level 1 of the International Hydrologic Code Intercomparison Project (HYDROCOIN). The MOTIF results for these cases were also described in the HYDROCOIN Secretariat's compilation and comparison of results submitted by the various project teams (Swedish Nuclear Power Inspectorate 1988). It is evident from the graphical comparisons presented that the MOTIF solutions for the fourteen verification cases are generally in excellent agreement with known analytical or numerical solutions obtained from independent sources. This series of verification studies has established the ability of the MOTIF finite-element code to accurately model the groundwater flow and solute and heat transport phenomena for which it is intended. (author). 20 refs., 14 tabs., 32 figs

  17. Cognitive Bias in the Verification and Validation of Space Flight Systems

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  18. EURATOM safeguards efforts in the development of spent fuel verification methods by non-destructive assay

    Energy Technology Data Exchange (ETDEWEB)

    Matloch, L.; Vaccaro, S.; Couland, M.; De Baere, P.; Schwalbach, P. [Euratom, Communaute europeenne de l' energie atomique - CEEA (European Commission (EC))

    2015-07-01

    The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction of encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)

  19. KNGR core proection calculator, software, verification and validation plan

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Park, Jong Kyun; Lee, Ki Young; Lee, Jang Soo; Cheon, Se Woo

    2001-05-01

    This document describes the Software Verification and Validation Plan(SVVP) Guidance to be used in reviewing the Software Program Manual(SPM) in Korean Next Generation Reactor(KNGR) projects. This document is intended for a verifier or reviewer who is involved with performing of software verification and validation task activity in KNGR projects. This document includeds the basic philosophy, performing V and V effort, software testing techniques, criteria of review and audit on the safety software V and V activity. Major review topics on safety software addresses three kinds of characteristics based on Standard Review Plan(SRP) Chapter 7, Branch Technical Position(BTP)-14 : management characteristics, implementation characteristics and resources characteristics when reviewing on SVVP. Based on major topics of this document, we have produced the evaluation items list such as checklist in Appendix A

  20. Modeling the dynamics of internal flooding - verification analysis

    International Nuclear Information System (INIS)

    Filipov, K.

    2011-01-01

    The results from conducted software WATERFOW's verification analysis, developed for the purposes of reactor building internal flooding analysis have been presented. For the purpose of benchmarking the integrated code MELCOR is selected. Considering the complex structure of reactor building, the sample tests were used to cover the characteristic points of the internal flooding analysis. The inapplicability of MELCOR to the internal flooding study has been proved

  1. Construction and commissioning test report of the CEDM test facility

    Energy Technology Data Exchange (ETDEWEB)

    Chung, C. H.; Kim, J. T.; Park, W. M.; Youn, Y. J.; Jun, H. G.; Choi, N. H.; Park, J. K.; Song, C. H.; Lee, S. H.; Park, J. K

    2001-02-01

    The test facility for performance verification of the control element drive mechanism (CEDM) of next generation power plant was installed at the site of KAERI. The CEDM was featured a mechanism consisting of complicated mechanical parts and electromagnetic control system. Thus, a new CEDM design should go through performance verification tests prior to it's application in a reactor. The test facility can simulate the reactor operating conditions such as temperature, pressure and water quality and is equipped with a test chamber to accomodate a CEDM as installed in the power plant. This test facility can be used for the following tests; endurance test, coil cooling test, power measurement and reactivity rod drop test. The commissioning tests for the test facility were performed up to the CEDM test conditions of 320 C and 150 bar, and required water chemistry was obtained by operating the on-line water treatment system.

  2. Construction and commissioning test report of the CEDM test facility

    International Nuclear Information System (INIS)

    Chung, C. H.; Kim, J. T.; Park, W. M.; Youn, Y. J.; Jun, H. G.; Choi, N. H.; Park, J. K.; Song, C. H.; Lee, S. H.; Park, J. K.

    2001-02-01

    The test facility for performance verification of the control element drive mechanism (CEDM) of next generation power plant was installed at the site of KAERI. The CEDM was featured a mechanism consisting of complicated mechanical parts and electromagnetic control system. Thus, a new CEDM design should go through performance verification tests prior to it's application in a reactor. The test facility can simulate the reactor operating conditions such as temperature, pressure and water quality and is equipped with a test chamber to accomodate a CEDM as installed in the power plant. This test facility can be used for the following tests; endurance test, coil cooling test, power measurement and reactivity rod drop test. The commissioning tests for the test facility were performed up to the CEDM test conditions of 320 C and 150 bar, and required water chemistry was obtained by operating the on-line water treatment system

  3. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  4. Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP

    International Nuclear Information System (INIS)

    Maruyama, Soh; Fujimoto, Nozomu; Sudo, Yukio; Kiso, Yoshihiro; Murakami, Tomoyuki.

    1988-09-01

    This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T 1-M ) with simulated fuel rods and fuel blocks. (author)

  5. Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP

    Science.gov (United States)

    Maruyama, Soh; Fujimoto, Nozomu; Kiso, Yoshihiro; Murakami, Tomoyuki; Sudo, Yukio

    1988-09-01

    This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T(sub 1-M)) with simulated fuel rods and fuel blocks.

  6. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  7. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  8. Systematic review and meta-analysis of the diagnostic accuracy of ultrasonography for deep vein thrombosis

    International Nuclear Information System (INIS)

    Goodacre, Steve; Sampson, Fiona; Thomas, Steve; Beek, Edwin van; Sutton, Alex

    2005-01-01

    Ultrasound (US) has largely replaced contrast venography as the definitive diagnostic test for deep vein thrombosis (DVT). We aimed to derive a definitive estimate of the diagnostic accuracy of US for clinically suspected DVT and identify study-level factors that might predict accuracy. We undertook a systematic review, meta-analysis and meta-regression of diagnostic cohort studies that compared US to contrast venography in patients with suspected DVT. We searched Medline, EMBASE, CINAHL, Web of Science, Cochrane Database of Systematic Reviews, Cochrane Controlled Trials Register, Database of Reviews of Effectiveness, the ACP Journal Club, and citation lists (1966 to April 2004). Random effects meta-analysis was used to derive pooled estimates of sensitivity and specificity. Random effects meta-regression was used to identify study-level covariates that predicted diagnostic performance. We identified 100 cohorts comparing US to venography in patients with suspected DVT. Overall sensitivity for proximal DVT (95% confidence interval) was 94.2% (93.2 to 95.0), for distal DVT was 63.5% (59.8 to 67.0), and specificity was 93.8% (93.1 to 94.4). Duplex US had pooled sensitivity of 96.5% (95.1 to 97.6) for proximal DVT, 71.2% (64.6 to 77.2) for distal DVT and specificity of 94.0% (92.8 to 95.1). Triplex US had pooled sensitivity of 96.4% (94.4 to 97.1%) for proximal DVT, 75.2% (67.7 to 81.6) for distal DVT and specificity of 94.3% (92.5 to 95.8). Compression US alone had pooled sensitivity of 93.8 % (92.0 to 95.3%) for proximal DVT, 56.8% (49.0 to 66.4) for distal DVT and specificity of 97.8% (97.0 to 98.4). Sensitivity was higher in more recently published studies and in cohorts with higher prevalence of DVT and more proximal DVT, and was lower in cohorts that reported interpretation by a radiologist. Specificity was higher in cohorts that excluded patients with previous DVT. No studies were identified that compared repeat US to venography in all patients. Repeat US

  9. Verification of failover effects from distributed control system communication networks in digitalized nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Min, Moon Gi; Lee, Jae Ki; Lee, Kwang Hyun; Lee, Dong Il; Lim, Hee Taek [Korea Hydro and Nuclear Power Co., Ltd, Daejeon (Korea, Republic of)

    2017-08-15

    Distributed Control System (DCS) communication networks, which use Fast Ethernet with redundant networks for the transmission of information, have been installed in digitalized nuclear power plants. Normally, failover tests are performed to verify the reliability of redundant networks during design and manufacturing phases; however, systematic integrity tests of DCS networks cannot be fully performed during these phases because all relevant equipment is not installed completely during these two phases. In additions, practical verification tests are insufficient, and there is a need to test the actual failover function of DCS redundant networks in the target environment. The purpose of this study is to verify that the failover functions works correctly in certain abnormal conditions during installation and commissioning phase and identify the influence of network failover on the entire DCS. To quantify the effects of network failover in the DCS, the packets (Protocol Data Units) must be collected and resource usage of the system has to be monitored and analyzed. This study introduces the use of a new methodology for verification of DCS network failover during the installation and commissioning phases. This study is expected to provide insight into verification methodology and the failover effects from DCS redundant networks. It also provides test results of network performance from DCS network failover in digitalized domestic nuclear power plants (NPPs)

  10. Software Quality Assurance and Verification for the MPACT Library Generation Process

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yuxuan [Univ. of Michigan, Ann Arbor, MI (United States); Williams, Mark L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wiarda, Dorothea [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Clarno, Kevin T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Celik, Cihangir [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-05-01

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX and VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.

  11. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  12. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  13. Guidelines for Sandia ASCI Verification and Validation Plans - Content and Format: Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    TRUCANO,TIMOTHY G.; MOYA,JAIME L.

    1999-12-01

    This report summarizes general guidelines for the development of Verification and Validation (V and V) plans for ASCI code projects at Sandia National Laboratories. The main content categories recommended by these guidelines for explicit treatment in Sandia V and V plans are (1) stockpile drivers influencing the code development project (2) the key phenomena to be modeled by the individual code; (3) software verification strategy and test plan; and (4) code validation strategy and test plans. The authors of this document anticipate that the needed content of the V and V plans for the Sandia ASCI codes will evolve as time passes. These needs will be reflected by future versions of this document.

  14. Cleanup Verification Package for the 600-259 Waste Site

    Energy Technology Data Exchange (ETDEWEB)

    J. M. Capron

    2006-02-09

    This cleanup verification package documents completion of remedial action for the 600-259 waste site. The site was the former site of the Special Waste Form Lysimeter, consisting of commercial reactor isotope waste forms in contact with soils within engineered caissons, and was used by Pacific Northwest National Laboratory to collect data regarding leaching behavior for target analytes. A Grout Waste Test Facility also operated at the site, designed to test leaching rates of grout-solidified low-level radioactive waste.

  15. Cleanup Verification Package for the 600-259 Waste Site

    International Nuclear Information System (INIS)

    Capron, J.M.

    2006-01-01

    This cleanup verification package documents completion of remedial action for the 600-259 waste site. The site was the former site of the Special Waste Form Lysimeter, consisting of commercial reactor isotope waste forms in contact with soils within engineered caissons, and was used by Pacific Northwest National Laboratory to collect data regarding leaching behavior for target analytes. A Grout Waste Test Facility also operated at the site, designed to test leaching rates of grout-solidified low-level radioactive waste

  16. Do the TTBT and JVE provide a framework for 'effective' verification?

    International Nuclear Information System (INIS)

    Vergino, E.S.

    1998-01-01

    The Threshold Test Ban Treaty (TTBT) was signed in 1974 by Richard Nixon and Leonid Brezhnev with both the US and USSR agreeing to adhere to the 150 kt limit of the treaty as of March 31, 1976. Yet the treaty remained non ratified for more than twelve years and during this time during the height of the Cold War, the US and USSR continued to accuse one another of violating the treaty. During late 1987, during the Nuclear Testing Talks in Geneva the Joint Verification Experiment (JVE) was discussed and then was formally announced at the Shultz/Shevardnadze meeting in December, 1987. In the course of arranging JVE Information and data for five Soviet and five US nuclear tests, were exchanged. JVE activity culminated with Kearsarge, detonated on August 17, 1988 and Shagan, detonated on September 14, 1988. JVE provided a unique opportunity for US and USSR technical experts to work together to demonstrate that effective verification of the TTBT could be achieved. The TTBT was the first treaty in which the US pursued a series of complex protocols involving additional, intrusive verification measures. These required extensive collaboration between scientific and political communities, a collaboration necessary to address the balance between the technical capabilities and requirements and the political drivers and needs. During this talk the author discusses this balance, how the balance changed with time, the drivers for change and the lessons learned, and weather there are lessons to be learned that are applicable to the development of other, future, arms control agreements

  17. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  18. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  19. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  20. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  1. Verification of classified fissile material using unclassified attributes

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Fearey, B.L.; Puckett, J.M.; Tape, J.W.

    1998-01-01

    This paper reports on the most recent efforts of US technical experts to explore verification by IAEA of unclassified attributes of classified excess fissile material. Two propositions are discussed: (1) that multiple unclassified attributes could be declared by the host nation and then verified (and reverified) by the IAEA in order to provide confidence in that declaration of a classified (or unclassified) inventory while protecting classified or sensitive information; and (2) that attributes could be measured, remeasured, or monitored to provide continuity of knowledge in a nonintrusive and unclassified manner. They believe attributes should relate to characteristics of excess weapons materials and should be verifiable and authenticatable with methods usable by IAEA inspectors. Further, attributes (along with the methods to measure them) must not reveal any classified information. The approach that the authors have taken is as follows: (1) assume certain attributes of classified excess material, (2) identify passive signatures, (3) determine range of applicable measurement physics, (4) develop a set of criteria to assess and select measurement technologies, (5) select existing instrumentation for proof-of-principle measurements and demonstration, and (6) develop and design information barriers to protect classified information. While the attribute verification concepts and measurements discussed in this paper appear promising, neither the attribute verification approach nor the measurement technologies have been fully developed, tested, and evaluated

  2. An overview of CTBT verification technologies and status of IMS networks

    International Nuclear Information System (INIS)

    Basham, P.

    2002-01-01

    The CTBT history is described in brief. Nuclear test environments and verification regime are presented with illustrations. Geographical location of seismic, hydroacoustic, infrasound and radionuclide monitoring networks are indicated on globe maps. Benefits of the International Monitoring System (IMS) to host countries are listed

  3. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  4. Fiction and reality in the modelling world - Balance between simplicity and complexity, calibration and identifiability, verification and falsification

    DEFF Research Database (Denmark)

    Harremoës, P.; Madsen, H.

    1999-01-01

    Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable by calibr......Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable...... by calibration/verification on the basis of the data series available, which generates elements of sheer guessing - unless the universality of the model is be based on induction, i.e. experience from the sum of all previous investigations. There is a need to deal more explicitly with uncertainty...

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, SHARPE MANUFACTURING TITANIUM T1-CG SPRAY GUN

    Science.gov (United States)

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, the pollution prevention capabilities of a high transfer efficiency liquid spray gun was tested. This ...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ANEST IWATA CORPORATION W400-LV SPRAY GUN

    Science.gov (United States)

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, the pollution prevention capabilities of a high transfer efficiency liquid spray gun was tested. This ...

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: BIOQUELL, INC. CLARIS C HYDROGEN PEROXIDE GAS GENERATOR

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Clarus C Hydrogen Peroxide Gas Generator, a biological decontamination device manufactured by BIOQUELL, Inc. The unit was tested by evaluating its ability to decontaminate seven types...

  8. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  9. Self-verification and contextualized self-views.

    Science.gov (United States)

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  10. The verification basis of the PM-ALPHA code

    Energy Technology Data Exchange (ETDEWEB)

    Theofanous, T.G.; Yuen, W.W.; Angelini, S. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    An overall verification approach for the PM-ALPHA code is presented and implemented. The approach consists of a stepwise testing procedure focused principally on the multifield aspects of the premixing phenomenon. Breakup is treated empirically, but it is shown that, through reasonable choices of the breakup parameters, consistent interpretations of existing integral premixing experiments can be obtained. The present capability is deemed adequate for bounding energetics evaluations. (author)

  11. Being known, intimate, and valued: global self-verification and dyadic adjustment in couples and roommates.

    Science.gov (United States)

    Katz, Jennifer; Joiner, Thomas E

    2002-02-01

    We contend that close relationships provide adults with optimal opportunities for personal growth when relationship partners provide accurate, honest feedback. Accordingly, it was predicted that young adults would experience the relationship quality with relationship partners who evaluated them in a manner consistent their own self-evaluations. Three empirical tests of this self-verification hypothesis as applied to close dyads were conducted. In Study 1, young adults in dating relationships were most intimate with and somewhat more committed to partners when they perceived that partners evaluated them as they evaluated themselves. Self-verification effects were pronounced for those involved in more serious dating relationships. In Study 2, men reported the greatest esteem for same-sex roommates who evaluated them in a self-verifying manner. Results from Study 2 were replicated and extended to both male and female roommate dyads in Study 3. Further, self-verification effects were most pronounced for young adults with high emotional empathy. Results suggest that self-verification theory is useful for understanding dyadic adjustment across a variety of relational contexts in young adulthood. Implications of self-verification processes for adult personal development are outlined within an identity negotiation framework.

  12. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    Science.gov (United States)

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  13. Reuse of the test information

    International Nuclear Information System (INIS)

    Markoski, Branko; Malbaski, Dushan; Hotomski, Petar

    2006-01-01

    Within software's life cycle, program testing is very important, since quality of specification demands, design and application must be proven. Testing of large and complicated programs must be done as systematically as possible, in order to obtain reliability. In case of large and comple systems and their operating systems ad hoc testing is used, which often could not prove quality or validity according to specification, construction or application. Validation and verification are terms often connected to program testing. Verification is checkup of testing of objects (or programs) in order to determine are they in accordance with specifications. Verification contains analysis, inspection, trying, as well as testing of program. About testing the software, ordinarily we do statically analyses (eploring of basic programs, searching for primary problems and collecting data's without eecuting the program) and dynamic analyses (eploring behavior of program in eecuting, so we acquire the data about the ways of eecuting, chronological sections and integrity of testing). Every company, which educes the software, is performing tests of their products, and the software from market usually contents comple variants of defects. Sometimes it is difficult to understand how it is possible that the test omits so obvious error.

  14. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  15. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  16. Comparison between 99Tcm-porcine plasmin and 99Tcm-labelled erythrocytes in diagnosis of deep vein thrombosis

    Energy Technology Data Exchange (ETDEWEB)

    Edenbrandt, C.M.; Dahlstroem, J.A.; Nilsson, J.; Ohlin, P.

    1984-06-01

    In 20 patients with suspect deep venous thrombosis (DVT), scintillation detector measurements were performed over each leg during the first 60 min after intravenous injection of 99Tcm-porcine plasmin. Thereafter, 99Tcm-labelled autologous erythrocytes were injected i.v. and repeat measurements were performed. Finally, scintillation camera images of both legs were obtained. Phlebography was used as a reference method. A close relationship was found between the scintillation detector measurements, both in patients with DVT (n . 11) and in patients without DVT (n . 9). Thus, 99Tcm-plasmin is not specifically bound to the thrombus. Rather the clinical utility of the test depends mainly on circulatory changes secondary to the thrombus. Scintillation camera images of 99Tcm-erythrocytes in the legs were not useful for diagnosis of DVT in the calves but showed a high specificity for DVT in the popliteal and femoral veins.

  17. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  18. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  19. Partial Defect Verification of Spent Fuel Assemblies by PDET: Principle and Field Testing in Interim Spent Fuel Storage Facility (CLAB) in Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Ham, Y.S.; Kerr, P.; Sitaraman, S.; Swan, R. [Global Security Directorate, Lawrence Livermore National Laboratory, Livermore, CA 94550 (United States); Rossa, R. [SCK-CEN, Mol (Belgium); Liljenfeldt, H. [SKB in Oskarshamn (Sweden)

    2015-07-01

    The need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called 'difficult-to-access' areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into 'difficult-to-access' areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reported the successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17x17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly burnup levels. (authors)

  20. Software verification in on-line systems

    International Nuclear Information System (INIS)

    Ehrenberger, W.

    1980-01-01

    Operator assistance is more and more provided by computers. Computers contain programs, whose quality should be above a certain level, before they are allowed to be used in reactor control rooms. Several possibilities for gaining software reliability figures are discussed in this paper. By supervising the testing procedure of a program, one can estimate the number of remaining programming errors. Such an estimation, however, is not very accurate. With mathematical proving procedures one can gain some knowledge on program properties. Such proving procedures are important for the verification of general WHILE-loops, which tend to be error prone. The program analysis decomposes a program into its parts. First the program structure is made visible, which includes the data movements and the control flow. From this analysis test cases can be derived that lead to a complete test. Program analysis can be done by hand or automatically. A statistical program test normally requires a large number of test runs. This number is diminished if details concerning both the program to be tested or its use are known in advance. (orig.)

  1. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  2. Working Group 3: Broader Perspectives on Non-proliferation and Nuclear Verification

    International Nuclear Information System (INIS)

    Dreicer, M.; Pregenzer, A.; Stein, G.

    2013-01-01

    This working group (WG) focused on the technical topics related to international security and stability in global nonproliferation and arms control regimes and asked how nonproliferation tools and culture might facilitate verification of future nuclear treaties. The review of existing and future nonproliferation and disarmament regimes (Comprehensive Test Ban Treaty - CTBT, UNSC Resolution 1540, UK/Norway/VERTIC exercise, Fissile Material Cut-off Treaty - FMCT) offered a view on challenges, possibilities, and limitations for future initiatives. The concepts that the WG considered, with potential use in implementing future nuclear verification treaties, are: Triple S Culture (Safety, Security, Safeguards), State-Level Approach, Safeguards-by-Design, risk-based approaches, managed access, inspections, and protection of sensitive information. Under these concepts, many existing tools, considered by the WG could be used for nuclear verification. Export control works to control sensitive technology and expertise. Global implementation is complicated and multi-faceted and would benefit from greater consistency and efficiency. In most cases, international cooperation and development international capability would supplement efforts. This document is composed of the slides and the paper of the presentation. (A.C.)

  3. Deep vein thrombosis of the lower limbs in intravenous drug users

    Directory of Open Access Journals (Sweden)

    Wiesława Kwiatkowska

    2015-04-01

    Full Text Available Addiction to intravenously administered drugs has been a serious epidemiological problem for years. Among the related health complications, deep vein thrombosis (DVT is one of the most important. This paper provides an illustrative presentation of DVT in intravenous drug users (IDUs, HIV-positive subjects among them.We searched PubMed, Ovid Journals, Scopus, ScienceDirect, Cochrane Library, Google Scholar and references from articles obtained. The main terms used to identify appropriate studies of DVT in IDUs were ‘intravenous drug users’, ‘substance-related disorders’ and ‘deep vein thrombosis’.No guidelines exist for DVT in intravenous drug users. As many as 47.6% of IDUs report having suffered from DVT. IDUs may constitute approx. 50% of patients under 40 years of age with DVT, this being promoted by multiple vein punctures, groin injections, lack of sterility, insoluble microparticles and other factors. The clinical appearance is more complex than in the general population, which also makes prognosis more difficult. HIV infection can worsen DVT. It often appears as proximal iliofemoral thrombosis, accompanied by local and general complications. Ultrasound with a compression test is an objective method of choice, but must often be complemented with computed tomography. Antithrombotic therapy in IDUs needs to be applied individually. The optimal method is supervised therapy at addiction treatment services.Individual and public preventive measures, among them locally prepared guidelines for DVT in IDUs, may be the most important processes capable of effectively reducing the morbidity of septic and non-septic DVT.

  4. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  5. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  6. Verification of thermal-irradiation stress analytical code VIENUS of graphite block

    International Nuclear Information System (INIS)

    Iyoku, Tatsuo; Ishihara, Masahiro; Shiozawa, Shusaku; Shirai, Hiroshi; Minato, Kazuo.

    1992-02-01

    The core graphite components of the High Temperature Engineering Test Reactor (HTTR) show both the dimensional change (irradiation shrinkage) and creep behavior due to fast neutron irradiation under the temperature and the fast neutron irradiation conditions of the HTTR. Therefore, thermal/irradiation stress analytical code, VIENUS, which treats these graphite irradiation behavior, is to be employed in order to design the core components such as fuel block etc. of the HTTR. The VIENUS is a two dimensional finite element viscoelastic stress analytical code to take account of changes in mechanical properties, thermal strain, irradiation-induced dimensional change and creep in the fast neutron irradiation environment. Verification analyses were carried out in order to prove the validity of this code based on the irradiation tests of the 8th OGL-1 fuel assembly and the fuel element of the Peach Bottom reactor. This report describes the outline of the VIENUS code and its verification analyses. (author)

  7. Evaluating the Use of a Negative D-Dimer and Modified Low Wells Score in Excluding above Knee Deep Venous Thrombosis in an Outpatient Population, Assessing Need for Diagnostic Ultrasound

    International Nuclear Information System (INIS)

    Rahiminejad, Maryam; Rastogi, Anshul; Prabhudesai, Shirish; Mcclinton, David; MacCallum, Peter; Platton, Sean; Friedman, Emma

    2014-01-01

    Aims. Colour doppler ultrasonography (CDUS) is widely used in the diagnosis of deep venous thrombosis (DVT); however, the number of scans positive for above knee DVT is low. The present study evaluates the reliability of the D-dimer test combined with a clinical probability score (Wells score) in ruling out an above knee DVT and identifying patients who do not need a CDUS. Materials and Method. This study is a retrospective audit and reaudit of a total of 816 outpatients presenting with suspected lower limb DVT from March 2009 to March 2010 and from September 2011 to February 2012. Following the initial audit, a revised clinical diagnostic pathway was implemented. Results. In our initial audit, seven patients (4.9%) with a negative D-dimer and a low Wells score had a DVT. On review, all seven had a risk factor identified that was not included in the Wells score. No patient with negative D-dimer and low Wells score with no extra clinical risk factor had a DVT on CDUS (negative predictive value 100%). A reaudit confirmed adherence to our revised clinical diagnostic pathway. Conclusions. A negative D-dimer together with a low Wells score and no risk factors effectively excludes a lower limb DVT and an ultrasound is unnecessary in these patients

  8. The verification basis of the ESPROSE.m code

    Energy Technology Data Exchange (ETDEWEB)

    Theofanous, T.G.; Yuen, W.W.; Freeman, K.; Chen, X. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    An overall verification approach for the ESPROSE.m code is presented and implemented. The approach consists of a stepwise testing procedure from wave dynamics aspects to explosion coupling at the local level, and culminates with the consideration of propagating explosive events. Each step in turn consists of an array of analytical and experimental tests. The results indicate that, given the premixture composition, the prediction of energetics of large scale explosions in multidimensional geometries is within reach. The main need identified is for constitutive laws for microinteractions with reactor materials; however, reasonably conservative assessments are presently possible. (author)

  9. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  10. Anatomic variation of the deep venous system and its relationship with deep vein thrombosis found on the lower extremity venograms that were obtained after artificial joint replacements

    International Nuclear Information System (INIS)

    Lee, Min Sun; Lee, Jee Eun; Hwang, Ji Young; Shim, Sung Shine; Yoo, Jeong Hyun; Suh, Jeong Soo; Park, Jae Young

    2006-01-01

    We wanted to evaluate the anatomic variations, the number of valves and the presence of deep vein thrombosis (DVT) on the lower extremity venograms obtained after artificial joint replacements, and we also wanted to determine the correlation of the incidence of DVT with the above-mentioned factors and the operation sites. From January to June 2004, conventional ascending contrast venographies of the lower extremities were performed in 119 patients at 7-10 days after artificial joint replacement, and all the patients were asymptomatic. Total knee replacement was done for 152 cases and total hip replacement was done for 34 cases. On all the venographic images of 186 limbs, the anatomic variations were classified and the presence of DVT was evaluated; the number of valves in the superficial femoral vein (SFV) and calf veins was counted. The sites of DVT were classified as calf, thigh and pelvis. Statistically, chi square tests and Fischer's exact tests were performed to determine the correlation of the incidence of DVT with the anatomic variations, the numbers of valves and the operation sites. Theoretically, there are 9 types of anatomical variation in the deep vein system of the lower extremity that can be classified, but only 7 types were observed in this study. The most frequent type was the normal single SFV type and this was noted in 117 cases (63%), and the others were all variations (69 cases, 37%). There was a 22.2% incidence of DVT (69 cases) in the normal single SFV type and 26.4% (17 cases) in the other variations. No significant difference was noted in the incidences of DVT between the two groups. In addition, no significant statistical differences were noted for the incidences of DVT between the single or variant multiple veins in the SFV and the popliteal vein (PV) respectively, between the different groups with small or large numbers of valves in the thigh and calf, respectively, and also between the different operation sites of the hip or knee

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION--TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES, FLINT HILLS RESOURCES, LP, CCD15010 DIESEL FUEL FORMULATION WITH HITEC4121 ADDITIVE

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  12. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Andersen, Morten T.; Wendt, Fabian; Robertson, Amy; Jonkman, Jason; Hall, Matthew

    2016-08-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  13. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  14. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  15. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  16. Practical problems in the verification of a family of inventory items

    International Nuclear Information System (INIS)

    Rota, A.; Stricht, E. van der; Stanners, W.

    1976-01-01

    This paper deals with the problem of checking medium-size families of items of similar specification during inventory verifications. An example of quantitative analysis using the signed rank test is given. The hypotheses commonly accepted in the statistical tests performed are extensively discussed. The difficulties met in practice which limit the application of the theory are pointed out and the solution adopted in the case considered is illustrated. (author)

  17. Diagnosis of deep venous thrombosis by phlebography and /sup 99/Tcsup (m)-Plasmin

    Energy Technology Data Exchange (ETDEWEB)

    Edenbrandt, C.M.; Nilsson, J.; Ohlin, P. (County Hospital, Helsingborg (Sweden). Dept. of Medicine, Diagnostic Radiology and Clinical Physology)

    1982-01-01

    One hundred and thirty-four patients admitted to the medical emergency ward due to suspected deep venous thrombosis (DVT) were examined. The uptake of intravenously injected porcine /sup 99/Tcsup (m)-plasmin was estimated in both legs. Thereafter, phlebography was performed using a high osmolar contrast medium. All phlebographies were evaluated independently. All patients with negative phlebography were examined after 3-5 days. The plasmin test and phlebography were repeated when called for. The sensivity of the plasmin test was 100% and the specificity 51% when compared to phlebography. The extension of the DVT as demonstrated by the plasmin test was similar to that determined by phlebography. Post-phlebographic trombosis was very rare. It is concluded that /sup 99/Tcsup (m)-plasmin test is a rapid method, convenient to the patient and well suitable as a screening test. The results indicate that a negative plasmin test excludes DVT while a positive test necessitates additional examination by phlebography.

  18. Diagnosis of deep venous thrombosis by phlebography and 99Tcsup (m)-Plasmin

    International Nuclear Information System (INIS)

    Edenbrandt, C-M.; Nilsson, J.; Ohlin, P.

    1982-01-01

    One hundred and thirty-four patients admitted to the medical emergency ward due to suspect deep venous thrombosis (DVT) were examined. The uptake of intravenously injected porcine 99 Tcsup (m)-plasmin was estimated in both legs. Thereafter, phlebography was performed using a high osmolar contrast medium. All phlebographies were evaluated independently. All patients with negative phlebography were examined after 3-5 days. The plasmin test and phlebography were repeated when called for. The sensivity of the plasmin test was 100% and the specificity 51% when compared to phlebography. The extension of the DVT as demonstrated by the plasmin test was similar to that determined by phlebography. Post-phlebographic trombosis was very rare. It is concluded that 99 Tcsup (m)-plasmin test is a rapid method, convenient to the patient and well suitable as a screening test. The results indicate that a negative plasmin test excludes DVT while a positive test necessitates additional examination by phlebography. (Authors)

  19. Technology Foresight and nuclear test verification: a structured and participatory approach

    Science.gov (United States)

    Noack, Patrick; Gaya-Piqué, Luis; Haralabus, Georgios; Auer, Matthias; Jain, Amit; Grenard, Patrick

    2013-04-01

    As part of its mandate, the CTBTO's nuclear explosion monitoring programme aims to maintain its sustainability, effectiveness and its long-term relevance to the verification regime. As such, the PTS is conducting a Technology Foresight programme of activities to identify technologies, processes, concepts and ideas that may serve said purpose and become applicable within the next 20 years. Through the Technology Foresight activities (online conferences, interviews, surveys, workshops and other) we have involved the wider science community in the fields of seismology, infrasound, hydroacoustics, radionuclide technology, remote sensing and geophysical techniques. We have assembled a catalogue of over 200 items, which incorporate technologies, processes, concepts and ideas which will have direct future relevance to the IMS (International Monitoring System), IDC (International Data Centre) and OSI (On-Site Inspection) activities within the PTS. In order to render this catalogue as applicable and useful as possible for strategy and planning, we have devised a "taxonomy" based on seven categories, against which each technology is assessed through a peer-review mechanism. These categories are: 1. Focus area of the technology in question: identify whether the technology relates to (one or more of the following) improving our understanding of source and source physics; propagation modelling; data acquisition; data transport; data processing; broad modelling concepts; quality assurance and data storage. 2. Current Development Stage of the technology in question. Based on a scale from one to six, this measure is specific to PTS needs and broadly reflects Technology Readiness Levels (TRLs). 3. Impact of the technology on each of the following capabilities: detection, location, characterization, sustainment and confidence building. 4. Development cost: the anticipated monetary cost of validating a prototype (i.e. Development Stage 3) of the technology in question. 5. Time to

  20. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  1. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  2. The verification of DRAGON: progress and lessons learned

    International Nuclear Information System (INIS)

    Marleau, G.

    2002-01-01

    The general requirements for the verification of the legacy code DRAGON are somewhat different from those used for new codes. For example, the absence of a design manual for DRAGON makes it difficult to confirm that the each part of the code performs as required since these requirements are not explicitly spelled out for most of the DRAGON modules. In fact, this conformance of the code can only be assessed, in most cases, by making sure that the contents of the DRAGON data structures, which correspond to the output generated by a module of the code, contains the adequate information. It is also possible in some cases to use the self-verification options in DRAGON to perform additional verification or to evaluate, using an independent software, the performance of specific functions in the code. Here, we will describe the global verification process that was considered in order to bring DRAGON to an industry standard tool-set (IST) status. We will also discuss some of the lessons we learned in performing this verification and present some of the modification to DRAGON that were implemented as a consequence of this verification. (author)

  3. Role of experiments in soil-structure interaction methodology verification

    International Nuclear Information System (INIS)

    Srinivasan, M.G.; Kot, C.A.; Hsieh, B.J.

    1986-01-01

    Different kinds of experimental data may be useful for partial or full verification of SSI analysis methods. The great bulk of existing data comes from earthquake records and dynamic testing of as-built structures. However, much of this data may not be suitable for the present purpose as the measurement locations were not selected with the verification of SSI analysis in mind and hence are too few in number or inappropriate in character. Data from scale model testing that include the soil in the model - both in-situ and laboratory - are relatively scarce. If the difficulty in satisfying the requirements of similitude laws on the one hand and simulating realistic soil behavior on the other can be resolved, scale model testing may generate very useful data for relatively low cost. The current NRC sponsored programs are expected to generate data very useful for verifying analysis methods for SSI. A systematic effort to inventory, evaluate and classify existing data is first necessary. This effort would probably show that more data is needed for the better understanding of SSI aspects such as spatial variation of ground motion and the related issue of foundation input motion, and soil stiffness. Collection of response data from in-structure and free field (surface and downhole) through instrumentation of selected as-built structures in seismically active regions may be the most efficient way to obtain the needed data. Augmentation of this data from properly designed scale model tests should also be considered

  4. LLNL`s partnership with selected US mines, for CTBT verification: A pictorial and some reflections

    Energy Technology Data Exchange (ETDEWEB)

    Heuze, F.E.

    1996-01-01

    The verification of an upcoming Comprehensive Test Ban Treaty (CTBT) will involve seismic monitoring and will provide for on-site inspections which may include drilling. Because of the fact that mining operations can send out strong seismic signals, many mining districts in the US and abroad may come under special scrutiny. The seismic signals can be generated by the use of large quantities of conventional explosives, by the collapse of underground workings, or by sudden energy release in the ground such as in rock bursts and coal bumps. These mining activities may be the cause of false alarms, but may also offer opportunities for evasive nuclear testing. So in preparing for future verification of a CTBT it becomes important to address the mining-related questions. For the United States, these are questions to be answered with respect to foreign mines. But there is a good amount of commonality in mining methods worldwide. Studies conducted at US mine sites can provide good analogs of activities that may be carried out for overseas CTBT verification, save for the expected logistical impediments.

  5. Verification and benchmarking of PORFLO: an equivalent porous continuum code for repository scale analysis

    International Nuclear Information System (INIS)

    Eyler, L.L.; Budden, M.J.

    1984-11-01

    The objective of this work was to perform an assessment of prediction capabilities and features of the PORFLO code in relation to its intended use in the Basalt Waste Isolation Project. This objective was to be accomplished through a code verification and benchmarking task. Results were to be documented which either support correctness of prediction capabilities or identify areas of intended application in which the code exhibits weaknesses. A test problem set consisting of 10 problems was developed. Results of PORFLO simulations of these problems were provided for use in this work. The 10 problems were designed to test the three basic computational capabilities or categories of the code. Broken down by physical process, these are heat transfer, fluid flow, and radionuclide transport. Two verification problems were included within each of these categories. They were problems designed to test basic features of PORFLO for which analytical solutions are available for use as a known comparison basis. Hence they are referred to as verification problems. Of the remaining four problems, one repository scale problem representative of intended PORFLO use within BWIP was included in each of the three basic capabilities categories. The remaining problem was a case specifically designed to test features of decay and retardation in radionuclide transport. These four problems are referred to as benchmarking problems, because results computed with an additional computer code were used as a basis for comparison. 38 figures

  6. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  7. Incorrect, fake, and false: Journalists' perceived online source credibility and verification behavior

    NARCIS (Netherlands)

    Vergeer, M.R.M.

    2018-01-01

    This study focuses on the extent journalists verify information provided by online sources, and tests to what extent this verification behavior can be explained by journalists' perceived credibility of online information and other factors, such as journalism education of journalists, work and

  8. Performance Assessment and Scooter Verification of Nano-Alumina Engine Oil

    Directory of Open Access Journals (Sweden)

    Yu-Feng Lue

    2016-09-01

    Full Text Available The performance assessment and vehicle verification of nano-alumina (Al2O3 engine oil (NAEO were conducted in this study. The NAEO was produced by mixing Al2O3 nanoparticles with engine oil using a two-step synthesis method. The weight fractions of the Al2O3 nanoparticles in the four test samples were 0 (base oil, 0.5, 1.5, and 2.5 wt. %. The measurement of basic properties included: (1 density; (2 viscosity at various sample temperatures (20–80 °C. A rotary tribology testing machine with a pin-on-disk apparatus was used for the wear test. The measurement of the before-and-after difference of specimen (disk weight (wear test indicates that the NAEO with 1.5 wt. % Al2O3 nanoparticles (1.5 wt. % NAEO was the chosen candidate for further study. For the scooter verification on an auto-pilot dynamometer, there were three tests, including: (1 the European Driving Cycle (ECE40 driving cycle; (2 constant speed (50 km/h; and (3 constant throttle positions (20%, 40%, 60%, and 90%. For the ECE40 driving cycle and the constant speed tests, the fuel consumption was decreased on average by 2.75%, while it was decreased by 3.57% for the constant throttle case. The experimental results prove that the engine oil with added Al2O3 nanoparticles significantly decreased the fuel consumption. In the future, experiments with property tests of other nano-engine oils and a performance assessment of the nano-engine-fuel will be conducted.

  9. Experimental study on design verification of new concept for integral reactor safety system

    International Nuclear Information System (INIS)

    Chung, Moon Ki; Choi, Ki Yong; Park, Hyun Sik; Cho, Seok; Park, Choon Kyung; Lee, Sung Jae; Song, Chul Hwa

    2004-01-01

    The pressurized light water cooled, medium power (330 MWt) SMART (System-integrated Modular Advanced ReacTor) has been under development at KAERI for a dual purpose : seawater desalination and electricity generation. The SMART design verification phase was followed to conduct various separate effects tests and comprehensive integral effect tests. The high temperature / high pressure thermal-hydraulic test facility, VISTA(Experimental Verification by Integral Simulation of Transient and Accidents) has been constructed to simulate the SMART-P (the one fifth scaled pilot plant) by KAERI. Experimental tests have been performed to investigate the thermal-hydraulic dynamic characteristics of the primary and the secondary systems. Heat transfer characteristics and natural circulation performance of the PRHRS (Passive Residual Heat Removal System) of SMART-P were also investigated using the VISTA facility. The coolant flows steadily in the natural circulation loop which is composed of the Steam Generator (SG) primary side, the secondary system, and the PRHRS. The heat transfers through the PRHRS heat exchanger and ECT are sufficient enough to enable the natural circulation of the coolant

  10. Self-verification motives at the collective level of self-definition.

    Science.gov (United States)

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  11. Verification of alternative dew point hygrometer for CV-LRT in MONJU. Short- and long-term verification of capacitance-type dew point hygrometer (Translated document)

    International Nuclear Information System (INIS)

    Ichikawa, Shoichi; Chiba, Yusuke; Ono, Fumiyasu; Hatori, Masakazu; Kobayashi, Takanori; Uekura, Ryoichi; Hashiri, Nobuo; Inuzuka, Taisuke; Kitano, Hiroshi; Abe, Hisashi

    2017-03-01

    To reduce the influence of maintenance of dew point hygrometers on the plant schedule at the prototype fast-breeder reactor MONJU, Japan Atomic Energy Agency examined a capacitance-type dew point hygrometer as an alternative to the lithium-chloride dew point hygrometer being used in the containment vessel leak rate test. As verifications, a capacitance-type dew point hygrometer was compared with a lithium-chloride dew point hygrometer under a containment vessel leak rate test condition. And the capacitance-type dew point hygrometer was compared with a high-precision-mirror-surface dew point hygrometer for long-term (2 years) in the containment vessel as an unprecedented try. A comparison of a capacitance-type dew point hygrometer with a lithium-chloride dew point hygrometer in a containment vessel leak rate test (Atmosphere: nitrogen, Testing time: 24 h) revealed no significant difference between the capacitance-type dew point hygrometer and the lithium-chloride dew point hygrometer. A comparison of the capacitance-type dew point hygrometer with the high-precision-mirror-surface dew point hygrometer for long-term verification (Atmosphere: air, Testing time: 24 months) revealed that the capacitance-type dew point hygrometer satisfied the instrumental specification (synthesized precision of detector and converter: ±2.04°C) specified in the Leak Rate Test Regulations for Nuclear Reactor Containment Vessel. It was confirmed that the capacitance-type dew point hygrometer can be used as a long-term alternative to the lithium-chloride dew point hygrometer without affecting the dew point hygrometer maintenance schedule of the MONJU plant. (author)

  12. Traumatic deep vein thrombosis in a soccer player: A case study

    Directory of Open Access Journals (Sweden)

    Upshur Ross EG

    2004-10-01

    Full Text Available Abstract A 42 year-old male former semi-professional soccer player sustained a right lower extremity popliteal contusion during a soccer game. He was clinically diagnosed with a possible traumatic deep vein thrombosis (DVT, and sent for confirmatory tests. A duplex doppler ultrasound was positive for DVT, and the patient was admitted to hospital for anticoagulation (unfractionated heparin, warfarin. Upon discharge from hospital the patient continued oral warfarin anticoagulation (six months, and the use of compression stockings (nine months. He followed up with his family doctor at regular intervals for serial coagulation measurements, and ultrasound examinations. The patient's only identified major thrombotic risk factor was the traumatic injury. One year after the initial deep vein thrombosis (DVT the patient returned to contact sport, however he continued to have intermittent symptoms of right lower leg pain and right knee effusion. Athletes can develop vascular injuries in a variety of contact and non-contact sports. Trauma is one of the most common causes of lower extremity deep vein thrombosis (DVT, however athletic injuries involving lower extremity traumatic DVT are seldom reported. This diagnosis and the associated risk factors must be considered during the initial physical examination. The primary method of radiological diagnosis of lower extremity DVT is a complete bilateral duplex sonography, which can be augmented by other methods such as evidence-based risk factor analysis. Antithrombotic medication is the current standard of treatment for DVT. Acute thrombolytic treatment has demonstrated an improved therapeutic efficacy, and a decrease in post-DVT symptoms. There is a lack of scientific literature concerning the return to sport protocol following a DVT event. Athletic individuals who desire to return to sport after a DVT need to be fully informed about their treatment and risk of reoccurrence, so that appropriate decisions can be

  13. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  14. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  15. US monitoring and verification technology: on-site inspection experience and future challenges

    International Nuclear Information System (INIS)

    Gullickson, R.L.; Carlson, D.; Ingraham, J.; Laird, B.

    2013-01-01

    The United States has a long and successful history of cooperation with treaty partners in monitoring and verification. For strategic arms reduction treaties, our collaboration has resulted in the development and application of systems with limited complexity and intrusiveness. As we progress beyond New START (NST) along the 'road to zero', the reduced number of nuclear weapons is likely to require increased confidence in monitoring and verification techniques. This may place increased demands on the technology to verify the presence of a nuclear weapon and even confirm the presence of a certain type. Simultaneously, this technology must include the ability to protect each treaty partner's sensitive nuclear weapons information. Mutual development of this technology by treaty partners offers the best approach for acceptance in treaty negotiations. This same approach of mutual cooperation and development is essential for developing nuclear test monitoring technology in support of the Comprehensive Nuclear Test Ban Treaty (CTBT). Our ability to detect low yield and evasive testing will be enhanced through mutually developed techniques and experiments using laboratory laser experiments and high explosives tests in a variety of locations and geologies. (authors)

  16. Computer Generated Inputs for NMIS Processor Verification

    International Nuclear Information System (INIS)

    J. A. Mullens; J. E. Breeding; J. A. McEvers; R. W. Wysor; L. G. Chiang; J. R. Lenarduzzi; J. T. Mihalczo; J. K. Mattingly

    2001-01-01

    Proper operation of the Nuclear Identification Materials System (NMIS) processor can be verified using computer-generated inputs [BIST (Built-In-Self-Test)] at the digital inputs. Preselected sequences of input pulses to all channels with known correlation functions are compared to the output of the processor. These types of verifications have been utilized in NMIS type correlation processors at the Oak Ridge National Laboratory since 1984. The use of this test confirmed a malfunction in a NMIS processor at the All-Russian Scientific Research Institute of Experimental Physics (VNIIEF) in 1998. The NMIS processor boards were returned to the U.S. for repair and subsequently used in NMIS passive and active measurements with Pu at VNIIEF in 1999

  17. Verification of the cross-section and depletion chain processing module of DRAGON 3.06

    International Nuclear Information System (INIS)

    Chambon, R.; Marleau, G.; Zkiek, A.

    2008-01-01

    In this paper we present a verification of the module of the lattice code DRAGON 3.06 used for processing microscopic cross-section libraries, including their associated depletion chain. This verification is performed by reprogramming the capabilities of DRAGON in another language (MATLAB) and testing them on different problems typical of the CANDU reactor. The verification procedure consists in first programming MATLAB m-files to read the different cross section libraries in ASCII format and to compute the reference cross-sections and depletion chains. The same information is also recovered from the output files of DRAGON (using different m-files) and the resulting cross sections and depletion chain are compared with the reference library, the differences being evaluated and tabulated. The results show that the cross-section calculations and the depletion chains are correctly processed in version 3.06 of DRAGON. (author)

  18. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  19. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  20. DarcyTools, Version 2.1. Verification and validation

    International Nuclear Information System (INIS)

    Svensson, Urban

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  1. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  2. DarcyTools, Version 2.1. Verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  3. Statistical methods to correct for verification bias in diagnostic studies are inadequate when there are few false negatives: a simulation study

    Directory of Open Access Journals (Sweden)

    Vickers Andrew J

    2008-11-01

    Full Text Available Abstract Background A common feature of diagnostic research is that results for a diagnostic gold standard are available primarily for patients who are positive for the test under investigation. Data from such studies are subject to what has been termed "verification bias". We evaluated statistical methods for verification bias correction when there are few false negatives. Methods A simulation study was conducted of a screening study subject to verification bias. We compared estimates of the area-under-the-curve (AUC corrected for verification bias varying both the rate and mechanism of verification. Results In a single simulated data set, varying false negatives from 0 to 4 led to verification bias corrected AUCs ranging from 0.550 to 0.852. Excess variation associated with low numbers of false negatives was confirmed in simulation studies and by analyses of published studies that incorporated verification bias correction. The 2.5th – 97.5th centile range constituted as much as 60% of the possible range of AUCs for some simulations. Conclusion Screening programs are designed such that there are few false negatives. Standard statistical methods for verification bias correction are inadequate in this circumstance.

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, HVLP COATING EQUIPMENT, SHARPE MANUFACTURING COMPANY PLATINUM 2012 HVLP SPRAY GUN

    Science.gov (United States)

    This report presents the results of the verification test of the Sharpe Platinum 2013 high-volume, low-pressure gravity-feed spray gun, hereafter referred to as the Sharpe Platinum, which is designed for use in automotive refinishing. The test coating chosen by Sharpe Manufacturi...

  5. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  6. The NRC measurement verification program

    International Nuclear Information System (INIS)

    Pham, T.N.; Ong, L.D.Y.

    1995-01-01

    A perspective is presented on the US Nuclear Regulatory Commission (NRC) approach for effectively monitoring the measurement methods and directly testing the capability and performance of licensee measurement systems. A main objective in material control and accounting (MC and A) inspection activities is to assure the accuracy and precision of the accounting system and the absence of potential process anomalies through overall accountability. The primary means of verification remains the NRC random sampling during routine safeguards inspections. This involves the independent testing of licensee measurement performance with statistical sampling plans for physical inventories, item control, and auditing. A prospective cost-effective alternative overcheck is also discussed in terms of an externally coordinated sample exchange or ''round robin'' program among participating fuel cycle facilities in order to verify the quality of measurement systems, i.e., to assure that analytical measurement results are free of bias

  7. Verification lessons learned and CTBT's contribution to disarmament and nonproliferation

    International Nuclear Information System (INIS)

    Zerbo, L.

    2013-01-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all nuclear tests for any purposes. The CTBT is essential for peace and security; it is a core element of the nonproliferation regime. It limits the ability of countries to develop advanced nuclear weapons technology. It is enforced through the extensive International Monitoring System designed to detect and deter nuclear explosions in atmosphere, underwater and underground. In the process of monitoring and detecting a potential nuclear test - the system registers over 30,000 events a year - the vast majority of them are earthquakes. The civil and scientific applications of the CTBT data can be used to gain better understanding of the earth, of climate change, of volcanic ash clouds, of the tsunamis, of the movements of whales and much more. The CTBT sets a new legal and verification standard for nuclear weapons. It is a non-discriminatory Treaty with the same rights and obligations for all Member States. Its verification regime is equally nondiscriminatory and provides equal access for all Member States to CTBTO data. We are driving to achieve 160 ratifications by the end of the year, and this will provide additional momentum towards entry into force (EIF) and universality. (A.C.)

  8. Multi-canister overpack project - verification and validation, MCNP 4A

    International Nuclear Information System (INIS)

    Goldmann, L.H.

    1997-01-01

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error

  9. TH-AB-201-01: A Feasibility Study of Independent Dose Verification for CyberKnife

    International Nuclear Information System (INIS)

    Sato, A; Noda, T; Keduka, Y; Kawajiri, T; Itano, M; Yamazaki, T; Tachibana, H

    2016-01-01

    Purpose: CyberKnife irradiation is composed of tiny-size, multiple and intensity-modulated beams compared to conventional linacs. Few of the publications for Independent dose calculation verification for CyberKnife have been reported. In this study, we evaluated the feasibility of independent dose verification for CyberKnife treatment as Secondary check. Methods: The followings were measured: test plans using some static and single beams, clinical plans in a phantom and using patient’s CT. 75 patient plans were collected from several treatment sites of brain, lung, liver and bone. In the test plans and the phantom plans, a pinpoint ion-chamber measurement was performed to assess dose deviation for a treatment planning system (TPS) and an independent verification program of Simple MU Analysis (SMU). In the clinical plans, dose deviation between the SMU and the TPS was performed. Results: In test plan, the dose deviations were 3.3±4.5%, and 4.1±4.4% for the TPS and the SMU, respectively. In the phantom measurements for the clinical plans, the dose deviations were −0.2±3.6% for the TPS and −2.3±4.8% for the SMU. In the clinical plans using the patient’s CT, the dose deviations were −3.0±2.1% (Mean±1SD). The systematic difference was partially derived from inverse square law and penumbra calculation. Conclusion: The independent dose calculation for CyberKnife shows −3.0±4.2% (Mean±2SD) and our study, the confidence limit was achieved within 5% of the tolerance level from AAPM task group 114 for non-IMRT treatment. Thus, it may be feasible to use independent dose calculation verification for CyberKnife treatment as the secondary check. This research is partially supported by Japan Agency for Medical Research and Development (AMED)

  10. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  11. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  12. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  13. FLUENT Test and Verification Document

    International Nuclear Information System (INIS)

    LEE, SI

    2006-01-01

    The FLUENT 6 CFD code has been benchmarked for a wide range of simple, classical, and complex physical problems associated with turbulent gas flow, natural convection, and turbulent mixing phenomena. The results validate the application of previous scoping calculations for the Tank 50/Tank 48 vapor space mixing. The benchmarked problems consisted of three groups. The first group was well-defined and classical problems for which analytical solutions exist. The other groups are complex and physical problems for which analytical solutions are difficult to obtain. For these test problems, CFD results were compared and verified through comparisons with experimental results. The benchmarking of the FLUENT 6 code showed that the code predictions are in good agreement with the analytical solutions or experimental test data. The code was shown to be sufficiently accurate to make reliable decisions based on calculated results for those applications that fall within the scope of the benchmarking test cases. For applications that fall outside the range of the benchmarking results, particularly for significantly higher benzene concentrations or for flow geometries not adequately represented by the κ-(var e psilon) turbulence model, further benchmarking work would be required

  14. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  15. Complex-Wide Waste Flow Analysis V1.0 verification and validation report

    International Nuclear Information System (INIS)

    Hsu, K.M.; Lundeen, A.S.; Oswald, K.B.; Shropshire, D.E.; Robinson, J.M.; West, W.H.

    1997-01-01

    The complex-wide waste flow analysis model (CWWFA) was developed to assist the Department of Energy (DOE) Environmental Management (EM) Office of Science and Technology (EM-50) to evaluate waste management scenarios with emphasis on identifying and prioritizing technology development opportunities to reduce waste flows and public risk. In addition, the model was intended to support the needs of the Complex-Wide Environmental Integration (EMI) team supporting the DOE's Accelerating Cleanup: 2006 Plan. CWWFA represents an integrated environmental modeling system that covers the life cycle of waste management activities including waste generation, interim process storage, retrieval, characterization and sorting, waste preparation and processing, packaging, final interim storage, transport, and disposal at a final repository. The CWWFA shows waste flows through actual site-specific and facility-specific conditions. The system requirements for CWWFA are documented in the Technical Requirements Document (TRD). The TRD is intended to be a living document that will be modified over the course of the execution of CWWFA development. Thus, it is anticipated that CWWFA will continue to evolve as new requirements are identified (i.e., transportation, small sites, new streams, etc.). This report provides a documented basis for system verification of CWWFA requirements. System verification is accomplished through formal testing and evaluation to ensure that all performance requirements as specified in the TRD have been satisfied. A Requirement Verification Matrix (RVM) was used to map the technical requirements to the test procedures. The RVM is attached as Appendix A. Since February of 1997, substantial progress has been made toward development of the CWWFA to meet the system requirements. This system verification activity provides a baseline on system compliance to requirements and also an opportunity to reevaluate what requirements need to be satisfied in FY-98

  16. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  17. Decontamination technology verification test on scraping surface soil on the highway roadside slopes using unmanned scraping machine

    International Nuclear Information System (INIS)

    Fujinaka, Hiroyuki; Kubota, Mitsuru; Shibuya, Susumu; Kasai, Yoshimitsu

    2013-01-01

    The restore the normal life in the contaminated area, reconstruction of the infrastructure is necessary and early decontamination of roads and roadside slopes of highway are required. Decontamination work of roadside slopes is conducted only by hand working so far, but on the high and steep roadside slopes it is desirable to carry out decontamination work by an unmanned scraping machine to reduce working hours and improve safety. In this verification test, decontamination work of the roadside slope of highway, of which area was 20m W x 15m L and divided into two sections, was implemented by the machine or by hand, and working hours and radiation exposure dose were measured. As the results of the test, working hours and radiation exposure dose by the machine were 49% and 63% respectively compared to those by hand. Based on the results, cost and radiation dose for decontamination work on larger slopes were evaluated. Cost by the machine is estimated to be less than that by hand where the area is over 4,000m 2 . It is confirmed that the decontamination work of roadside slopes by the machine can be done more quickly and safely in comparison with hand working. (author)

  18. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    Energy Technology Data Exchange (ETDEWEB)

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials. The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations

  19. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  20. Verification steps for the CMS event-builder software

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The CMS event-builder software is used to assemble event fragments into complete events at 100 kHz. The data originates at the detector front-end electronics, passes through several computers and is transported from the underground to the high-level trigger farm on the surface. I will present the testing and verifications steps a new software version has to pass before it is deployed in production. I will discuss the current practice and possible improvements.

  1. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  2. TEST/QA PLAN FOR THE VERIFICATION TESTING OF ALTERNATIVES OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSONS, AND LUBRICANTS FOR HIGHWAY AND NONROAD USE HEAVY DUTY DIESEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  3. Technical verification of advanced nuclear fuel for KSNPs

    International Nuclear Information System (INIS)

    Lee, C. B.; Bang, J. G.; Kim, D. H. and others

    2002-03-01

    KNFC has developed the advanced 16x16 fuel assembly for the Korean Standard Nuclear Plants through the three-year R and D project (from April 1999 to March 2002) under the Nuclear R and D program by MOST. The purpose of this project is to verify the advanced 16x16 fuel assembly for the Korean Standard Nuclear Plants being developed by KNFC during the same period. Verification tests for the advanced fuel assembly and its components such as characteristic test on the spacer grid spring and dimple, static buckling and dynamic impact test on the 5x5 partial spacer grid, the fuel rod vibration test supported by the PLUS7 mid-spacer grid, fretting wear test, turbulent flow structure test in wind tunnel and corrosion test were performed by using the KAERI facilities. Design reports and test results produced by KNFC were technically reviewed. For the domestic production of burnable poison rod, manufacturing technology of burnable poison pellets was developed

  4. The CTBT Verification Regime: Monitoring the Earth for nuclear explosions

    International Nuclear Information System (INIS)

    2011-03-01

    The Comprehensive nuclear-Test-Ban Treaty (CTBT) bans all nuclear weapon tests. Its unique verification regime is designed to detect nuclear explosions anywhere on the planet - in the oceans, underground and in the atmosphere. once complete, the international Monitoring system (iMs) will consist of 337 facilities located in 89 countries around the globe. The iMs is currently operating in test mode so that data are already transmitted for analysis from monitoring facilities to the international Data Centre (iDC) at the headquarters of the preparatory Commission for the Comprehensive nuclear-Test-Ban Treaty Organization (CTBTO) in Vienna. Data and analysis results are shared with Member states.

  5. Verification test calculations for the Source Term Code Package

    International Nuclear Information System (INIS)

    Denning, R.S.; Wooton, R.O.; Alexander, C.A.; Curtis, L.A.; Cybulskis, P.; Gieseke, J.A.; Jordan, H.; Lee, K.W.; Nicolosi, S.L.

    1986-07-01

    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs

  6. Utilization of the capsule out-pile test facilities(2000-2003)

    Energy Technology Data Exchange (ETDEWEB)

    Cho, M. S.; Oh, J. M.; Cho, Y. G. and others

    2003-06-01

    Two out-pile test facilities were installed and being utilized for the non-irradiation tests outside the HANARO. The names of the facilities are the irradiation equipment design verification test facilities and the one-channel flow test device. In these facilities, the performance test of all capsules manufactured before loading in the HANARO and the design verification test for newly developed capsules were performed. The tests in these facilities include loading/unloading, pressure drop, endurance and vibration test etc. of capsules. In the period 2000{approx}2003, the performance tests for 8 material capsules of 99M-01K{approx}02M-05U were carried out, and the design verification tests of creep and fuel capsules developed newly were performed. For development of the creep capsule, pressure drop measurement, operation test of heater, T/C, LVDT and stress loading test were performed. In the design stage of the fuel capsule, the endurance and vibration test besides the above mentioned tests were carried out for verification of the safe operation during irradiation test in the HANARO. And in-chimeny bracket and the capsule supporting system were fixed and the flow tubes and the handling tools were manufactured for use at the facilities.

  7. Verification of in-core thermal and hydraulic analysis code FLOWNET/TRUMP for the high temperature engineering test reactor (HTTR) at JAERI

    International Nuclear Information System (INIS)

    Maruyama, Soh; Sudo, Yukio; Saito, Shinzo; Kiso, Yoshihiro; Hayakawa, Hitoshi

    1989-01-01

    The FLOWNET/TRUMP code consists of a flow network analysis code 'FLOWNET' for calculations of coolant flow distribution and coolant temperature distribution in the core with a thermal conduction analysis code 'TRUMP' for calculation of temperature distribution in solid structures. The verification of FLOWNET/TRUMP was made by the comparison of the analytical results with the results of steady state experiments by the HENDEL multichannel test rig, T1-M, which consisted of twelve simulated fuel rods heated electrically and eleven hexagonal graphite fuel blocks. The T1-M simulated the one fuel column in the core. The analytical results agreed well with the results of the experiment in which the HTTR operating conditions were simulated. (orig.)

  8. Dosimetric accuracy of Kodak EDR2 film for IMRT verifications.

    Science.gov (United States)

    Childress, Nathan L; Salehpour, Mohammad; Dong, Lei; Bloch, Charles; White, R Allen; Rosen, Isaac I

    2005-02-01

    Patient-specific intensity-modulated radiotherapy (IMRT) verifications require an accurate two-dimensional dosimeter that is not labor-intensive. We assessed the precision and reproducibility of film calibrations over time, measured the elemental composition of the film, measured the intermittency effect, and measured the dosimetric accuracy and reproducibility of calibrated Kodak EDR2 film for single-beam verifications in a solid water phantom and for full-plan verifications in a Rexolite phantom. Repeated measurements of the film sensitometric curve in a single experiment yielded overall uncertainties in dose of 2.1% local and 0.8% relative to 300 cGy. 547 film calibrations over an 18-month period, exposed to a range of doses from 0 to a maximum of 240 MU or 360 MU and using 6 MV or 18 MV energies, had optical density (OD) standard deviations that were 7%-15% of their average values. This indicates that daily film calibrations are essential when EDR2 film is used to obtain absolute dose results. An elemental analysis of EDR2 film revealed that it contains 60% as much silver and 20% as much bromine as Kodak XV2 film. EDR2 film also has an unusual 1.69:1 silver:halide molar ratio, compared with the XV2 film's 1.02:1 ratio, which may affect its chemical reactions. To test EDR2's intermittency effect, the OD generated by a single 300 MU exposure was compared to the ODs generated by exposing the film 1 MU, 2 MU, and 4 MU at a time to a total of 300 MU. An ion chamber recorded the relative dose of all intermittency measurements to account for machine output variations. Using small MU bursts to expose the film resulted in delivery times of 4 to 14 minutes and lowered the film's OD by approximately 2% for both 6 and 18 MV beams. This effect may result in EDR2 film underestimating absolute doses for patient verifications that require long delivery times. After using a calibration to convert EDR2 film's OD to dose values, film measurements agreed within 2% relative

  9. A Verification Study on the Loop-Breaking Logic of FTREX

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2008-01-01

    The logical loop problem in fault tree analysis (FTA) has been solved by manually or automatically breaking their circular logics. The breaking of logical loops is one of uncertainty sources in fault tree analyses. A practical method which can verify fault tree analysis results was developed by Choi. The method has the capability to handle logical loop problems. It has been implemented in a FORTRAN program which is called VETA (Verification and Evaluation of fault Tree Analysis results) code. FTREX, a well-known fault tree quantifier developed by KAERI, has an automatic loop-breaking logic. In order to make certain of the correctness of the loop-breaking logic of FTREX, some typical trees with complex loops are developed and applied to this study. This paper presents some verification results of the loop-breaking logic tested by the VETA code

  10. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  11. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  12. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  13. Advanced verification methods for OVI security ink

    Science.gov (United States)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  14. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  15. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  16. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    Misra, M.K.; Menon, Saritha P.; Thirugnana Murthy, D.

    2013-01-01

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  17. Simulation-based MDP verification for leading-edge masks

    Science.gov (United States)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification

  18. Standard practice for verification and classification of extensometer systems

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice covers procedures for the verification and classification of extensometer systems, but it is not intended to be a complete purchase specification. The practice is applicable only to instruments that indicate or record values that are proportional to changes in length corresponding to either tensile or compressive strain. Extensometer systems are classified on the basis of the magnitude of their errors. 1.2 Because strain is a dimensionless quantity, this document can be used for extensometers based on either SI or US customary units of displacement. Note 1—Bonded resistance strain gauges directly bonded to a specimen cannot be calibrated or verified with the apparatus described in this practice for the verification of extensometers having definite gauge points. (See procedures as described in Test Methods E251.) 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish app...

  19. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: IN-DRAIN TREATMENT DEVICE. HYDRO INTERNATIONAL UP-FLO™ FILTER

    Science.gov (United States)

    Verification testing of the Hydro International Up-Flo™ Filter with one filter module and CPZ Mix™ filter media was conducted at the Penn State Harrisburg Environmental Engineering Laboratory in Middletown, Pennsylvania. The Up-Flo™ Filter is designed as a passive, modular filtr...