WorldWideScience

Sample records for verification performance analysis

  1. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  2. Verification of the CONPAS (CONtainment Performance Analysis System) code package

    International Nuclear Information System (INIS)

    Kim, See Darl; Ahn, Kwang Il; Song, Yong Man; Choi, Young; Park, Soo Yong; Kim, Dong Ha; Jin, Young Ho.

    1997-09-01

    CONPAS is a computer code package to integrate the numerical, graphical, and results-oriented aspects of Level 2 probabilistic safety assessment (PSA) for nuclear power plants under a PC window environment automatically. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules: (1) ET Editor, (2) Computer, (3) Text Editor, and (4) Mechanistic Code Plotter. Compared with other existing computer codes for Level 2 PSA, and CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friendly interface. The computational performance of CONPAS has been verified through a Level 2 PSA to a reference plant. The results of the CONPAS code was compared with an existing level 2 PSA code (NUCAP+) and the comparison proves that CONPAS is appropriate for Level 2 PSA. (author). 9 refs., 8 tabs., 14 figs

  3. Quality Assurance in Environmental Technology Verification (ETV): Analysis and Impact on the EU ETV Pilot Programme Performance

    Science.gov (United States)

    Molenda, Michał; Ratman-Kłosińska, Izabela

    2018-03-01

    Many innovative environmental technologies never reach the market because they are new and cannot demonstrate a successful track record of previous applications. This fact is a serious obstacle on their way to the market. Lack of credible data on the performance of a technology causes mistrust of investors in innovations, especially from public sector, who seek effective solutions however without compromising the technical and financial risks associated with their implementation. Environmental technology verification (ETV) offers a credible, robust and transparent process that results in a third party confirmation of the claims made by the providers about the performance of the novel environmental technologies. Verifications of performance are supported by high quality, independent test data. In that way ETV as a tool helps establish vendor credibility and buyer confidence. Several countries across the world have implemented ETV in the form of national or regional programmes. ETV in the European Union was implemented as a voluntary scheme if a form of a pilot programme. The European Commission launched the Environmental Technology Pilot Programme of the European Union (EU ETV) in 2011. The paper describes the European model of ETV set up and put to operation under the Pilot Programme of Environmental Technologies Verification of the European Union. The goal, objectives, technological scope, involved entities are presented. An attempt has been made to summarise the results of the EU ETV scheme performance available for the period of 2012 when the programme has become fully operational until the first half of 2016. The study was aimed at analysing the overall organisation and efficiency of the EU ETV Pilot Programme. The study was based on the analysis of the documents the operation of the EU ETV system. For this purpose, a relevant statistical analysis of the data on the performance of the EU ETV system provided by the European Commission was carried out.

  4. Performance Verification of GOSAT-2 FTS-2 Simulator and Sensitivity Analysis for Greenhouse Gases Retrieval

    Science.gov (United States)

    Kamei, A.; Yoshida, Y.; Dupuy, E.; Hiraki, K.; Matsunaga, T.

    2015-12-01

    performance verification of the GOSAT-2 FTS-2 simulator and describe the future prospects for Level 2 retrieval. Besides, we will present the various sensitivity analyses relating to the engineering parameters and the atmospheric conditions on Level 1 processing for greenhouse gases retrieval.

  5. Camera calibration in a hazardous environment performed in situ with automated analysis and verification

    International Nuclear Information System (INIS)

    DePiero, F.W.; Kress, R.L.

    1993-01-01

    Camera calibration using the method of Two Planes is discussed. An implementation of the technique is described that may be performed in situ, e.g., in a hazardous or contaminated environment, thus eliminating the need for decontamination of camera systems before recalibration. Companion analysis techniques used for verifying the correctness of the calibration are presented

  6. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  7. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  8. Performance analysis and experimental verification of mid-range wireless energy transfer through non-resonant magnetic coupling

    DEFF Research Database (Denmark)

    Peng, Liang; Wang, Jingyu; Zhejiang University, Hangzhou, China, L.

    2011-01-01

    In this paper, the efficiency analysis of a mid-range wireless energy transfer system is performed through non-resonant magnetic coupling. It is shown that the self-resistance of the coils and the mutual inductance are critical in achieving a high efficiency, which is indicated by our theoretical...

  9. Development of advanced earthquake resistant performance verification on reinforced concrete underground structure. Pt. 2. Verification of the ground modeling methods applied to non-linear soil-structure interaction analysis

    International Nuclear Information System (INIS)

    Kawai, Tadashi; Kanatani, Mamoru; Ohtomo, Keizo; Matsui, Jun; Matsuo, Toyofumi

    2003-01-01

    In order to develop an advanced verification method for earthquake resistant performance on reinforced concrete underground structures, the applicability of two different types of soil modeling methods in numerical analysis were verified through non-linear dynamic numerical simulations of the large shaking table tests conducted using the model comprised of free-field ground or soils and a reinforced concrete two-box culvert structure system. In these simulations, the structure was modeled by a beam type element having a tri-linear curve of the relations between curvature and flexural moment. The soil was modeled by the Ramberg-Osgood model as well as an elasto-plastic constitutive model. The former model only employs non-linearity of shear modulus regarding strain and initial stress conditions, whereas the latter can express non-linearity of shear modulus caused by changes of mean effective stress during ground excitation and dilatancy of ground soil. Therefore the elasto-plastic constitutive model could precisely simulate the vertical acceleration and displacement response on ground surface, which were produced by the soil dilations during a shaking event of a horizontal base input in the model tests. In addition, the model can explain distinctive dynamic earth pressure acting on the vertical walls of the structure which was also confirmed to be related to the soil dilations. However, since both these modeling methods could express the shear force on the upper slab surface of the model structure, which plays the predominant role on structural deformation, these modeling methods were applicable equally to the evaluation of seismic performance similar to the model structure of this study. (author)

  10. Verification of the code ATHLET by post-test analysis of two experiments performed at the CCTF integral test facility

    International Nuclear Information System (INIS)

    Krepper, E.; Schaefer, F.

    2001-03-01

    In the framework of the external validation of the thermohydraulic code ATHLET Mod 1.2 Cycle C, which has been developed by the GRS, post test analyses of two experiments were done, which were performed at the japanese test facility CCTF. The test facility CCTF is a 1:25 volume-scaled model of a 1000 MW pressurized water reactor. The tests simulate a double end break in the cold leg of the PWR with ECC injection into the cold leg and with combined ECC injection into the hot and cold legs. The evaluation of the calculated results shows, that the main phenomena can be calculated in a good agreement with the experiment. Especially the behaviour of the quench front and the core cooling are calculated very well. Applying a two-channel representation of the reactor model the radial behaviour of the quench front could be reproduced. Deviations between calculations and experiment can be observed simulating the emergency injection in the beginning of the transient. Very high condensation rates were calculated and the pressure decrease in this phase of the transient is overestimated. Besides that, the pressurization due to evaporation in the refill phase is underestimated by ATHLET. (orig.) [de

  11. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    Time optimal reachability analysis employs model-checking to compute goal states that can be reached from an initial state with a minimal accumulated time duration. The model-checker may produce a corresponding diagnostic trace which can be interpreted as a feasible schedule for many scheduling...... and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  12. MCNP5 development, verification, and performance

    International Nuclear Information System (INIS)

    Forrest B, Brown

    2003-01-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  13. MCNP5 development, verification, and performance

    Energy Technology Data Exchange (ETDEWEB)

    Forrest B, Brown [Los Alamos National Laboratory (United States)

    2003-07-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  14. Development of advanced earthquake resistant performance verification on reinforced concrete underground structures. Pt. 3. Applicability of soil-structure Interaction analysis using nonlinear member model

    International Nuclear Information System (INIS)

    Matsui, Jun; Ohtomo, Keizo; Kawai, Tadashi; Kanatani, Mamoru; Matsuo, Toyofumi

    2003-01-01

    The objective of this study is to obtain verification data concerning performance of RC duct-type underground structures subject to strong earth quakes. This paper presents the investigated results of numerical simulation obtained from shaking table tests of box-type structure models with a scale of about 1/2. We proposed practical nonlinear member models, by which mechanical properties of RC member and soil are defined as hysteresis models (RC: axial force dependent degrading tri-linear model, soil: modified Ramberg-Osgood model), and joint elements are used to evaluate the interaction along the interface of two materials between soil and RC structures; including the slippage and separation. Consequently, the proposed models could simulate the test results on the deformation of soil and RC structure, as well as damage of RC structures which is important in verifying their seismic performance with practical accuracy. (author)

  15. Systems analysis - independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  16. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  17. Verification Test of Hydraulic Performance for Reactor Coolant Pump

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang Jun; Kim, Jae Shin; Ryu, In Wan; Ko, Bok Seong; Song, Keun Myung [Samjin Ind. Co., Seoul (Korea, Republic of)

    2010-01-15

    According to this project, basic design for prototype pump and model pump of reactor coolant pump and test facilities has been completed. Basic design for prototype pump to establish structure, dimension and hydraulic performance has been completed and through primary flow analysis by computational fluid dynamics(CFD), flow characteristics and hydraulic performance have been established. This pump was designed with mixed flow pump having the following design requirements; specific velocity(Ns); 1080.9(rpm{center_dot}m{sup 3}/m{center_dot}m), capacity; 3115m{sup 3}/h, total head ; 26.3m, pump speed; 1710rpm, pump efficiency; 77.0%, Impeller out-diameter; 349mm, motor output; 360kw, design pressure; 17MPaG. The features of the pump are leakage free due to no mechanical seal on the pump shaft which insures reactor's safety and law noise level and low vibration due to no cooling fan on the motor which makes eco-friendly product. Model pump size was reduced to 44% of prototype pump for the verification test for hydraulic performance of reactor coolant pump and was designed with mixed flow pump and canned motor having the following design requirements; specific speed(NS); 1060.9(rpm{center_dot}m{sup 3}/m{center_dot}m), capacity; 539.4m{sup 3}/h, total head; 21.0m, pump speed; 3476rpm, pump efficiency; 72.9%, Impeller out-diameter; 154mm, motor output; 55kw, design pressure; 1.0MPaG. The test facilities were designed for verification test of hydraulic performance suitable for pump performance test, homologous test, NPSH test(cavitation), cost down test and pressure pulsation test of inlet and outlet ports. Test tank was designed with testing capacity enabling up to 2000m{sup 3}/h and design pressure 1.0MPaG. Auxiliary pump was designed with centrifugal pump having capacity; 1100m{sup 3}/h, total head; 42.0m, motor output; 190kw

  18. Performance verification of 3D printers

    OpenAIRE

    Hansen, Hans Nørgaard; Nielsen, Jakob Skov; Rasmussen, Jakob; Pedersen, David Bue

    2014-01-01

    Additive Manufacturing continues to gain momentum as the next industrial revolution. While these layering technologies have demonstrated significant time and cost savings for prototype efforts, and enabled new designs with performance benefits, additive manufacturing has not been affiliated with 'precision' applications. In order to understand additive manufacturing's capabilities or short comings with regard to precision applications, it is important to understand the mechanics of the proces...

  19. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  20. Performance verification of 3D printers

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Nielsen, Jakob Skov; Rasmussen, Jakob

    2014-01-01

    Additive Manufacturing continues to gain momentum as the next industrial revolution. While these layering technologies have demonstrated significant time and cost savings for prototype efforts, and enabled new designs with performance benefits, additive manufacturing has not been affiliated....... This paper and presentation will take a deep dive into the hardware and mechanics of the modern-day DMLM machine from three of the largest equipment manufacturers. We will also look at typical post processes including the heat treats that are commonly applied to DMLM metal parts. Along the way, we'll mention...... with 'precision' applications. In order to understand additive manufacturing's capabilities or short comings with regard to precision applications, it is important to understand the mechanics of the process. GE Aviation's Additive Development Center [ADC] is in a unique position to comment on additive metal...

  1. Performance Verification for Safety Injection Tank with Fluidic Device

    International Nuclear Information System (INIS)

    Yune, Seok Jeong; Kim, Da Yong

    2014-01-01

    In LBLOCA, the SITs of a conventional nuclear power plant deliver excessive cooling water to the reactor vessel causing the water to flow into the containment atmosphere. In an effort to make it more efficient, Fluidic Device (FD) is installed inside a SIT of Advanced Power Reactor 1400 (APR 1400). FD, a complete passive controller which doesn't require actuating power, controls injection flow rates which are susceptible to a change in the flow resistance inside a vortex chamber of FD. When SIT Emergency Core Cooling (ECC) water level is above the top of the stand pipe, the water enters the vortex chamber through both the top of the stand pipe and the control ports resulting in injection of the water at a large flow rate. When the water level drops below the top of the stand pipe, the water only enters the vortex chamber through the control ports resulting in vortex formation in the vortex chamber and a relatively small flow injection. Performance verification of SIT shall be carried out because SITs play an integral role to mitigate accidents. In this paper, the performance verification method of SIT with FD is presented. In this paper, the equations for calculation of flow resistance coefficient (K) are induced to evaluate on-site performance of APR 1400 SIT with FD. Then, the equations are applied to the performance verification of SIT with FD and good results are obtained

  2. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.

    2012-01-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  3. Performance analysis

    International Nuclear Information System (INIS)

    2008-05-01

    This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.

  4. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  5. Performance verification tests of JT-60SA CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Obana, Tetsuhiro, E-mail: obana.tetsuhiro@LHD.nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Murakami, Haruyuki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan)

    2015-11-15

    Highlights: • The performance of the JT-60SA CS model coil was verified. • The CS model coil comprised a quad-pancake wound with a Nb{sub 3}Sn CIC conductor. • The CS model coil met the design requirements. - Abstract: As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb{sub 3}Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of −0.62% for the Nb{sub 3}Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  6. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  7. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  8. Commitment to COT verification improves patient outcomes and financial performance.

    Science.gov (United States)

    Maggio, Paul M; Brundage, Susan I; Hernandez-Boussard, Tina; Spain, David A

    2009-07-01

    After an unsuccessful American College of Surgery Committee on Trauma visit, our level I trauma center initiated an improvement program that included (1) hiring new personnel (trauma director and surgeons, nurse coordinator, orthopedic trauma surgeon, and registry staff), (2) correcting deficiencies in trauma quality assurance and process improvement programs, and (3) development of an outreach program. Subsequently, our trauma center had two successful verifications. We examined the longitudinal effects of these efforts on volume, patient outcomes and finances. The Trauma Registry was used to derive data for all trauma patients evaluated in the emergency department from 2001 to 2007. Clinical data analyzed included number of admissions, interfacility transfers, injury severity scores (ISS), length of stay, and mortality for 2001 to 2007. Financial performance was assessed for fiscal years 2001 to 2007. Data were divided into patients discharged from the emergency department and those admitted to the hospital. Admissions increased 30%, representing a 7.6% annual increase (p = 0.004), mostly due to a nearly fivefold increase in interfacility transfers. Severe trauma patients (ISS >24) increased 106% and mortality rate for ISS >24 decreased by 47% to almost half the average of the National Trauma Database. There was a 78% increase in revenue and a sustained increase in hospital profitability. A major hospital commitment to Committee on Trauma verification had several salient outcomes; increased admissions, interfacility transfers, and acuity. Despite more seriously injured patients, there has been a major, sustained reduction in mortality and a trend toward decreased intensive care unit length of stay. This resulted in a substantial increase in contribution to margin (CTM), net profit, and revenues. With a high level of commitment and favorable payer mix, trauma center verification improves outcomes for both patients and the hospital.

  9. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  10. SiSn diodes: Theoretical analysis and experimental verification

    KAUST Repository

    Hussain, Aftab M.; Wehbe, Nimer; Hussain, Muhammad Mustafa

    2015-01-01

    We report a theoretical analysis and experimental verification of change in band gap of silicon lattice due to the incorporation of tin (Sn). We formed SiSn ultra-thin film on the top surface of a 4 in. silicon wafer using thermal diffusion of Sn

  11. Development of evaluation and performance verification technology for radiotherapy radiation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.; Jang, S. Y.; Kim, B. H. and others

    2005-02-15

    No matter how much the importance is emphasized, the exact assessment of the absorbed doses administered to the patients to treat the various diseases such as lately soaring malignant tumors with the radiotherapy practices is the most important factor. In reality, several over-exposed patients from the radiotherapy practice become very serious social issues. Especially, the development of a technology to exactly assess the high doses and high energies (In general, dose administered to the patients with the radiotherapy practices are very huge doses, and they are about three times higher than the lethal doses) generated by the radiation generators and irradiation equipment is a competing issue to be promptly conducted. Over fifty medical centers in Korea operate the radiation generators and irradiation equipment for the radiotherapy practices. However, neither the legal and regulatory systems to implement a quality assurance program are sufficiently stipulated nor qualified personnel who could run a program to maintain the quality assurance and control of those generators and equipment for the radiotherapy practices in the medical facilities are sufficiently employed. To overcome the above deficiencies, a quality assurance program such as those developed in the technically advanced countries should be developed to exactly assess the doses administered to patients with the radiotherapy practices and develop the necessary procedures to maintain the continuing performance of the machine or equipment for the radiotherapy. The QA program and procedures should induce the fluent calibration of the machine or equipment with quality, and definitely establish the safety of patients in the radiotherapy practices. In this study, a methodology for the verification and evaluation of the radiotherapy doses is developed, and several accurate measurements, evaluations of the doses delivered to patients and verification of the performance of the therapy machine and equipment are

  12. Development of NSSS Control System Performance Verification Tool

    International Nuclear Information System (INIS)

    Sohn, Suk Whun; Song, Myung Jun

    2007-01-01

    Thanks to many control systems and control components, the nuclear power plant can be operated safely and efficiently under the transient condition as well as the steady state condition. If a fault or an error exists in control systems, the nuclear power plant should experience the unwanted and unexpected transient condition. Therefore, the performance of these control systems and control components should be completely verified through power ascension tests of startup period. However, there are many needs to replace control components or to modify control logic or to change its setpoint. It is important to verify the performance of changed control system without redoing power ascension tests in order to perform these changes. Up to now, a simulation method with computer codes which has been used for design of nuclear power plants was commonly used to verify its performance. But, if hardware characteristics of control system are changed or the software in control system has an unexpected fault or error, this simulation method is not effective to verify the performance of changed control system. Many tests related to V and V (Verification and Validation) are performed in the factory as well as in the plant to eliminate these errors which might be generated in hardware manufacturing or software coding. It reveals that these field tests and the simulation method are insufficient to guaranty the performance of changed control system. Two unexpected transients occurred in YGN 5 and 6 startup period are good examples to show this fact. One occurred at 50% reactor power and caused reactor trip. The other occurred during 70% loss of main feedwater pump test and caused the excess turbine runback

  13. Modeling the dynamics of internal flooding - verification analysis

    International Nuclear Information System (INIS)

    Filipov, K.

    2011-01-01

    The results from conducted software WATERFOW's verification analysis, developed for the purposes of reactor building internal flooding analysis have been presented. For the purpose of benchmarking the integrated code MELCOR is selected. Considering the complex structure of reactor building, the sample tests were used to cover the characteristic points of the internal flooding analysis. The inapplicability of MELCOR to the internal flooding study has been proved

  14. The verification of neutron activation analysis support system (cooperative research)

    Energy Technology Data Exchange (ETDEWEB)

    Sasajima, Fumio; Ichimura, Shigeju; Ohtomo, Akitoshi; Takayanagi, Masaji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Sawahata, Hiroyuki; Ito, Yasuo [Tokyo Univ. (Japan). Research Center for Nuclear Science and Technology; Onizawa, Kouji [Radiation Application Development Association, Tokai, Ibaraki (Japan)

    2000-12-01

    Neutron activation analysis support system is the system in which even the user who has not much experience in the neutron activation analysis can conveniently and accurately carry out the multi-element analysis of the sample. In this verification test, subjects such functions, usability, precision and accuracy of the analysis and etc. of the neutron activation analysis support system were confirmed. As a method of the verification test, it was carried out using irradiation device, measuring device, automatic sample changer and analyzer equipped in the JRR-3M PN-3 facility, and analysis software KAYZERO/SOLCOI based on the k{sub 0} method. With these equipments, calibration of the germanium detector, measurement of the parameter of the irradiation field and analysis of three kinds of environmental standard sample were carried out. The k{sub 0} method adopted in this system is primarily utilized in Europe recently, and it is the analysis method, which can conveniently and accurately carried out the multi-element analysis of the sample without requiring individual comparison standard sample. By this system, total 28 elements were determined quantitatively, and 16 elements with the value guaranteed as analytical data of the NIST (National Institute of Standards and Technology) environment standard sample were analyzed in the accuracy within 15%. This report describes content and verification result of neutron activation support system. (author)

  15. A practical approach to perform graded verification and validation

    International Nuclear Information System (INIS)

    Terrado, Carlos; Woolley, J.

    2000-01-01

    Modernization of instrumentation and control (I and C) systems in nuclear power plants often implies to go from analog to digital systems. One condition for the upgrade to be successful is that the new systems achieve at least the same quality level as the analog they replace. The most important part of digital systems quality assurance (QA) is verification and validation (V and V). V and V is concerned with the process as much as the product, it is a systematic program of review and testing activities performed throughout the system development life cycle. Briefly, we can say that verification is to build the product correctly, and validation is to build the correct product. Since V and V is necessary but costly, it is helpful to tailor the effort that should be performed to achieve the quality goal for each particular case. To do this, an accepted practice is to establish different V and V levels, each one with a proper degree of stringency or rigor. This paper shows a practical approach to estimate the appropriate level of V and V, and the resulting V and V techniques recommended for each specific system. The firs step purposed is to determine 'What to do', that is the selection of the V and V class. The main factors considered here are: required integrity, functional complexity, defense in depth and development environment. A guideline to classify the particular system using these factors and show how they lead to the selection of the V and V class is presented. The second step is to determine 'How to do it', that is to choose an appropriate set of V and V methods according to the attributes of the system and the V and V class already selected. A list of possible V and V methods that are recommended for each V and V level during different stages of the development life cycle is included. As a result of the application of this procedure, solutions are found for generalists interested in 'What to do', as well as for specialists, interested in 'How to do'. Finally

  16. Fire-accident analysis code (FIRAC) verification

    International Nuclear Information System (INIS)

    Nichols, B.D.; Gregory, W.S.; Fenton, D.L.; Smith, P.R.

    1986-01-01

    The FIRAC computer code predicts fire-induced transients in nuclear fuel cycle facility ventilation systems. FIRAC calculates simultaneously the gas-dynamic, material transport, and heat transport transients that occur in any arbitrarily connected network system subjected to a fire. The network system may include ventilation components such as filters, dampers, ducts, and blowers. These components are connected to rooms and corridors to complete the network for moving air through the facility. An experimental ventilation system has been constructed to verify FIRAC and other accident analysis codes. The design emphasizes network system characteristics and includes multiple chambers, ducts, blowers, dampers, and filters. A larger industrial heater and a commercial dust feeder are used to inject thermal energy and aerosol mass. The facility is instrumented to measure volumetric flow rate, temperature, pressure, and aerosol concentration throughout the system. Aerosol release rates and mass accumulation on filters also are measured. We have performed a series of experiments in which a known rate of thermal energy is injected into the system. We then simulated this experiment with the FIRAC code. This paper compares and discusses the gas-dynamic and heat transport data obtained from the ventilation system experiments with those predicted by the FIRAC code. The numerically predicted data generally are within 10% of the experimental data

  17. Triple Modular Redundancy verification via heuristic netlist analysis

    Directory of Open Access Journals (Sweden)

    Giovanni Beltrame

    2015-08-01

    Full Text Available Triple Modular Redundancy (TMR is a common technique to protect memory elements for digital processing systems subject to radiation effects (such as in space, high-altitude, or near nuclear sources. This paper presents an approach to verify the correct implementation of TMR for the memory elements of a given netlist (i.e., a digital circuit specification using heuristic analysis. The purpose is detecting any issues that might incur during the use of automatic tools for TMR insertion, optimization, place and route, etc. Our analysis does not require a testbench and can perform full, exhaustive coverage within less than an hour even for large designs. This is achieved by applying a divide et impera approach, splitting the circuit into smaller submodules without loss of generality, instead of applying formal verification to the whole netlist at once. The methodology has been applied to a production netlist of the LEON2-FT processor that had reported errors during radiation testing, successfully showing a number of unprotected memory elements, namely 351 flip-flops.

  18. Arms control verification costs: the need for a comparative analysis

    International Nuclear Information System (INIS)

    MacLean, G.; Fergusson, J.

    1998-01-01

    The end of the Cold War era has presented practitioners and analysts of international non-proliferation, arms control and disarmament (NACD) the opportunity to focus more intently on the range and scope of NACD treaties and their verification. Aside from obvious favorable and well-publicized developments in the field of nuclear non-proliferation, progress also has been made in a wide variety of arenas, ranging from chemical and biological weapons, fissile material, conventional forces, ballistic missiles, to anti-personnel landmines. Indeed, breaking from the constraints imposed by the Cold War United States-Soviet adversarial zero-sum relationship that impeded the progress of arms control, particularly on a multilateral level, the post Cold War period has witnessed significant developments in NACD commitments, initiatives, and implementation. The goals of this project - in its final iteration - will be fourfold. First, it will lead to the creation of a costing analysis model adjustable for uses in several current and future arms control verification tasks. Second, the project will identify data accumulated in the cost categories outlined in Table 1 in each of the five cases. By comparing costs to overall effectiveness, the application of the model will demonstrate desirability in each of the cases (see Chart 1). Third, the project will identify and scrutinize 'political costs' as well as real expenditures and investment in the verification regimes (see Chart 2). And, finally, the project will offer some analysis on the relationship between national and multilateral forms of arms control verification, as well as the applicability of multilateralism as an effective tool in the verification of international non-proliferation, arms control, and disarmament agreements. (author)

  19. High Performance Electrical Modeling and Simulation Verification Test Suite - Tier I; TOPICAL

    International Nuclear Information System (INIS)

    SCHELLS, REGINA L.; BOGDAN, CAROLYN W.; WIX, STEVEN D.

    2001-01-01

    This document describes the High Performance Electrical Modeling and Simulation (HPEMS) Global Verification Test Suite (VERTS). The VERTS is a regression test suite used for verification of the electrical circuit simulation codes currently being developed by the HPEMS code development team. This document contains descriptions of the Tier I test cases

  20. MSFC Turbine Performance Optimization (TPO) Technology Verification Status

    Science.gov (United States)

    Griffin, Lisa W.; Dorney, Daniel J.; Snellgrove, Lauren M.; Zoladz, Thomas F.; Stroud, Richard T.; Turner, James E. (Technical Monitor)

    2002-01-01

    Capability to optimize for turbine performance and accurately predict unsteady loads will allow for increased reliability, Isp, and thrust-to-weight. The development of a fast, accurate, validated aerodynamic design, analysis, and optimization system is required.

  1. Self-Verification of Ability through Biased Performance Memory.

    Science.gov (United States)

    Karabenick, Stuart A.; LeBlanc, Daniel

    Evidence points to a pervasive tendency for persons to behave to maintain their existing cognitive structures. One strategy by which this self-verification is made more probable involves information processing. Through attention, encoding and retrieval, and the interpretation of events, persons process information so that self-confirmatory…

  2. Taiwan Power Company's power distribution analysis and fuel thermal margin verification methods for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, P.H.

    1995-01-01

    Taiwan Power Company's (TPC's) power distribution analysis and fuel thermal margin verification methods for pressurized water reactors (PWRs) are examined. The TPC and the Institute of Nuclear Energy Research started a joint 5-yr project in 1989 to establish independent capabilities to perform reload design and transient analysis utilizing state-of-the-art computer programs. As part of the effort, these methods were developed to allow TPC to independently perform verifications of the local power density and departure from nucleate boiling design bases, which are required by the reload safety evaluation for the Maanshan PWR plant. The computer codes utilized were extensively validated for the intended applications. Sample calculations were performed for up to six reload cycles of the Maanshan plant, and the results were found to be quite consistent with the vendor's calculational results

  3. Verification and validation of COBRA-SFS transient analysis capability

    International Nuclear Information System (INIS)

    Rector, D.R.; Michener, T.E.; Cuta, J.M.

    1998-05-01

    This report provides documentation of the verification and validation testing of the transient capability in the COBRA-SFS code, and is organized into three main sections. The primary documentation of the code was published in September 1995, with the release of COBRA-SFS, Cycle 2. The validation and verification supporting the release and licensing of COBRA-SFS was based solely on steady-state applications, even though the appropriate transient terms have been included in the conservation equations from the first cycle. Section 2.0, COBRA-SFS Code Description, presents a capsule description of the code, and a summary of the conservation equations solved to obtain the flow and temperature fields within a cask or assembly model. This section repeats in abbreviated form the code description presented in the primary documentation (Michener et al. 1995), and is meant to serve as a quick reference, rather than independent documentation of all code features and capabilities. Section 3.0, Transient Capability Verification, presents a set of comparisons between code calculations and analytical solutions for selected heat transfer and fluid flow problems. Section 4.0, Transient Capability Validation, presents comparisons between code calculations and experimental data obtained in spent fuel storage cask tests. Based on the comparisons presented in Sections 2.0 and 3.0, conclusions and recommendations for application of COBRA-SFS to transient analysis are presented in Section 5.0

  4. MESA: Message-Based System Analysis Using Runtime Verification

    Science.gov (United States)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  5. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  6. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    International Nuclear Information System (INIS)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  7. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  8. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    International Nuclear Information System (INIS)

    Gauld, Ian C.; Hu, Jianwei; De Baere, P.; Tobin, Stephen

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy-EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  9. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gauld, Ian C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hu, Jianwei [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); De Baere, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Vaccaro, S. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Schwalbach, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Liljenfeldt, Henrik [Swedish Nuclear Fuel and Waste Management Company (Sweden); Tobin, Stephen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  10. SiSn diodes: Theoretical analysis and experimental verification

    KAUST Repository

    Hussain, Aftab M.

    2015-08-24

    We report a theoretical analysis and experimental verification of change in band gap of silicon lattice due to the incorporation of tin (Sn). We formed SiSn ultra-thin film on the top surface of a 4 in. silicon wafer using thermal diffusion of Sn. We report a reduction of 0.1 V in the average built-in potential, and a reduction of 0.2 V in the average reverse bias breakdown voltage, as measured across the substrate. These reductions indicate that the band gap of the silicon lattice has been reduced due to the incorporation of Sn, as expected from the theoretical analysis. We report the experimentally calculated band gap of SiSn to be 1.11 ± 0.09 eV. This low-cost, CMOS compatible, and scalable process offers a unique opportunity to tune the band gap of silicon for specific applications.

  11. Development of Out-pile Test Technology for Fuel Assembly Performance Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Tae Hyun; In, W. K.; Oh, D. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)] (and others)

    2007-03-15

    Out-pile tests with full scale fuel assembly are to verify the design and to evaluate the performance of the final products. HTL for the hydraulic tests and FAMeCT for mechanical/structural tests were constructed in this project. The maximum operating conditions of HTL are 30 bar, 320 .deg. C, and 500 m3/hr. This facility can perform the pressure drop test, fuel assembly uplift test, and flow induced vibration test. FAMeCT can perform the bending and vibration tests. The verification of the developed facilities were carried out by comparing the reference data of the fuel assembly which was obtained at the Westinghouse Co. The compared data showed a good coincidence within uncertainties. FRETONUS was developed for high temperature and high pressure fretting wear simulator and performance test. A performance test was conducted for 500 hours to check the integrity, endurance, data acquisition capability of the simulator. The technology of turbulent flow analysis and finite element analysis by computation was developed. From the establishments of out-pile test facilities for full scale fuel assembly, the domestic infrastructure for PWR fuel development has been greatly upgraded.

  12. Numerical verification of composite rods theory on multi-story buildings analysis

    Science.gov (United States)

    El-Din Mansour, Alaa; Filatov, Vladimir; Gandzhuntsev, Michael; Ryasny, Nikita

    2018-03-01

    In the article, a verification proposal of the composite rods theory on the structural analysis of skeletons for high-rise buildings. A testing design model been formed on which horizontal elements been represented by a multilayer cantilever beam operates on transverse bending on which slabs are connected with a moment-non-transferring connections and a multilayer columns represents the vertical elements. Those connections are sufficiently enough to form a shearing action can be approximated by a certain shear forces function, the thing which significantly reduces the overall static indeterminacy degree of the structural model. A system of differential equations describe the operation mechanism of the multilayer rods that solved using the numerical approach of successive approximations method. The proposed methodology to be used while preliminary calculations for the sake of determining the rigidity characteristics of the structure; are needed. In addition, for a qualitative assessment of the results obtained by other methods when performing calculations with the verification aims.

  13. The design and verification of probabilistic safety analysis platform NFRisk

    International Nuclear Information System (INIS)

    Hu Wenjun; Song Wei; Ren Lixia; Qian Hongtao

    2010-01-01

    To increase the technical ability in Probabilistic Safety Analysis (PSA) field in China,it is necessary and important to study and develop indigenous professional PSA platform. Following such principle as 'from structure simplification to modulization to production of cut sets to minimum of cut sets', the algorithms, including simplification algorithm, modulization algorithm, the algorithm of conversion from fault tree to binary decision diagram (BDD), the solving algorithm of cut sets, the minimum algorithm of cut sets, and so on, were designed and developed independently; the design of data management and operation platform was completed all alone; the verification and validation of NFRisk platform based on 3 typical fault trees was finished on our own. (authors)

  14. TRACEABILITY OF PRECISION MEASUREMENTS ON COORDINATE MEASURING MACHINES – PERFORMANCE VERIFICATION OF CMMs

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Sobiecki, René; Tosello, Guido

    This document is used in connection with one exercise of 30 minutes duration as a part of the course VISION ONLINE – One week course on Precision & Nanometrology. The exercise concerns performance verification of the volumetric measuring capability of a small volume coordinate measuring machine...

  15. Wind turbine power performance verification in complex terrain and wind farms

    DEFF Research Database (Denmark)

    Friis Pedersen, Troels; Gjerding, S.; Enevoldsen, P.

    2002-01-01

    is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedurefor non-grid (small) wind turbines. This report presents work that was made to support the basis......The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurementson individual wind turbines. The second one...... then been investigated in more detail. The work has given rise to a range of conclusionsand recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark...

  16. Fast and Safe Concrete Code Execution for Reinforcing Static Analysis and Verification

    Directory of Open Access Journals (Sweden)

    M. Belyaev

    2015-01-01

    Full Text Available The problem of improving precision of static analysis and verification techniques for C is hard due to simplification assumptions these techniques make about the code model. We present a novel approach to improving precision by executing the code model in a controlled environment that captures program errors and contract violations in a memory and time efficient way. We implemented this approach as an executor module Tassadar as a part of bounded model checker Borealis. We tested Tassadar on two test sets, showing that its impact on performance of Borealis is minimal.The article is published in the authors’ wording.

  17. Complex-Wide Waste Flow Analysis V1.0 verification and validation report

    International Nuclear Information System (INIS)

    Hsu, K.M.; Lundeen, A.S.; Oswald, K.B.; Shropshire, D.E.; Robinson, J.M.; West, W.H.

    1997-01-01

    The complex-wide waste flow analysis model (CWWFA) was developed to assist the Department of Energy (DOE) Environmental Management (EM) Office of Science and Technology (EM-50) to evaluate waste management scenarios with emphasis on identifying and prioritizing technology development opportunities to reduce waste flows and public risk. In addition, the model was intended to support the needs of the Complex-Wide Environmental Integration (EMI) team supporting the DOE's Accelerating Cleanup: 2006 Plan. CWWFA represents an integrated environmental modeling system that covers the life cycle of waste management activities including waste generation, interim process storage, retrieval, characterization and sorting, waste preparation and processing, packaging, final interim storage, transport, and disposal at a final repository. The CWWFA shows waste flows through actual site-specific and facility-specific conditions. The system requirements for CWWFA are documented in the Technical Requirements Document (TRD). The TRD is intended to be a living document that will be modified over the course of the execution of CWWFA development. Thus, it is anticipated that CWWFA will continue to evolve as new requirements are identified (i.e., transportation, small sites, new streams, etc.). This report provides a documented basis for system verification of CWWFA requirements. System verification is accomplished through formal testing and evaluation to ensure that all performance requirements as specified in the TRD have been satisfied. A Requirement Verification Matrix (RVM) was used to map the technical requirements to the test procedures. The RVM is attached as Appendix A. Since February of 1997, substantial progress has been made toward development of the CWWFA to meet the system requirements. This system verification activity provides a baseline on system compliance to requirements and also an opportunity to reevaluate what requirements need to be satisfied in FY-98

  18. Verification of Memory Performance Contracts with KeY

    OpenAIRE

    Engel, Christian

    2007-01-01

    Determining the worst case memory consumption is an important issue for real-time Java applications. This work describes a methodology for formally verifying worst case memory performance constraints and proposes extensions to Java Modeling Language (JML) facilitating better verifiability of JML performance specifications.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ENVIRONMENTAL DECISION SUPPORT SOFTWARE, UNIVERSITY OF TENNESSEE RESEARCH CORPORATION, SPATIAL ANALYSIS AND DECISION ASSISTANCE (SADA)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  20. Modified Truncated Multiplicity Analysis to Improve Verification of Uranium Fuel Cycle Materials

    International Nuclear Information System (INIS)

    LaFleur, A.; Miller, K.; Swinhoe, M.; Belian, A.; Croft, S.

    2015-01-01

    Accurate verification of 235U enrichment and mass in UF6 storage cylinders and the UO2F2 holdup contained in the process equipment is needed to improve international safeguards and nuclear material accountancy at uranium enrichment plants. Small UF6 cylinders (1.5'' and 5'' diameter) are used to store the full range of enrichments from depleted to highly-enriched UF6. For independent verification of these materials, it is essential that the 235U mass and enrichment measurements do not rely on facility operator declarations. Furthermore, in order to be deployed by IAEA inspectors to detect undeclared activities (e.g., during complementary access), it is also imperative that the measurement technique is quick, portable, and sensitive to a broad range of 235U masses. Truncated multiplicity analysis is a technique that reduces the variance in the measured count rates by only considering moments 1, 2, and 3 of the multiplicity distribution. This is especially important for reducing the uncertainty in the measured doubles and triples rates in environments with a high cosmic ray background relative to the uranium signal strength. However, we believe that the existing truncated multiplicity analysis throws away too much useful data by truncating the distribution after the third moment. This paper describes a modified truncated multiplicity analysis method that determines the optimal moment to truncate the multiplicity distribution based on the measured data. Experimental measurements of small UF6 cylinders and UO2F2 working reference materials were performed at Los Alamos National Laboratory (LANL). The data were analyzed using traditional and modified truncated multiplicity analysis to determine the optimal moment to truncate the multiplicity distribution to minimize the uncertainty in the measured count rates. The results from this analysis directly support nuclear safeguards at enrichment plants and provide a more accurate verification method for UF6

  1. Power Performance Verification of a Wind Farm Using the Friedman's Test.

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L

    2016-06-03

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  2. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.

    2016-01-01

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628

  3. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2016-06-01

    Full Text Available In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  4. INF and IAEA: A comparative analysis of verification strategy

    International Nuclear Information System (INIS)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities

  5. Verification of spectrophotometric method for nitrate analysis in water samples

    Science.gov (United States)

    Kurniawati, Puji; Gusrianti, Reny; Dwisiwi, Bledug Bernanti; Purbaningtias, Tri Esti; Wiyantoko, Bayu

    2017-12-01

    The aim of this research was to verify the spectrophotometric method to analyze nitrate in water samples using APHA 2012 Section 4500 NO3-B method. The verification parameters used were: linearity, method detection limit, level of quantitation, level of linearity, accuracy and precision. Linearity was obtained by using 0 to 50 mg/L nitrate standard solution and the correlation coefficient of standard calibration linear regression equation was 0.9981. The method detection limit (MDL) was defined as 0,1294 mg/L and limit of quantitation (LOQ) was 0,4117 mg/L. The result of a level of linearity (LOL) was 50 mg/L and nitrate concentration 10 to 50 mg/L was linear with a level of confidence was 99%. The accuracy was determined through recovery value was 109.1907%. The precision value was observed using % relative standard deviation (%RSD) from repeatability and its result was 1.0886%. The tested performance criteria showed that the methodology was verified under the laboratory conditions.

  6. Experimental Verification Of Hyper-V Performance Isolation Level

    Directory of Open Access Journals (Sweden)

    Krzysztof Rzecki

    2014-01-01

    Full Text Available The need for cost optimization in a broad sense constitutes the basis of operation of every enterprise. In the case of IT structure, which is present in almost every field of activity these days, one of the most commonly applied technologies leading to good cost-to-profit adjustment is virtualization. It consists in locating several operational systems with IT systems on a single server. In order for such optimization to be carried out correctly it has to be strictly controlled by means of allocating access to resources, which is known as performance isolation. Modern virtualizers allow to set up this allocation in quantitative terms (the number of processors, size of RAM, or disc space. It appears, however, that in qualitative terms (processor's time, RAM or hard disc bandwidth the actual allocation of resources does not always correspond with this configuration. This paper provides an experimental presentation of the achievable level of performance isolation of the Hyper-V virtualizer.

  7. Verification and Performance Evaluation of Timed Game Strategies

    DEFF Research Database (Denmark)

    David, Alexandre; Fang, Huixing; Larsen, Kim Guldstrand

    2014-01-01

    Control synthesis techniques, based on timed games, derive strategies to ensure a given control objective, e.g., time-bounded reachability. Model checking verifies correctness properties of systems. Statistical model checking can be used to analyse performance aspects of systems, e.g., energy...... consumption. In this work, we propose to combine these three techniques. In particular, given a strategy synthesized for a timed game and a given control objective, we want to make a deeper examination of the consequences of adopting this strategy. Firstly, we want to apply model checking to the timed game...... under the synthesized strategy in order to verify additional correctness properties. Secondly, we want to apply statistical model checking to evaluate various performance aspects of the synthesized strategy. For this, the underlying timed game is extended with relevant price and stochastic information...

  8. Performance Verification on UWB Antennas for Breast Cancer Detection

    Directory of Open Access Journals (Sweden)

    Vijayasarveswari V.

    2017-01-01

    Full Text Available Breast cancer is a common disease among women and death figure is continuing to increase. Early breast cancer detection is very important. Ultra wide-band (UWB is the promising candidate for short communication applications. This paper presents the performance of different types of UWB antennas for breast cancer detection. Two types of antennas are used i.e: UWB pyramidal antenna and UWB horn antenna. These antennas are used to transmit and receive the UWB signal. The collected signals are fed into developed neural network module to measure the performance efficiency of each antenna. The average detection efficiency is 88.46% and 87.55% for UWB pyramidal antenna and UWB horn antenna respectively. These antennas can be used to detect breast cancer in the early stage and save precious lives.

  9. Verification study of the FORE-2M nuclear/thermal-hydraulilc analysis computer code

    International Nuclear Information System (INIS)

    Coffield, R.D.; Tang, Y.S.; Markley, R.A.

    1982-01-01

    The verification of the LMFBR core transient performance code, FORE-2M, was performed in two steps. Different components of the computation (individual models) were verified by comparing with analytical solutions and with results obtained from other conventionally accepted computer codes (e.g., TRUMP, LIFE, etc.). For verification of the integral computation method of the code, experimental data in TREAT, SEFOR and natural circulation experiments in EBR-II were compared with the code calculations. Good agreement was obtained for both of these steps. Confirmation of the code verification for undercooling transients is provided by comparisons with the recent FFTF natural circulation experiments. (orig.)

  10. Cloud computing platform for real-time measurement and verification of energy performance

    International Nuclear Information System (INIS)

    Ke, Ming-Tsun; Yeh, Chia-Hung; Su, Cheng-Jie

    2017-01-01

    Highlights: • Application of PSO algorithm can improve the accuracy of the baseline model. • M&V cloud platform automatically calculates energy performance. • M&V cloud platform can be applied in all energy conservation measures. • Real-time operational performance can be monitored through the proposed platform. • M&V cloud platform facilitates the development of EE programs and ESCO industries. - Abstract: Nations worldwide are vigorously promoting policies to improve energy efficiency. The use of measurement and verification (M&V) procedures to quantify energy performance is an essential topic in this field. Currently, energy performance M&V is accomplished via a combination of short-term on-site measurements and engineering calculations. This requires extensive amounts of time and labor and can result in a discrepancy between actual energy savings and calculated results. In addition, the M&V period typically lasts for periods as long as several months or up to a year, the failure to immediately detect abnormal energy performance not only decreases energy performance, results in the inability to make timely correction, and misses the best opportunity to adjust or repair equipment and systems. In this study, a cloud computing platform for the real-time M&V of energy performance is developed. On this platform, particle swarm optimization and multivariate regression analysis are used to construct accurate baseline models. Instantaneous and automatic calculations of the energy performance and access to long-term, cumulative information about the energy performance are provided via a feature that allows direct uploads of the energy consumption data. Finally, the feasibility of this real-time M&V cloud platform is tested for a case study involving improvements to a cold storage system in a hypermarket. Cloud computing platform for real-time energy performance M&V is applicable to any industry and energy conservation measure. With the M&V cloud platform, real

  11. Instrument performance and simulation verification of the POLAR detector

    Science.gov (United States)

    Kole, M.; Li, Z. H.; Produit, N.; Tymieniecka, T.; Zhang, J.; Zwolinska, A.; Bao, T. W.; Bernasconi, T.; Cadoux, F.; Feng, M. Z.; Gauvin, N.; Hajdas, W.; Kong, S. W.; Li, H. C.; Li, L.; Liu, X.; Marcinkowski, R.; Orsi, S.; Pohl, M.; Rybka, D.; Sun, J. C.; Song, L. M.; Szabelski, J.; Wang, R. J.; Wang, Y. H.; Wen, X.; Wu, B. B.; Wu, X.; Xiao, H. L.; Xiong, S. L.; Zhang, L.; Zhang, L. Y.; Zhang, S. N.; Zhang, X. F.; Zhang, Y. J.; Zhao, Y.

    2017-11-01

    POLAR is a new satellite-born detector aiming to measure the polarization of an unprecedented number of Gamma-Ray Bursts in the 50-500 keV energy range. The instrument, launched on-board the Tiangong-2 Chinese Space lab on the 15th of September 2016, is designed to measure the polarization of the hard X-ray flux by measuring the distribution of the azimuthal scattering angles of the incoming photons. A detailed understanding of the polarimeter and specifically of the systematic effects induced by the instrument's non-uniformity are required for this purpose. In order to study the instrument's response to polarization, POLAR underwent a beam test at the European Synchrotron Radiation Facility in France. In this paper both the beam test and the instrument performance will be described. This is followed by an overview of the Monte Carlo simulation tools developed for the instrument. Finally a comparison of the measured and simulated instrument performance will be provided and the instrument response to polarization will be presented.

  12. Research on Elemental Technology of Advanced Nuclear Fuel Performance Verification

    International Nuclear Information System (INIS)

    Kim, Yong Soo; Lee, Dong Uk; Jean, Sang Hwan; Koo, Min

    2003-04-01

    Most of current properties models and fuel performance models used in the performance evaluation codes are based on the in-pile data up to 33,000 MWd/MtU. Therefore, international experts are investigating the properties changes and developing advanced prediction models for high burn-up application. Current research is to develop high burn-up fission gas release model for the code and to support the code development activities by collecting data and models, reviewing/assessing the data and models together, and benchmarking the selected models against the appropriate in-pile data. For high burn-up applications, two stage two step fission gas release model is developed based on the real two diffusion process in the grain lattice and grain boundaries of the fission gases and the observation of accelerated release rate in the high burn-up. It is found that the prediction of this model is in excellent agreement with the in-pile measurement results, not only in the low burn-up but also in the high burn-up. This research is found that the importance of thermal conductivity of oxide fuel, especially in the high burn-up, is focused again. It is found that even the temperature dependent models differ from one to another and most of them overestimate the conductivity in the high burn-up. An in-pile data benchmarking of high LHGR fuel rod shows that the difference can reach 30%∼40%, which predicts 400 .deg. C lower than the real fuel centerline temperature. Recent models on the thermal expansion and heat capacity of oxide fuel are found to be well-defined. Irradiation swelling of the oxide fuel are now well-understood that in most cases in LWRs solid fission product swelling is dominant. Thus, the accumulation of in-pile data can enhance the accuracy of the model prediction, rather than theoretical modeling works. Thermo-physical properties of Zircaloy cladding are also well-defined and well-understood except the thermal expansion. However, it turns out that even the

  13. Verification and validation of the decision analysis model for assessment of TWRS waste treatment strategies

    International Nuclear Information System (INIS)

    Awadalla, N.G.; Eaton, S.C.F.

    1996-01-01

    This document is the verification and validation final report for the Decision Analysis Model for Assessment of Tank Waste Remediation System Waste Treatment Strategies. This model is also known as the INSIGHT Model

  14. A study on periodic safety verification on MOV performance

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Du Eon; Park, Jong Ho; Han, Jae Seob; Kang, Hyeon Taek; Lee, Jeong Min; Song, Kyu Jo; Shin, Wan Sun; Lee, Taek Sang [Chungnam National Univ., Taejon (Korea, Republic of)

    2000-03-15

    The objectives of this study, therefore, are to define the optimized valve diagnostic variances which early detect the abnormal conditions during the surveillance of the valve and consequently reduce the radiation exposure. The major direction of the development is to detect in advance the valve degradation by monitoring the motor current and power signals which can be obtained remotely at Motor Control Center (MCC). A series of valve operation experiments have been performed under several kinds of abnormal conditions by using the test apparatus which consists of a 3-inch gate valve, a motor(0.33 Hp, 460V, 0.8A, 1560rpm), actuator(SMB-000-2 type), some measuring devices(power analyzer, oscilloscope, data recorder and current transformer, AC current and voltage transducer) and connection cables.

  15. Performance verification of the CMS Phase-1 Upgrade Pixel detector

    Science.gov (United States)

    Veszpremi, V.

    2017-12-01

    The CMS tracker consists of two tracking systems utilizing semiconductor technology: the inner pixel and the outer strip detectors. The tracker detectors occupy the volume around the beam interaction region between 3 cm and 110 cm in radius and up to 280 cm along the beam axis. The pixel detector consists of 124 million pixels, corresponding to about 2 m 2 total area. It plays a vital role in the seeding of the track reconstruction algorithms and in the reconstruction of primary interactions and secondary decay vertices. It is surrounded by the strip tracker with 10 million read-out channels, corresponding to 200 m 2 total area. The tracker is operated in a high-occupancy and high-radiation environment established by particle collisions in the LHC . The current strip detector continues to perform very well. The pixel detector that has been used in Run 1 and in the first half of Run 2 was, however, replaced with the so-called Phase-1 Upgrade detector. The new system is better suited to match the increased instantaneous luminosity the LHC would reach before 2023. It was built to operate at an instantaneous luminosity of around 2×1034 cm-2s-1. The detector's new layout has an additional inner layer with respect to the previous one; it allows for more efficient tracking with smaller fake rate at higher event pile-up. The paper focuses on the first results obtained during the commissioning of the new detector. It also includes challenges faced during the first data taking to reach the optimal measurement efficiency. Details will be given on the performance at high occupancy with respect to observables such as data-rate, hit reconstruction efficiency, and resolution.

  16. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  17. Measurement and Verification of Energy Savings and Performance from Advanced Lighting Controls

    Energy Technology Data Exchange (ETDEWEB)

    PNNL

    2016-02-21

    This document provides a framework for measurement and verification (M&V) of energy savings, performance, and user satisfaction from lighting retrofit projects involving occupancy-sensor-based, daylighting, and/or other types of automatic lighting. It was developed to provide site owners, contractors, and other involved organizations with the essential elements of a robust M&V plan for retrofit projects and to assist in developing specific project M&V plans.

  18. On Demand Internal Short Circuit Device Enables Verification of Safer, Higher Performing Battery Designs

    Energy Technology Data Exchange (ETDEWEB)

    Darcy, Eric; Keyser, Matthew

    2017-05-15

    The Internal Short Circuit (ISC) device enables critical battery safety verification. With the aluminum interstitial heat sink between the cells, normal trigger cells cannot be driven into thermal runaway without excessive temperature bias of adjacent cells. With an implantable, on-demand ISC device, thermal runaway tests show that the conductive heat sinks protected adjacent cells from propagation. High heat dissipation and structural support of Al heat sinks show high promise for safer, higher performing batteries.

  19. TRACEABILITY OF ON COORDINATE MEASURING MACHINES – CALIBRATION AND PERFORMANCE VERIFICATION

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Savio, Enrico; Bariani, Paolo

    This document is used in connection with three exercises each of 45 minutes duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercises concern three aspects of coordinate measurement traceability: 1) Performance verification of a CMM using a ball bar; 2) Calibration...... of an optical coordinate measuring machine; 3) Uncertainty assessment using the ISO 15530-3 “Calibrated workpieces” procedure....

  20. Verification of structural analysis computer codes in nuclear engineering

    International Nuclear Information System (INIS)

    Zebeljan, Dj.; Cizelj, L.

    1990-01-01

    Sources of potential errors, which can take place during use of finite element method based computer programs, are described in the paper. The magnitude of errors was defined as acceptance criteria for those programs. Error sources are described as they are treated by 'National Agency for Finite Element Methods and Standards (NAFEMS)'. Specific verification examples are used from literature of Nuclear Regulatory Commission (NRC). Example of verification is made on PAFEC-FE computer code for seismic response analyses of piping systems by response spectrum method. (author)

  1. Verification of temporal-causal network models by mathematical analysis

    Directory of Open Access Journals (Sweden)

    Jan Treur

    2016-04-01

    Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.

  2. EPID-based verification of the MLC performance for dynamic IMRT and VMAT

    International Nuclear Information System (INIS)

    Rowshanfarzad, Pejman; Sabet, Mahsheed; Barnes, Michael P.; O’Connor, Daryl J.; Greer, Peter B.

    2012-01-01

    Purpose: In advanced radiotherapy treatments such as intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), verification of the performance of the multileaf collimator (MLC) is an essential part of the linac QA program. The purpose of this study is to use the existing measurement methods for geometric QA of the MLCs and extend them to more comprehensive evaluation techniques, and to develop dedicated robust algorithms to quantitatively investigate the MLC performance in a fast, accurate, and efficient manner. Methods: The behavior of leaves was investigated in the step-and-shoot mode by the analysis of integrated electronic portal imaging device (EPID) images acquired during picket fence tests at fixed gantry angles and arc delivery. The MLC was also studied in dynamic mode by the analysis of cine EPID images of a sliding gap pattern delivered in a variety of conditions including different leaf speeds, deliveries at fixed gantry angles or in arc mode, and changing the direction of leaf motion. The accuracy of the method was tested by detection of the intentionally inserted errors in the delivery patterns. Results: The algorithm developed for the picket fence analysis was able to find each individual leaf position, gap width, and leaf bank skewness in addition to the deviations from expected leaf positions with respect to the beam central axis with sub-pixel accuracy. For the three tested linacs over a period of 5 months, the maximum change in the gap width was 0.5 mm, the maximum deviation from the expected leaf positions was 0.1 mm and the MLC skewness was up to 0.2°. The algorithm developed for the sliding gap analysis could determine the velocity and acceleration/deceleration of each individual leaf as well as the gap width. There was a slight decrease in the accuracy of leaf performance with increasing leaf speeds. The analysis results were presented through several graphs. The accuracy of the method was assessed as 0.01 mm

  3. Modeling and Verification of Insider Threats Using Logical Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2017-01-01

    and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider...

  4. Tempered Water Lower Port Connector Structural Analysis Verification

    International Nuclear Information System (INIS)

    CREA, B.A.

    2000-01-01

    Structural analysis of the lower port connection of the Tempered Water System of the Cold Vacuum Drying Facility was performed. Subsequent detailed design changes to enhance operability resulted in the need to re-evaluate the bases of the original analysis to verify its continued validity. This evaluation is contained in Appendix A of this report. The original evaluation is contained in Appendix B

  5. Distributed source term analysis, a new approach to nuclear material inventory verification

    CERN Document Server

    Beddingfield, D H

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to gamma-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than gamma-rays.

  6. Distributed source term analysis, a new approach to nuclear material inventory verification

    International Nuclear Information System (INIS)

    Beddingfield, D.H.; Menlove, H.O.

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to γ-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than γ-rays

  7. Calibration And Performance Verification Of LSC Packard 1900TR AFTER REPAIRING

    International Nuclear Information System (INIS)

    Satrio; Evarista-Ristin; Syafalni; Alip

    2003-01-01

    Calibration process and repeated verification of LSC Packard 1900TR at Hydrology Section-P3TlR has been done. In the period of middle 1997 to July 2000, the counting system of the instrument has damaged and repaired for several times. After repairing, the system was recalibrated and then verified. The calibration and verification were conducted by using standard 3 H, 14 C and background unquenched. The result of calibration shows that background count rates of 3 H and 14 C is 12.3 ± 0.79 cpm and 18.24 ± 0.69 cpm respectively; FOM 3 H and 14 C is 285.03 ± 15.95 and 641.06 ± 16.45 respectively; 3 H and 14 C efficiency is 59.13 ± 0.28 % and 95.09 ± 0.31 %. respectively. From the verification data's, the parameter of SIS and tSIE for 14 C is to be in range of limit. And then 3 H and 14 C efficiency is still above minimum limit. Whereas, the background fluctuation still show normal condition. It could be concluded that until now the performance of LSC Packard 1900TR is well condition and could be used for counting. (author)

  8. Wind turbine power performance verification in complex terrain and wind farms

    Energy Technology Data Exchange (ETDEWEB)

    Friis Pedersen, T.; Gjerding, S.; Ingham, P.; Enevoldsen, P.; Kjaer Hansen, J.; Kanstrup Joergensen, H.

    2002-04-01

    The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurements on individual wind turbines. The second one is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedure for non-grid (small) wind turbines. This report presents work that was made to support the basis for this standardisation work. The work addressed experience from several national and international research projects and contractual and field experience gained within the wind energy community on this matter. The work was wide ranging and addressed 'grey' areas of knowledge regarding existing methodologies, which has then been investigated in more detail. The work has given rise to a range of conclusions and recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark; anemometry and the influence of inclined flow. (au)

  9. The relative importance of managerial competencies for predicting the perceived job performance of Broad-Based Black Economic Empowerment verification practitioners

    Directory of Open Access Journals (Sweden)

    Barbara M. Seate

    2016-04-01

    Full Text Available Orientation: There is a need for the growing Broad-Based Black Economic Empowerment (B-BBEE verification industry to assess competencies and determine skills gaps for the management of the verification practitioners’ perceived job performance. Knowing which managerial competencies are important for different managerial functions is vital for developing and improving training and development programmes. Research purpose: The purpose of this study was to determine the managerial capabilities that are required of the B-BBEE verification practitioners, in order to improve their perceived job performance. Motivation for the study: The growing number of the B-BBEE verification practitioners calls for more focused training and development. Generating such a training and development programme demands empirical research into the relative importance of managerial competencies. Research approach, design and method: A quantitative design using the survey approach was adopted. A questionnaire was administered to a stratified sample of 87 B-BBEE verification practitioners. Data were analysed using the Statistical Package for Social Sciences (version 22.0 and Smart Partial Least Squares software. Main findings: The results of the correlation analysis revealed that there were strong and positive associations between technical skills, interpersonal skills, compliance to standards and ethics, managerial skills and perceived job performance. Results of the regression analysis showed that managerial skills, compliance to standards and ethics and interpersonal skills were statistically significant in predicting perceived job performance. However, technical skills were insignificant in predicting perceived job performance. Practical/managerial implications: The study has shown that the B-BBEE verification industry, insofar as the technical skills of the practitioners are concerned, does have suitably qualified staff with the requisite educational qualifications. At

  10. Automatic analysis of intrinsic positional verification films brachytherapy using MATLAB

    International Nuclear Information System (INIS)

    Quiros Higueras, J. D.; Marco Blancas, N. de; Ruiz Rodriguez, J. C.

    2011-01-01

    One of the essential tests in quality control of brachytherapy equipment is verification auto load intrinsic positional radioactive source. A classic method for evaluation is the use of x-ray film and measuring the distance between the marks left by autoradiography of the source with respect to a reference. In our center has developed an automated method of measurement by the radiochromic film scanning and implementation of a macro developed in Matlab, in order to optimize time and reduce uncertainty in the measurement. The purpose of this paper is to describe the method developed, assess their uncertainty and quantify their advantages over the manual method. (Author)

  11. Linear models to perform treaty verification tasks for enhanced information security

    International Nuclear Information System (INIS)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-01-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  12. Linear models to perform treaty verification tasks for enhanced information security

    Energy Technology Data Exchange (ETDEWEB)

    MacGahan, Christopher J., E-mail: cmacgahan@optics.arizona.edu [College of Optical Sciences, The University of Arizona, 1630 E. University Blvd, Tucson, AZ 85721 (United States); Sandia National Laboratories, Livermore, CA 94551 (United States); Kupinski, Matthew A. [College of Optical Sciences, The University of Arizona, 1630 E. University Blvd, Tucson, AZ 85721 (United States); Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A. [Sandia National Laboratories, Livermore, CA 94551 (United States)

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  13. The data requirements for the verification and validation of a fuel performance code - the transuranus perspective

    International Nuclear Information System (INIS)

    Schubert, A.; Di Marcello, V.; Rondinella, V.; Van De Laar, J.; Van Uffelen, P.

    2013-01-01

    In general, the verification and validation (V and V) of a fuel performance code like TRANSURANUS consists of three basic steps: a) verifying the correctness and numerical stability of the sub-models; b) comparing the sub-models with experimental data; c) comparing the results of the integral fuel performance code with experimental data Only the second and third steps of the V and V rely on experimental information. This scheme can be further detailed according to the physical origin of the data: on one hand, in-reactor ('in-pile') experimental data are generated in the course of the irradiation; on the other hand ex-reactor ('out-of-pile') experimental data are obtained for instance from various postirradiation examinations (PIE) or dedicated experiments with fresh samples. For both categories, we will first discuss the V and V of sub-models of TRANSURANUS related to separate aspects of the fuel behaviour: this includes the radial variation of the composition and fissile isotopes, the thermal properties of the fuel (e.g. thermal conductivity, melting temperature, etc.), the mechanical properties of fuel and cladding (e.g. elastic constants, creep properties), as well as the models for the fission product behaviour. Secondly, the integral code verification will be addressed as it treats various aspects of the fuel behaviour, including the geometrical changes in the fuel and the gas pressure and composition of the free volume in the rod. (authors)

  14. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    International Nuclear Information System (INIS)

    Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.

    1994-12-01

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user's guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway

  15. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy.

    Science.gov (United States)

    Frey, K; Unholtz, D; Bauer, J; Debus, J; Min, C H; Bortfeld, T; Paganetti, H; Parodi, K

    2014-10-07

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β(+)-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to 'red flag' problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a

  16. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  17. STAMPS: development and verification of swallowing kinematic analysis software.

    Science.gov (United States)

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  18. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  19. Towards measurement and verification of energy performance under the framework of the European directive for energy performance of buildings

    International Nuclear Information System (INIS)

    Burman, Esfand; Mumovic, Dejan; Kimpian, Judit

    2014-01-01

    Directive 2002/91/EC of the European Parliament and Council on the Energy Performance of Buildings has led to major developments in energy policies followed by the EU Member States. The national energy performance targets for the built environment are mostly rooted in the Building Regulations that are shaped by this Directive. Article 3 of this Directive requires a methodology to calculate energy performance of buildings under standardised operating conditions. Overwhelming evidence suggests that actual energy performance is often significantly higher than this standardised and theoretical performance. The risk is national energy saving targets may not be achieved in practice. The UK evidence for the education and office sectors is presented in this paper. A measurement and verification plan is proposed to compare actual energy performance of a building with its theoretical performance using calibrated thermal modelling. Consequently, the intended vs. actual energy performance can be established under identical operating conditions. This can help identify the shortcomings of construction process and building procurement. Once energy performance gap is determined with reasonable accuracy and root causes identified, effective measures could be adopted to remedy or offset this gap. - Highlights: • Building energy performance gap is a negative externality that must be addressed. • A method is proposed to link actual performance to building compliance calculation. • Energy performance gap is divided into procurement and operational gaps. • This framework enables policy makers to measure and address procurement gap. • Building fine-tuning by construction teams could also narrow operational gap

  20. Structural performance evaluation on aging underground reinforced concrete structures. Part 6. An estimation method of threshold value in performance verification taking reinforcing steel corrosion

    International Nuclear Information System (INIS)

    Matsuo, Toyofumi; Matsumura, Takuro; Miyagawa, Yoshinori

    2009-01-01

    This paper discusses applicability of material degradation model due to reinforcing steel corrosion for RC box-culverts with corroded reinforcement and an estimation method for threshold value in performance verification reflecting reinforcing steel corrosion. First, in FEM analyses, loss of reinforcement section area and initial tension strain arising from reinforcing steel corrosion, and deteriorated bond characteristics between reinforcement and concrete were considered. The full-scale loading tests using corroded RC box-culverts were numerically analyzed. As a result, the analyzed crack patterns and load-strain relationships were in close agreement with the experimental results within the maximum corrosion ratio 15% of primary reinforcement. Then, we showed that this modeling could estimate the load carrying capacity of corroded RC box-culverts. Second, a parametric study was carried out for corroded RC box culverts with various sizes, reinforcement ratios and levels of steel corrosion, etc. Furthermore, as an application of analytical results and various experimental investigations, we suggested allowable degradation ratios for a modification of the threshold value, which corresponds to the chloride induced deterioration progress that is widely accepted in maintenance practice for civil engineering reinforced concrete structures. Finally, based on these findings, we developed two estimation methods for threshold value in performance verification: 1) a structural analysis method using nonlinear FEM included modeling of material degradation, 2) a practical method using a threshold value, which is determined by structural analyses of RC box-culverts in sound condition, is multiplied by the allowable degradation ratio. (author)

  1. Automated data acquisition and analysis system for inventory verification

    International Nuclear Information System (INIS)

    Sorenson, R.J.; Kaye, J.H.

    1974-03-01

    A real-time system is proposed which would allow CLO Safeguards Branch to conduct a meaningful inventory verification using a variety of NDA instruments. The overall system would include the NDA instruments, automated data handling equipment, and a vehicle to house and transport the instruments and equipment. For the purpose of the preliminary cost estimate a specific data handling system and vehicle were required. A Tracor Northern TN-11 data handling system including a PDP-11 minicomputer and a measurement vehicle similar to the Commission's Regulatory Region I van were used. The basic system is currently estimated to cost about $100,000, and future add-ons which would expand the systems' capabilities are estimated to cost about $40,000. The concept of using a vehicle in order to permanently rack mount the data handling equipmentoffers a number of benefits such as control of equipment environment and allowance for improvements, expansion, and flexibility in the system. Justification is also presented for local design and assembly of the overall system. A summary of the demonstration system which illustrates the advantages and feasibility of the overall system is included in this discussion. Two ideas are discussed which are not considered to be viable alternatives to the proposed system: addition of the data handling capabilities to the semiportable ''cart'' and use of a telephone link to a large computer center

  2. Quality verification at Arkansas Nuclear One using performance-based concepts

    International Nuclear Information System (INIS)

    Cooper, R.M.

    1990-01-01

    Performance-based auditing is beginning to make an impact within the nuclear industry. Its use provides performance assessments of the operating plant. In the past, this company along with most other nuclear utilities, performed compliance-based audits. These audits focused on paper reviews of past activities that were completed in weeks or months. This type of audit did not provide a comprehensive assessment of the effectiveness of an activity's performance, nor was it able to identify any performance problems that may have occurred. To respond to this discrepancy, a comprehensive overhaul of quality assurance (QA) assessment programs was developed. The first major change was to develop a technical specification (tech spec) audit program, with the objective of auditing each tech spec line item every 5 yr. To achieve performance-based results within the tech spec audit program, a tech spec surveillance program was implemented whose goal is to observe 75% of the tech-spec required tests every 5 yr. The next major change was to develop a QA surveillance program that would provide surveillance coverage for the remainder of the plant not covered by the tech spec surveillance program. One other improvement was to merge the QA/quality control (QC) functions into one nuclear quality group. The final part of the quality verification effort is trending of the quality performance-based data (including US Nuclear Regulatory Commission (NRC) violations)

  3. Numerical verification of equilibrium chemistry software within nuclear fuel performance codes

    International Nuclear Information System (INIS)

    Piro, M.H.; Lewis, B.J.; Thompson, W.T.; Simunovic, S.; Besmann, T.M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing transport source terms, material properties, and boundary conditions in heat and mass transport modules. Consequently, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method called the Gibbs Criteria is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes. (author)

  4. Performance Assessment and Scooter Verification of Nano-Alumina Engine Oil

    Directory of Open Access Journals (Sweden)

    Yu-Feng Lue

    2016-09-01

    Full Text Available The performance assessment and vehicle verification of nano-alumina (Al2O3 engine oil (NAEO were conducted in this study. The NAEO was produced by mixing Al2O3 nanoparticles with engine oil using a two-step synthesis method. The weight fractions of the Al2O3 nanoparticles in the four test samples were 0 (base oil, 0.5, 1.5, and 2.5 wt. %. The measurement of basic properties included: (1 density; (2 viscosity at various sample temperatures (20–80 °C. A rotary tribology testing machine with a pin-on-disk apparatus was used for the wear test. The measurement of the before-and-after difference of specimen (disk weight (wear test indicates that the NAEO with 1.5 wt. % Al2O3 nanoparticles (1.5 wt. % NAEO was the chosen candidate for further study. For the scooter verification on an auto-pilot dynamometer, there were three tests, including: (1 the European Driving Cycle (ECE40 driving cycle; (2 constant speed (50 km/h; and (3 constant throttle positions (20%, 40%, 60%, and 90%. For the ECE40 driving cycle and the constant speed tests, the fuel consumption was decreased on average by 2.75%, while it was decreased by 3.57% for the constant throttle case. The experimental results prove that the engine oil with added Al2O3 nanoparticles significantly decreased the fuel consumption. In the future, experiments with property tests of other nano-engine oils and a performance assessment of the nano-engine-fuel will be conducted.

  5. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong

    2008-03-15

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation.

  6. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    International Nuclear Information System (INIS)

    Chung, Bub Dong

    2008-03-01

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation

  7. Objective Oriented Design of System Thermal Hydraulic Analysis Program and Verification of Feasibility

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. TRACE, RELAP5-3D and MARS codes are examples of these activities. The codes were redesigned to have modular structures utilizing Fortran 90 features. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP. Although the recent Fortran language also support the OOP, it is considered to have limited functions compared to the modern software features. In this work, objective oriented program for system safety analysis code has been tried utilizing modern C language feature. The advantage of OOP has been discussed after verification of design feasibility

  8. The effect of two complexity factors on the performance of emergency tasks-An experimental verification

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Jung, Kwangtae

    2008-01-01

    It is well known that the use of procedures is very important in securing the safety of process systems, since good procedures effectively guide human operators by providing 'what should be done' and 'how to do it', especially under stressful conditions. At the same time, it has been emphasized that the use of complicated procedures could drastically impair operators' performance. This means that a systematic approach that can properly evaluate the complexity of procedures is indispensable for minimizing the side effects of complicated procedures. For this reason, Park et al. have developed a task complexity measure called TACOM that can be used to quantify the complexity of tasks stipulated in emergency operating procedures (EOPs) of nuclear power plants (NPPs). The TACOM measure consists of five sub-measures that can cover five important factors making the performance of emergency tasks complicated. However, a verification activity for two kinds of complexity factors-the level of abstraction hierarchy (AH) and engineering decision (ED)-seems to be insufficient. In this study, therefore, an experiment is conducted by using a low-fidelity simulator in order to clarify the appropriateness of these complexity factors. As a result, it seems that subjects' performance data are affected by the level of AH as well as ED. Therefore it is anticipate that both the level of AH and ED will play an important role in evaluating the complexity of EOPs

  9. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  10. Analysis of human scream and its impact on text-independent speaker verification.

    Science.gov (United States)

    Hansen, John H L; Nandwana, Mahesh Kumar; Shokouhi, Navid

    2017-04-01

    Scream is defined as sustained, high-energy vocalizations that lack phonological structure. Lack of phonological structure is how scream is identified from other forms of loud vocalization, such as "yell." This study investigates the acoustic aspects of screams and addresses those that are known to prevent standard speaker identification systems from recognizing the identity of screaming speakers. It is well established that speaker variability due to changes in vocal effort and Lombard effect contribute to degraded performance in automatic speech systems (i.e., speech recognition, speaker identification, diarization, etc.). However, previous research in the general area of speaker variability has concentrated on human speech production, whereas less is known about non-speech vocalizations. The UT-NonSpeech corpus is developed here to investigate speaker verification from scream samples. This study considers a detailed analysis in terms of fundamental frequency, spectral peak shift, frame energy distribution, and spectral tilt. It is shown that traditional speaker recognition based on the Gaussian mixture models-universal background model framework is unreliable when evaluated with screams.

  11. Performance verification of the Gravity and Extreme Magnetism Small explorer (GEMS) x-ray polarimeter

    Science.gov (United States)

    Enoto, Teruaki; Black, J. Kevin; Kitaguchi, Takao; Hayato, Asami; Hill, Joanne E.; Jahoda, Keith; Tamagawa, Toru; Kaneko, Kenta; Takeuchi, Yoko; Yoshikawa, Akifumi; Marlowe, Hannah; Griffiths, Scott; Kaaret, Philip E.; Kenward, David; Khalid, Syed

    2014-07-01

    Polarimetry is a powerful tool for astrophysical observations that has yet to be exploited in the X-ray band. For satellite-borne and sounding rocket experiments, we have developed a photoelectric gas polarimeter to measure X-ray polarization in the 2-10 keV range utilizing a time projection chamber (TPC) and advanced micro-pattern gas electron multiplier (GEM) techniques. We carried out performance verification of a flight equivalent unit (1/4 model) which was planned to be launched on the NASA Gravity and Extreme Magnetism Small Explorer (GEMS) satellite. The test was performed at Brookhaven National Laboratory, National Synchrotron Light Source (NSLS) facility in April 2013. The polarimeter was irradiated with linearly-polarized monochromatic X-rays between 2.3 and 10.0 keV and scanned with a collimated beam at 5 different detector positions. After a systematic investigation of the detector response, a modulation factor >=35% above 4 keV was obtained with the expected polarization angle. At energies below 4 keV where the photoelectron track becomes short, diffusion in the region between the GEM and readout strips leaves an asymmetric photoelectron image. A correction method retrieves an expected modulation angle, and the expected modulation factor, ~20% at 2.7 keV. Folding the measured values of modulation through an instrument model gives sensitivity, parameterized by minimum detectable polarization (MDP), nearly identical to that assumed at the preliminary design review (PDR).

  12. Performance verification and system parameter identification of spacecraft tape recorder control servo

    Science.gov (United States)

    Mukhopadhyay, A. K.

    1979-01-01

    Design adequacy of the lead-lag compensator of the frequency loop, accuracy checking of the analytical expression for the electrical motor transfer function, and performance evaluation of the speed control servo of the digital tape recorder used on-board the 1976 Viking Mars Orbiters and Voyager 1977 Jupiter-Saturn flyby spacecraft are analyzed. The transfer functions of the most important parts of a simplified frequency loop used for test simulation are described and ten simulation cases are reported. The first four of these cases illustrate the method of selecting the most suitable transfer function for the hysteresis synchronous motor, while the rest verify and determine the servo performance parameters and alternative servo compensation schemes. It is concluded that the linear methods provide a starting point for the final verification/refinement of servo design by nonlinear time response simulation and that the variation of the parameters of the static/dynamic Coulomb friction is as expected in a long-life space mission environment.

  13. Performance Verification of the Gravity and Extreme Magnetism Small Explorer GEMS X-Ray Polarimeter

    Science.gov (United States)

    Enoto, Teruaki; Black, J. Kevin; Kitaguchi, Takao; Hayato, Asami; Hill, Joanne E.; Jahoda, Keith; Tamagawa, Toru; Kanako, Kenta; Takeuchi, Yoko; Yoshikawa, Akifumi; hide

    2014-01-01

    olarimetry is a powerful tool for astrophysical observations that has yet to be exploited in the X-ray band. For satellite-borne and sounding rocket experiments, we have developed a photoelectric gas polarimeter to measure X-ray polarization in the 2-10 keV range utilizing a time projection chamber (TPC) and advanced micro-pattern gas electron multiplier (GEM) techniques. We carried out performance verification of a flight equivalent unit (1/4 model) which was planned to be launched on the NASA Gravity and Extreme Magnetism Small Explorer (GEMS) satellite. The test was performed at Brookhaven National Laboratory, National Synchrotron Light Source (NSLS) facility in April 2013. The polarimeter was irradiated with linearly-polarized monochromatic X-rays between 2.3 and 10.0 keV and scanned with a collimated beam at 5 different detector positions. After a systematic investigation of the detector response, a modulation factor greater than or equal to 35% above 4 keV was obtained with the expected polarization angle. At energies below 4 keV where the photoelectron track becomes short, diffusion in the region between the GEM and readout strips leaves an asymmetric photoelectron image. A correction method retrieves an expected modulation angle, and the expected modulation factor, approximately 20% at 2.7 keV. Folding the measured values of modulation through an instrument model gives sensitivity, parameterized by minimum detectable polarization (MDP), nearly identical to that assumed at the preliminary design review (PDR).

  14. Characteristics of a micro-fin evaporator: Theoretical analysis and experimental verification

    OpenAIRE

    Zheng Hui-Fan; Fan Xiao-Wei; Wang Fang; Liang Yao-Hua

    2013-01-01

    A theoretical analysis and experimental verification on the characteristics of a micro-fin evaporator using R290 and R717 as refrigerants were carried out. The heat capacity and heat transfer coefficient of the micro-fin evaporator were investigated under different water mass flow rate, different refrigerant mass flow rate, and different inner tube diameter of micro-fin evaporator. The simulation results of the heat transfer coefficient are fairly in good a...

  15. Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure

    Science.gov (United States)

    2016-05-09

    rats. The exposed hair samples were received from USAMRICD early in method development and required storage until the method was developed and validated...Because the storage of hair samples after an exposure has not been studied, it was unclear as to whether the analyte would be stable in the stored...biological matrixes typically used for analysis (i.e., blood, urine , and tissues), limiting the amount of time after an exposure that verification is

  16. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  17. Flammable Gas Refined Safety Analysis Tool Software Verification and Validation Report for Resolve Version 2.5

    International Nuclear Information System (INIS)

    BRATZEL, D.R.

    2000-01-01

    The purpose of this report is to document all software verification and validation activities, results, and findings related to the development of Resolve Version 2.5 for the analysis of flammable gas accidents in Hanford Site waste tanks

  18. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  19. Development and performance validation of a cryogenic linear stage for SPICA-SAFARI verification

    Science.gov (United States)

    Ferrari, Lorenza; Smit, H. P.; Eggens, M.; Keizer, G.; de Jonge, A. W.; Detrain, A.; de Jonge, C.; Laauwen, W. M.; Dieleman, P.

    2014-07-01

    In the context of the SAFARI instrument (SpicA FAR-infrared Instrument) SRON is developing a test environment to verify the SAFARI performance. The characterization of the detector focal plane will be performed with a backilluminated pinhole over a reimaged SAFARI focal plane by an XYZ scanning mechanism that consists of three linear stages stacked together. In order to reduce background radiation that can couple into the high sensitivity cryogenic detectors (goal NEP of 2•10-19 W/√Hz and saturation power of few femtoWatts) the scanner is mounted inside the cryostat in the 4K environment. The required readout accuracy is 3 μm and reproducibility of 1 μm along the total travel of 32 mm. The stage will be operated in "on the fly" mode to prevent vibrations of the scanner mechanism and will move with a constant speed varying from 60 μm/s to 400 μm/s. In order to meet the requirements of large stroke, low dissipation (low friction) and high accuracy a DC motor plus spindle stage solution has been chosen. In this paper we will present the stage design and stage characterization, describing also the measurements setup. The room temperature performance has been measured with a 3D measuring machine cross calibrated with a laser interferometer and a 2-axis tilt sensor. The low temperature verification has been performed in a wet 4K cryostat using a laser interferometer for measuring the linear displacements and a theodolite for measuring the angular displacements. The angular displacements can be calibrated with a precision of 4 arcsec and the position could be determined with high accuracy. The presence of friction caused higher values of torque than predicted and consequently higher dissipation. The thermal model of the stage has also been verified at 4K.

  20. Whole-core thermal-hydraulic transient code development and verification for LMFBR analysis

    International Nuclear Information System (INIS)

    Spencer, D.R.

    1979-04-01

    Predicted performance during both steady state and transient reactor operation determines the steady state operating limits on LMFBRs. Unnecessary conservatism in performance predictions will not contribute to safety, but will restrict the reactor to more conservative, less economical steady state operation. The most general method for reducing analytical conservatism in LMFBR's without compromising safety is to develop, validate and apply more sophisticated computer models to the limiting performance analyses. The purpose of the on-going Natural Circulation Verification Program (NCVP) is to develop and validate computer codes to analyze natural circulation transients in LMFBRs, and thus, replace unnecessary analytical conservatism with demonstrated calculational capability

  1. Performance Analysis of MYSEA

    Science.gov (United States)

    2012-09-01

    Services FSD Federated Services Daemon I&A Identification and Authentication IKE Internet Key Exchange KPI Key Performance Indicator LAN Local Area...spection takes place in different processes in the server architecture. Key Performance Indica- tor ( KPI )s associated with the system need to be...application and risk analysis of security controls. Thus, measurement of the KPIs is needed before an informed tradeoff between the performance penalties

  2. Development and verification test of integral reactor major components - Development of MCP impeller design, performance prediction code and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung Kyoon; Oh, Woo Hyoung; Song, Jae Wook [Korea Advanced Institute of Science and Technology, Taejon (Korea)

    1999-03-01

    The present study is aimed at developing a computational code for design and performance prediction of an axial-flow pump. The proposed performance prediction method is tested against a model axial-flow pump streamline curvature method. The preliminary design is made by using the ideal velocity triangles at inlet and exit and the three dimensional blade shape is calculated by employing the free vortex design method. Then the detailed blading design is carried out by using experimental database of double circular arc cambered hydrofoils. To computationally determine the design incidence, deviation, blade camber, solidity and stagger angle, a number of correlation equations are developed form the experimental database and a theorical formula for the lift coefficient is adopted. A total of 8 equations are solved iteratively using an under-relaxation factor. An experimental measurement is conducted under a non-cavitating condition to obtain the off-design performance curve and also a cavitation test is carried out by reducing the suction pressure. The experimental results are very satisfactorily compared with the predictions by the streamline curvature method. 28 refs., 26 figs., 11 tabs. (Author)

  3. Analytical analysis and experimental verification of interleaved parallelogram heat sink

    International Nuclear Information System (INIS)

    Chen, Hong-Long; Wang, Chi-Chuan

    2017-01-01

    Highlights: • A novel air-cooled heat sink profile (IPFM) is proposed to compete with the typical design. • It features two different perimeters with odd fin being rectangular and the rest being parallelogram. • A new modified dimensionless parameter characterized the flow length in triangular region is proposed. • The analytical predictions are in line with the experiments for both conventional and IPFM design. • IPFM design shows a much lower pressure drop and a superior performance especially for dense fins. - Abstract: In this study, a novel air-cooled heat sink profile is proposed to compete with the conventional design. The new design is termed as IPFM (Interleaved Parallelogram Fin Module) which features two different geometrical perimeter shapes of fins. This new design not only gains the advantage of lower pressure drop for power saving; but also gains a material saving for less fin surface area. An assessment of flow impedance and performance between the conventional and IPFM heat sink is analytically investigated and experimentally verified. A new modified dimensionless friction factor for triangular region is proposed. The analytical predictions agree with experimental measurements for both conventional and IPFM design. In electronic cooling design, especially for cloud server air-cooled heat sink design, the flow pattern is usually laminar with Reynolds number being operated less than 2000. In this regime, the IPFM design shows 8–12% less of surface than conventional design when the flow rate is less than 10 CFM; yet the thermal performance is slightly inferior to the conventional design when the flowrate is raised towards 25 CFM. Yet in the test range of 5–25 CFM, a 10–15% lower flow impedance is observed. The smaller fin spacing, the more conspicuous reduction of flow impedance is observed. The optimization of cutting angle is around 35° for 10 CFM, and it is reduced to 15° at a larger flowrate of 20 CFM.

  4. Calibrations and verifications performed in view of the ILA reinstatement at JET

    Energy Technology Data Exchange (ETDEWEB)

    Dumortier, P., E-mail: pierre.dumortier@rma.ac.be; Durodié, F. [LPP-ERM-KMS, TEC partner, Brussels (Belgium); Helou, W. [CEA, IRFM, F-13108 St-Paul-Lez-Durance (France); Monakhov, I.; Noble, C.; Wooldridge, E.; Blackman, T.; Graham, M. [CCFE, Culham Science Centre, Abingdon (United Kingdom); Collaboration: EUROfusion Consortium

    2015-12-10

    The calibrations and verifications that are performed in preparation of the ITER-Like antenna (ILA) reinstatement at JET are reviewed. A brief reminder of the ILA system layout is given. The different calibration methods and results are then discussed. They encompass the calibrations of the directional couplers present in the system, the determination of the relation between the capacitor position readings and the capacitance value, the voltage probes calibration inside the antenna housing, the RF cables characterization and the acquisition electronics circuit calibration. Earlier experience with the ILA has shown that accurate calibrations are essential for the control of the full ILA close-packed antenna array, its protection through the S-Matrix Arc Detection and the new second stage matching algorithm to be implemented. Finally the voltage stand-off of the capacitors is checked and the phase range achievable with the system is verified. The system layout is modified as to allow dipole operation over the whole operating frequency range when operating with the 3dB combiner-splitters.

  5. Development and Verification of Smoothed Particle Hydrodynamics Code for Analysis of Tsunami near NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Young Beom; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    It becomes more complicated when considering the shape and phase of the ground below the seawater. Therefore, some different attempts are required to precisely analyze the behavior of tsunami. This paper introduces an on-going activities on code development in SNU based on an unconventional mesh-free fluid analysis method called Smoothed Particle Hydrodynamics (SPH) and its verification work with some practice simulations. This paper summarizes the on-going development and verification activities on Lagrangian mesh-free SPH code in SNU. The newly developed code can cover equation of motions and heat conduction equation so far, and verification of each models is completed. In addition, parallel computation using GPU is now possible, and GUI is also prepared. If users change input geometry or input values, they can simulate for various conditions geometries. A SPH method has large advantages and potential in modeling of free surface, highly deformable geometry and multi-phase problems that traditional grid-based code has difficulties in analysis. Therefore, by incorporating more complex physical models such as turbulent flow, phase change, two-phase flow, and even solid mechanics, application of the current SPH code is expected to be much more extended including molten fuel behaviors in the sever accident.

  6. Slideline verification for multilayer pressure vessel and piping analysis

    International Nuclear Information System (INIS)

    Van Gulick, L.A.

    1983-01-01

    Nonlinear finite element method (FEM) computer codes with slideline algorithm implementations should be useful for the analysis of prestressed multilayer pressure vessels and piping. This paper presents closed form solutions useful for validating slideline implementations for this purpose. The solutions describe stresses and displacements of an internally pressurized elastic-plastic sphere initially separated from an elastic outer sphere by a uniform gap. Comparison of closed form and FEM results evaluates the usefulness of the closed form solution and the validity of the slideline implementation used

  7. Using harmonical analysis for experimental verification of reactor dynamics

    International Nuclear Information System (INIS)

    Hrstka, V.

    1974-01-01

    The questions are discussed of the accuracy of the method of static programming when applied to digital harmonic analysis, with regard to the variation of the mean value of the analyzed signals, and to the use of symmetrical trapezoidal periodical signals. The evaluation is made of the suitability of the above-mentioned method in determining the frequency characteristic of the SR-OA reactor. The results obtained were applied to planning the start-up experiments of the KS-150 reactor at the A-1 nuclear power station. (author)

  8. Dynamics of railway bridges, analysis and verification by field tests

    Directory of Open Access Journals (Sweden)

    Andersson Andreas

    2015-01-01

    Full Text Available The following paper discusses different aspects of railway bridge dynamics, comprising analysis, modelling procedures and experimental testing. The importance of realistic models is discussed, especially regarding boundary conditions, load distribution and soil-structure interaction. Two theoretical case studies are presented, involving both deterministic and probabilistic assessment of a large number of railway bridges using simplified and computationally efficient models. A total of four experimental case studies are also introduced, illustrating different aspects and phenomena in bridge dynamics. The excitation consists of both ambient vibrations, train induced vibrations, free vibrations after train passages and controlled forced excitation.

  9. Verification testing of the compression performance of the HEVC screen content coding extensions

    Science.gov (United States)

    Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng

    2017-09-01

    This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.

  10. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    Jackson, J.F.

    1975-01-01

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  11. Spectral signature verification using statistical analysis and text mining

    Science.gov (United States)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  12. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  13. Design, analysis, and test verification of advanced encapsulation systems

    Science.gov (United States)

    Garcia, A.; Minning, C.

    1981-01-01

    Thermal, optical, structural, and electrical isolation analyses are decribed. Major factors in the design of terrestrial photovoltaic modules are discussed. Mechanical defects in the different layers of an encapsulation system, it was found, would strongly influence the minimum pottant thickness required for electrical isolation. Structural, optical, and electrical properties, a literature survey indicated, are hevily influenced by the presence of moisture. These items, identified as technology voids, are discussed. Analyses were based upon a 1.2 meter square module using 10.2 cm (4-inch) square cells placed 1.3 mm apart as shown in Figure 2-2. Sizing of the structural support member of a module was determined for a uniform, normal pressure load of 50 psf, corresponding to the pressure difference generated between the front and back surface of a module by a 100 mph wind. Thermal and optical calculations were performed for a wind velocity of 1 meter/sec parallel to the ground and for module tilt (relative to the local horizontal) of 37 deg. Placement of a module in a typical array field is illustrated.

  14. Design, analysis, and test verification of advanced encapsulation systems

    Science.gov (United States)

    Garcia, A.; Minning, C.

    1981-11-01

    Thermal, optical, structural, and electrical isolation analyses are decribed. Major factors in the design of terrestrial photovoltaic modules are discussed. Mechanical defects in the different layers of an encapsulation system, it was found, would strongly influence the minimum pottant thickness required for electrical isolation. Structural, optical, and electrical properties, a literature survey indicated, are hevily influenced by the presence of moisture. These items, identified as technology voids, are discussed. Analyses were based upon a 1.2 meter square module using 10.2 cm (4-inch) square cells placed 1.3 mm apart as shown in Figure 2-2. Sizing of the structural support member of a module was determined for a uniform, normal pressure load of 50 psf, corresponding to the pressure difference generated between the front and back surface of a module by a 100 mph wind. Thermal and optical calculations were performed for a wind velocity of 1 meter/sec parallel to the ground and for module tilt (relative to the local horizontal) of 37 deg. Placement of a module in a typical array field is illustrated.

  15. Acquisition System Verification for Energy Efficiency Analysis of Building Materials

    Directory of Open Access Journals (Sweden)

    Natalia Cid

    2017-08-01

    Full Text Available Climate change and fossil fuel depletion foster interest in improving energy efficiency in buildings. There are different methods to achieve improved efficiency; one of them is the use of additives, such as phase change materials (PCMs. To prove this method’s effectiveness, a building’s behaviour should be monitored and analysed. This paper describes an acquisition system developed for monitoring buildings based on Supervisory Control and Data Acquisition (SCADA and with a 1-wire bus network as the communication system. The system is empirically tested to prove that it works properly. With this purpose, two experimental cubicles are made of self-compacting concrete panels, one of which has a PCM as an additive to improve its energy storage properties. Both cubicles have the same dimensions and orientation, and they are separated by six feet to avoid shadows. The behaviour of the PCM was observed with the acquisition system, achieving results that illustrate the differences between the cubicles directly related to the PCM’s characteristics. Data collection devices included in the system were temperature sensors, some of which were embedded in the walls, as well as humidity sensors, heat flux density sensors, a weather station and energy counters. The analysis of the results shows agreement with previous studies of PCM addition; therefore, the acquisition system is suitable for this application.

  16. Verification and benchmarking of PORFLO: an equivalent porous continuum code for repository scale analysis

    International Nuclear Information System (INIS)

    Eyler, L.L.; Budden, M.J.

    1984-11-01

    The objective of this work was to perform an assessment of prediction capabilities and features of the PORFLO code in relation to its intended use in the Basalt Waste Isolation Project. This objective was to be accomplished through a code verification and benchmarking task. Results were to be documented which either support correctness of prediction capabilities or identify areas of intended application in which the code exhibits weaknesses. A test problem set consisting of 10 problems was developed. Results of PORFLO simulations of these problems were provided for use in this work. The 10 problems were designed to test the three basic computational capabilities or categories of the code. Broken down by physical process, these are heat transfer, fluid flow, and radionuclide transport. Two verification problems were included within each of these categories. They were problems designed to test basic features of PORFLO for which analytical solutions are available for use as a known comparison basis. Hence they are referred to as verification problems. Of the remaining four problems, one repository scale problem representative of intended PORFLO use within BWIP was included in each of the three basic capabilities categories. The remaining problem was a case specifically designed to test features of decay and retardation in radionuclide transport. These four problems are referred to as benchmarking problems, because results computed with an additional computer code were used as a basis for comparison. 38 figures

  17. Initial Clinical Experience Performing Patient Treatment Verification With an Electronic Portal Imaging Device Transit Dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Sean L., E-mail: BerryS@MSKCC.org [Department of Applied Physics and Applied Mathematics, Columbia University, New York, New York (United States); Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); Polvorosa, Cynthia; Cheng, Simon; Deutsch, Israel; Chao, K. S. Clifford; Wuu, Cheng-Shie [Department of Radiation Oncology, Columbia University, New York, New York (United States)

    2014-01-01

    Purpose: To prospectively evaluate a 2-dimensional transit dosimetry algorithm's performance on a patient population and to analyze the issues that would arise in a widespread clinical adoption of transit electronic portal imaging device (EPID) dosimetry. Methods and Materials: Eleven patients were enrolled on the protocol; 9 completed and were analyzed. Pretreatment intensity modulated radiation therapy (IMRT) patient-specific quality assurance was performed using a stringent local 3%, 3-mm γ criterion to verify that the planned fluence had been appropriately transferred to and delivered by the linear accelerator. Transit dosimetric EPID images were then acquired during treatment and compared offline with predicted transit images using a global 5%, 3-mm γ criterion. Results: There were 288 transit images analyzed. The overall γ pass rate was 89.1% ± 9.8% (average ± 1 SD). For the subset of images for which the linear accelerator couch did not interfere with the measurement, the γ pass rate was 95.7% ± 2.4%. A case study is presented in which the transit dosimetry algorithm was able to identify that a lung patient's bilateral pleural effusion had resolved in the time between the planning CT scan and the treatment. Conclusions: The EPID transit dosimetry algorithm under consideration, previously described and verified in a phantom study, is feasible for use in treatment delivery verification for real patients. Two-dimensional EPID transit dosimetry can play an important role in indicating when a treatment delivery is inconsistent with the original plan.

  18. Testing, verification and application of CONTAIN for severe accident analysis of LMFBR-containments

    International Nuclear Information System (INIS)

    Langhans, J.

    1991-01-01

    Severe accident analysis for LMFBR-containments has to consider various phenomena influencing the development of containment loads as pressure and temperatures as well as generation, transport, depletion and release of aerosols and radioactive materials. As most of the different phenomena are linked together their feedback has to be taken into account within the calculation of severe accident consequences. Otherwise no best-estimate results can be assured. Under the sponsorship of the German BMFT the US code CONTAIN is being developed, verified and applied in GRS for future fast breeder reactor concepts. In the first step of verification, the basic calculation models of a containment code have been proven: (i) flow calculation for different flow situations, (ii) heat transfer from and to structures, (iii) coolant evaporation, boiling and condensation, (iv) material properties. In the second step the proof of the interaction of coupled phenomena has been checked. The calculation of integrated containment experiments relating natural convection flow, structure heating and coolant condensation as well as parallel calculation of results obtained with an other code give detailed information on the applicability of CONTAIN. The actual verification status allows the following conclusion: a caucious analyst experienced in containment accident modelling using the proven parts of CONTAIN will obtain results which have the same accuracy as other well optimized and detailed lumped parameter containment codes can achieve. Further code development, additional verification and international exchange of experience and results will assure an adequate code for the application in safety analyses for LMFBRs. (orig.)

  19. Poster - 43: Analysis of SBRT and SRS dose verification results using the Octavius 1000SRS detector

    Energy Technology Data Exchange (ETDEWEB)

    Cherpak, Amanda [Nova Scotia Cancer Centre, Nova Scotia Health Authority, Halifax, NS, Department of Radiation Oncology, Dalhousie University, Halifax, NS, Department of Physics and Atmospheric Sciences, Dalhousie University, Halifax, NS (Canada)

    2016-08-15

    Purpose: The Octavius 1000{sup SRS} detector was commissioned in December 2014 and is used routinely for verification of all SRS and SBRT plans. Results of verifications were analyzed to assess trends and limitations of the device and planning methods. Methods: Plans were delivered using a True Beam STx and results were evaluated using gamma analysis (95%, 3%/3mm) and absolute dose difference (5%). Verification results were analyzed based on several plan parameters including tumour volume, degree of modulation and prescribed dose. Results: During a 12 month period, a total of 124 patient plans were verified using the Octavius detector. Thirteen plans failed the gamma criteria, while 7 plans failed based on the absolute dose difference. When binned according to degree of modulation, a significant correlation was found between MU/cGy and both mean dose difference (r=0.78, p<0.05) and gamma (r=−0.60, p<0.05). When data was binned according to tumour volume, the standard deviation of average gamma dropped from 2.2% – 3.7% for the volumes less than 30 cm{sup 3} to below 1% for volumes greater than 30 cm{sup 3}. Conclusions: The majority of plans and verification failures involved tumour volumes smaller than 30 cm{sup 3}. This was expected due to the nature of disease treated with SBRT and SRS techniques and did not increase rate of failure. Correlations found with MU/cGy indicate that as modulation increased, results deteriorated but not beyond the previously set thresholds.

  20. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  1. PIPE STRESS and VERPIP codes for stress analysis and verifications of PEC reactor piping

    International Nuclear Information System (INIS)

    Cesari, F.; Ferranti, P.; Gasparrini, M.; Labanti, L.

    1975-01-01

    To design LMFBR piping systems following ASME Sct. III requirements unusual flexibility computer codes are to be adopted to consider piping and its guard-tube. For this purpose PIPE STRESS code previously prepared by Southern-Service, has been modified. Some subroutine for detailed stress analysis and principal stress calculations on all the sections of piping have been written and fitted in the code. Plotter can also be used. VERPIP code for automatic verifications of piping as class 1 Sct. III prescriptions has been also prepared. The results of PIPE STRESS and VERPIP codes application to PEC piping are in section III of this report

  2. Improvement and verification of fast-reactor safety-analysis techniques. Final report

    International Nuclear Information System (INIS)

    Barker, D.H.

    1981-12-01

    The work involved on this project took place between March 1, 1975 and December 31, 1981. The work resulted in two PhD and one Masters Theses. Part I was the Verification and Applicability Studies for the VENUS-II LMFBR Disassembly Code. These tests showed that the VENUS-II code closely predicted the energy release in all three tests chosen for analysis. Part II involved the chemical simulation of pool dispersion in the transition phase of an HCDA. Part III involved the reaction of an internally heated fluid and the vessel walls

  3. Network performance analysis

    CERN Document Server

    Bonald, Thomas

    2013-01-01

    The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i

  4. USING PERFLUOROCARBON TRACERS FOR VERIFICATION OF CAP AND COVER SYSTEMS PERFORMANCE

    International Nuclear Information System (INIS)

    HEISER, J.; SULLIVAN, T.

    2001-01-01

    The Department of Energy (DOE) Environmental Management (EM) office has committed itself to an accelerated cleanup of its national facilities. The goal is to have much of the DOE legacy waste sites remediated by 2006. This includes closure of several sites (e.g., Rocky Flats and Fernald). With the increased focus on accelerated cleanup, there has been considerable concern about long-term stewardship issues in general, and verification and long-term monitoring (LTM) of caps and covers, in particular. Cap and cover systems (covers) are vital remedial options that will be extensively used in meeting these 2006 cleanup goals. Every buried waste site within the DOE complex will require some form of cover system. These covers are expected to last from 100 to 1000 years or more. The stakeholders can be expected to focus on system durability and sustained performance. DOE EM has set up a national committee of experts to develop a long-term capping (LTC) guidance document. Covers are subject to subsidence, erosion, desiccation, animal intrusion, plant root infiltration, etc., all of which will affect the overall performance of the cover. Very little is available in terms of long-term monitoring other than downstream groundwater or surface water monitoring. By its very nature, this can only indicate that failure of the cover system has already occurred and contaminants have been transported away from the site. This is unacceptable. Methods that indicate early cover failure (prior to contaminant release) or predict approaching cover failure are needed. The LTC committee has identified predictive monitoring technologies as a high priority need for DOE, both for new covers as well as existing covers. The same committee identified a Brookhaven National Laboratory (BNL) technology as one approach that may be capable of meeting the requirements for LTM. The Environmental Research and Technology Division (ERTD) at BNL developed a novel methodology for verifying and monitoring

  5. Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP

    International Nuclear Information System (INIS)

    Maruyama, Soh; Fujimoto, Nozomu; Sudo, Yukio; Kiso, Yoshihiro; Murakami, Tomoyuki.

    1988-09-01

    This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T 1-M ) with simulated fuel rods and fuel blocks. (author)

  6. Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP

    Science.gov (United States)

    Maruyama, Soh; Fujimoto, Nozomu; Kiso, Yoshihiro; Murakami, Tomoyuki; Sudo, Yukio

    1988-09-01

    This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T(sub 1-M)) with simulated fuel rods and fuel blocks.

  7. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  8. Verification test for on-line diagnosis algorithm based on noise analysis

    International Nuclear Information System (INIS)

    Tamaoki, T.; Naito, N.; Tsunoda, T.; Sato, M.; Kameda, A.

    1980-01-01

    An on-line diagnosis algorithm was developed and its verification test was performed using a minicomputer. This algorithm identifies the plant state by analyzing various system noise patterns, such as power spectral densities, coherence functions etc., in three procedure steps. Each obtained noise pattern is examined by using the distances from its reference patterns prepared for various plant states. Then, the plant state is identified by synthesizing each result with an evaluation weight. This weight is determined automatically from the reference noise patterns prior to on-line diagnosis. The test was performed with 50 MW (th) Steam Generator noise data recorded under various controller parameter values. The algorithm performance was evaluated based on a newly devised index. The results obtained with one kind of weight showed the algorithm efficiency under the proper selection of noise patterns. Results for another kind of weight showed the robustness of the algorithm to this selection. (orig.)

  9. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  10. Analysis of an indirect neutron signature for enhanced UF_6 cylinder verification

    International Nuclear Information System (INIS)

    Kulisek, J.A.; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-01-01

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF_6) cylinders. The current method provides relatively low accuracy for the assay of "2"3"5U enrichment, especially for natural and depleted UF_6. Furthermore, the current method provides no capability to assay the absolute mass of "2"3"5U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from "2"3"5U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA_N_T). HEVA_N_T enables full-volume assay of UF_6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF_6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA_N_T in terms of the individual contributions to HEVA_N_T from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA_N_T signature to manipulation by the nearby placement of neutron-conversion materials.

  11. Analysis of an indirect neutron signature for enhanced UF{sub 6} cylinder verification

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, J.A., E-mail: Jonathan.Kulisek@pnnl.gov; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-02-21

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF{sub 6}) cylinders. The current method provides relatively low accuracy for the assay of {sup 235}U enrichment, especially for natural and depleted UF{sub 6}. Furthermore, the current method provides no capability to assay the absolute mass of {sup 235}U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from {sup 235}U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA{sub NT}). HEVA{sub NT} enables full-volume assay of UF{sub 6} cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF{sub 6}. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA{sub NT} in terms of the individual contributions to HEVA{sub NT} from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA{sub NT} signature to manipulation by the nearby placement of neutron-conversion materials.

  12. Verification and implications of the multiple pin treatment in the SASSYS-1 LMR systems analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.

    1994-01-01

    As part of a program to obtain realistic, as opposed to excessively conservative, analysis of reactor transients, a multiple pin treatment for the analysis of intra-subassembly thermal hydraulics has been included in the SASSYS-1 liquid metal reactor systems analysis code. This new treatment has made possible a whole new level of verification for the code. The code can now predict the steady-state and transient responses of individual thermocouples within instrumented subassemlies in a reactor, rather than just predicting average temperatures for a subassembly. Very good agreement has been achieved between code predictions and the experimental measurements of steady-state and transient temperatures and flow rates in the Shutdown Heat Removal Tests in the EBR-II Reactor. Detailed multiple pin calculations for blanket subassemblies in the EBR-II reactor demonstrate that the actual steady-state and transient peak temperatures in these subassemblies are significantly lower than those that would be calculated by simpler models

  13. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  14. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund

    2013-01-01

    pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation......In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...

  15. Performance analysis in saber.

    Science.gov (United States)

    Aquili, Andrea; Tancredi, Virginia; Triossi, Tamara; De Sanctis, Desiree; Padua, Elvira; DʼArcangelo, Giovanna; Melchiorri, Giovanni

    2013-03-01

    Fencing is a sport practiced by both men and women, which uses 3 weapons: foil, épée, and saber. In general, there are few scientific studies available in international literature; they are limited to the performance analysis of fencing bouts, yet there is nothing about saber. There are 2 kinds of competitions in the World Cup for both men and women: the "FIE GP" and "A." The aim of this study was to carry out a saber performance analysis to gain useful indicators for the definition of a performance model. In addition, it is expected to verify if it could be influenced by the type of competition and if there are differences between men and women. Sixty bouts: 33 FIE GP and 27 "A" competitions (35 men's and 25 women's saber bouts) were analyzed. The results indicated that most actions are offensive (55% for men and 49% for women); the central area of the piste is mostly used (72% for men and 67% for women); the effective fighting time is 13.6% for men and 17.1% for women, and the ratio between the action and break times is 1:6.5 for men and 1:5.1 for women. A lunge is carried out every 23.9 seconds by men and every 20 seconds by women, and a direction change is carried out every 65.3 seconds by men and every 59.7 seconds by women. The data confirm the differences between the saber and the other 2 weapons. There is no significant difference between the data of the 2 different kinds of competitions.

  16. Experimental verification for standard analysis procedure of 241Am in food

    International Nuclear Information System (INIS)

    Liu Qingfen; Zhu Hongda; Liu Shutian; Pan Jingshun; Yang Dating

    2005-01-01

    Objective: The briefly experimental verification for 'determination of 241 Am in food' has been described. Methods: The overall recovery, the MDL of method and decontamination experiment has been done by standard analysis procedure. Results: The overall recovery is 76.26 ± 4.1%. The MDL is 3.4 x 10 -5 Bq/g ash, decontamination factor is higher than 10 3 for Po, 10 2 for U, Th, Pu and 60 for 237 Np. Conclusion: The results showed that the overall recovery is quite high and reliable, the MDL of method is able to meet examining 241 Am limited values in foods. the obtained decontamination factors of recommended procedure can meet analysis of 241 Am in food examination. Venifying results of the procedure are satisfied by using 243 Am spike and 241 Am standard reference material. (authors)

  17. Assessment and Verification of SLS Block 1-B Exploration Upper Stage State and Stage Disposal Performance

    Science.gov (United States)

    Patrick, Sean; Oliver, Emerson

    2018-01-01

    One of the SLS Navigation System's key performance requirements is a constraint on the payload system's delta-v allocation to correct for insertion errors due to vehicle state uncertainty at payload separation. The SLS navigation team has developed a Delta-Delta-V analysis approach to assess the effect on trajectory correction maneuver (TCM) design needed to correct for navigation errors. This approach differs from traditional covariance analysis based methods and makes no assumptions with regard to the propagation of the state dynamics. This allows for consideration of non-linearity in the propagation of state uncertainties. The Delta-Delta-V analysis approach re-optimizes perturbed SLS mission trajectories by varying key mission states in accordance with an assumed state error. The state error is developed from detailed vehicle 6-DOF Monte Carlo analysis or generated using covariance analysis. These perturbed trajectories are compared to a nominal trajectory to determine necessary TCM design. To implement this analysis approach, a tool set was developed which combines the functionality of a 3-DOF trajectory optimization tool, Copernicus, and a detailed 6-DOF vehicle simulation tool, Marshall Aerospace Vehicle Representation in C (MAVERIC). In addition to delta-v allocation constraints on SLS navigation performance, SLS mission requirement dictate successful upper stage disposal. Due to engine and propellant constraints, the SLS Exploration Upper Stage (EUS) must dispose into heliocentric space by means of a lunar fly-by maneuver. As with payload delta-v allocation, upper stage disposal maneuvers must place the EUS on a trajectory that maximizes the probability of achieving a heliocentric orbit post Lunar fly-by considering all sources of vehicle state uncertainty prior to the maneuver. To ensure disposal, the SLS navigation team has developed an analysis approach to derive optimal disposal guidance targets. This approach maximizes the state error covariance prior

  18. The use of the hybrid K-edge densitometer for routine analysis of safeguards verification samples of reprocessing input liquor

    International Nuclear Information System (INIS)

    Ottmar, H.; Eberle, H.

    1991-01-01

    Following successful tests of a hybrid K-edge instrument at TUI Karlsruhe and the routine use of a K-edge densitometer for safeguards verification at the same laboratory, the Euratom Safeguards Directorate of the Commission of the European Communities decided to install the first such instrument into a large industrial reprocessing plant for the routine verification of samples taken from the input accountancy tanks. This paper reports on the installation, calibration, sample handling procedure and the performance of this instrument after one year of routine operation

  19. Verification and validation of the PLTEMP/ANL code for thermal hydraulic analysis of experimental and test reactors

    International Nuclear Information System (INIS)

    Kalimullah, M.; Olson, A.O.; Feldman, E.E.; Hanan, N.; Dionne, B.

    2012-01-01

    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  20. Verification and Validation of the PLTEMP/ANL Code for Thermal-Hydraulic Analysis of Experimental and Test Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kalimullah, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hanan, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-07

    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  1. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  2. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  3. On the safety and performance demonstration tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and validation and verification of computational codes

    International Nuclear Information System (INIS)

    Kim, Jong Bum; Jeong, Ji Young; Lee, Tae Ho; Kim, Sung Kyun; Euh, Dong Jin; Joo, Hyung Kook

    2016-01-01

    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V and V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the computer codes V and V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs) have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results

  4. On the safety and performance demonstration tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and validation and verification of computational codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Bum; Jeong, Ji Young; Lee, Tae Ho; Kim, Sung Kyun; Euh, Dong Jin; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V and V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the computer codes V and V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs) have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  5. The cumulative verification image analysis tool for offline evaluation of portal images

    International Nuclear Information System (INIS)

    Wong, John; Yan Di; Michalski, Jeff; Graham, Mary; Halverson, Karen; Harms, William; Purdy, James

    1995-01-01

    Purpose: Daily portal images acquired using electronic portal imaging devices contain important information about the setup variation of the individual patient. The data can be used to evaluate the treatment and to derive correction for the individual patient. The large volume of images also require software tools for efficient analysis. This article describes the approach of cumulative verification image analysis (CVIA) specifically designed as an offline tool to extract quantitative information from daily portal images. Methods and Materials: The user interface, image and graphics display, and algorithms of the CVIA tool have been implemented in ANSCI C using the X Window graphics standards. The tool consists of three major components: (a) definition of treatment geometry and anatomical information; (b) registration of portal images with a reference image to determine setup variation; and (c) quantitative analysis of all setup variation measurements. The CVIA tool is not automated. User interaction is required and preferred. Successful alignment of anatomies on portal images at present remains mostly dependent on clinical judgment. Predefined templates of block shapes and anatomies are used for image registration to enhance efficiency, taking advantage of the fact that much of the tool's operation is repeated in the analysis of daily portal images. Results: The CVIA tool is portable and has been implemented on workstations with different operating systems. Analysis of 20 sequential daily portal images can be completed in less than 1 h. The temporal information is used to characterize setup variation in terms of its systematic, random and time-dependent components. The cumulative information is used to derive block overlap isofrequency distributions (BOIDs), which quantify the effective coverage of the prescribed treatment area throughout the course of treatment. Finally, a set of software utilities is available to facilitate feedback of the information for

  6. Verification and Validation of Tropospheric Model/Database

    National Research Council Canada - National Science Library

    Junho, choi

    1998-01-01

    A verification and validation of tropospheric models and databases has been performed based on ray tracing algorithm, statistical analysis, test on real time system operation, and other technical evaluation process...

  7. Assessment and Verification of SLS Block 1-B Exploration Upper Stage and Stage Disposal Performance

    Science.gov (United States)

    Patrick, Sean; Oliver, T. Emerson; Anzalone, Evan J.

    2018-01-01

    Delta-v allocation to correct for insertion errors caused by state uncertainty is one of the key performance requirements imposed on the SLS Navigation System. Additionally, SLS mission requirements include the need for the Exploration Up-per Stage (EUS) to be disposed of successfully. To assess these requirements, the SLS navigation team has developed and implemented a series of analysis methods. Here the authors detail the Delta-Delta-V approach to assessing delta-v allocation as well as the EUS disposal optimization approach.

  8. Modeling and experimental verification of proof mass effects on vibration energy harvester performance

    International Nuclear Information System (INIS)

    Kim, Miso; Hoegen, Mathias; Dugundji, John; Wardle, Brian L

    2010-01-01

    An electromechanically coupled model for a cantilevered piezoelectric energy harvester with a proof mass is presented. Proof masses are essential in microscale devices to move device resonances towards optimal frequency points for harvesting. Such devices with proof masses have not been rigorously modeled previously; instead, lumped mass or concentrated point masses at arbitrary points on the beam have been used. Thus, this work focuses on the exact vibration analysis of cantilevered energy harvester devices including a tip proof mass. The model is based not only on a detailed modal analysis, but also on a thorough investigation of damping ratios that can significantly affect device performance. A model with multiple degrees of freedom is developed and then reduced to a single-mode model, yielding convenient closed-form normalized predictions of device performance. In order to verify the analytical model, experimental tests are undertaken on a macroscale, symmetric, bimorph, piezoelectric energy harvester with proof masses of different geometries. The model accurately captures all aspects of the measured response, including the location of peak-power operating points at resonance and anti-resonance, and trends such as the dependence of the maximal power harvested on the frequency. It is observed that even a small change in proof mass geometry results in a substantial change of device performance due not only to the frequency shift, but also to the effect on the strain distribution along the device length. Future work will include the optimal design of devices for various applications, and quantification of the importance of nonlinearities (structural and piezoelectric coupling) for device performance

  9. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  10. Field Test and Performance Verification: Integrated Active Desiccant Rooftop Hybrid System Installed in a School - Final Report: Phase 4A

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J

    2005-12-21

    This report summarizes the results of a field verification pilot site investigation that involved the installation of a hybrid integrated active desiccant/vapor-compression rooftop heating, ventilation, and air-conditioning (HVAC) unit at an elementary school in the Atlanta Georgia area. For years, the school had experienced serious humidity and indoor air quality (IAQ) problems that had resulted in occupant complaints and microbial (mold) remediation. The outdoor air louvers of the original HVAC units had been closed in an attempt to improve humidity control within the space. The existing vapor compression variable air volume system was replaced by the integrated active desiccant rooftop (IADR) system that was described in detail in an Oak Ridge National Laboratory (ORNL) report published in 2004 (Fischer and Sand 2004). The IADR system and all space conditions have been monitored remotely for more than a year. The hybrid system was able to maintain both the space temperature and humidity as desired while delivering the outdoor air ventilation rate required by American Society of Heating, Refrigerating and Air-Conditioning Engineers Standard 62. The performance level of the IADR unit and the overall system energy efficiency was measured and found to be very high. A comprehensive IAQ investigation was completed by the Georgia Tech Research Institute before and after the system retrofit. Before-and-after data resulting from this investigation confirmed a significant improvement in IAQ, humidity control, and occupant comfort. These observations were reported by building occupants and are echoed in a letter to ORNL from the school district energy manager. The IADR system was easily retrofitted in place of the original rooftop system using a custom curb adapter. All work was completed in-house by the school's maintenance staff over one weekend. A subsequent cost analysis completed for the school district by the design engineer of record concluded that the IADR

  11. Validation and Verification of Operational Land Analysis Activities at the Air Force Weather Agency

    Science.gov (United States)

    Shaw, Michael; Kumar, Sujay V.; Peters-Lidard, Christa D.; Cetola, Jeffrey

    2012-01-01

    The NASA developed Land Information System (LIS) is the Air Force Weather Agency's (AFWA) operational Land Data Assimilation System (LDAS) combining real time precipitation observations and analyses, global forecast model data, vegetation, terrain, and soil parameters with the community Noah land surface model, along with other hydrology module options, to generate profile analyses of global soil moisture, soil temperature, and other important land surface characteristics. (1) A range of satellite data products and surface observations used to generate the land analysis products (2) Global, 1/4 deg spatial resolution (3) Model analysis generated at 3 hours. AFWA recognizes the importance of operational benchmarking and uncertainty characterization for land surface modeling and is developing standard methods, software, and metrics to verify and/or validate LIS output products. To facilitate this and other needs for land analysis activities at AFWA, the Model Evaluation Toolkit (MET) -- a joint product of the National Center for Atmospheric Research Developmental Testbed Center (NCAR DTC), AFWA, and the user community -- and the Land surface Verification Toolkit (LVT), developed at the Goddard Space Flight Center (GSFC), have been adapted to operational benchmarking needs of AFWA's land characterization activities.

  12. DAS performance analysis

    International Nuclear Information System (INIS)

    Bates, G.; Bodine, S.; Carroll, T.; Keller, M.

    1984-02-01

    This report begins with an overview of the Data Acquisition System (DAS), which supports several of PPPL's experimental devices. Performance measurements which were taken on DAS and the tools used to make them are then described

  13. Front-End Electronics for Verification Measurements: Performance Evaluation and Viability of Advanced Tamper Indicating Measures

    International Nuclear Information System (INIS)

    Smith, E.; Conrad, R.; Morris, S.; Ramuhalli, P.; Sheen, D.; Schanfein, M.; Ianakiev, K.; Browne, M.; Svoboda, J.

    2015-01-01

    The International Atomic Energy Agency (IAEA) continues to expand its use of unattended, remotely monitored measurement systems. An increasing number of systems and an expanding family of instruments create challenges in terms of deployment efficiency and the implementation of data authentication measures. A collaboration between Pacific Northwest National Laboratory (PNNL), Idaho National Laboratory (INL), and Los Alamos National Laboratory (LANL) is working to advance the IAEA's capabilities in these areas. The first objective of the project is to perform a comprehensive evaluation of a prototype front-end electronics package, as specified by the IAEA and procured from a commercial vendor. This evaluation begins with an assessment against the IAEA's original technical specifications and expands to consider the strengths and limitations over a broad range of important parameters that include: sensor types, cable types, and the spectrum of industrial electromagnetic noise that can degrade signals from remotely located detectors. A second objective of the collaboration is to explore advanced tamper-indicating (TI) measures that could help to address some of the long-standing data authentication challenges with IAEA's unattended systems. The collaboration has defined high-priority tampering scenarios to consider (e.g., replacement of sensor, intrusion into cable), and drafted preliminary requirements for advanced TI measures. The collaborators are performing independent TI investigations of different candidate approaches: active time-domain reflectometry (PNNL), passive noise analysis (INL), and pulse-by-pulse analysis and correction (LANL). The initial investigations focus on scenarios where new TI measures are retrofitted into existing IAEA UMS deployments; subsequent work will consider the integration of advanced TI methods into new IAEA UMS deployments where the detector is separated from the front-end electronics. In this paper, project progress

  14. Performance characteristics of an independent dose verification program for helical tomotherapy

    Directory of Open Access Journals (Sweden)

    Isaac C. F. Chang

    2017-01-01

    Full Text Available Helical tomotherapy with its advanced method of intensity-modulated radiation therapy delivery has been used clinically for over 20 years. The standard delivery quality assurance procedure to measure the accuracy of delivered radiation dose from each treatment plan to a phantom is time-consuming. RadCalc®, a radiotherapy dose verification software, has released specifically for beta testing a module for tomotherapy plan dose calculations. RadCalc®'s accuracy for tomotherapy dose calculations was evaluated through examination of point doses in ten lung and ten prostate clinical plans. Doses calculated by the TomoHDA™ tomotherapy treatment planning system were used as the baseline. For lung cases, RadCalc® overestimated point doses in the lung by an average of 13%. Doses within the spinal cord and esophagus were overestimated by 10%. Prostate plans showed better agreement, with overestimations of 6% in the prostate, bladder, and rectum. The systematic overestimation likely resulted from limitations of the pencil beam dose calculation algorithm implemented by RadCalc®. Limitations were more severe in areas of greater inhomogeneity and less prominent in regions of homogeneity with densities closer to 1 g/cm3. Recommendations for RadCalc® dose calculation algorithms and anatomical representation were provided based on the results of the study.

  15. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  16. UR10 Performance Analysis

    DEFF Research Database (Denmark)

    Ravn, Ole; Andersen, Nils Axel; Andersen, Thomas Timm

    While working with the UR-10 robot arm, it has become apparent that some commands have undesired behaviour when operating the robot arm through a socket connection, sending one command at a time. This report is a collection of the results optained when testing the performance of the different...

  17. Comparative Analysys of Speech Parameters for the Design of Speaker Verification Systems

    National Research Council Canada - National Science Library

    Souza, A

    2001-01-01

    Speaker verification systems are basically composed of three stages: feature extraction, feature processing and comparison of the modified features from speaker voice and from the voice that should be...

  18. Halal assurance in food supply chains: Verification of halal certificates using audits and laboratory analysis

    NARCIS (Netherlands)

    Spiegel, van der M.; Fels-Klerx, van der H.J.; Sterrenburg, P.; Ruth, van S.M.; Scholtens-Toma, I.M.J.; Kok, E.J.

    2012-01-01

    The global halal market is increasing. Worldwide a large number of standardization and certification organizations has been established. This article discusses halal requirements, summarizes applied standards and certification, and explores current verification of halal certificates using audits and

  19. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  20. Characteristics of a micro-fin evaporator: Theoretical analysis and experimental verification

    Directory of Open Access Journals (Sweden)

    Zheng Hui-Fan

    2013-01-01

    Full Text Available A theoretical analysis and experimental verification on the characteristics of a micro-fin evaporator using R290 and R717 as refrigerants were carried out. The heat capacity and heat transfer coefficient of the micro-fin evaporator were investigated under different water mass flow rate, different refrigerant mass flow rate, and different inner tube diameter of micro-fin evaporator. The simulation results of the heat transfer coefficient are fairly in good agreement with the experimental data. The results show that heat capacity and the heat transfer coefficient of the micro-fin evaporator increase with increasing logarithmic mean temperature difference, the water mass flow rate and the refrigerant mass flow rate. Heat capacity of the micro-fin evaporator for diameter 9.52 mm is higher than that of diameter 7.00 mm with using R290 as refrigerant. Heat capacity of the micro-fin evaporator with using R717 as refrigerant is higher than that of R290 as refrigerant. The results of this study can provide useful guidelines for optimal design and operation of micro-fin evaporator in its present or future applications.

  1. Waste package performance analysis

    International Nuclear Information System (INIS)

    Lester, D.H.; Stula, R.T.; Kirstein, B.E.

    1982-01-01

    A performance assessment model for multiple barrier packages containing unreprocessed spent fuel has been applied to several package designs. The resulting preliminary assessments were intended for use in making decisions about package development programs. A computer model called BARIER estimates the package life and subsequent rate of release of selected nuclides. The model accounts for temperature, pressure (and resulting stresses), bulk and localized corrosion, and nuclide retardation by the backfill after water intrusion into the waste form. The assessment model assumes a post-closure, flooded, geologic repository. Calculations indicated that, within the bounds of model assumptions, packages could last for several hundred years. Intact backfills of appropriate design may be capable of nuclide release delay times on the order of 10 7 yr for uranium, plutonium, and americium. 8 references, 6 figures, 9 tables

  2. Verification of computed tomographic estimates of cochlear implant array position: a micro-CT and histologic analysis.

    Science.gov (United States)

    Teymouri, Jessica; Hullar, Timothy E; Holden, Timothy A; Chole, Richard A

    2011-08-01

    To determine the efficacy of clinical computed tomographic (CT) imaging to verify postoperative electrode array placement in cochlear implant (CI) patients. Nine fresh cadaver heads underwent clinical CT scanning, followed by bilateral CI insertion and postoperative clinical CT scanning. Temporal bones were removed, trimmed, and scanned using micro-CT. Specimens were then dehydrated, embedded in either methyl methacrylate or LR White resin, and sectioned with a diamond wafering saw. Histology sections were examined by 3 blinded observers to determine the position of individual electrodes relative to soft tissue structures within the cochlea. Electrodes were judged to be within the scala tympani, scala vestibuli, or in an intermediate position between scalae. The position of the array could be estimated accurately from clinical CT scans in all specimens using micro-CT and histology as a criterion standard. Verification using micro-CT yielded 97% agreement, and histologic analysis revealed 95% agreement with clinical CT results. A composite, 3-dimensional image derived from a patient's preoperative and postoperative CT images using a clinical scanner accurately estimates the position of the electrode array as determined by micro-CT imaging and histologic analyses. Information obtained using the CT method provides valuable insight into numerous variables of interest to patient performance such as surgical technique, array design, and processor programming and troubleshooting.

  3. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  4. Performance analysis of switching systems

    NARCIS (Netherlands)

    Berg, van den R.A.

    2008-01-01

    Performance analysis is an important aspect in the design of dynamic (control) systems. Without a proper analysis of the behavior of a system, it is impossible to guarantee that a certain design satisfies the system’s requirements. For linear time-invariant systems, accurate performance analyses are

  5. Verification and validation of the THYTAN code for the graphite oxidation analysis in the HTGR systems

    International Nuclear Information System (INIS)

    Shimazaki, Yosuke; Isaka, Kazuyoshi; Nomoto, Yasunobu; Seki, Tomokazu; Ohashi, Hirofumi

    2014-12-01

    The analytical models for the evaluation of graphite oxidation were implemented into the THYTAN code, which employs the mass balance and a node-link computational scheme to evaluate tritium behavior in the High Temperature Gas-cooled Reactor (HTGR) systems for hydrogen production, to analyze the graphite oxidation during the air or water ingress accidents in the HTGR systems. This report describes the analytical models of the THYTAN code in terms of the graphite oxidation analysis and its verification and validation (V and V) results. Mass transfer from the gas mixture in the coolant channel to the graphite surface, diffusion in the graphite, graphite oxidation by air or water, chemical reaction and release from the primary circuit to the containment vessel by a safety valve were modeled to calculate the mass balance in the graphite and the gas mixture in the coolant channel. The computed solutions using the THYTAN code for simple questions were compared to the analytical results by a hand calculation to verify the algorithms for each implemented analytical model. A representation of the graphite oxidation experimental was analyzed using the THYTAN code, and the results were compared to the experimental data and the computed solutions using the GRACE code, which was used for the safety analysis of the High Temperature Engineering Test Reactor (HTTR), in regard to corrosion depth of graphite and oxygen concentration at the outlet of the test section to validate the analytical models of the THYTAN code. The comparison of THYTAN code results with the analytical solutions, experimental data and the GRACE code results showed the good agreement. (author)

  6. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    for states that have traditionally had 'less transparency' in their military sectors. As case studies, first we investigate how to applied verification measures including remote sensing, off-site environmental sampling and on-site inspections to monitor the shutdown status of plutonium production facilities, and what measures could be taken to prevent the disclosure of sensitive information at the site. We find the most effective verification measure to monitor the status of the reprocessing plant would be on-site environmental sampling. Some countries may worry that sample analysis could disclose sensitive information about their past plutonium production activities. However, we find that sample analysis at the reprocessing site need not reveal such information. Sampling would not reveal such information as long as inspectors are not able to measure total quantities of Cs-137 and Sr-90 from HLW produced at former military plutonium production facilities. Secondly, we consider verification measures for shutdown gaseous diffusion uranium-enrichment plants (GDPs). The GDPs could be monitored effectively by satellite imagery, as one telltale operational signature of the GDP would be the water-vapor plume coming from the cooling tower, which should be easy to detect with satellite images. Furthermore, the hot roof of the enrichment building could be detectable using satellite thermal-infrared images. Finally, some on-site verification measures should be allowed, such as visual observation, surveillance and tamper-indicating seals. Finally, FMCT verification regime would have to be designed to detect undeclared fissile material production activities and facilities. These verification measures could include something like special or challenge inspections or complementary access. There would need to be provisions to prevent the abuse of such inspections, especially at sensitive and non-proscribed military and nuclear activities. In particular, to protect sensitive

  7. On the Safety and Performance Demonstration Tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and Validation and Verification of Computational Codes

    Directory of Open Access Journals (Sweden)

    Jong-Bum Kim

    2016-10-01

    Full Text Available The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR has been developed and the validation and verification (V&V activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1, produced satisfactory results, which were used for the computer codes V&V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  8. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Reports: Final Comprehensive Performance Test Report, P/N: 1356006-1, S.N: 202/A2

    Science.gov (United States)

    Platt, R.

    1998-01-01

    This is the Performance Verification Report. the process specification establishes the requirements for the comprehensive performance test (CPT) and limited performance test (LPT) of the earth observing system advanced microwave sounding unit-A2 (EOS/AMSU-A2), referred to as the unit. The unit is defined on drawing 1356006.

  9. Content analysis of age verification, purchase and delivery methods of internet e-cigarette vendors, 2013 and 2014.

    Science.gov (United States)

    Williams, Rebecca S; Derrick, Jason; Liebman, Aliza Kate; LaFleur, Kevin; Ribisl, Kurt M

    2018-05-01

    Identify the population of internet e-cigarette vendors (IEVs) and conduct content analyses of their age verification, purchase and delivery methods in 2013 and 2014. We used multiple sources to identify IEV websites, primarily complex search algorithms scanning more than 180 million websites. In 2013, we manually screened 32 446 websites, identifying 980 IEVs, selecting the 281 most popular for content analysis. This methodology yielded 31 239 websites for screening in 2014, identifying 3096 IEVs, with 283 selected for content analysis. The proportion of vendors that sold online-only, with no retail store, dropped significantly from 2013 (74.7%) to 2014 (64.3%) (ponline age verification services (7.1% in 2013 and 8.5% in 2014), driving licences (1.8% in 2013 and 7.4% in 2014, ponline e-cigarette sales are needed, including strict age and identity verification requirements. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Performance Testing of Homeland Security Technologies in U.S. EPA's Environmental Technology Verification (ETV) Program

    National Research Council Canada - National Science Library

    Kelly, Thomas J; Hofacre, Kent C; Derringer, Tricia L; Riggs, Karen B; Koglin, Eric N

    2004-01-01

    ... (reports and test plans available at www.epa.gov/etv). In the aftermath of the terrorist attacks of September 11, 2001, the ETV approach has also been employed in performance tests of technologies relevant to homeland security (HS...

  11. Identification and verification of critical performance dimensions. Phase 1 of the systematic process redesign of drug distribution.

    Science.gov (United States)

    Colen, Hadewig B; Neef, Cees; Schuring, Roel W

    2003-06-01

    Worldwide patient safety has become a major social policy problem for healthcare organisations. As in other organisations, the patients in our hospital also suffer from an inadequate distribution process, as becomes clear from incident reports involving medication errors. Medisch Spectrum Twente is a top primary-care, clinical, teaching hospital. The hospital pharmacy takes care of 1070 internal beds and 1120 beds in an affiliated psychiatric hospital and nursing homes. In the beginning of 1999, our pharmacy group started a large interdisciplinary research project to develop a safe, effective and efficient drug distribution system by using systematic process redesign. The process redesign includes both organisational and technological components. This article describes the identification and verification of critical performance dimensions for the design of drug distribution processes in hospitals (phase 1 of the systematic process redesign of drug distribution). Based on reported errors and related causes, we suggested six generic performance domains. To assess the role of the performance dimensions, we used three approaches: flowcharts, interviews with stakeholders and review of the existing performance using time studies and medication error studies. We were able to set targets for costs, quality of information, responsiveness, employee satisfaction, and degree of innovation. We still have to establish what drug distribution system, in respect of quality and cost-effectiveness, represents the best and most cost-effective way of preventing medication errors. We intend to develop an evaluation model, using the critical performance dimensions as a starting point. This model can be used as a simulation template to compare different drug distribution concepts in order to define the differences in quality and cost-effectiveness.

  12. Analytical Performance Verification of FCS-MPC Applied to Power Electronic Converters

    DEFF Research Database (Denmark)

    Novak, Mateja; Dragicevic, Tomislav; Blaabjerg, Frede

    2017-01-01

    Since the introduction of finite control set model predictive control (FCS-MPC) in power electronics the algorithm has been missing an important aspect that would speed up its implementation in industry: a simple method to verify the algorithm performance. This paper proposes to use a statistical...... model checking (SMC) method for performance evaluation of the algorithm applied to power electronics converters. SMC is simple to implement, intuitive and it requires only an operational model of the system that can be simulated and checked against properties. Device under test for control algorithm...

  13. Improving Speaker Verification Performance in Presence of Spoofing Attacks Using Out-of-Domain Spoofed Data

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Sahidullah, Md; Tan, Zheng-Hua

    2017-01-01

    of the two systems is challenging and often leads to increased false rejection rates. Furthermore, the performance of CM severely degrades if in-domain development data are unavailable. In this study, therefore, we propose a solution that uses two separate background models – one from human speech...

  14. International Performance Measurement & Verification Protocol: Concepts and Practices for Improved Indoor Environmental Quality, Volume II (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    2002-03-01

    This protocol serves as a framework to determine energy and water savings resulting from the implementation of an energy efficiency program. It is also intended to help monitor the performance of renewable energy systems and to enhance indoor environmental quality in buildings.

  15. Design and performance verification of advanced multistage depressed collectors. [traveling wave tubes for ECM

    Science.gov (United States)

    Kosmahl, H.; Ramins, P.

    1975-01-01

    Design and performance of a small size, 4-stage depressed collector are discussed. The collector and a spent beam refocusing section preceding it are intended for efficiency enhancement of octave bandwidth, high CW power traveling wave tubes for use in ECM.

  16. Study and characterization of arrays of detectors for dosimetric verification of radiotherapy, analysis of business solutions

    International Nuclear Information System (INIS)

    Gago Arias, A.; Brualla Gonzalez, L.; Gomez Rodriguez, F.; Gonzalez Castano, D. M.; Pardo Montero, J.; Luna Vega, V.; Mosquera Sueiro, J.; Sanchez Garcia, M.

    2011-01-01

    This paper presents a comparative study of the detector arrays developed by different business houses to the demand for devices that speed up the verification process. Will analyze the effect of spatial response of individual detectors in the measurement of dose distributions, modeling the same and analyzing the ability of the arrays to detect variations in a treatment yield.

  17. The dynamic flowgraph methodology as a safety analysis tool : programmable electronic system design and verification

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2002-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DFM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DFM, and

  18. Introduction to the Special Issue on Specification Analysis and Verification of Reactive Systems

    NARCIS (Netherlands)

    Delzanno, Giorgio; Etalle, Sandro; Gabbrielli, Maurizio

    2006-01-01

    This special issue is inspired by the homonymous ICLP workshops that took place during ICLP 2001 and ICLP 2002. Extending and shifting slightly from the scope of their predecessors (on verification and logic languages) held in the context of previous editions of ICLP, the aim of the SAVE workshops

  19. Notes on human performance analysis

    International Nuclear Information System (INIS)

    Hollnagel, E.; Pedersen, O.M.; Rasmussen, J.

    1981-06-01

    This paper contains a framework for the integration of observation and analysis of human performance in nuclear environments - real or simulated. It identifies four main sources of data, and describes the characteristic data types and methods of analysis for each source in relation to a common conceptual background. The general conclusion is that it is highly useful to combine the knowledge and experience from different contexts into coherent picture of how nuclear operators perform under varying circumstances. (author)

  20. Development and Performance Verification of Fiber Optic Temperature Sensors in High Temperature Engine Environments

    Science.gov (United States)

    Adamovsky, Grigory; Mackey, Jeffrey R.; Kren, Lawrence A.; Floyd, Bertram M.; Elam, Kristie A.; Martinez, Martel

    2014-01-01

    A High Temperature Fiber Optic Sensor (HTFOS) has been developed at NASA Glenn Research Center for aircraft engine applications. After fabrication and preliminary in-house performance evaluation, the HTFOS was tested in an engine environment at NASA Armstrong Flight Research Center. The engine tests enabled the performance of the HTFOS in real engine environments to be evaluated along with the ability of the sensor to respond to changes in the engine's operating condition. Data were collected prior, during, and after each test in order to observe the change in temperature from ambient to each of the various test point levels. An adequate amount of data was collected and analyzed to satisfy the research team that HTFOS operates properly while the engine was running. Temperature measurements made by HTFOS while the engine was running agreed with those anticipated.

  1. Description and performance characteristics for the neutron Coincidence Collar for the verification of reactor fuel assemblies

    International Nuclear Information System (INIS)

    Menlove, H.O.

    1981-08-01

    An active neutron interrogation method has been developed for the measurement of 235 U content in fresh fuel assemblies. The neutron Coincidence Collar uses neutron interrogation with an AmLi neutron source and coincidence counting the induced fission reaction neutrons from the 235 U. This manual describes the system components, operation, and performance characteristics. Applications of the Coincidence Collar to PWR and BWR types of reactor fuel assemblies are described

  2. Lay out, test verification and in orbit performance of HELIOS a temperature control system

    Science.gov (United States)

    Brungs, W.

    1975-01-01

    HELIOS temperature control system is described. The main design features and the impact of interactions between experiment, spacecraft system, and temperature control system requirements on the design are discussed. The major limitations of the thermal design regarding a closer sun approach are given and related to test experience and performance data obtained in orbit. Finally the validity of the test results achieved with prototype and flight spacecraft is evaluated by comparison between test data, orbit temperature predictions and flight data.

  3. Presentation and verification of a simple mathematical model foridentification of the areas behind noise barrierwith the highest performance

    Directory of Open Access Journals (Sweden)

    M. Monazzam

    2009-07-01

    Full Text Available Background and aims   Traffic noise barriers are the most important measure to control the environmental noise pollution. Diffraction from top edge of noise barriers is the most important path of indirect sound wave moves towards receiver.Therefore, most studies are focused on  improvement of this kind.   Methods   T-shape profile barriers are one of the most successful barrier among many different profiles. In this investigation the theory of destructive effect of diffracted waves from real edge of barrier and the wave diffracted from image of the barrier with phase difference of radians is used. Firstly a simple mathematical representation of the zones behind rigid and absorbent T- shape barriers with the highest insertion loss using the destructive effect of indirect path via barrier  image is introduced and then two different profile reflective and absorption barrier is used for  verification of the introduced model   Results   The results are then compared with the results of a verified two dimensional boundary element method at 1/3 octave band frequencies and in a wide field behind those barriers. Avery good agreement between the results has been achieved. In this method effective height is used for any different profile barriers.   Conclusion   The introduced model is very simple, flexible and fast and could be used for choosing the best location of profile rigid and absorptive barriers to achieve the highest  performance.  

  4. Verification of the 2.00 WAPPA-B [Waste Package Performance Assessment-B version] code

    International Nuclear Information System (INIS)

    Tylock, B.; Jansen, G.; Raines, G.E.

    1987-07-01

    The old version of the Waste Package Performance Assessment (WAPPA) code has been modified into a new code version, 2.00 WAPPA-B. The input files and the results for two benchmarks at repository conditions are fully documented in the appendixes of the EA reference report. The 2.00 WAPPA-B version of the code is suitable for computation of barrier failure due to uniform corrosion; however, an improved sub-version, 2.01 WAPPA-B, is recommended for general use due to minor errors found in 2.00 WAPPA-B during its verification procedures. The input files and input echoes have been modified to include behavior of both radionuclides and elements, but the 2.00 WAPPA-B version of the WAPPA code is not recommended for computation of radionuclide releases. The 2.00 WAPPA-B version computes only mass balances and the initial presence of radionuclides that can be released. Future code development in the 3.00 WAPPA-C version will include radionuclide release computations. 19 refs., 10 figs., 1 tab

  5. Verification of LOCA/ECCS analysis codes ALARM-B2 and THYDE-B1 by comparison with RELAP4/MOD6/U4/J3

    International Nuclear Information System (INIS)

    Shimizu, Takashi

    1982-08-01

    For a verification study of ALARM-B2 code and THYDE-B1 code which are the component of the JAERI code system for evaluation of BWR ECCS performance, calculations for typical small and large break LOCA in BWR were done, and compared with those by RELAP4/MOD6/U4/J3 code. This report describes the influences of differences between the analytical models incorporated in the individual code and the problems identified by this verification study. (author)

  6. Grazing Incidence Wavefront Sensing and Verification of X-Ray Optics Performance

    Science.gov (United States)

    Saha, Timo T.; Rohrbach, Scott; Zhang, William W.

    2011-01-01

    Evaluation of interferometrically measured mirror metrology data and characterization of a telescope wavefront can be powerful tools in understanding of image characteristics of an x-ray optical system. In the development of soft x-ray telescope for the International X-Ray Observatory (IXO), we have developed new approaches to support the telescope development process. Interferometrically measuring the optical components over all relevant spatial frequencies can be used to evaluate and predict the performance of an x-ray telescope. Typically, the mirrors are measured using a mount that minimizes the mount and gravity induced errors. In the assembly and mounting process the shape of the mirror segments can dramatically change. We have developed wavefront sensing techniques suitable for the x-ray optical components to aid us in the characterization and evaluation of these changes. Hartmann sensing of a telescope and its components is a simple method that can be used to evaluate low order mirror surface errors and alignment errors. Phase retrieval techniques can also be used to assess and estimate the low order axial errors of the primary and secondary mirror segments. In this paper we describe the mathematical foundation of our Hartmann and phase retrieval sensing techniques. We show how these techniques can be used in the evaluation and performance prediction process of x-ray telescopes.

  7. Thermal Power Plant Performance Analysis

    CERN Document Server

    2012-01-01

    The analysis of the reliability and availability of power plants is frequently based on simple indexes that do not take into account the criticality of some failures used for availability analysis. This criticality should be evaluated based on concepts of reliability which consider the effect of a component failure on the performance of the entire plant. System reliability analysis tools provide a root-cause analysis leading to the improvement of the plant maintenance plan.   Taking in view that the power plant performance can be evaluated not only based on  thermodynamic related indexes, such as heat-rate, Thermal Power Plant Performance Analysis focuses on the presentation of reliability-based tools used to define performance of complex systems and introduces the basic concepts of reliability, maintainability and risk analysis aiming at their application as tools for power plant performance improvement, including: ·         selection of critical equipment and components, ·         defini...

  8. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  9. Confidence Intervals Verification for Simulated Error Rate Performance of Wireless Communication System

    KAUST Repository

    Smadi, Mahmoud A.

    2012-12-06

    In this paper, we derived an efficient simulation method to evaluate the error rate of wireless communication system. Coherent binary phase-shift keying system is considered with imperfect channel phase recovery. The results presented demonstrate the system performance under very realistic Nakagami-m fading and additive white Gaussian noise channel. On the other hand, the accuracy of the obtained results is verified through running the simulation under a good confidence interval reliability of 95 %. We see that as the number of simulation runs N increases, the simulated error rate becomes closer to the actual one and the confidence interval difference reduces. Hence our results are expected to be of significant practical use for such scenarios. © 2012 Springer Science+Business Media New York.

  10. Analytical model for performance verification of liquid poison injection system of a nuclear reactor

    International Nuclear Information System (INIS)

    Kansal, Anuj Kumar; Maheshwari, Naresh Kumar; Vijayan, Pallippattu Krishnan

    2014-01-01

    Highlights: • One-dimensional modelling of shut down system-2. • Semi-empirical correlation poison jet progression. • Validation of code. - Abstract: Shut down system-2 (SDS-2) in advanced vertical pressure tube type reactor, provides rapid reactor shutdown by high pressure injection of a neutron absorbing liquid called poison, into the moderator in the calandria. Poison inside the calandria is distributed by poison jets issued from holes provided in the injection tubes. Effectiveness of the system depends on the rate and spread of the poison in the moderator. In this study, a transient one-dimensional (1D) hydraulic code, COPJET is developed, to predict the performance of system by predicting progression of poison jet with time. Validation of the COPJET is done with the data available in literature. Thereafter, it is applied for advanced vertical pressure type reactor

  11. Performance test and verification of an off-the-shelf automated avian radar tracking system.

    Science.gov (United States)

    May, Roel; Steinheim, Yngve; Kvaløy, Pål; Vang, Roald; Hanssen, Frank

    2017-08-01

    Microwave radar is an important tool for observation of birds in flight and represents a tremendous increase in observation capability in terms of amount of surveillance space that can be covered at relatively low cost. Based on off-the-shelf radar hardware, automated radar tracking systems have been developed for monitoring avian movements. However, radar used as an observation instrument in biological research has its limitations that are important to be aware of when analyzing recorded radar data. This article describes a method for exploring the detection capabilities of a dedicated short-range avian radar system used inside the operational Smøla wind-power plant. The purpose of the testing described was to find the maximum detection range for various sized birds, while controlling for the effects of flight tortuosity, flight orientation relative to the radar and ground clutter. The method was to use a dedicated test target in form of a remotely controlled unmanned aerial vehicle (UAV) with calibrated radar cross section (RCS), which enabled the design of virtually any test flight pattern within the area of interest. The UAV had a detection probability of 0.5 within a range of 2,340 m from the radar. The detection performance obtained by the RCS-calibrated test target (-11 dBm 2 , 0.08 m 2 RCS) was then extrapolated to find the corresponding performance of differently sized birds. Detection range depends on system sensitivity, the environment within which the radar is placed and the spatial distribution of birds. The avian radar under study enables continuous monitoring of bird activity within a maximum range up to 2 km dependent on the size of the birds in question. While small bird species may be detected up to 0.5-1 km, larger species may be detected up to 1.5-2 km distance from the radar.

  12. Laboratory Testing and Performance Verification of the CHARIS Integral Field Spectrograph

    Science.gov (United States)

    Groff, Tyler D.; Chilcote, Jeffrey; Kasdin, N. Jeremy; Galvin, Michael; Loomis, Craig; Carr, Michael A.; Brandt, Timothy; Knapp, Gillian; Limbach, Mary Anne; Guyon, Olivier; hide

    2016-01-01

    The Coronagraphic High Angular Resolution Imaging Spectrograph (CHARIS) is an integral field spectrograph (IFS) that has been built for the Subaru telescope. CHARIS has two imaging modes; the high-resolution mode is R82, R69, and R82 in J, H, and K bands respectively while the low-resolution discovery mode uses a second low-resolution prism with R19 spanning 1.15-2.37 microns (J+H+K bands). The discovery mode is meant to augment the low inner working angle of the Subaru Coronagraphic Extreme Adaptive Optics (SCExAO) adaptive optics system, which feeds CHARIS a coronagraphic image. The goal is to detect and characterize brown dwarfs and hot Jovian planets down to contrasts five orders of magnitude dimmer than their parent star at an inner working angle as low as 80 milliarcseconds. CHARIS constrains spectral crosstalk through several key aspects of the optical design. Additionally, the repeatability of alignment of certain optical components is critical to the calibrations required for the data pipeline. Specifically the relative alignment of the lens let array, prism, and detector must be highly stable and repeatable between imaging modes. We report on the measured repeatability and stability of these mechanisms, measurements of spectral crosstalk in the instrument, and the propagation of these errors through the data pipeline. Another key design feature of CHARIS is the prism, which pairs Barium Fluoride with Ohara L-BBH2 high index glass. The dispersion of the prism is significantly more uniform than other glass choices, and the CHARIS prisms represent the first NIR astronomical instrument that uses L-BBH2as the high index material. This material choice was key to the utility of the discovery mode, so significant efforts were put into cryogenic characterization of the material. The final performance of the prism assemblies in their operating environment is described in detail. The spectrograph is going through final alignment, cryogenic cycling, and is being

  13. Performance verification of network function virtualization in software defined optical transport networks

    Science.gov (United States)

    Zhao, Yongli; Hu, Liyazhou; Wang, Wei; Li, Yajie; Zhang, Jie

    2017-01-01

    With the continuous opening of resource acquisition and application, there are a large variety of network hardware appliances deployed as the communication infrastructure. To lunch a new network application always implies to replace the obsolete devices and needs the related space and power to accommodate it, which will increase the energy and capital investment. Network function virtualization1 (NFV) aims to address these problems by consolidating many network equipment onto industry standard elements such as servers, switches and storage. Many types of IT resources have been deployed to run Virtual Network Functions (vNFs), such as virtual switches and routers. Then how to deploy NFV in optical transport networks is a of great importance problem. This paper focuses on this problem, and gives an implementation architecture of NFV-enabled optical transport networks based on Software Defined Optical Networking (SDON) with the procedure of vNFs call and return. Especially, an implementation solution of NFV-enabled optical transport node is designed, and a parallel processing method for NFV-enabled OTN nodes is proposed. To verify the performance of NFV-enabled SDON, the protocol interaction procedures of control function virtualization and node function virtualization are demonstrated on SDON testbed. Finally, the benefits and challenges of the parallel processing method for NFV-enabled OTN nodes are simulated and analyzed.

  14. arXiv Performance verification of the CMS Phase-1 Upgrade Pixel detector

    CERN Document Server

    Veszpremi, Viktor

    2017-12-04

    The CMS tracker consists of two tracking systems utilizing semiconductor technology: the inner pixel and the outer strip detectors. The tracker detectors occupy the volume around the beam interaction region between 3 cm and 110 cm in radius and up to 280 cm along the beam axis. The pixel detector consists of 124 million pixels, corresponding to about 2 m 2 total area. It plays a vital role in the seeding of the track reconstruction algorithms and in the reconstruction of primary interactions and secondary decay vertices. It is surrounded by the strip tracker with 10 million read-out channels, corresponding to 200 m 2 total area. The tracker is operated in a high-occupancy and high-radiation environment established by particle collisions in the LHC . The current strip detector continues to perform very well. The pixel detector that has been used in Run 1 and in the first half of Run 2 was, however, replaced with the so-called Phase-1 Upgrade detector. The new system is better suited to match the increased inst...

  15. SU-E-T-350: Verification of Gating Performance of a New Elekta Gating Solution: Response Kit and Catalyst System

    Energy Technology Data Exchange (ETDEWEB)

    Xie, X; Cao, D; Housley, D; Mehta, V; Shepard, D [Swedish Cancer Institute, Seattle, WA (United States)

    2014-06-01

    Purpose: In this work, we have tested the performance of new respiratory gating solutions for Elekta linacs. These solutions include the Response gating and the C-RAD Catalyst surface mapping system.Verification measurements have been performed for a series of clinical cases. We also examined the beam on latency of the system and its impact on delivery efficiency. Methods: To verify the benefits of tighter gating windows, a Quasar Respiratory Motion Platform was used. Its vertical-motion plate acted as a respiration surrogate and was tracked by the Catalyst system to generate gating signals. A MatriXX ion-chamber array was mounted on its longitudinal-moving platform. Clinical plans are delivered to a stationary and moving Matrix array at 100%, 50% and 30% gating windows and gamma scores were calculated comparing moving delivery results to the stationary result. It is important to note that as one moves to tighter gating windows, the delivery efficiency will be impacted by the linac's beam-on latency. Using a specialized software package, we generated beam-on signals of lengths of 1000ms, 600ms, 450ms, 400ms, 350ms and 300ms. As the gating windows get tighter, one can expect to reach a point where the dose rate will fall to nearly zero, indicating that the gating window is close to beam-on latency. A clinically useful gating window needs to be significantly longer than the latency for the linac. Results: As expected, the use of tighter gating windows improved delivery accuracy. However, a lower limit of the gating window, largely defined by linac beam-on latency, exists at around 300ms. Conclusion: The Response gating kit, combined with the C-RAD Catalyst, provides an effective solution for respiratorygated treatment delivery. Careful patient selection, gating window design, even visual/audio coaching may be necessary to ensure both delivery quality and efficiency. This research project is funded by Elekta.

  16. Development and verification of a high performance multi-group SP3 transport capability in the ARTEMIS core simulator

    International Nuclear Information System (INIS)

    Van Geemert, Rene

    2008-01-01

    For satisfaction of future global customer needs, dedicated efforts are being coordinated internationally and pursued continuously at AREVA NP. The currently ongoing CONVERGENCE project is committed to the development of the ARCADIA R next generation core simulation software package. ARCADIA R will be put to global use by all AREVA NP business regions, for the entire spectrum of core design processes, licensing computations and safety studies. As part of the currently ongoing trend towards more sophisticated neutronics methodologies, an SP 3 nodal transport concept has been developed for ARTEMIS which is the steady-state and transient core simulation part of ARCADIA R . For enabling a high computational performance, the SP N calculations are accelerated by applying multi-level coarse mesh re-balancing. In the current implementation, SP 3 is about 1.4 times as expensive computationally as SP 1 (diffusion). The developed SP 3 solution concept is foreseen as the future computational workhorse for many-group 3D pin-by-pin full core computations by ARCADIA R . With the entire numerical workload being highly parallelizable through domain decomposition techniques, associated CPU-time requirements that adhere to the efficiency needs in the nuclear industry can be expected to become feasible in the near future. The accuracy enhancement obtainable by using SP 3 instead of SP 1 has been verified by a detailed comparison of ARTEMIS 16-group pin-by-pin SP N results with KAERI's DeCart reference results for the 2D pin-by-pin Purdue UO 2 /MOX benchmark. This article presents the accuracy enhancement verification and quantifies the achieved ARTEMIS-SP 3 computational performance for a number of 2D and 3D multi-group and multi-box (up to pin-by-pin) core computations. (authors)

  17. Groundwater flow code verification ''benchmarking'' activity (COVE-2A): Analysis of participants' work

    International Nuclear Information System (INIS)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project

  18. Status on development and verification of reactivity initiated accident analysis code for PWR (NODAL3)

    International Nuclear Information System (INIS)

    Peng Hong Liem; Surian Pinem; Tagor Malem Sembiring; Tran Hoai Nam

    2015-01-01

    A coupled neutronics thermal-hydraulics code NODAL3 has been developed based on the nodal few-group neutron diffusion theory in 3-dimensional Cartesian geometry for a typical pressurized water reactor (PWR) static and transient analyses, especially for reactivity initiated accidents (RIA). The spatial variables are treated by using a polynomial nodal method (PNM) while for the neutron dynamic solver the adiabatic and improved quasi-static methods are adopted. A simple single channel thermal-hydraulics module and its steam table is implemented into the code. Verification works on static and transient benchmarks are being conducting to assess the accuracy of the code. For the static benchmark verification, the IAEA-2D, IAEA-3D, BIBLIS and KOEBERG light water reactor (LWR) benchmark problems were selected, while for the transient benchmark verification, the OECD NEACRP 3-D LWR Core Transient Benchmark and NEA-NSC 3-D/1-D PWR Core Transient Benchmark (Uncontrolled Withdrawal of Control Rods at Zero Power). Excellent agreement of the NODAL3 results with the reference solutions and other validated nodal codes was confirmed. (author)

  19. TACO: fuel pin performance analysis

    International Nuclear Information System (INIS)

    Stoudt, R.H.; Buchanan, D.T.; Buescher, B.J.; Losh, L.L.; Wilson, H.W.; Henningson, P.J.

    1977-08-01

    The thermal performance of fuel in an LWR during its operational lifetime must be described for LOCA analysis as well as for other safety analyses. The determination of stored energy in the LOCA analysis, for example, requires a conservative fuel pin thermal performance model that is capable of calculating fuel and cladding behavior, including the gap conductance between the fuel and cladding, as a function of burnup. The determination of parameters that affect the fuel and cladding performance, such as fuel densification, fission gas release, cladding dimensional changes, fuel relocation, and thermal expansion, should be accounted for in the model. Babcock and Wilcox (B and W) has submitted a topical report, BAW-10087P, December 1975, which describes their thermal performance model TACO. A summary of the elements that comprise the TACO model and an evaluation are presented

  20. [Determinants of task preferences when performance is indicative of individual characteristics: self-assessment motivation and self-verification motivation].

    Science.gov (United States)

    Numazaki, M; Kudo, E

    1995-04-01

    The present study was conducted to examine determinants of information-gathering behavior with regard to one's own characteristics. Four tasks with different self-congruent and incongruent diagnosticity were presented to subjects. As self-assessment theory predicted, high diagnostic tasks were preferred to low tasks. And as self-verification theory predicted, self-congruent diagnosticity had a stronger effect on task preference than self-incongruent diagnosticity. In addition, subjects who perceived the relevant characteristics important inclined to choose self-assessment behavior more than who did not. Also, subjects who were certain of their self-concept inclined to choose self-verification behavior more than who were not. These results suggest that both self-assessment and self-verification motivations play important roles in information-gathering behavior regarding one's characteristics, and strength of the motivations is determined by the importance of relevant characteristics or the certainty of self-concept.

  1. Shift Performance Test and Analysis of Multipurpose Vehicle

    Directory of Open Access Journals (Sweden)

    Can Yang

    2014-08-01

    Full Text Available This paper presented an analysis of the gear shifting performances of a multipurpose vehicle transmission in driving condition by Ricardo's Gear Shift Quality Assessment (GSQA system. The performances of the transmission included the travel and effort of the gear shift lever and synchronizing time. The mathematic models of the transmission including the gear shift mechanism and synchronizer were developed in MATLAB. The model of the gear shift mechanism was developed to analyze the travel map of the gear shift lever and the model of the synchronizer was developed to obtain the force-time curve of the synchronizer during the slipping time. The model of the synchronizer was used to investigate the relationship between the performances of the transmission and the variation of parameters during gear shifting. The mathematic models of the gear shift mechanism and the synchronizer provided a rapid design and verification method for the transmission with ring spring.

  2. Fuel performance analysis code 'FAIR'

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1994-01-01

    For modelling nuclear reactor fuel rod behaviour of water cooled reactors under severe power maneuvering and high burnups, a mechanistic fuel performance analysis code FAIR has been developed. The code incorporates finite element based thermomechanical module, physically based fission gas release module and relevant models for modelling fuel related phenomena, such as, pellet cracking, densification and swelling, radial flux redistribution across the pellet due to the build up of plutonium near the pellet surface, pellet clad mechanical interaction/stress corrosion cracking (PCMI/SSC) failure of sheath etc. The code follows the established principles of fuel rod analysis programmes, such as coupling of thermal and mechanical solutions along with the fission gas release calculations, analysing different axial segments of fuel rod simultaneously, providing means for performing local analysis such as clad ridging analysis etc. The modular nature of the code offers flexibility in affecting modifications easily to the code for modelling MOX fuels and thorium based fuels. For performing analysis of fuel rods subjected to very long power histories within a reasonable amount of time, the code has been parallelised and is commissioned on the ANUPAM parallel processing system developed at Bhabha Atomic Research Centre (BARC). (author). 37 refs

  3. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V ampersand V) manual. Volume 9

    International Nuclear Information System (INIS)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M.

    1995-03-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V ampersand V of successive versions of SAPHIRE. Previous efforts have been the V ampersand V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V ampersand V plan is based on the SAPHIRE 4.0 V ampersand V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified

  4. Comparability of the performance of in-line computer vision for geometrical verification of parts, produced by Additive Manufacturing

    DEFF Research Database (Denmark)

    Pedersen, David B.; Hansen, Hans N.

    2014-01-01

    The field of Additive Manufacturing is growing at an accelerated rate, as prototyping is left in favor of direct manufacturing of components for the industry and consumer. A consequence of masscustomization and component complexity is an adverse geometrical verification challenge. Mass...

  5. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF THE INDUSTRIAL TEST SYSTEM, INC. CYANIDE REAGENTSTRIP™ TEST KIT

    Science.gov (United States)

    Cyanide can be present in various forms in water. The cyanide test kit evaluated in this verification study (Industrial Test System, Inc. Cyanide Regent Strip ™ Test Kit) was designed to detect free cyanide in water. This is done by converting cyanide in water to cyanogen...

  6. Spacecraft Multiple Array Communication System Performance Analysis

    Science.gov (United States)

    Hwu, Shian U.; Desilva, Kanishka; Sham, Catherine C.

    2010-01-01

    The Communication Systems Simulation Laboratory (CSSL) at the NASA Johnson Space Center is tasked to perform spacecraft and ground network communication system simulations, design validation, and performance verification. The CSSL has developed simulation tools that model spacecraft communication systems and the space and ground environment in which the tools operate. In this paper, a spacecraft communication system with multiple arrays is simulated. Multiple array combined technique is used to increase the radio frequency coverage and data rate performance. The technique is to achieve phase coherence among the phased arrays to combine the signals at the targeting receiver constructively. There are many technical challenges in spacecraft integration with a high transmit power communication system. The array combining technique can improve the communication system data rate and coverage performances without increasing the system transmit power requirements. Example simulation results indicate significant performance improvement can be achieved with phase coherence implementation.

  7. Performance verification and comparison of TianLong automatic hypersensitive hepatitis B virus DNA quantification system with Roche CAP/CTM system.

    Science.gov (United States)

    Li, Ming; Chen, Lin; Liu, Li-Ming; Li, Yong-Li; Li, Bo-An; Li, Bo; Mao, Yuan-Li; Xia, Li-Fang; Wang, Tong; Liu, Ya-Nan; Li, Zheng; Guo, Tong-Sheng

    2017-10-07

    To investigate and compare the analytical and clinical performance of TianLong automatic hypersensitive hepatitis B virus (HBV) DNA quantification system and Roche CAP/CTM system. Two hundred blood samples for HBV DNA testing, HBV-DNA negative samples and high-titer HBV-DNA mixture samples were collected and prepared. National standard materials for serum HBV and a worldwide HBV DNA panel were employed for performance verification. The analytical performance, such as limit of detection, limit of quantification, accuracy, precision, reproducibility, linearity, genotype coverage and cross-contamination, was determined using the TianLong automatic hypersensitive HBV DNA quantification system (TL system). Correlation and Bland-Altman plot analyses were carried out to compare the clinical performance of the TL system assay and the CAP/CTM system. The detection limit of the TL system was 10 IU/mL, and its limit of quantification was 30 IU/mL. The differences between the expected and tested concentrations of the national standards were less than ± 0.4 Log 10 IU/mL, which showed high accuracy of the system. Results of the precision, reproducibility and linearity tests showed that the multiple test coefficient of variation (CV) of the same sample was less than 5% for 10 2 -10 6 IU/mL; and for 30-10 8 IU/mL, the linear correlation coefficient r 2 = 0.99. The TL system detected HBV DNA (A-H) genotypes and there was no cross-contamination during the "checkerboard" test. When compared with the CAP/CTM assay, the two assays showed 100% consistency in both negative and positive sample results (15 negative samples and 185 positive samples). No statistical differences between the two assays in the HBV DNA quantification values were observed ( P > 0.05). Correlation analysis indicated a significant correlation between the two assays, r 2 = 0.9774. The Bland-Altman plot analysis showed that 98.9% of the positive data were within the 95% acceptable range, and the maximum difference

  8. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.

  9. Unmanned Aircraft Systems Minimum Operations Performance Standards End-to-End Verification and Validation (E2-V2) Simulation

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Vincent, Michael J.; Sturdy, James L.; Munoz, Cesar A.; Hoffler, Keith D.; Dutle, Aaron M.; Myer, Robert R.; Dehaven, Anna M.; hide

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The current NAS relies on pilot's vigilance and judgement to remain Well Clear (CFR 14 91.113) of other aircraft. RTCA SC-228 has defined DAA Well Clear (DAAWC) to provide a quantified Well Clear volume to allow systems to be designed and measured against. Extended research efforts have been conducted to understand and quantify system requirements needed to support a UAS pilot's ability to remain well clear of other aircraft. The efforts have included developing and testing sensor, algorithm, alerting, and display requirements. More recently, sensor uncertainty and uncertainty mitigation strategies have been evaluated. This paper discusses results and lessons learned from an End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS). NASA Langley Research Center (LaRC) was called upon to develop a system that evaluates a specific set of encounters, in a variety of geometries, with end-to-end DAA functionality including the use of sensor and tracker models, a sensor uncertainty mitigation model, DAA algorithmic guidance in both vertical and horizontal maneuvering, and a pilot model which maneuvers the ownship aircraft to remain well clear from intruder aircraft, having received collective input from the previous modules of the system. LaRC developed a functioning batch simulation and added a sensor/tracker model from the Federal Aviation Administration (FAA) William J. Hughes Technical Center, an in-house developed sensor uncertainty mitigation strategy, and implemented a pilot

  10. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  11. Development and verification of local/global analysis techniques for laminated composites

    Science.gov (United States)

    Griffin, O. Hayden, Jr.

    1989-01-01

    Analysis and design methods for laminated composite materials have been the subject of considerable research over the past 20 years, and are currently well developed. In performing the detailed three-dimensional analyses which are often required in proximity to discontinuities, however, analysts often encounter difficulties due to large models. Even with the current availability of powerful computers, models which are too large to run, either from a resource or time standpoint, are often required. There are several approaches which can permit such analyses, including substructuring, use of superelements or transition elements, and the global/local approach. This effort is based on the so-called zoom technique to global/local analysis, where a global analysis is run, with the results of that analysis applied to a smaller region as boundary conditions, in as many iterations as is required to attain an analysis of the desired region. Before beginning the global/local analyses, it was necessary to evaluate the accuracy of the three-dimensional elements currently implemented in the Computational Structural Mechanics (CSM) Testbed. It was also desired to install, using the Experimental Element Capability, a number of displacement formulation elements which have well known behavior when used for analysis of laminated composites.

  12. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    be done by comparing simulation results with: - actual test results, - results obtained from other functionally validated simulation tools, supplemented if necessary by expert analysis. The documents used for functional validation shall be properly referenced. To ensure proper use of qualified software, a user guide shall be written. Finally, the people carrying out studies with the software shall be adequately trained, certified and supervised. Quality audits shall be performed periodically to check the validity of the tool qualification over time as well as its proper use. (author)

  13. Experiment and analysis of CASTOR type model cask for verification of radiation shielding

    Energy Technology Data Exchange (ETDEWEB)

    Hattori, Seiichi; Ueki, Kohtaro.

    1988-08-01

    The radiation shielding system of CASTOR type cask is composed of the graphite cast iron and the polyethylene lod. The former fomes the cylndrical body of the cask to shield gamma rays and the latter is embeded in the body to shield neutrons. Characteristic of radiation shielding of CASTOR type cask is that zigzag arrangement of the polyethylene lod is adopted to unify the penetrating dose rate. It is necessary to use the three-dimensional analysis code to analyse the shielding performance of the cask with the complicated shielding system precisely. However, it takes too much time as well as too much cost. Therefore, the two-dimensional analysis is usually applied, in which the three-dimensional model is equivalently transformed into the two-dimensional calculation. The reseach study was conducted to verify the application of the two-dimensional analysis, in which the experiment and the analysis using CASTOR type model cask was perfomed. The model cask was manufactured by GNS campany in West Germany and the shielding ability test facilities in CRIEPI were used. It was judged from the study that the two-dimensional analysis is useful means for the practical use.

  14. Verification of analysis methods for predicting the behaviour of seismically isolated nuclear structures. Final report of a co-ordinated research project 1996-1999

    International Nuclear Information System (INIS)

    2002-06-01

    This report is a summary of the work performed under a co-ordinated research project (CRP) entitled Verification of Analysis Methods for Predicting the Behaviour of Seismically isolated Nuclear Structures. The project was organized by the IAEA on the recommendation of the IAEA's Technical Working Group on Fast Reactors (TWGFR) and carried out from 1996 to 1999. One of the primary requirements for nuclear power plants and facilities is to ensure safety and the absence of damage under strong external dynamic loading from, for example, earthquakes. The designs of liquid metal cooled fast reactors (LMFRs) include systems which operate at low pressure and include components which are thin-walled and flexible. These systems and components could be considerably affected by earthquakes in seismic zones. Therefore, the IAEA through its advanced reactor technology development programme supports the activities of Member States to apply seismic isolation technology to LMFRs. The application of this technology to LMFRs and other nuclear plants and related facilities would offer the advantage that standard designs may be safely used in areas with a seismic risk. The technology may also provide a means of seismically upgrading nuclear facilities. Design analyses applied to such critical structures need to be firmly established, and the CRP provided a valuable tool in assessing their reliability. Ten organizations from India, Italy, Japan, the Republic of Korea, the Russian Federation, the United Kingdom, the United States of America and the European Commission co-operated in this CRP. This report documents the CRP activities, provides the main results and recommendations and includes the work carried out by the research groups at the participating institutes within the CRP on verification of their analysis methods for predicting the behaviour of seismically isolated nuclear structures

  15. Thermal design, analysis, and experimental verification for a DIII-D cryogenic pump

    International Nuclear Information System (INIS)

    Baxi, C.B.; Anderson, P.; Langhorn, A.; Schaubel, K.; Smith, J.

    1991-01-01

    As part of the advanced divertor program, it is planned to install a 50 m 3 /s capacity cryopump for particle removal in the DIII-D tokamak. The cryopump will be located in the outer bottom corner of the vacuum vessel. The pump will consist of a surface at liquid helium temperature (helium panel) with a surface area of about 1 m 2 , a surface at liquid nitrogen temperature (nitrogen shield) to reduce radiation heat load on the helium panel, and a secondary shield around the nitrogen shield. The cryopump design poses a number of thermal hydraulic problems such as estimation of heat loads on helium and nitrogen panels, stability of the two-phase helium flow, performance of the pump components during high temperature bakeout, and cooldown performance of the helium panel from ambient temperatures. This paper presents the thermal analysis done to resolve these issues. A prototypic experiment performed at General Atomics verified the analysis and increased the confidence in the design. The experimental results are also summarized in this paper. (orig.)

  16. Thermal design, analysis, and experimental verification for a DIII-D cryogenic pump

    International Nuclear Information System (INIS)

    Baxi, C.B.; Anderson, P.; Langhorn, A.; Schaubel, K.; Smith, J.

    1991-08-01

    As part of the advanced divertor program, it is planned to install a 50 m 3 /s capacity cryopump for particle removal in the D3-D tokamak. The cryopump will be located in the outer bottom corner of the vacuum vessel. The pump will consist of a surface at liquid helium temperature (helium panel) with a surface area of about 1 m 2 , a surface at liquid nitrogen temperature (nitrogen shield) to reduce radiation heat load on the helium panel, and a secondary shield around the nitrogen shield. The cryopump design poses a number of thermal hydraulic problems such as estimation of heat loads on helium and nitrogen panels, stability of the two-phase helium flow, performance of the pump components during high temperature bakeout, and cooldown performance of the helium panel from ambient temperatures. This paper presents the thermal analysis done to resolve these issues. A prototypic experiment performed at General Atomics verified the analysis and increased the confidence in the design. The experimental results are also summarized in this paper. 7 refs., 5 figs., 1 tab

  17. Application of FE-analysis in Design and Verification of Bolted Joints according to VDI 2230 at CERN

    CERN Document Server

    AUTHOR|(CDS)2225945; Dassa, Luca; Welo, Torgeir

    This thesis investigates how finite element analysis (FEA) can be used to simplify and improve analysis of bolted joints according to the guideline VDI 2230. Some aspects of how FEA can be applied to aid design and verification of bolted joints are given in the guideline, but not in a streamlined way that makes it simple and efficient to apply. The scope of this thesis is to clarify how FEA and VDI 2230 can be combined in analysis of bolted joints, and to present a streamlined workflow. The goal is to lower the threshold for carrying out such combined analysis. The resulting benefits are improved analysis validity and quality, and improved analysis efficiency. A case from the engineering department at CERN, where FEA has been used in analysis of bolted joints is used as basis to identify challenges in combining FEA and VDI 2230. This illustrates the need for a streamlined analysis strategy and well described workflow. The case in question is the Helium vessel (pressure vessel) for the DQW Crab Cavities, whi...

  18. Evolution of metastable phases in silicon during nanoindentation: mechanism analysis and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Mylvaganam, K [Centre for Advanced Materials Technology, University of Sydney, NSW 2006 (Australia); Zhang, L C [School of Mechanical and Manufacturing Engineering, University of New South Wales, NSW 2052 (Australia); Eyben, P; Vandervorst, W [IMEC, Kapeldreef 75, B-3001 Leuven (Belgium); Mody, J, E-mail: k.mylvaganam@usyd.edu.a, E-mail: Liangchi.zhang@unsw.edu.a, E-mail: eyben@imec.b, E-mail: jamody@imec.b, E-mail: vdvorst@imec.b [KU Leuven, Electrical Engineering Department, INSYS, Kasteelpark Arenberg 10, B-3001 Leuven (Belgium)

    2009-07-29

    This paper explores the evolution mechanisms of metastable phases during the nanoindentation on monocrystalline silicon. Both the molecular dynamics (MD) and the in situ scanning spreading resistance microscopy (SSRM) analyses were carried out on Si(100) orientation, and for the first time, experimental verification was achieved quantitatively at the same nanoscopic scale. It was found that under equivalent indentation loads, the MD prediction agrees extremely well with the result experimentally measured using SSRM, in terms of the depth of the residual indentation marks and the onset, evolution and dimension variation of the metastable phases, such as {beta}-Sn. A new six-coordinated silicon phase, Si-XIII, transformed directly from Si-I was discovered. The investigation showed that there is a critical size of contact between the indenter and silicon, beyond which a crystal particle of distorted diamond structure will emerge in between the indenter and the amorphous phase upon unloading.

  19. Initial verification and validation of RAZORBACK - A research reactor transient analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actual ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.

  20. Computer model verification for seismic analysis of vertical pumps and motors

    International Nuclear Information System (INIS)

    McDonald, C.K.

    1993-01-01

    The general principles of modeling vertical pumps and motors are discussed and then two examples of verifying the models are presented in detail. The first examples is a vertical pump and motor assembly. The model and computer analysis are presented and the first four modes (frequencies) calculated are compared to the values of the same modes obtained from a shaker test. The model used for this example is a lumped mass connected by massless beams model. The shaker test was performed by National Technical Services, Los Angeles, CA. The second example is a larger vertical motor. The model used for this example is a finite element three dimensional shell model. The first frequency obtained from this model is compared to the first frequency obtained from shop tests for several different motors. The shop tests were performed by Reliance Electric, Stratford, Ontario and Siemens-Allis, Inc., Norwood, Ohio

  1. Scalable Performance Measurement and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gamblin, Todd [Univ. of North Carolina, Chapel Hill, NC (United States)

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number of tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.

  2. Computational combustion and emission analysis of hydrogen-diesel blends with experimental verification

    International Nuclear Information System (INIS)

    Masood, M.; Ishrat, M.M.; Reddy, A.S.

    2007-01-01

    The paper discusses the effect of blending hydrogen with diesel in different proportions on combustion and emissions. A comparative study was carried out to analyze the effect of direct injection of hydrogen into the combustion chamber with that of induction through the inlet manifold for dual fueling. Percentage of hydrogen substitution varied from 20% to 80%, simultaneously reducing the diesel percentages. CFD analysis of dual fuel combustion and emissions were carried out for both the said methods using the CFD software FLUENT, meshing the combustion chamber was carried out using GAMBIT. The standard combustion and emission models were used in the analysis. In the second part of the paper, the effect of angle of injection in both the methods of hydrogen admission, on performance, combustion and emissions were analyzed. The experimental results were compared with that of simulated values and a good agreement between them was noticed. (author)

  3. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  4. Shield verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    WSRC-RP-90-26, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification are integral part of the certification process. This document identifies the work performed and documentation generated to satisfy these action items for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system, it is not certification of the complete SHIELD system. Complete certification will follow at a later date. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but can be found in the references. The validation and verification effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system computer code is completed

  5. GRIMHX verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Trumble, E.F.

    1991-12-01

    WSRC-RP-90-026, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification of the code is an integral part of this process. This document identifies the work performed and documentation generated to satisfy these action items for the Reactor Physics computer code GRIMHX. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but are found in the references. The publication of this document signals the validation and verification effort for the GRIMHX code is completed

  6. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  7. Analysis and Verification of Message Sequence Charts of Distributed Systems with the Help of Coloured Petri Nets

    Directory of Open Access Journals (Sweden)

    S. A. Chernenok

    2014-01-01

    Full Text Available The standard language of message sequence charts MSC is intended to describe scenarios of object interaction. Due to their expressiveness and simplicity MSC diagrams are widely used in practice at all stages of system design and development. In particular, the MSC language is used for describing communication behavior in distributed systems and communication protocols. In this paper the method for analysis and verification of MSC and HMSC diagrams is considered. The method is based on the translation of (HMSC into coloured Petri nets. The translation algorithms cover most standard elements of the MSC including data concepts. Size estimates of the CPN which is the result of the translation are given. Properties of the resulting CPN are analyzed and verified by using the known system CPN Tools and the CPN verifier based on the known tool SPIN. The translation method has been demonstrated by the example.

  8. Finite element program ARKAS: verification for IAEA benchmark problem analysis on core-wide mechanical analysis of LMFBR cores

    International Nuclear Information System (INIS)

    Nakagawa, M.; Tsuboi, Y.

    1990-01-01

    ''ARKAS'' code verification, with the problems set in the International Working Group on Fast Reactors (IWGFR) Coordinated Research Programme (CRP) on the inter-comparison between liquid metal cooled fast breeder reactor (LMFBR) Core Mechanics Codes, is discussed. The CRP was co-ordinated by the IWGFR around problems set by Dr. R.G. Anderson (UKAEA) and arose from the IWGFR specialists' meeting on The Predictions and Experience of Core Distortion Behaviour (ref. 2). The problems for the verification (''code against code'') and validation (''code against experiment'') were set and calculated by eleven core mechanics codes from nine countries. All the problems have been completed and were solved with the core structural mechanics code ARKAS. Predictions by ARKAS agreed very well with other solutions for the well-defined verification problems. For the validation problems based on Japanese ex-reactor 2-D thermo-elastic experiments, the agreements between measured and calculated values were fairly good. This paper briefly describes the numerical model of the ARKAS code, and discusses some typical results. (author)

  9. A verification study and trend analysis of simulated boundary layer wind fields over Europe

    Energy Technology Data Exchange (ETDEWEB)

    Lindenberg, Janna

    2011-07-01

    Simulated wind fields from regional climate models (RCMs) are increasingly used as a surrogate for observations which are costly and prone to homogeneity deficiencies. Compounding the problem, a lack of reliable observations makes the validation of the simulated wind fields a non trivial exercise. Whilst the literature shows that RCMs tend to underestimate strong winds over land these investigations mainly relied on comparisons with near surface measurements and extrapolated model wind fields. In this study a new approach is proposed using measurements from high towers and a robust validation process. Tower height wind data are smoother and thus more representative of regional winds. As benefit this approach circumvents the need to extrapolate simulated wind fields. The performance of two models using different downscaling techniques is evaluated. The influence of the boundary conditions on the simulation of wind statistics is investigated. Both models demonstrate a reasonable performance over flat homogeneous terrain and deficiencies over complex terrain, such as the Upper Rhine Valley, due to a too coarse spatial resolution ({proportional_to}50 km). When the spatial resolution is increased to 10 and 20 km respectively a benefit is found for the simulation of the wind direction only. A sensitivity analysis shows major deviations of international land cover data. A time series analysis of dynamically downscaled simulations is conducted. While the annual cycle and the interannual variability are well simulated, the models are less effective at simulating small scale fluctuations and the diurnal cycle. The hypothesis that strong winds are underestimated by RCMs is supported by means of a storm analysis. Only two-thirds of the observed storms are simulated by the model using a spectral nudging approach. In addition ''False Alarms'' are simulated, which are not detected in the observations. A trend analysis over the period 1961 - 2000 is conducted

  10. Analysis and verification of a prediction model of solar energetic proton events

    Science.gov (United States)

    Wang, J.; Zhong, Q.

    2017-12-01

    The solar energetic particle event can cause severe radiation damages near Earth. The alerts and summary products of the solar energetic proton events were provided by the Space Environment Prediction Center (SEPC) according to the flux of the greater than 10 MeV protons taken by GOES satellite in geosynchronous orbit. The start of a solar energetic proton event is defined as the time when the flux of the greater than 10 MeV protons equals or exceeds 10 proton flux units (pfu). In this study, a model was developed to predict the solar energetic proton events, provide the warning for the solar energetic proton events at least minutes in advance, based on both the soft X-ray flux and integral proton flux taken by GOES. The quality of the forecast model was measured against verifications of accuracy, reliability, discrimination capability, and forecast skills. The peak flux and rise time of the solar energetic proton events in the six channels, >1MeV, >5 MeV, >10 MeV, >30 MeV, >50 MeV, >100 MeV, were also simulated and analyzed.

  11. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The specification establishes the requirements for the Comprehensive Performance Test (CPT) and Limited Performance Test (LPT) of the Advanced Microwave Sounding, Unit-A2 (AMSU-A2), referred to herein as the unit. The unit is defined on Drawing 1331200. 1.2 Test procedure sequence. The sequence in which the several phases of this test procedure shall take place is shown in Figure 1, but the sequence can be in any order.

  12. Seismic analysis methods for LMFBR core and verification with mock-up vibration tests

    International Nuclear Information System (INIS)

    Sasaki, Y.; Kobayashi, T.; Fujimoto, S.

    1988-01-01

    This paper deals with the vibration behaviors of a cluster of core elements with the hexagonal cross section in a barrel under the dynamic excitation due to seismic events. When a strong earthquake excitation is applied to the core support, the cluster of core elements displace to a geometrical limit determined by restraint rings in the barrel, and collisions could occur between adjacent elements as a result of their relative motion. For these reasons, seismic analysis on LMFBR core elements is a complicated non-linear vibration problem, which includes collisions and fluid interactions. In an actual core design, it is hard to include hundreds of elements in the numerical calculations. In order to study the seismic behaviors of core elements, experiments with single row 29 elements (17 core fuel assemblies, 4 radial blanket assemblies, and 8 neutron shield assemblies) simulated all elements in MONJU core central row, and experiments with 7 cluster rows of 37 core fuel assemblies in the core center were performed in a fluid filled tank, using a large-sized shaking table. Moreover, the numerical analyses of these experiments were performed for the validation of simplified and detailed analytical methods. 4 refs, 18 figs

  13. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  14. 3D DVH-based metric analysis versus per-beam planar analysis in IMRT pretreatment verification

    International Nuclear Information System (INIS)

    Carrasco, Pablo; Jornet, Núria; Latorre, Artur; Eudaldo, Teresa; Ruiz, Agustí; Ribas, Montserrat

    2012-01-01

    Purpose: To evaluate methods of pretreatment IMRT analysis, using real measurements performed with a commercial 2D detector array, for clinical relevance and accuracy by comparing clinical DVH parameters. Methods: We divided the work into two parts. The first part consisted of six in-phantom tests aimed to study the sensitivity of the different analysis methods. Beam fluences, 3D dose distribution, and DVH of an unaltered original plan were compared to those of the delivered plan, in which an error had been intentionally introduced. The second part consisted of comparing gamma analysis with DVH metrics for 17 patient plans from various sites. Beam fluences were measured with the MapCHECK 2 detector, per-beam planar analysis was performed with the MapCHECK software, and 3D gamma analysis and the DVH evaluation were performed using 3DVH software. Results: In a per-beam gamma analysis some of the tests yielded false positives or false negatives. However, the 3DVH software correctly described the DVH of the plan which included the error. The measured DVH from the plan with controlled error agreed with the planned DVH within 2% dose or 2% volume. We also found that a gamma criterion of 3%/3 mm was too lax to detect some of the forced errors. Global analysis masked some problems, while local analysis magnified irrelevant errors at low doses. Small hotspots were missed for all metrics due to the spatial resolution of the detector panel. DVH analysis for patient plans revealed small differences between treatment plan calculations and 3DVH results, with the exception of very small volume structures such as the eyes and the lenses. Target coverage (D 98 and D 95 ) of the measured plan was systematically lower than that predicted by the treatment planning system, while other DVH characteristics varied depending on the parameter and organ. Conclusions: We found no correlation between the gamma index and the clinical impact of a discrepancy for any of the gamma index evaluation

  15. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P. [and others

    1996-12-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  16. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    International Nuclear Information System (INIS)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P.

    1996-01-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising

  17. Freight performance measures : approach analysis.

    Science.gov (United States)

    2010-05-01

    This report reviews the existing state of the art and also the state of the practice of freight performance measurement. Most performance measures at the state level have aimed at evaluating highway or transit infrastructure performance with an empha...

  18. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  19. Development and verification of the CATHENA GUI

    International Nuclear Information System (INIS)

    Chin, T.

    2008-01-01

    This paper presents the development and verification of a graphical user interface for CATHENA MOD-3.5d. The thermalhydraulic computer code CATHENA has been developed to simulate the physical behaviour of the hydraulic components in nuclear reactors and experimental facilities. A representation of the facility is developed as an ASCII text file and used by CATHENA to perform the simulation. The existing method of manual generation of idealizations of a physical system for performing thermal hydraulic analysis is complex, time-consuming and prone to errors. An overview is presented of the CATHENA GUI and its depiction of a CATHENA idealization through the manipulation of a visual collection of objects. The methodologies and rigour involved in the verification of the CATHENA GUI will be discussed. (author)

  20. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  1. Performance evaluation and dose verification of the low dose rate permanent prostrate brachytherapy system at the korle-bu Teaching Hospital

    International Nuclear Information System (INIS)

    Asenso, Y.A.

    2015-07-01

    .55 % respectively. That of the physical and internal grid alignment yielded a maximum discrepancy of 2.67 ± 0.01 mm at position 6A on the template. The probe retraction test produced no discrepancies in the “clicks” and corresponding distances. Meanwhile the depth of penetration and axial and lateral resolution test at the time performing the tests were no available standard measurements for comparison. The dose verification test consisted of three tests, the calibration point test, the source strength verification and the TPS dose verification. The calibration point test indicated that distance for maximum ionization chamber sensitivity is 3cm, so seeds can be calibrated at this point. The source strength verification were within the tolerances recommended by ICRU report 38 (ICRU, 1985). The average source strength measured was 0.651450 U ± 0.001052 U deviating from the manufacturer value of 0.64989 U by 0.242 % ± 0. 164 %. The TPS dose verification test produced results with significant errors which occurred due to post irradiation development of film with time but the doses obtained by both TPS and film followed the same pattern. The outcome of the performance evaluations indicate that for patient work, the ultrasound system and prostate brachytherapy system can provide the mechanism for accurate positioning of the brachytherapy seeds facilitating reliable identification of the target volume for accurate effective treatment. (au)

  2. Analysis and experimental verification of a control scheme for unified power quality conditioner

    Energy Technology Data Exchange (ETDEWEB)

    Peng Cheng Zhu; Xun Li; Yong Kang; Jian Chen [Huazhong Univ. of Science and Techmnology, Wuhan (China). Dept. of Electrical Engineering

    2005-07-01

    Improving power quality for sensitive load by a Unified Power Quality Conditioner (UPQC) in a distributed generation system is presented in this paper. The power balance of a UPQC, consisting of back-to-back connected series and shunt Active Filters (AF), is analysed. Based on the analysis a novel control scheme is established in a 2-phase Synchronous Rotating d-q Frame (SRF). In this control scheme, the series AF is controlled as a current source and makes the input current sinusoidal, while the shunt AF is controlled as a voltage source and keeps the load voltage in the normal value. With the proposed control strategy, the UPQC is capable of compensating not only harmonic and reactive currents of the load but also grid voltage distortion. There is no harmonic interference between harmonic-producing loads and harmonic-sensitive loads, which are connected on the common bus. The performance of a UPQC with the proposed control scheme under nonlinear load and grid voltage distortion is investigated with simulation as well as experimental works. (Author)

  3. Thermal Analysis of MIRIS Space Observation Camera for Verification of Passive Cooling

    Directory of Open Access Journals (Sweden)

    Duk-Hang Lee

    2012-09-01

    Full Text Available We conducted thermal analyses and cooling tests of the space observation camera (SOC of the multi-purpose infrared imaging system (MIRIS to verify passive cooling. The thermal analyses were conducted with NX 7.0 TMG for two cases of attitude of the MIRIS: for the worst hot case and normal case. Through the thermal analyses of the flight model, it was found that even in the worst case the telescope could be cooled to less than 206°K. This is similar to the results of the passive cooling test (~200.2°K. For the normal attitude case of the analysis, on the other hand, the SOC telescope was cooled to about 160°K in 10 days. Based on the results of these analyses and the test, it was determined that the telescope of the MIRIS SOC could be successfully cooled to below 200°K with passive cooling. The SOC is, therefore, expected to have optimal performance under cooled conditions in orbit.

  4. Results of the independent radiological verification survey of the remedial action performed at the former Alba Craft Laboratory site, Oxford, Ohio, (OXO001)

    International Nuclear Information System (INIS)

    Kleinhans, K.R.; Murray, M.E.; Carrier, R.F.

    1996-04-01

    Between October 1952 and February 1957, National Lead of Ohio (NLO), a primary contractor for the Atomic Energy Commission (AEC), subcontracted certain uranium machining operations to Alba Craft Laboratory, Incorporated, located at 10-14 West Rose Avenue, Oxford, Ohio. In 1992, personnel from Oak Ridge National Laboratory (ORNL) confirmed the presence of residual radioactive materials from the AEC-related operations in and around the facility in amounts exceeding the applicable Department of Energy (DOE) guidelines. Although the amount of uranium found on the property posed little health hazard if left undisturbed, the levels were sufficient to require remediation to bring radiological conditions into compliance with current guidelines, thus ensuring that the public and the environment are protected. A team from ORNL conducted a radiological verification survey of the former Alba Craft Laboratory property between December 1994 and February 1995. The survey was conducted at the request of DOE and included directly measured radiation levels, the collection and analysis of soil samples to determine concentrations of uranium and certain other radionuclides, and comparison of these data to the guidelines. This document reports the findings of this survey. The results of the independent verification survey of the former Alba Craft Laboratory property demonstrate that all contaminated areas have been remediated to radionuclide concentrations and activity levels below the applicable guideline limits set by DOE

  5. Synthetic spider silk sustainability verification by techno-economic and life cycle analysis

    Science.gov (United States)

    Edlund, Alan

    Major ampullate spider silk represents a promising biomaterial with diverse commercial potential ranging from textiles to medical devices due to the excellent physical and thermal properties from the protein structure. Recent advancements in synthetic biology have facilitated the development of recombinant spider silk proteins from Escherichia coli (E. coli), alfalfa, and goats. This study specifically investigates the economic feasibility and environmental impact of synthetic spider silk manufacturing. Pilot scale data was used to validate an engineering process model that includes all of the required sub-processing steps for synthetic fiber manufacture: production, harvesting, purification, drying, and spinning. Modeling was constructed modularly to support assessment of alternative protein production methods (alfalfa and goats) as well as alternative down-stream processing technologies. The techno-economic analysis indicates a minimum sale price from pioneer and optimized E. coli plants at 761 kg-1 and 23 kg-1 with greenhouse gas emissions of 572 kg CO2-eq. kg-1 and 55 kg CO2-eq. kg-1, respectively. Spider silk sale price estimates from goat pioneer and optimized results are 730 kg-1 and 54 kg-1, respectively, with pioneer and optimized alfalfa plants are 207 kg-1 and 9.22 kg-1 respectively. Elevated costs and emissions from the pioneer plant can be directly tied to the high material consumption and low protein yield. Decreased production costs associated with the optimized plants include improved protein yield, process optimization, and an Nth plant assumption. Discussion focuses on the commercial potential of spider silk, the production performance requirements for commercialization, and impact of alternative technologies on the sustainability of the system.

  6. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  7. Investigation and Verification of the Aerodynamic Performance of a Fan/Booster with Through-flow Method

    Science.gov (United States)

    Liu, Xiaoheng; Jin, Donghai; Gui, Xingmin

    2018-04-01

    Through-flow method is still widely applied in the revolution of the design of a turbomachinery, which can provide not merely the performance characteristic but also the flow field. In this study, a program based on the through-flow method was proposed, which had been verified by many other numerical examples. So as to improve the accuracy of the calculation, abundant loss and deviation models dependent on the real geometry of engine were put into use, such as: viscous losses, overflow in gaps, leakage from a flow path through seals. By means of this program, the aerodynamic performance of a certain high through-flow commercial fan/booster was investigated. On account of the radial distributions of the relevant parameters, flow deterioration in this machine was speculated. To confirm this surmise, 3-D numerical simulation was carried out with the help of the NUMECA software. Through detailed analysis, the speculation above was demonstrated, which provide sufficient evidence for the conclusion that the through-flow method is an essential and effective method for the performance prediction of the fan/booster.

  8. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  9. Verification of the model of predisposition in triathlon – structural model of confirmative factor analysis

    Directory of Open Access Journals (Sweden)

    Lenka Kovářová

    2012-09-01

    Full Text Available BACKGROUND: The triathlon is a combination of three different types of sport – swimming, cycling, and running. Each of these requires different top level predispositions and complex approach to talent selection is a rather difficult process. Attempts to identify assumptions in the triathlon have so far been specific and focused only on some groups of predispositions (physiology, motor tests, and psychology. The latest studies missed the structural approach and were based on determinants of sport performance, theory of sports training and expert assessment. OBJECTIVE: The aim of our study was to verify the model of predisposition in the short triathlon for talent assessment of young male athletes age 17–20 years. METHODS: The research sample consisted of 55 top level triathletes – men, who were included in the Government supported sports talent programme in the Czech Republic at the age of 17–20 years. We used a confirmative factor analysis (FA and Path diagram to verify the model, which allow us to explain mutual relationships among observed variables. For statistical data processing we used a structure equating modeling (SEM by software Lisrel L88. RESULTS: The study confirms best structural model for talent selection in triathlon at the age of 17–20 years old men, which composed seventeen indicators (tests and explained 91% of all cross-correlations (Goodness of Fit Index /GFI/ 0.91, Root Mean Square Residual /RMSR/ 0.13. Tests for predispositions in triathlons were grouped into five items, three motor predispositions (swimming, cycling and running skills, aerobic and psychological predispositions. Aerobic predispositions showed the highest importance to the assumptions to the general factor (1.00; 0. Running predispositions were measured as a very significant factor (–0.85; 0.28 which confirms importance of this critical stage of the race. Lower factor weight showed clusters of swimming (–0.61; 0.63 and cycling (0.53; 0

  10. Performance Analysis in Elite Sports

    NARCIS (Netherlands)

    Talsma, Bertus Gatze

    2013-01-01

    The central theme of this dissertation concerns the development of techniques for analyzing and comparing performances of elite sportsmen. When performances are delivered under varying circumstances, or are influenced by other factors than the athletes' abilities, a fair comparison, for instance

  11. Enhancing importance-performance analysis

    DEFF Research Database (Denmark)

    Eskildsen, Jacob Kjær; Kristensen, Kai

    2006-01-01

    Purpose: The interpretation of the importance/performance map is based on an assumption of independence between importance and performance but many studies question the validity of this assumption. The aim of this research is to develop a new typology for job satisfaction attributes as well...... as a new importance/performance map that can be an aid for organizations when they prioritize their improvement actions based on a job satisfaction study. Design/methodology/approach: A typology for possible relationships between importance and performance in job satisfaction studies is developed based...... on theoretical considerations. This typology is then applied and validated on approximately 10,000 responses from the European Employee Index 2002. Ultimately a new importance/performance map for priority setting in job satisfaction studies is developed based on the new typology for possible relationships...

  12. Proposal of performance indicators/model for Operational Readiness Verification (ORV) at restart after a planned shutdown

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Nygren, Magnus

    2005-12-01

    The objectives of the study reported here were to propose a model that can be used in the analysis of possible future ORV-related events and to outline a set of performance indicators that can be used by the inspectorate to assess a utility's level of readiness if an ORV-event should take place. Together the two objectives serve to improve the inspectorate's ability to ensure that the utilities maintain an adequate capability to respond. The background for the current study is the nine ORV events that occurred in Sweden between 1995- 1998, as well as the findings of a previous study of safety during outage and restart of nuclear power plants project. This study found that the three levels or types of tests that occur in ORV were used according to need rather than according to a predefined arrangement or procedure, and that tasks were adapted relative to the different types of embedding and the degree of correspondence between nominal and actual ORV. The organisation's coping with the complexity of ORV was discussed by the relation between expectations and surprises, how planning was used as control, attention to details, and the practices of shift changes. It is a truism that accidents are analysed and interpreted relative to a commonly accepted understanding of their nature. This understanding is, however, relative rather than absolute, and has changed significantly during the last decade. In the 1990s, accidents were analysed step by step, and explanations and recommendations therefore emphasised specific rather than generic solutions. The present study illustrates this by going through the responses to the nine ORV events. Following that, the nine events are analysed anew using a contemporary understanding of accidents (a systemic model), which emphasises that incidents more often arise from context induced performance variability than from failures of people. The alternative interpretation provided by a systemic model is illustrated by a detailed analysis of

  13. Thermal Analysis of the Driving Component Based on the Thermal Network Method in a Lunar Drilling System and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Dewei Tang

    2017-03-01

    Full Text Available The main task of the third Chinese lunar exploration project is to obtain soil samples that are greater than two meters in length and to acquire bedding information from the surface of the moon. The driving component is the power output unit of the drilling system in the lander; it provides drilling power for core drilling tools. High temperatures can cause the sensors, permanent magnet, gears, and bearings to suffer irreversible damage. In this paper, a thermal analysis model for this driving component, based on the thermal network method (TNM was established and the model was solved using the quasi-Newton method. A vacuum test platform was built and an experimental verification method (EVM was applied to measure the surface temperature of the driving component. Then, the TNM was optimized, based on the principle of heat distribution. Through comparative analyses, the reasonableness of the TNM is validated. Finally, the static temperature field of the driving component was predicted and the “safe working times” of every mode are given.

  14. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  15. A case study of the crack sizing performance of the Ultrasonic Phased Array combined crack and wall loss inspection tool on the Centennial pipeline, the defect evaluation, including the defect evaluation, field feature verification and tool performance validation (performed by Marathon Oil, DNV and GE Oil and Gas)

    Energy Technology Data Exchange (ETDEWEB)

    Hrncir, T.; Turner, S. [Marathon Pipe Line LLC, Findley, OH (United States); Polasik, SJ [DNV Columbus, Inc, Dublin, OH 43017 (United States); Vieth, P. [BP EandP, Houston, TX (United States); Allen, D.; Lachtchouk, I.; Senf, P.; Foreman, G. [GE Oil and Gas PII Pipeline Solutions, Stutensee (Germany)], email: geoff.foreman@ge.com

    2010-07-01

    The Centennial Pipeline System is operated by Marathon Pipe Line LLC. It is 754 miles long and carries liquid products from eastern Texas to southern Illinois. Most of it was constructed in 1951 for natural gas, but it was converted in 2001 for liquid product service. GE Oil and Gas conducted an ultrasonic phased array in-line inspection (ILI) survey of this pipeline, whose primary purpose was to detect and characterize stress corrosion cracking. A dig verification was performed in 2008 to increase the level of confidence in the detection and depth-sizing capabilities of this inspection method. This paper outlines of the USCD technology and experience and describes how the ILI survey results were validated, how the ILI data analysis was improved, and the impact on managing the integrity of the line section. Results indicate that the phased array technology approached a 90% certainty predicted depth with a tolerance of 1 mm at a 95% confidence level.

  16. Performance trending, analysis, and reporting

    International Nuclear Information System (INIS)

    Thomas, J.A.; Forsyth, M.

    1995-01-01

    Improvements in power plant operations and maintenance have seen made possible through the use of improved software systems and communications capabilities provided by distributed computer systems and networks. Staff functions have been added at several operating units to improve performance. A new class of software system is also in use at South Texas Project and DC Cook. These staff activities are performed using the new software tool support and associated improvements in operations that have been produced

  17. Importance-performance analysis based SWOT analysis

    OpenAIRE

    Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.

    2016-01-01

    SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...

  18. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  19. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  20. NPP Temelin instrumentation and control system upgrade and verification

    International Nuclear Information System (INIS)

    Ubra, O.; Petrlik, J.

    1998-01-01

    Two units of Ver 1000 type of the Czech nuclear power plant Temelin, which are under construction are being upgraded with the latest instrumentation and control system delivered by WEC. To confirm that the functional design of the new Reactor Control and Limitation System, Turbine Control System and Plant Control System are in compliance with the Czech customer requirements and that these requirements are compatible with NPP Temelin upgraded technology, the verification of the control systems has been performed. The method of transient analysis has been applied. Some details of the NPP Temelin Reactor Control and Limitation System verification are presented.(author)

  1. Thermal–structural analysis of ITER triangular support for dominant load verification

    International Nuclear Information System (INIS)

    Kim, Yu-Gyeong; Hwang, Jong-Hyun; Jung, Yung-Jin; Kim, Hyun-Soo; Ahn, Hee-Jae

    2014-01-01

    Highlights: • The load combination method is introduced to thermal–structural analysis for contradictive loads occurred simultaneously. • The one-way coupling analysis also conducted for thermal–structural analysis and its validity is checked by comparing with the load combination. • The dominant load for triangular support bracket is determined as the baking condition. - Abstract: The triangular support is located on the lower inner shell of vacuum vessel of ITER, which should be designed to withstand various loads such as nuclear heat, coolant pressure and so on. The appropriateness of its design is evaluated under the dominant load that could represent the most conservative condition among the design loads. In order to decide the dominant load, a valid method for thermal–structural analysis is firstly verified considering contradictory behaviors between heat and structural loads. In this paper, two approaches; one-way coupling and load combination, are introduced for thermal–structural analysis. The one-way coupling is a method generally used but has a limit to apply on contradictory conditions. The load combination could give a proper solution since it evaluates each load independently and then adds up each result linearly. Based on the results of each case, structural analysis for another load case, baking condition with incident, is conducted to find out which load is dominant for triangular support. Consequently, it is found that the baking condition is the dominant load for triangular support bracket. The proposed load combination method gives a physically reasonable solution which can be used as a reference for checking the validity of other thermal–structural analysis. It is expected that these results could be applied for manufacturing design of the triangular support under various load conditions

  2. Thermal–structural analysis of ITER triangular support for dominant load verification

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yu-Gyeong, E-mail: aspirany@hhi.co.kr [Hyundai Heavy Industries Co., Ltd., 1000, Bangeojinsunhwando-ro, Dong-gu, Ulsan (Korea, Republic of); Hwang, Jong-Hyun; Jung, Yung-Jin [Hyundai Heavy Industries Co., Ltd., 1000, Bangeojinsunhwando-ro, Dong-gu, Ulsan (Korea, Republic of); Kim, Hyun-Soo; Ahn, Hee-Jae [National Fusion Research Institute, 113 Gwahangno, Yuseong-gu, Daejeon-si (Korea, Republic of)

    2014-12-15

    Highlights: • The load combination method is introduced to thermal–structural analysis for contradictive loads occurred simultaneously. • The one-way coupling analysis also conducted for thermal–structural analysis and its validity is checked by comparing with the load combination. • The dominant load for triangular support bracket is determined as the baking condition. - Abstract: The triangular support is located on the lower inner shell of vacuum vessel of ITER, which should be designed to withstand various loads such as nuclear heat, coolant pressure and so on. The appropriateness of its design is evaluated under the dominant load that could represent the most conservative condition among the design loads. In order to decide the dominant load, a valid method for thermal–structural analysis is firstly verified considering contradictory behaviors between heat and structural loads. In this paper, two approaches; one-way coupling and load combination, are introduced for thermal–structural analysis. The one-way coupling is a method generally used but has a limit to apply on contradictory conditions. The load combination could give a proper solution since it evaluates each load independently and then adds up each result linearly. Based on the results of each case, structural analysis for another load case, baking condition with incident, is conducted to find out which load is dominant for triangular support. Consequently, it is found that the baking condition is the dominant load for triangular support bracket. The proposed load combination method gives a physically reasonable solution which can be used as a reference for checking the validity of other thermal–structural analysis. It is expected that these results could be applied for manufacturing design of the triangular support under various load conditions.

  3. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  4. Verification of hybrid analysis concept of soil-foundation interaction by field vibration tests. Pt. 2

    International Nuclear Information System (INIS)

    Katayama, I.; Niwa, A.; Kubo, Y.; Penzien, J.

    1987-01-01

    The paper describes the outline of the hybrid analysis code for soil-structure interaction (HASSI) and the results of numerical simulation of the responses obtained at the model 2C in both cases of the forced vibration test and the natural earthquake excitation. (orig./HP)

  5. Analysis, Test and Verification in The Presence of Variability (Dagstuhl Seminar 13091)

    DEFF Research Database (Denmark)

    2014-01-01

    -aware tool chains. We brought together 46 key researchers from three continents, working on quality assurance challenges that arise from introducing variability, and some who do not work with variability, but that are experts in their respective areas in the broader domain of software analysis or testing...

  6. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik

    2001-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses compositionality and dependency analysis to significantly improve the efficiency of symbolic model checking of state/event models. It makes...

  7. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Behrmann, Gerd

    1999-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses \\emph{compositionality} and \\emph{dependency analysis} to significantly improve the efficiency of symbolic model checking of state/event models...

  8. MONJU fuel pin performance analysis

    International Nuclear Information System (INIS)

    Kitagawa, H.; Yamanaka, T.; Hayashi, H.

    1979-01-01

    Monju fuel pin has almost the same properties as other LMFBR fuel pins, i.e. Phenix, PFR, CRBR, but would be irradiated under severe conditions: maximum linear heat rate of 381 watt/cm, hot spot cladding temperature of 675 deg C, peak burnup of 131,000 MWd/t, peak fluence (E greater than 0.1 MeV) of 2.3 10 23 n/cm 2 . In order to understand in-core performance of Monju fuel pin, its thermal and mechanical behaviour was predicted using the fast running performance code SIMPLE. The code takes into account pellet-cladding interaction due to thermal expansion and swelling, gap conductance, structural changes of fuel pellets, fission product gas release with burnup and temperature increase, swelling and creep of fuel pellets, corrosion of cladding due to sodium flow and chemical attack by fission products, and cumulative damage of the cladding due to thermal creep

  9. Verification of hybrid analysis concept of soil-foundation interaction by field vibration tests - Analytical phase

    International Nuclear Information System (INIS)

    Katayama, I.; Niwa, A.; Kubo, Y.; Penzien, J.

    1987-01-01

    In connection with the previous paper under the same subject, which describes the results obtained by the field vibration tests of five different models, this paper describes the outline of the hybrid analysis code of soil-structure interaction (HASSI) and the results of numerical simulation of the responses obtained at the model 2C in both cases of the forced vibration test and the natural earthquake excitation

  10. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  11. In silico analysis and verification of S100 gene expression in gastric cancer

    International Nuclear Information System (INIS)

    Liu, Ji; Li, Xue; Dong, Guang-Long; Zhang, Hong-Wei; Chen, Dong-Li; Du, Jian-Jun; Zheng, Jian-Yong; Li, Ji-Peng; Wang, Wei-Zhong

    2008-01-01

    The S100 protein family comprises 22 members whose protein sequences encompass at least one EF-hand Ca 2+ binding motif. They were involved in the regulation of a number of cellular processes such as cell cycle progression and differentiation. However, the expression status of S100 family members in gastric cancer was not known yet. Combined with analysis of series analysis of gene expression, virtual Northern blot and microarray data, the expression levels of S100 family members in normal and malignant stomach tissues were systematically investigated. The expression of S100A3 was further evaluated by quantitative RT-PCR. At least 5 S100 genes were found to be upregulated in gastric cance by in silico analysis. Among them, four genes, including S100A2, S100A4, S100A7 and S100A10, were reported to overexpressed in gastric cancer previously. The expression of S100A3 in eighty patients of gastric cancer was further examined. The results showed that the mean expression levels of S100A3 in gastric cancer tissues were 2.5 times as high as in adjacent non-tumorous tissues. S100A3 expression was correlated with tumor differentiation and TNM (Tumor-Node-Metastasis) stage of gastric cancer, which was relatively highly expressed in poorly differentiated and advanced gastric cancer tissues (P < 0.05). To our knowledge this is the first report of systematic evaluation of S100 gene expressions in gastric cancers by multiple in silico analysis. The results indicated that overexpression of S100 gene family members were characteristics of gastric cancers and S100A3 might play important roles in differentiation and progression of gastric cancer

  12. PERFORMANCE ANALYSIS OF DISTINCT SECURED AUTHENTICATION PROTOCOLS USED IN THE RESOURCE CONSTRAINED PLATFORM

    Directory of Open Access Journals (Sweden)

    S. Prasanna

    2014-03-01

    Full Text Available Most of the e-commerce and m-commerce applications in the current e-business world, has adopted asymmetric key cryptography technique in their authentication protocol to provide an efficient authentication of the involved parties. This paper exhibits the performance analysis of distinct authentication protocol which implements the public key cryptography like RSA, ECC and HECC. The comparison is made based on key generation, sign generation and sign verification processes. The results prove that the performance achieved through HECC based authentication protocol is better than the ECC- and RSA based authentication protocols.

  13. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  14. Verification of a neutronic code for transient analysis in reactors with Hex-z geometry

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez-Pintor, S.; Verdu, G. [Departamento de Ingenieria Quimica Y Nuclear, Universitat Politecnica de Valencia, Cami de Vera, 14, 46022. Valencia (Spain); Ginestar, D. [Departamento de Matematica Aplicada, Universitat Politecnica de Valencia, Cami de Vera, 14, 46022. Valencia (Spain)

    2012-07-01

    Due to the geometry of the fuel bundles, to simulate reactors such as VVER reactors it is necessary to develop methods that can deal with hexagonal prisms as basic elements of the spatial discretization. The main features of a code based on a high order finite element method for the spatial discretization of the neutron diffusion equation and an implicit difference method for the time discretization of this equation are presented and the performance of the code is tested solving the first exercise of the AER transient benchmark. The obtained results are compared with the reference results of the benchmark and with the results provided by PARCS code. (authors)

  15. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  16. Fault Tree Analysis for Safety/Security Verification in Aviation Software

    Directory of Open Access Journals (Sweden)

    Andrew J. Kornecki

    2013-01-01

    Full Text Available The Next Generation Air Traffic Management system (NextGen is a blueprint of the future National Airspace System. Supporting NextGen is a nation-wide Aviation Simulation Network (ASN, which allows integration of a variety of real-time simulations to facilitate development and validation of the NextGen software by simulating a wide range of operational scenarios. The ASN system is an environment, including both simulated and human-in-the-loop real-life components (pilots and air traffic controllers. Real Time Distributed Simulation (RTDS developed at Embry Riddle Aeronautical University, a suite of applications providing low and medium fidelity en-route simulation capabilities, is one of the simulations contributing to the ASN. To support the interconnectivity with the ASN, we designed and implemented a dedicated gateway acting as an intermediary, providing logic for two-way communication and transfer messages between RTDS and ASN and storage for the exchanged data. It has been necessary to develop and analyze safety/security requirements for the gateway software based on analysis of system assets, hazards, threats and attacks related to ultimate real-life future implementation. Due to the nature of the system, the focus was placed on communication security and the related safety of the impacted aircraft in the simulation scenario. To support development of safety/security requirements, a well-established fault tree analysis technique was used. This fault tree model-based analysis, supported by a commercial tool, was a foundation to propose mitigations assuring the gateway system safety and security. 

  17. Verification of radiation heat transfer analysis in KSTAR PFC and vacuum vessel during baking

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, S.Y. [Chungnam National University, 79 Daehak-ro, Yuseong-gu, Daejeon 34167 (Korea, Republic of); Kim, Y.J., E-mail: k43689@nfri.re.kr [National Fusion Research Institute, 169-148 Gwahang-ro, Yuseong-gu, Daejeon 34133 (Korea, Republic of); Kim, S.T.; Jung, N.Y.; Im, D.S.; Gong, J.D.; Lee, J.M.; Park, K.R.; Oh, Y.K. [National Fusion Research Institute, 169-148 Gwahang-ro, Yuseong-gu, Daejeon 34133 (Korea, Republic of)

    2016-11-01

    Highlights: • Thermal network is used to analyze heat transfer from PFC to VV. • Three heat transfer rate equations are derived based on the thermal network. • The equations is verified using Experimental data and design documents. • Most of the heat lost in tokamak is transferred to experimental room air. • The heat loss to the air is 101 kW of the total heat loss of 154 kW in tokamak. - Abstract: KSTAR PFC (Plasma Facing Component) and VV (Vacuum Vessel) were not arrived at the target temperatures in bake-out phase, which are 300 °C and 110 °C, respectively. The purpose of this study is to find out the reason why they have not been reached the target temperature. A thermal network analysis is used to investigate the radiation heat transfer from PFC to VV, and the thermal network is drawn up based on the actual KSTAR tokamak. The analysis model consists of three equations, and is solved using the EES (Engineering Equation Solver). The heat transfer rates obtained with the analysis model is verified using the experimental data at the KSTAR bake-out phase. The analyzed radiation heat transfer rates from PFC to VV agree quite well with those of experiment throughout the bake-out phase. Heat loss from PFC to experimental room air via flange of VV is also calculated and compared, which is found be the main reason of temperature gap between the target temperature and actually attained temperature of KSTAR PFC.

  18. Verification of radiation heat transfer analysis in KSTAR PFC and vacuum vessel during baking

    International Nuclear Information System (INIS)

    Yoo, S.Y.; Kim, Y.J.; Kim, S.T.; Jung, N.Y.; Im, D.S.; Gong, J.D.; Lee, J.M.; Park, K.R.; Oh, Y.K.

    2016-01-01

    Highlights: • Thermal network is used to analyze heat transfer from PFC to VV. • Three heat transfer rate equations are derived based on the thermal network. • The equations is verified using Experimental data and design documents. • Most of the heat lost in tokamak is transferred to experimental room air. • The heat loss to the air is 101 kW of the total heat loss of 154 kW in tokamak. - Abstract: KSTAR PFC (Plasma Facing Component) and VV (Vacuum Vessel) were not arrived at the target temperatures in bake-out phase, which are 300 °C and 110 °C, respectively. The purpose of this study is to find out the reason why they have not been reached the target temperature. A thermal network analysis is used to investigate the radiation heat transfer from PFC to VV, and the thermal network is drawn up based on the actual KSTAR tokamak. The analysis model consists of three equations, and is solved using the EES (Engineering Equation Solver). The heat transfer rates obtained with the analysis model is verified using the experimental data at the KSTAR bake-out phase. The analyzed radiation heat transfer rates from PFC to VV agree quite well with those of experiment throughout the bake-out phase. Heat loss from PFC to experimental room air via flange of VV is also calculated and compared, which is found be the main reason of temperature gap between the target temperature and actually attained temperature of KSTAR PFC.

  19. Slideline verification for multilayer pressure vessel and piping analysis including tangential motion

    International Nuclear Information System (INIS)

    Van Gulick, L.A.

    1984-01-01

    Nonlinear finite element method (FEM) computer codes with slideline algorithm implementations should be useful for the analysis of prestressed multilayer pressure vessels and piping. This paper presents closed form solutions including the effects of tangential motion useful for verifying slideline implementations for this purpose. The solutions describe stresses and displacements of a long internally pressurized elastic-plastic cylinder initially separated from an elastic outer cylinder by a uniform gap. Comparison of closed form and FEM results evaluates the usefulness of the closed form solution and the validity of the sideline implementation used

  20. Verification of Serpent code for the fuel analysis of a PBMR

    International Nuclear Information System (INIS)

    Bastida O, G. E.; Francois L, J. L.

    2015-09-01

    In this paper the models and simulations with the Monte Carlo code Serpent are presented, as well as the obtained results of the different analyzed cases in order to verify the suitability or reliability of the use of this code to ensure favorable results in the realization of a neutronic analysis of fuel for a Pebble Bed Modular Reactor (PBMR). Comparisons were made with the results reported in a report by the OECD/Nea relative to a high temperature reactor of spheres bed with plutonium reactor grade as fuel. The results show that the use of Serpent is appropriate, as these results are comparable with those reported in the report. (Author)

  1. Verification of the thermal module in the ELESIM code and the associated uncertainty analysis

    International Nuclear Information System (INIS)

    Arimescu, V.I.; Williams, A.F.; Klein, M.E.; Richmond, W.R.; Couture, M.

    1997-09-01

    Temperature is a critical parameter in fuel modelling because most of the physical processes that occur in fuel elements during irradiation are thermally activated. The focus of this paper is the temperature distribution calculation used in the computer code ELESIM, developed at AECL to model the steady-state behaviour of CANDU fuel. A validation procedure for fuel codes is described and applied to ELESIM's thermal calculation.The effects of uncertainties in model parameters, like Uranium Dioxide thermal conductivity, and input variables, such as fuel element linear power, are accounted for through an uncertainty analysis using Response Surface and Monte Carlo techniques

  2. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    Science.gov (United States)

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664

  3. Analysis, Verification, and Application of Equations and Procedures for Design of Exhaust-pipe Shrouds

    Science.gov (United States)

    Ellerbrock, Herman H.; Wcislo, Chester R.; Dexter, Howard E.

    1947-01-01

    Investigations were made to develop a simplified method for designing exhaust-pipe shrouds to provide desired or maximum cooling of exhaust installations. Analysis of heat exchange and pressure drop of an adequate exhaust-pipe shroud system requires equations for predicting design temperatures and pressure drop on cooling air side of system. Present experiments derive such equations for usual straight annular exhaust-pipe shroud systems for both parallel flow and counter flow. Equations and methods presented are believed to be applicable under certain conditions to the design of shrouds for tail pipes of jet engines.

  4. Parametric Analysis and Experimental Verification of a Hybrid Vibration Energy Harvester Combining Piezoelectric and Electromagnetic Mechanisms

    Directory of Open Access Journals (Sweden)

    Zhenlong Xu

    2017-06-01

    Full Text Available Considering coil inductance and the spatial distribution of the magnetic field, this paper developed an approximate distributed-parameter model of a hybrid energy harvester (HEH. The analytical solutions were compared with numerical solutions. The effects of load resistances, electromechanical coupling factors, mechanical damping ratio, coil parameters and size scale on performance were investigated. A meso-scale HEH prototype was fabricated, tested and compared with a stand-alone piezoelectric energy harvester (PEH and a stand-alone electromagnetic energy harvester (EMEH. The peak output power is 2.93% and 142.18% higher than that of the stand-alone PEH and EMEH, respectively. Moreover, its bandwidth is 108%- and 122.7%-times that of the stand-alone PEH and EMEH, respectively. The experimental results agreed well with the theoretical values. It is indicated that the linearized electromagnetic coupling coefficient is more suitable for low-level excitation acceleration. Hybrid energy harvesting contributes to widening the frequency bandwidth and improving energy conversion efficiency. However, only when the piezoelectric coupling effect is weak or medium can the HEH generate more power than the single-mechanism energy harvester. Hybrid energy harvesting can improve output power even at the microelectromechanical systems (MEMS scale. This study presents a more effective model for the performance evaluation and structure optimization of the HEH.

  5. Radionuclide analysis and scaling factors verification for LLRW of Taipower Reactor

    International Nuclear Information System (INIS)

    King, J.-Y.; Liu, K.-T.; Chen, S.-C.; Chang, T.-M.; Pung, T.-C.; Men, L.-C.; Wang, S.-J.

    2004-01-01

    The Atomic Energy Council of the Republic of China (CAEC) final disposal policy for Low Level Radwaste (LLRW) will be carried on in 1996. Institute of Nuclear Energy Research has the contract to develop the Radionuclide analysis method and to establish the scaling factors for LLRW of Taipower reactors. The radionuclides analyzed including: Co-60, Cs-137, Ce-144, γ-nuclides; H-3, C-14, Fe-55, Ni-59, Ni-63, Sr-90, Nb-94, Tc-99, I-129, Pu-238, Pu-239/240, Pu-241, Am-241, Cm-242, Cm-244 α, β and low energy γ nuclides. 120 samples taken from 21 waste streams were analyzed and the database was collected within 2 years. The scaling factors for different kind of waste streams were computed with weighted log-mean average method. In 1993, the scaling factors for each waste stream has been verified through actual station samples. (author)

  6. Indian Point Nuclear Power Station: verification analysis of County Radiological Emergency-Response Plans

    International Nuclear Information System (INIS)

    Nagle, J.; Whitfield, R.

    1983-05-01

    This report was developed as a management tool for use by the Federal Emergency Management Agency (FEMA) Region II staff. The analysis summarized in this report was undertaken to verify the extent to which procedures, training programs, and resources set forth in the County Radiological Emergency Response Plans (CRERPs) for Orange, Putnam, and Westchester counties in New York had been realized prior to the March 9, 1983, exercise of the Indian Point Nuclear Power Station near Buchanan, New York. To this end, a telephone survey of county emergency response organizations was conducted between January 19 and February 22, 1983. This report presents the results of responses obtained from this survey of county emergency response organizations

  7. Control analysis and experimental verification of a practical dc–dc boost converter

    Directory of Open Access Journals (Sweden)

    Saswati Swapna Dash

    2015-12-01

    Full Text Available This paper presents detailed open loop and closed loop analysis on boost dc–dc converter for both voltage mode control and current mode control. Here the boost dc–dc converter is a practical converter considering all possible parasitic elements like ESR and on state voltage drops. The open loop control, closed loop current mode control and voltage mode control are verified. The comparative study of all control techniques is presented. The PI compensator for closed loop current mode control is designed using these classical techniques like root locus technique and bode diagram. The simulation results are validated with the experimental results of voltage mode control for both open loop and closed loop control.

  8. Verification and sensitivity analysis on the elastic stiffness of the leaf type holddown spring assembly

    International Nuclear Information System (INIS)

    Song, Kee Nam

    1998-01-01

    The elastic formula of leaf type hold down spring(HDS) assembly is verified by comparing the values of elastic stiffness with the characteristic test results of the HDS's specimens. The comparisons show that the derived elastic stiffness formula is useful in reliably estimating the elastic stiffness of leaf type HDS assembly. The elastic stiffness sensitivity of leaf type HDS assembly is analyzed using the formula and its gradient vectors obtained from the mid-point formula. As a result of sensitivity analysis, the elastic stiffness sensitivity with respect to each design variable is quantified and design variables of large sensitivity are identified. Among the design variables, leaf thickness is identified as the most sensitive design variable to the elastic of leaf type HDS assembly. In addition, the elastic stiffness sensitivity, with respect to design variable, is in power-law type correlation to the base thickness of the leaf. (author)

  9. Logic analysis and verification of n-input genetic logic circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2017-01-01

    . In this paper, we present an approach to analyze and verify the Boolean logic of a genetic circuit from the data obtained through stochastic analog circuit simulations. The usefulness of this analysis is demonstrated through different case studies illustrating how our approach can be used to verify the expected......Nature is using genetic logic circuits to regulate the fundamental processes of life. These genetic logic circuits are triggered by a combination of external signals, such as chemicals, proteins, light and temperature, to emit signals to control other gene expressions or metabolic pathways...... accordingly. As compared to electronic circuits, genetic circuits exhibit stochastic behavior and do not always behave as intended. Therefore, there is a growing interest in being able to analyze and verify the logical behavior of a genetic circuit model, prior to its physical implementation in a laboratory...

  10. System Verification Through Reliability, Availability, Maintainability (RAM) Analysis & Technology Readiness Levels (TRLs)

    Energy Technology Data Exchange (ETDEWEB)

    Emmanuel Ohene Opare, Jr.; Charles V. Park

    2011-06-01

    The Next Generation Nuclear Plant (NGNP) Project, managed by the Idaho National Laboratory (INL), is authored by the Energy Policy Act of 2005, to research, develop, design, construct, and operate a prototype fourth generation nuclear reactor to meet the needs of the 21st Century. A section in this document proposes that the NGNP will provide heat for process heat applications. As with all large projects developing and deploying new technologies, the NGNP is expected to meet high performance and availability targets relative to current state of the art systems and technology. One requirement for the NGNP is to provide heat for the generation of hydrogen for large scale productions and this process heat application is required to be at least 90% or more available relative to other technologies currently on the market. To reach this goal, a RAM Roadmap was developed highlighting the actions to be taken to ensure that various milestones in system development and maturation concurrently meet required availability requirements. Integral to the RAM Roadmap was the use of a RAM analytical/simulation tool which was used to estimate the availability of the system when deployed based on current design configuration and the maturation level of the system.

  11. Coordinated Control Design for the HTR-PM Plant: From Theoretic Analysis to Simulation Verification

    International Nuclear Information System (INIS)

    Dong Zhe; Huang Xiaojin

    2014-01-01

    HTR-PM plant is a two-modular nuclear power plant based on pebble bed modular high temperature gas-cooled reactor (MHTGR), and adopts operation scheme of two nuclear steam supplying systems (NSSSs) driving one turbine. Here, an NSSS is composed of an MHTGR, a once-through steam generator (OTSG) and some connecting pipes. Due to the coupling effect induced by two NSSSs driving one common turbine and that between the MHTGR and OTSG given by common helium flow, it is necessary to design a coordinated control for the safe, stable and efficient operation of the HTR-PM plant. In this paper, the design of the feedback loops and control algorithms of the coordinated plant control law is firstly given. Then, the hardware-in-loop (HIL) system for verifying the feasibility and performance of this control strategy is introduced. Finally, some HIL simulation results are given, which preliminarily show that this coordinated control law can be implemented practically. (author)

  12. Three-electrode self-actuating self-sensing quartz cantilever: design, analysis, and experimental verification.

    Science.gov (United States)

    Chen, C Julian; Schwarz, Alex; Wiesendanger, Roland; Horn, Oliver; Müller, Jörg

    2010-05-01

    We present a novel quartz cantilever for frequency-modulation atomic force microscopy (FM-AFM) which has three electrodes: an actuating electrode, a sensing electrode, and a ground electrode. By applying an ac signal on the actuating electrode, the cantilever is set to vibrate. If the frequency of actuation voltage closely matches one of the characteristic frequencies of the cantilever, a sharp resonance should be observed. The vibration of the cantilever in turn generates a current on the sensing electrode. The arrangement of the electrodes is such that the cross-talk capacitance between the actuating electrode and the sensing electrode is less than 10(-16) F, thus the direct coupling is negligible. To verify the principle, a number of samples were made. Direct measurements with a Nanosurf easyPPL controller and detector showed that for each cantilever, one or more vibrational modes can be excited and detected. Using classical theory of elasticity, it is shown that such novel cantilevers with proper dimensions can provide optimized performance and sensitivity in FM-AFM with very simple electronics.

  13. Verification and validation of one-dimensional models used in subcooled flow boiling analysis

    International Nuclear Information System (INIS)

    Braz Filho, Francisco A.; Caldeira, Alexandre D.; Borges, Eduardo M.; Sabundjian, Gaiane

    2009-01-01

    Subcooled flow boiling occurs in many industrial applications and it is characterized by large heat transfer coefficients. However, this efficient heat transfer mechanism is limited by the critical heat flux, where the heat transfer coefficient decreases leading to a fast heater temperature excursion, potentially leading to heater melting and destruction. Subcooled flow boiling is especially important in water-cooled nuclear power reactors, where the presence of vapor bubbles in the core influences the reactor system behavior at operating and accident conditions. With the aim of verifying the subcooled flow boiling calculation models of the most important nuclear reactor thermal-hydraulic computer codes, such as RELAP5, COBRA-EN and COTHA-2tp, the main purpose of this work is to compare experimental data with results from these codes in the pressure range between 15 and 45 bar. For the pressure of 45 bar the results are in good agreement, while for low pressures (15 and 30 bar) the results start to become conflicting. Besides, as a sub-product of this analysis, a comparison among the models is also presented. (author)

  14. Verification of fire and explosion accident analysis codes (facility design and preliminary results)

    International Nuclear Information System (INIS)

    Gregory, W.S.; Nichols, B.D.; Talbott, D.V.; Smith, P.R.; Fenton, D.L.

    1985-01-01

    For several years, the US Nuclear Regulatory Commission has sponsored the development of methods for improving capabilities to analyze the effects of postulated accidents in nuclear facilities; the accidents of interest are those that could occur during nuclear materials handling. At the Los Alamos National Laboratory, this program has resulted in three computer codes: FIRAC, EXPAC, and TORAC. These codes are designed to predict the effects of fires, explosions, and tornadoes in nuclear facilities. Particular emphasis is placed on the movement of airborne radioactive material through the gaseous effluent treatment system of a nuclear installation. The design, construction, and calibration of an experimental ventilation system to verify the fire and explosion accident analysis codes are described. The facility features a large industrial heater and several aerosol smoke generators that are used to simulate fires. Both injected thermal energy and aerosol mass can be controlled using this equipment. Explosions are simulated with H 2 /O 2 balloons and small explosive charges. Experimental measurements of temperature, energy, aerosol release rates, smoke concentration, and mass accumulation on HEPA filters can be made. Volumetric flow rate and differential pressures also are monitored. The initial experiments involve varying parameters such as thermal and aerosol rate and ventilation flow rate. FIRAC prediction results are presented. 10 figs

  15. An analysis of depressive symptoms in stroke survivors: verification of a moderating effect of demographic characteristics.

    Science.gov (United States)

    Park, Eun-Young; Kim, Jung-Hee

    2017-04-08

    The rehabilitation of depressed stroke patients is more difficult because poststroke depression is associated with disruption of daily activities, functioning, and quality of life. However, research on depression in stroke patients is limited. The aim of our study was to evaluate the interaction of demographic characteristics including gender, age, education level, the presence of a spouse, and income status on depressive symptoms in stroke patients and to identify groups that may need more attention with respect to depressive symptoms. We completed a secondary data analysis using data from a completed cross-sectional study of people with stroke. Depression was measured using the Center for Epidemiologic Studies Depression Scale. In this study, depressive symptoms in women living with a spouse were less severe than among those without a spouse. For those with insufficient income, depressive symptom scores were higher in the above high school group than in the below high school group, but were lower in patients who were living with a spouse than in those living without a spouse. Assessing depressive symptoms after stroke should consider the interaction of gender, economic status, education level, and the presence/absence of a spouse. These results would help in comprehensive understanding of the importance of screening for and treating depressive symptoms during rehabilitation after stroke.

  16. Kinematic analysis and experimental verification of a eccentric wheel based precision alignment mechanism for LINAC

    International Nuclear Information System (INIS)

    Mundra, G.; Jain, V.; Singh, K.K.; Saxena, P.; Khare, R.K.; Bagre, M.

    2011-01-01

    Eccentric wheel based precision alignment system was designed for the remote motorized alignment of proposed proton injector LINAC (SFDTL). As a part of the further development for the alignment and monitoring scheme, a menu driven alignment system is being developed. The paper describes a general kinematic equation (with base line tilt correction) based on the various parameters of the mechanism like eccentricity, wheel diameter, distance between the wheels and the diameter of the cylindrical accelerator component. Based on this equation the extent of the alignment range for the 4 degree of freedom is evaluated and analysis on some of the parameters variation and the theoretical accuracy/resolution is computed. For the same a computer program is written which can compute the various points for the each discrete position of the two motor combinations. The paper also describes the experimentally evaluated values of these positions (for the full extent of area) and the matching/comparison of the two data. These data now can be used for the movement computation required for alignment of the four motors (two front and two rear motors of the support structure). (author)

  17. Analysis of performance for centrifugal steam compressor

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seung Hwan; Ryu, Chang Kook; Ko, Han Seo [Sungkyunkwan University, Suwon (Korea, Republic of)

    2016-12-15

    In this study, mean streamline and Computational fluid dynamics (CFD) analyses were performed to investigate the performance of a small centrifugal steam compressor using a latent heat recovery technology. The results from both analysis methods showed good agreement. The compression ratio and efficiency of steam were found to be related with those of air by comparing the compression performances of both gases. Thus, the compression performance of steam could be predicted by the compression performance of air using the developed dimensionless parameters.

  18. Analysis of performance for centrifugal steam compressor

    International Nuclear Information System (INIS)

    Kang, Seung Hwan; Ryu, Chang Kook; Ko, Han Seo

    2016-01-01

    In this study, mean streamline and Computational fluid dynamics (CFD) analyses were performed to investigate the performance of a small centrifugal steam compressor using a latent heat recovery technology. The results from both analysis methods showed good agreement. The compression ratio and efficiency of steam were found to be related with those of air by comparing the compression performances of both gases. Thus, the compression performance of steam could be predicted by the compression performance of air using the developed dimensionless parameters

  19. Proteomic analysis and qRT-PCR verification of temperature response to Arthrospira (Spirulina platensis.

    Directory of Open Access Journals (Sweden)

    Wang Huili

    Full Text Available Arthrospira (Spirulina platensis (ASP is a representative filamentous, non-N2-fixing cyanobacterium that has great potential to enhance the food supply and possesses several valuable physiological features. ASP tolerates high and low temperatures along with highly alkaline and salty environments, and can strongly resist oxidation and irradiation. Based on genomic sequencing of ASP, we compared the protein expression profiles of this organism under different temperature conditions (15°C, 35°Cand 45°C using 2-DE and peptide mass fingerprinting techniques. A total of 122 proteins having a significant differential expression response to temperature were retrieved. Of the positively expressed proteins, the homologies of 116 ASP proteins were found in Arthrospira (81 proteins in Arthrospira platensis str. Paraca and 35 in Arthrospira maxima CS-328. The other 6 proteins have high homology with other microorganisms. We classified the 122 differentially expressed positive proteins into 14 functions using the COG database, and characterized their respective KEGG metabolism pathways. The results demonstrated that these differentially expressed proteins are mainly involved in post-translational modification (protein turnover, chaperones, energy metabolism (photosynthesis, respiratory electron transport, translation (ribosomal structure and biogenesis and carbohydrate transport and metabolism. Others proteins were related to amino acid transport and metabolism, cell envelope biogenesis, coenzyme metabolism and signal transduction mechanisms. Results implied that these proteins can perform predictable roles in rendering ASP resistance against low and high temperatures. Subsequently, we determined the transcription level of 38 genes in vivo in response to temperature and identified them by qRT-PCR. We found that the 26 differentially expressed proteins, representing 68.4% of the total target genes, maintained consistency between transcription and

  20. New approach to accuracy verification of 3D surface models: An analysis of point cloud coordinates.

    Science.gov (United States)

    Lee, Wan-Sun; Park, Jong-Kyoung; Kim, Ji-Hwan; Kim, Hae-Young; Kim, Woong-Chul; Yu, Chin-Ho

    2016-04-01

    The precision of two types of surface digitization devices, i.e., a contact probe scanner and an optical scanner, and the trueness of two types of stone replicas, i.e., one without an imaging powder (SR/NP) and one with an imaging powder (SR/P), were evaluated using a computer-aided analysis. A master die was fabricated from stainless steel. Ten impressions were taken, and ten stone replicas were prepared from Type IV stone (Fujirock EP, GC, Leuven, Belgium). The precision of two types of scanners was analyzed using the root mean square (RMS), measurement error (ME), and limits of agreement (LoA) at each coordinate. The trueness of the stone replicas was evaluated using the total deviation. A Student's t-test was applied to compare the discrepancies between the CAD-reference-models of the master die (m-CRM) and point clouds for the two types of stone replicas (α=.05). The RMS values for the precision were 1.58, 1.28, and 0.98μm along the x-, y-, and z-axes in the contact probe scanner and 1.97, 1.32, and 1.33μm along the x-, y-, and z-axes in the optical scanner, respectively. A comparison with m-CRM revealed a trueness of 7.10μm for SR/NP and 8.65μm for SR/P. The precision at each coordinate (x-, y-, and z-axes) was revealed to be higher than the one assessed in the previous method (overall offset differences). A comparison between the m-CRM and 3D surface models of the stone replicas revealed a greater dimensional change in SR/P than in SR/NP. Copyright © 2015 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  1. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  2. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  3. Dynamic model based novel findings in power systems analysis and frequency measurement verification

    Science.gov (United States)

    Kook, Kyung Soo

    power system engineering and, for doing this, new models and better application of the simulation should be proposed. Conducting extensive simulation studies, this dissertation verified that the actual X/R ratio of the bulk power systems is much lower than what has been known as its typical value, showed the effectiveness of the ESS control to mitigate the intermittence of the wind power from the perspective of the power grid using the newly proposed simulation model of ESS connected to the wind power, and found many characteristics of the wide-area frequency wave propagation. Also the possibility of using the simulated responses of the power system for replacing the measured data could be confirmed and this is very promising to the future application of the simulation to the on-line analysis of the power systems based on the FNET measurements.

  4. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  5. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  6. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  7. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    for the review and any actions that were taken when these items were missing are documented in Section 5 of this report. The availability and use of user experience were limited to extensive experience in performing RESRAD-BUILD calculations by the verification project manager and by participation in the RESRAD-BUILD workshop offered by the code developers on May 11, 2001. The level of a posteriori verification that was implemented is defined in Sections 2 through 4 of this report. In general, a rigorous verification review plan addresses program requirements, design, coding, documentation, test coverage, and evaluation of test results. The scope of the RESRAD-BUILD verification is to focus primarily on program requirements, documentation, testing and evaluation. Detailed program design and source code review would be warranted only in those cases when the evaluation of test results and user experience revealed possible problems in these areas. The verification tasks were conducted in three parts and were applied to version 3.1 of the RESRAD-BUILD code and the final version of the user.s manual, issued in November 2001 (Yu (and others) 2001). These parts include the verification of the deterministic models used in RESRAD-BUILD (Section 2), the verification of the uncertainty analysis model included in RESRAD-BUILD (Section 3), and recommendations for improvement of the RESRAD-BUILD user interface, including evaluations of the user's manual, code design, and calculation methodology (Section 4). Any verification issues that were identified were promptly communicated to the RESRAD-BUILD development team, in particular those that arose from the database and parameter verification tasks. This allowed the developers to start implementing necessary database or coding changes well before this final report was issued

  8. Modelling, Verification, and Comparative Performance Analysis of the B.A.T.M.A.N. Protocol

    NARCIS (Netherlands)

    Chaudhary, Kaylash; Fehnker, Ansgar; Mehta, Vinay; Hermanns, Holger; Höfner, Peter

    2017-01-01

    This paper considers on a network routing protocol known as Better Approach to Mobile Ad hoc Networks (B.A.T.M.A.N.). The protocol serves two aims: first, to discover all bidirectional links, and second, to identify the best-next-hop for every other node in the network. A key element is that each

  9. Verification, Performance Analysis and Controller Synthesis for Real-Time Systems

    DEFF Research Database (Denmark)

    Fahrenberg, Uli; Larsen, Kim Guldstrand; Thrane, Claus Rørbæk

    2009-01-01

    This note aims at providing a concise and precise Travellers Guide, Phrase Book or Reference Manual to the timed automata modeling formalism introduced by Alur and Dill [7, 8]. The note gives comprehensive definitions of timed automata, priced (or weighted) timed automata, and timed games...

  10. The backfitting process and its verification

    International Nuclear Information System (INIS)

    Del Nero, G.; Grimaldi, G.

    1990-01-01

    Backfitting of plants in operation is based on: - compliance with new standards and regulations, - lessons learned from operating experience. This goal can be more effectively achieved on the basis of a valid methodology of analysis and a consistent process of collection, storage and retrieval of the operating data. The general backfitting problem, the verification process and the utilization of TPA as mean to assess backfitting are illustrated. The results of the analyses performed on Caorso plant are presented as well, using some specially designed software tools Management more than hardware problems are focused. Some general conclusions are then presented as final results of the whole work

  11. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  12. Review of the technical basis and verification of current analysis methods used to predict seismic response of spent fuel storage racks

    International Nuclear Information System (INIS)

    DeGrassi, G.

    1992-10-01

    This report presents the results of a literature review on spent fuel rack seismic analysis methods and modeling procedures. The analysis of the current generation of free standing high density spent fuel racks requires careful consideration of complex phenomena such as rigid body sliding and tilting motions; impacts between adjacent racks, between fuel assemblies and racks, and between racks and pool walls and floor; fluid coupling and frictional effects. The complexity of the potential seismic response of these systems raises questions regarding the levels of uncertainty and ranges of validity of the analytical results. BNL has undertaken a program to investigate and assess the strengths and weaknesses of current fuel rack seismic analysis methods. The first phase of this program involved a review of technical literature to identify the extent of experimental and analytical verification of the analysis methods and assumptions. Numerous papers describing analysis methods for free standing fuel racks were reviewed. However, the extent of experimental verification of these methods was found to be limited. Based on the information obtained from the literature review, the report provides an assessment of the significance of the issues of concern and makes recommendations for additional studies

  13. The Innovative Design and Prototype Verification of Wheelchair with One Degree of Freedom to Perform Lifting and Standing Functions

    Science.gov (United States)

    Hsieh, Long-Chang; Chen, Tzu-Hsia

    2017-12-01

    Traditionally, the mechanism of wheelchair with lifting and standing functions has 2 degrees of freedom, and used 2 power sources to perform these 2 motion function. The purpose of this paper is to invent new wheelchair with 1 degree of freedom to perform these 2 motion functions. Hence, we can use only 1 power source to drive the mechanism to achieve lifting and standing motion functions. The new design has the advantages of simple operation, more stability, and more safety. For traditional standing wheelchair, its’ centre of gravity moves forward when standing up and it needs 2 auxiliary wheels to prevent dumping. In this paper, by using the checklist method of Osborn, the wheelchair with 1 DOF is invented to perform lifting and standing functions. The centre of gravity of this new wheelchair after standing up still located between the front and rear wheels, no auxiliary wheels needed. Finally, the prototype is manufactured to verify the theoretical results.

  14. Prediction and experimental verification of performance of box type solar cooker. Part II: Cooking vessel with depressed lid

    International Nuclear Information System (INIS)

    Reddy, Avala Raji; Rao, A.V. Narasimha

    2008-01-01

    Our previous article (Part I) discussed the theoretical and experimental study of the performance boost obtained by a cooking vessel with central cylindrical cavity on lugs when compared to that of a conventional cylindrical vessel on floor/lugs. This article compares the performance of the cooking vessel with depressed lid on lugs with that of the conventional vessel on lugs. A mathematical model is presented to understand the heat flow process to the cooking vessel and, thereby, to the food material. It is found from the experiments that the cooking vessel with depressed lid results in higher temperature of the thermic fluid loaded in the cooking vessel compared to that of the thermic fluid kept in the conventional vessel when both are placed on lugs. Similar results were obtained by modeling the process mathematically. The average improvement of performance of the vessel with depressed lid is found to be 8.4% better than the conventional cylindrical vessel

  15. Verification and Analysis of Implementing Virtual Electric Devices in Circuit Simulation of Pulsed DC Electrical Devices in the NI MULTISIM 10.1 Environment

    Directory of Open Access Journals (Sweden)

    V. A. Solov'ev

    2015-01-01

    Full Text Available The paper presents the analysis results of the implementation potential and evaluation of the virtual electric devices reliability when conducting circuit simulation of pulsed DC electrical devices in the NI Multisim 10.1environment. It analyses metrological properties of electric measuring devices and sensors of the NI Multisim 10.1environment. To calculate the reliable parameters of periodic non-sinusoidal electrical values based on their physical feasibility the mathematical expressions have been defined.To verify the virtual electric devices a circuit model of the power section of buck DC converter with enabled devices under consideration at its input and output is used as a consumer of pulse current of trapezoidal or triangular form. It is used as an example to show a technique to verify readings of virtual electric measuring devices in the NI Multisim 10.1environment.It is found that when simulating the pulsed DC electric devices to measure average and RMS voltage supply and current consumption values it is advisable to use the probe. Electric device power consumption read from the virtual power meter is equal to its average value, and its displayed power factor is inversely proportional to the input current form factor. To determine the RMS pulsed DC current by ammeter and multi-meter it is necessary to measure current by these devices in DC and AC modes, and then determine the RMS value of measurement results.Virtual electric devices verification has proved the possibility of their application to determine the energy performance of transistor converters for various purposes in the circuit simulation in the NI 10.1 Multisim environment, thus saving time of their designing.

  16. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  17. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  18. International Performance Measurement and Verification Protocol: Concepts and Options for Determining Energy and Water Savings, Volume I (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    2002-03-01

    This protocol serves as a framework to determine energy and water savings resulting from the implementation of an energy efficiency program. It is also intended to help monitor the performance of renewable energy systems and to enhance indoor environmental quality in buildings.

  19. Developing a NASA strategy for the verification of large space telescope observatories

    Science.gov (United States)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  20. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  1. Development of a FBR fuel bundle-duct interaction analysis code-BAMBOO. Analysis model and verification by Phenix high burn-up fuel subassemblies

    International Nuclear Information System (INIS)

    Uwaba, Tomoyuki; Ito, Masahiro; Ukai, Shigeharu

    2005-01-01

    The bundle-duct interaction analysis code ''BAMBOO'' has been developed for the purpose of predicting deformation of a wire-wrapped fuel pin bundle of a fast breeder reactor (FBR). The BAMBOO code calculates helical bowing and oval-distortion of all the fuel pins in a fuel subassembly. We developed deformation models in order to precisely analyze the irradiation induced deformation by the code: a model to analyze fuel pin self-bowing induced by circumferential gradient of void swelling as well as thermal expansion, and a model to analyze dispersion of the orderly arrangement of a fuel pin bundle. We made deformation analyses of high burn-up fuel subassemblies in Phenix reactor and compared the calculated results with the post irradiation examination data of these subassemblies for the verification of these models. From the comparison we confirmed that the calculated values of the oval-distortion and bowing reasonably agreed with the PIE results if these models were used in the analysis of the code. (author)

  2. Prediction and experimental verification of performance of box type solar cooker - Part I. Cooking vessel with central cylindrical cavity

    International Nuclear Information System (INIS)

    Reddy, Avala Raji; Rao, A.V. Narasimha

    2007-01-01

    The performance of conventional box type solar cookers can be improved by better designs of cooking vessels with proper understanding of the heat flow to the material to be cooked. An attempt has been made in this article to arrive at a mathematical model to understand the heat flow process to the cooking vessel and thereby to the food material. The mathematical model considers a double glazed hot box type solar cooker loaded with two different types of vessels, kept either on the floor of the cooker or on lugs. The performance of the cooking vessel with a central cylindrical cavity is compared with that of a conventional cylindrical cooking vessel. It is found from the experiments and modeling that the cooking vessel with a central cylindrical cavity on lugs results in a higher temperature of the thermic fluid than that of a conventional vessel on the floor or on lugs. The average improvement of performance of the vessel with a central cylindrical cavity kept on lugs is found to be 5.9% and 2.4% more than that of a conventional cylindrical vessel on the floor and on lugs, respectively

  3. Performance Verification of Production-Scalable Energy-Efficient Solutions: Winchester/Camberley Homes Mixed-Humid Climate

    Energy Technology Data Exchange (ETDEWEB)

    Mallay, D. [Partnership for Home Innovation, Upper Marlboro, MD (United States); Wiehagen, J. [Partnership for Home Innovation, Upper Marlboro, MD (United States)

    2014-07-01

    Winchester/Camberley Homes collaborated with the Building America team Partnership for Home Innovation to develop a new set of high performance home designs that could be applicable on a production scale. The new home designs are to be constructed in the mixed humid climate zone and could eventually apply to all of the builder's home designs to meet or exceed future energy codes or performance-based programs. However, the builder recognized that the combination of new wall framing designs and materials, higher levels of insulation in the wall cavity, and more detailed air sealing to achieve lower infiltration rates changes the moisture characteristics of the wall system. In order to ensure long term durability and repeatable successful implementation with few call-backs, the project team demonstrated through measured data that the wall system functions as a dynamic system, responding to changing interior and outdoor environmental conditions within recognized limits of the materials that make up the wall system. A similar investigation was made with respect to the complete redesign of the HVAC systems to significantly improve efficiency while maintaining indoor comfort. Recognizing the need to demonstrate the benefits of these efficiency features, the builder offered a new house model to serve as a test case to develop framing designs, evaluate material selections and installation requirements, changes to work scopes and contractor learning curves, as well as to compare theoretical performance characteristics with measured results.

  4. Performance Verification of Production-Scalable Energy-Efficient Solutions: Winchester/Camberley Homes Mixed-Humid Climate

    Energy Technology Data Exchange (ETDEWEB)

    Mallay, D.; Wiehagen, J.

    2014-07-01

    Winchester/Camberley Homes with the Building America program and its NAHB Research Center Industry Partnership collaborated to develop a new set of high performance home designs that could be applicable on a production scale. The new home designs are to be constructed in the mixed humid climate zone four and could eventually apply to all of the builder's home designs to meet or exceed future energy codes or performance-based programs. However, the builder recognized that the combination of new wall framing designs and materials, higher levels of insulation in the wall cavity, and more detailed air sealing to achieve lower infiltration rates changes the moisture characteristics of the wall system. In order to ensure long term durability and repeatable successful implementation with few call-backs, this report demonstrates through measured data that the wall system functions as a dynamic system, responding to changing interior and outdoor environmental conditions within recognized limits of the materials that make up the wall system. A similar investigation was made with respect to the complete redesign of the heating, cooling, air distribution, and ventilation systems intended to optimize the equipment size and configuration to significantly improve efficiency while maintaining indoor comfort. Recognizing the need to demonstrate the benefits of these efficiency features, the builder offered a new house model to serve as a test case to develop framing designs, evaluate material selections and installation requirements, changes to work scopes and contractor learning curves, as well as to compare theoretical performance characteristics with measured results.

  5. Mathematical Verification for Transmission Performance of Centralized Lightwave WDM-RoF-PON with Quintuple Services Integrated in Each Wavelength Channel

    Directory of Open Access Journals (Sweden)

    Shuai Chen

    2015-01-01

    Full Text Available Wavelength-division-multiplexing passive-optical-network (WDM-PON has been recognized as a promising solution of the “last mile” access as well as multibroadband data services access for end users, and WDM-RoF-PON, which employs radio-over-fiber (RoF technique in WDM-PON, is even a more attractive approach for future broadband fiber and wireless access for its strong availability of centralized multiservices transmission operation and its transparency for bandwidth and signal modulation formats. As for multiservices development in WDM-RoF-PON, various system designs have been reported and verified via simulation or experiment till now, and the scheme with multiservices transmitted in each single wavelength channel is believed as the one that has the highest bandwidth efficiency; however, the corresponding mathematical verification is still hard to be found in state-of-the-art literature. In this paper, system design and data transmission performance of a quintuple services integrated WDM-RoF-PON which jointly employs carrier multiplexing and orthogonal modulation techniques, have been theoretically analyzed and verified in detail; moreover, the system design has been duplicated and verified experimentally and the theory system of such WDM-RoF-PON scheme has thus been formed.

  6. Results of the independent radiological verification survey of the remedial action performed at 525 S. Main Street, Oxford, Ohio, (OXO002)

    International Nuclear Information System (INIS)

    Kleinhans, K.R.; Rice, D.E.; Murray, M.E.; Carrier, R.F.

    1996-04-01

    Between October 1952 and February 1957, National Lead of Ohio (NLO), a primary contractor for the Atomic Energy Commission (AEC), subcontracted certain uranium machining operations to Alba Craft Laboratory, Incorporated, located at 10-14 West Rose Avenue, Oxford, Ohio. In 1992, personnel from Oak Ridge National Laboratory (ORNL) confirmed the presence of residual radioactive materials from the AEC-related operations in and around the facility in amounts exceeding the applicable Department of Energy (DOE) guidelines. Above-guideline radiation levels were also found both indoors and outdoors at 525 S. Main Street, a private residential property in the immediate vicinity of the Alba Craft site. This document reports the findings at this private residence. Although the amount of uranium found on the properties posed little health hazard if left undisturbed, the levels were sufficient to require remediation to bring radiological conditions into compliance with current guidelines, thus ensuring that the public and the environment are protected. A team from ORNL conducted a radiological verification survey of the property at 525 S. Main Street, between November 1993 and December 1994. The survey was conducted at the request of DOE and included directly measured radiation levels, the collection and analysis of soil samples to determine concentrations of uranium and certain other radionuclides, and comparison of these data to the guidelines

  7. Building America Performance Analysis Procedures: Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    None

    2004-06-01

    To measure progress toward multi-year research goals, cost and performance trade-offs are evaluated through a series of controlled field and laboratory experiments supported by energy analysis techniques using test data to calibrate simulation models.

  8. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Budzien, Joanne Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ferguson, Jim Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Harwell, Megan Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hickmann, Kyle Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Israel, Daniel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Magrogan, William Richard III [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Singleton, Jr., Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Srinivasan, Gowri [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Walter, Jr, John William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Woods, Charles Nathan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-26

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents serve as the compilation of results demonstrating accomplishment of these objectives.

  9. Personal Verification/Identification via Analysis of the Peripheral ECG Leads: Influence of the Personal Health Status on the Accuracy

    Directory of Open Access Journals (Sweden)

    Irena Jekova

    2015-01-01

    Full Text Available Traditional means for identity validation (PIN codes, passwords, and physiological and behavioral biometric characteristics (fingerprint, iris, and speech are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (rI, II (rII, calculated from them first principal ECG component (rPCA, linear and nonlinear combinations between rI, rII, and rPCA. For the verification task, the one-to-one scenario is applied and threshold values for rI, rII, and rPCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension has been considered. In addition a common reference PTB dataset (14 healthy individuals with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%.

  10. Structured Performance Analysis for Component Based Systems

    OpenAIRE

    Salmi , N.; Moreaux , Patrice; Ioualalen , M.

    2012-01-01

    International audience; The Component Based System (CBS) paradigm is now largely used to design software systems. In addition, performance and behavioural analysis remains a required step for the design and the construction of efficient systems. This is especially the case of CBS, which involve interconnected components running concurrent processes. % This paper proposes a compositional method for modeling and structured performance analysis of CBS. Modeling is based on Stochastic Well-formed...

  11. Paramedir: A Tool for Programmable Performance Analysis

    Science.gov (United States)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  12. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Jang-Soo; Kim, Jang-Yeol

    2004-01-01

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  13. Testing and Performance Verification of a High Bypass Ratio Turbofan Rotor in an Internal Flow Component Test Facility

    Science.gov (United States)

    VanZante, Dale E.; Podboy, Gary G.; Miller, Christopher J.; Thorp, Scott A.

    2009-01-01

    A 1/5 scale model rotor representative of a current technology, high bypass ratio, turbofan engine was installed and tested in the W8 single-stage, high-speed, compressor test facility at NASA Glenn Research Center (GRC). The same fan rotor was tested previously in the GRC 9x15 Low Speed Wind Tunnel as a fan module consisting of the rotor and outlet guide vanes mounted in a flight-like nacelle. The W8 test verified that the aerodynamic performance and detailed flow field of the rotor as installed in W8 were representative of the wind tunnel fan module installation. Modifications to W8 were necessary to ensure that this internal flow facility would have a flow field at the test package that is representative of flow conditions in the wind tunnel installation. Inlet flow conditioning was designed and installed in W8 to lower the fan face turbulence intensity to less than 1.0 percent in order to better match the wind tunnel operating environment. Also, inlet bleed was added to thin the casing boundary layer to be more representative of a flight nacelle boundary layer. On the 100 percent speed operating line the fan pressure rise and mass flow rate agreed with the wind tunnel data to within 1 percent. Detailed hot film surveys of the inlet flow, inlet boundary layer and fan exit flow were compared to results from the wind tunnel. The effect of inlet casing boundary layer thickness on fan performance was quantified. Challenges and lessons learned from testing this high flow, low static pressure rise fan in an internal flow facility are discussed.

  14. Factors affecting construction performance: exploratory factor analysis

    Science.gov (United States)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  15. Performance optimisations for distributed analysis in ALICE

    International Nuclear Information System (INIS)

    Betev, L; Gheata, A; Grigoras, C; Hristov, P; Gheata, M

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with ''sensors'' collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis

  16. Overcoming urban GPS navigation challenges through the use of MEMS inertial sensors and proper verification of navigation system performance

    Science.gov (United States)

    Vinande, Eric T.

    This research proposes several means to overcome challenges in the urban environment to ground vehicle global positioning system (GPS) receiver navigation performance through the integration of external sensor information. The effects of narrowband radio frequency interference and signal attenuation, both common in the urban environment, are examined with respect to receiver signal tracking processes. Low-cost microelectromechanical systems (MEMS) inertial sensors, suitable for the consumer market, are the focus of receiver augmentation as they provide an independent measure of motion and are independent of vehicle systems. A method for estimating the mounting angles of an inertial sensor cluster utilizing typical urban driving maneuvers is developed and is able to provide angular measurements within two degrees of truth. The integration of GPS and MEMS inertial sensors is developed utilizing a full state navigation filter. Appropriate statistical methods are developed to evaluate the urban environment navigation improvement due to the addition of MEMS inertial sensors. A receiver evaluation metric that combines accuracy, availability, and maximum error measurements is presented and evaluated over several drive tests. Following a description of proper drive test techniques, record and playback systems are evaluated as the optimal way of testing multiple receivers and/or integrated navigation systems in the urban environment as they simplify vehicle testing requirements.

  17. Mechanistic Physiologically Based Pharmacokinetic (PBPK) Model of the Heart Accounting for Inter-Individual Variability: Development and Performance Verification.

    Science.gov (United States)

    Tylutki, Zofia; Mendyk, Aleksander; Polak, Sebastian

    2018-04-01

    Modern model-based approaches to cardiac safety and efficacy assessment require accurate drug concentration-effect relationship establishment. Thus, knowledge of the active concentration of drugs in heart tissue is desirable along with inter-subject variability influence estimation. To that end, we developed a mechanistic physiologically based pharmacokinetic model of the heart. The models were described with literature-derived parameters and written in R, v.3.4.0. Five parameters were estimated. The model was fitted to amitriptyline and nortriptyline concentrations after an intravenous infusion of amitriptyline. The cardiac model consisted of 5 compartments representing the pericardial fluid, heart extracellular water, and epicardial intracellular, midmyocardial intracellular, and endocardial intracellular fluids. Drug cardiac metabolism, passive diffusion, active efflux, and uptake were included in the model as mechanisms involved in the drug disposition within the heart. The model accounted for inter-individual variability. The estimates of optimized parameters were within physiological ranges. The model performance was verified by simulating 5 clinical studies of amitriptyline intravenous infusion, and the simulated pharmacokinetic profiles agreed with clinical data. The results support the model feasibility. The proposed structure can be tested with the goal of improving the patient-specific model-based cardiac safety assessment and offers a framework for predicting cardiac concentrations of various xenobiotics. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  18. Structural Performance Optimization and Verification of an Improved Thin-Walled Storage Tank for a Pico-Satellite

    Directory of Open Access Journals (Sweden)

    Lai Teng

    2017-11-01

    Full Text Available This paper presents an improved mesh storage tank structure obtained using 3D metal printing. The storage tank structure is optimized using a multi-objective uniform design method. Each parameter influencing the storage tank is considered as the optimization factor, and the compression stress ( σ , volume utilization ratio ( v , and weight ( m , are considered as the optimization objectives. Regression equations were established between the optimization factors and targets, the orders of the six factors affecting three target values are analyzed, and the relative deviations between the regression equation and calculation results for σ , v , and m were 9.72%, 4.15%, and 2.94%, respectively. The optimization results showed that the regression equations can predict the structure performance of the improved storage tank, and the values of the influence factors obtained through the optimization are effective. In addition, the compression stress was improved by 24.98%, the volume utilization ratio was increased by 26.86%, and the weight was reduced by 26.83%. The optimized storage tank was developed through 3D metal printing, and the compressive stress was improved by 58.71%, the volume utilization ratio was increased by 24.52%, and the weight was reduced by 11.67%.

  19. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  20. The KNICS approach for verification and validation of safety software

    International Nuclear Information System (INIS)

    Cha, Kyung Ho; Sohn, Han Seong; Lee, Jang Soo; Kim, Jang Yeol; Cheon, Se Woo; Lee, Young Joon; Hwang, In Koo; Kwon, Kee Choon

    2003-01-01

    This paper presents verification and validation (VV) to be approached for safety software of POSAFE-Q Programmable Logic Controller (PLC) prototype and Plant Protection System (PPS) prototype, which consists of Reactor Protection System (RPS) and Engineered Safety Features-Component Control System (ESF-CCS) in development of Korea Nuclear Instrumentation and Control System (KNICS). The SVV criteria and requirements are selected from IEEE Std. 7-4.3.2, IEEE Std. 1012, IEEE Std. 1028 and BTP-14, and they have been considered for acceptance framework to be provided within SVV procedures. SVV techniques, including Review and Inspection (R and I), Formal Verification and Theorem Proving, and Automated Testing, are applied for safety software and automated SVV tools supports SVV tasks. Software Inspection Support and Requirement Traceability (SIS-RT) supports R and I and traceability analysis, a New Symbolic Model Verifier (NuSMV), Statemate MAGNUM (STM) ModelCertifier, and Prototype Verification System (PVS) are used for formal verification, and McCabe and Cantata++ are utilized for static and dynamic software testing. In addition, dedication of Commercial-Off-The-Shelf (COTS) software and firmware, Software Safety Analysis (SSA) and evaluation of Software Configuration Management (SCM) are being performed for the PPS prototype in the software requirements phase

  1. System Reliability Analysis Considering Correlation of Performances

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Saekyeol; Lee, Tae Hee [Hanyang Univ., Seoul (Korea, Republic of); Lim, Woochul [Mando Corporation, Seongnam (Korea, Republic of)

    2017-04-15

    Reliability analysis of a mechanical system has been developed in order to consider the uncertainties in the product design that may occur from the tolerance of design variables, uncertainties of noise, environmental factors, and material properties. In most of the previous studies, the reliability was calculated independently for each performance of the system. However, the conventional methods cannot consider the correlation between the performances of the system that may lead to a difference between the reliability of the entire system and the reliability of the individual performance. In this paper, the joint probability density function (PDF) of the performances is modeled using a copula which takes into account the correlation between performances of the system. The system reliability is proposed as the integral of joint PDF of performances and is compared with the individual reliability of each performance by mathematical examples and two-bar truss example.

  2. System Reliability Analysis Considering Correlation of Performances

    International Nuclear Information System (INIS)

    Kim, Saekyeol; Lee, Tae Hee; Lim, Woochul

    2017-01-01

    Reliability analysis of a mechanical system has been developed in order to consider the uncertainties in the product design that may occur from the tolerance of design variables, uncertainties of noise, environmental factors, and material properties. In most of the previous studies, the reliability was calculated independently for each performance of the system. However, the conventional methods cannot consider the correlation between the performances of the system that may lead to a difference between the reliability of the entire system and the reliability of the individual performance. In this paper, the joint probability density function (PDF) of the performances is modeled using a copula which takes into account the correlation between performances of the system. The system reliability is proposed as the integral of joint PDF of performances and is compared with the individual reliability of each performance by mathematical examples and two-bar truss example.

  3. Performance Analysis using Coloured Petri Nets

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    Performance is often a central issue in the design, development, and configuration of systems. It is not always enough to know that systems work properly, they must also work effectively. There are numerous studies, e.g. in the areas of computer and telecommunication systems, manufacturing......, military, health care, and transportation, that have shown that time, money, and even lives can be saved if the performance of a system is improved. Performance analysis studies are conducted to evaluate existing or planned systems, to compare alternative configurations, or to find an optimal configuration...... of a system. There are three alternative techniques for analysing the performance of a system: measurement, analytical models, and simulation models. This dissertation focuses on the the use of coloured Petri nets for simulationbased performance analysis of industrial-sized systems. Coloured Petri nets...

  4. Comprehensive analysis of transport aircraft flight performance

    Science.gov (United States)

    Filippone, Antonio

    2008-04-01

    This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance

  5. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  6. Design for rock grouting based on analysis of grout penetration. Verification using Aespoe HRL data and parameter analysis

    International Nuclear Information System (INIS)

    Kobayashi, Shinji; Stille, Haakan

    2007-01-01

    Grouting as a method to reduce the inflow of water into underground facilities will be important in both the construction and operation of the deep repository. SKB has been studying grouting design based on characterization of fractured rock and prediction of grout spread. However, as in other Scandinavian tunnels, stop criteria have been empirically set so that grouting is completed when the grout flow is less than a certain value at maximum pressure or the grout take is above a certain value. Since empirically based stop criteria are determined without a theoretical basis and are not related to grout penetration, the grouting result may be inadequate or uneconomical. In order to permit the choice of adequate and cost-effective grouting methods, stop criteria can be designed based on a theoretical analysis of grout penetration. The relationship between grout penetration and grouting time has been studied at the Royal Institute of Technology and Chalmers University of Technology. Based on these studies, the theory has been further developed in order to apply to real grouting work. Another aspect is using the developed method for parameter analysis. The purpose of parameter analysis is to evaluate the influence of different grouting parameters on the result. Since the grouting strategy is composed of many different components, the selection of a grouting method is complex. Even if the theoretically most suitable grouting method is selected, it is difficult to carry out grouting exactly as planned because grouting parameters such as grout properties can easily vary during the grouting operation. In addition, knowing the parameters precisely beforehand is impossible because there are uncertainties inherent in the rock mass. Therefore, it is important to asses the effects of variations in grouting parameters. The parameter analysis can serve as a guide in choosing an effective grouting method. The objectives of this report are to: Further develop the theory concerning

  7. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  8. Performance Analysis: Control of Hazardous Energy

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, Connie E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Freeman, Jeff W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kerr, Christine E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-10-06

    LLNL experienced 26 occurrences related to the control of hazardous energy from January 1, 2008 through August 2010. These occurrences were 17% of the total number of reported occurrences during this 32-month period. The Performance Analysis and Reporting Section of the Contractor Assurance Office (CAO) routinely analyzes reported occurrences and issues looking for patterns that may indicate changes in LLNL’s performance and early indications of performance trends. It became apparent through these analyses that LLNL might have experienced a change in the control of hazardous energy and that these occurrences should be analyzed in more detail to determine if the perceived change in performance was real, whether that change is significant and if the causes of the occurrences are similar. This report documents the results of this more detailed analysis.

  9. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  10. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  11. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  12. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  13. Development of core design/analysis technology for integral reactor; verification of SMART nuclear design by Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Hyo; Hong, In Seob; Han, Beom Seok; Jeong, Jong Seong [Seoul National University, Seoul (Korea)

    2002-03-01

    The objective of this project is to verify neutronics characteristics of the SMART core design as to compare computational results of the MCNAP code with those of the MASTER code. To achieve this goal, we will analyze neutronics characteristics of the SMART core using the MCNAP code and compare these results with results of the MASTER code. We improved parallel computing module and developed error analysis module of the MCNAP code. We analyzed mechanism of the error propagation through depletion computation and developed a calculation module for quantifying these errors. We performed depletion analysis for fuel pins and assemblies of the SMART core. We modeled a 3-D structure of the SMART core and considered a variation of material compositions by control rods operation and performed depletion analysis for the SMART core. We computed control-rod worths of assemblies and a reactor core for operation of individual control-rod groups. We computed core reactivity coefficients-MTC, FTC and compared these results with computational results of the MASTER code. To verify error analysis module of the MCNAP code, we analyzed error propagation through depletion of the SMART B-type assembly. 18 refs., 102 figs., 36 tabs. (Author)

  14. Comparison of detection limits in environmental analysis--is it possible? An approach on quality assurance in the lower working range by verification.

    Science.gov (United States)

    Geiss, S; Einax, J W

    2001-07-01

    Detection limit, reporting limit and limit of quantitation are analytical parameters which describe the power of analytical methods. These parameters are used for internal quality assurance and externally for competing, especially in the case of trace analysis in environmental compartments. The wide variety of possibilities for computing or obtaining these measures in literature and in legislative rules makes any comparison difficult. Additionally, a host of terms have been used within the analytical community to describe detection and quantitation capabilities. Without trying to create an order for the variety of terms, this paper is aimed at providing a practical proposal for answering the main questions for the analysts concerning quality measures above. These main questions and related parameters were explained and graphically demonstrated. Estimation and verification of these parameters are the two steps to get real measures. A rule for a practical verification is given in a table, where the analyst can read out what to measure, what to estimate and which criteria have to be fulfilled. In this manner verified parameters detection limit, reporting limit and limit of quantitation now are comparable and the analyst himself is responsible to the unambiguity and reliability of these measures.

  15. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  16. Solar diffusers in Earth observation instruments with an illumination angle of up to 70°: design and verification of performance in BRDF

    NARCIS (Netherlands)

    Gür, B.; Bol, H.; Xu, P.; Li, B.

    2015-01-01

    The present paper describes the challenging diffuser design and verification activities of TNO under contract of a customer for an earth observation instrument with observation conditions that require feasible BRDF under large angles of incidence of up to 70° with respect to the surface normal. Not

  17. Verification of in-core thermal and hydraulic analysis code FLOWNET/TRUMP for the high temperature engineering test reactor (HTTR) at JAERI

    International Nuclear Information System (INIS)

    Maruyama, Soh; Sudo, Yukio; Saito, Shinzo; Kiso, Yoshihiro; Hayakawa, Hitoshi

    1989-01-01

    The FLOWNET/TRUMP code consists of a flow network analysis code 'FLOWNET' for calculations of coolant flow distribution and coolant temperature distribution in the core with a thermal conduction analysis code 'TRUMP' for calculation of temperature distribution in solid structures. The verification of FLOWNET/TRUMP was made by the comparison of the analytical results with the results of steady state experiments by the HENDEL multichannel test rig, T1-M, which consisted of twelve simulated fuel rods heated electrically and eleven hexagonal graphite fuel blocks. The T1-M simulated the one fuel column in the core. The analytical results agreed well with the results of the experiment in which the HTTR operating conditions were simulated. (orig.)

  18. Principal Component Analysis as an Efficient Performance ...

    African Journals Online (AJOL)

    This paper uses the principal component analysis (PCA) to examine the possibility of using few explanatory variables (X's) to explain the variation in Y. It applied PCA to assess the performance of students in Abia State Polytechnic, Aba, Nigeria. This was done by estimating the coefficients of eight explanatory variables in a ...

  19. Using Ratio Analysis to Evaluate Financial Performance.

    Science.gov (United States)

    Minter, John; And Others

    1982-01-01

    The ways in which ratio analysis can help in long-range planning, budgeting, and asset management to strengthen financial performance and help avoid financial difficulties are explained. Types of ratios considered include balance sheet ratios, net operating ratios, and contribution and demand ratios. (MSE)

  20. Performance analysis of opportunistic nonregenerative relaying

    KAUST Repository

    Tourki, Kamel; Alouini, Mohamed-Slim; Qaraqe, Khalid A.; Yang, Hongchuan

    2013-01-01

    Opportunistic relaying in cooperative communication depends on careful relay selection. However, the traditional centralized method used for opportunistic amplify-and-forward protocols requires precise measurements of channel state information at the destination. In this paper, we adopt the max-min criterion as a relay selection framework for opportunistic amplify-and-forward cooperative communications, which was exhaustively used for the decode-and-forward protocol, and offer an accurate performance analysis based on exact statistics of the local signal-to-noise ratios of the best relay. Furthermore, we evaluate the asymptotical performance and deduce the diversity order of our proposed scheme. Finally, we validate our analysis by showing that performance simulation results coincide with our analytical results over Rayleigh fading channels, and we compare the max-min relay selection with their centralized channel state information-based and partial relay selection counterparts.

  1. Probabilistic Analysis of Gas Turbine Field Performance

    Science.gov (United States)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  2. Underground verification of the large deflection performance of fibre reinforced shotcrete subjected to high stresses and convergence and to dynamic loading.

    CSIR Research Space (South Africa)

    Joughin, WC

    2002-04-01

    Full Text Available and polypropylene fibre reinforced shotcrete compared to mesh reinforced shotcrete in tunnels subject to high stresses and convergence and possibly, to dynamic loading. In particular: • A direct comparison of the in situ performance of mesh reinforced shotcrete... with that of steel and polypropylene fibre reinforced shotcrete; • Confirmation that the performance of fibre reinforced shotcrete matches the performance of mesh reinforced shotcrete under large deformation; • A comparative basis for theoretical analysis...

  3. ANALYSIS FRAMEWORKS OF THE COLLABORATIVE INNOVATION PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Dan SERGHIE

    2014-12-01

    Full Text Available Time management is one of the resources by which we can achieve improved performance innovation. This perspective of resource management and process efficiency by reducing the timing of incubation of ideas, selecting profitable innovations and turning them into added value relates to that absolute time, a time specific to human existence. In this article I will try to prove that the main way to obtain high performance through inter-organizational innovation can be achieved by manipulating the context and manipulating knowledge outside the arbitrary concept for “time”. This article presents the results of the research suggesting a sequential analysis and evaluation model of the performance through a rational and refined process of selection of the performance indicators, aiming at providing the shortest and most relevant list of criteria.

  4. RAZORBACK - A Research Reactor Transient Analysis Code Version 1.0 - Volume 3: Verification and Validation Report.

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G.

    2017-04-01

    This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code shows good agreement between simulation and actual ACRR operations.

  5. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  6. Open Source Analysis in Support to Nonproliferation Monitoring and Verification Activities: Using the New Media to Derive Unknown New Information

    International Nuclear Information System (INIS)

    Pabian, F.; Renda, G.; Jungwirth, R.; Kim, L.; Wolfart, E.; Cojazzi, G.G.M.; )

    2015-01-01

    This paper will describe evolving techniques that leverage freely available open source social media venues, sometimes referred to as the ''New Media,'' together with geospatial tools and commercial satellite imagery (with its ever improving spatial, spectral, and temporal resolutions), to expand the existing nuclear non-proliferation knowledge base by way of a review of some recent exemplar cases. The application of such techniques can enhance more general data mining, as those techniques can be more directly tailored to IAEA Safeguards monitoring and other non-proliferation verification activities to improve the possibility of the remote detection of undeclared nuclear related facilities and/or activities. As part of what might be called the new ''Societal Verification'' regime, these techniques have enlisted either the passive or active involvement of interested parties (NGOs, academics, and even hobbyists) using open sources and collaboration networks together with previously highlighted geospatial visualization tools and techniques. This paper will show how new significant, and unprecedented, information discoveries have already been made (and published in open source) in the last four years, i.e., since the last IAEA Safeguards Symposium. With respect to the possibility of soliciting active participation (e.g., ''crowd-sourcing'') via social media, one can envision scenarios (one example from open source will be provided) whereby a previously unknown nuclear related facility could be identified or located through the online posting of reports, line drawings, and/or ground photographs. Nonetheless, these techniques should not be viewed as a panacea, as examples of both deception and human error will also be provided. This paper will highlight the use of these remote-means of discovery techniques, and how they have shed entirely new light on important nuclear non-proliferation relevant issues in

  7. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  8. Building America House Performance Analysis Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Farrar-Nagy, S.; Anderson, R.; Judkoff, R.

    2001-10-29

    As the Building America Program has grown to include a large and diverse cross section of the home building industry, accurate and consistent analysis techniques have become more important to help all program partners as they perform design tradeoffs and calculate energy savings for prototype houses built as part of the program. This document illustrates some of the analysis concepts proven effective and reliable for analyzing the transient energy usage of advanced energy systems as well as entire houses. The analysis procedure described here provides a starting point for calculating energy savings of a prototype house relative to two base cases: builder standard practice and regional standard practice. Also provides building simulation analysis to calculate annual energy savings based on side-by-side short-term field testing of a prototype house.

  9. Performance analysis and prediction in triathlon.

    Science.gov (United States)

    Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B

    2016-01-01

    Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.

  10. Analysis of human performance in KHNP NPPs

    International Nuclear Information System (INIS)

    Tae, Sung Eun

    2004-01-01

    The most important thing in the management of nuclear power plant is safety. One of the key factors to enhance the safety is to analyze human performance and to reflect the results on the practical plant operation. KHNP NPPs experienced human errors in the fields of operation and maintenance. The human errors need to be analyzed and, necessary corrective actions according to the causes should be made to prevent the same event or similar events. Therefore we'd like to introduce the procedure of K-HPES(KHNP-Human Performance Enhancement System) and the results of analysis of HPES reports produced in 2002 and 2003

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  12. Secure Hardware Performance Analysis in Virtualized Cloud Environment

    Directory of Open Access Journals (Sweden)

    Chee-Heng Tan

    2013-01-01

    Full Text Available The main obstacle in mass adoption of cloud computing for database operations is the data security issue. In this paper, it is shown that IT services particularly in hardware performance evaluation in virtual machine can be accomplished effectively without IT personnel gaining access to real data for diagnostic and remediation purposes. The proposed mechanisms utilized TPC-H benchmark to achieve 2 objectives. First, the underlying hardware performance and consistency is supervised via a control system, which is constructed using a combination of TPC-H queries, linear regression, and machine learning techniques. Second, linear programming techniques are employed to provide input to the algorithms that construct stress-testing scenarios in the virtual machine, using the combination of TPC-H queries. These stress-testing scenarios serve 2 purposes. They provide the boundary resource threshold verification to the first control system, so that periodic training of the synthetic data sets for performance evaluation is not constrained by hardware inadequacy, particularly when the resources in the virtual machine are scaled up or down which results in the change of the utilization threshold. Secondly, they provide a platform for response time verification on critical transactions, so that the expected Quality of Service (QoS from these transactions is assured.

  13. Validation, verification and evaluation of a Train to Train Distance Measurement System by means of Colored Petri Nets

    International Nuclear Information System (INIS)

    Song, Haifeng; Liu, Jieyu; Schnieder, Eckehard

    2017-01-01

    Validation, verification and evaluation are necessary processes to assure the safety and functionality of a system before its application in practice. This paper presents a Train to Train Distance Measurement System (TTDMS), which can provide distance information independently from existing onboard equipment. Afterwards, we proposed a new process using Colored Petri Nets to verify the TTDMS system functional safety, as well as to evaluate the system performance. Three main contributions are carried out in the paper: Firstly, this paper proposes a formalized TTDMS model, and the model correctness is validated using state space analysis and simulation-based verification. Secondly, corresponding checking queries are proposed for the purpose of functional safety verification. Further, the TTDMS performance is evaluated by applying parameters in the formal model. Thirdly, the reliability of a functional prototype TTDMS is estimated. It is found that the procedure can cooperate with the system development, and both formal and simulation-based verifications are performed. Using our process to evaluate and verify a system is easier to read and more reliable compared to executable code and mathematical methods. - Highlights: • A new Train to Train Distance Measurement System. • New approach verifying system functional safety and evaluating system performance by means of CPN. • System formalization using the system property concept. • Verification of system functional safety using state space analysis. • Evaluation of system performance applying simulation-based analysis.

  14. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  15. Track 4: basic nuclear science variance reduction for Monte Carlo criticality simulations. 2. Assessment of MCNP Statistical Analysis of keff Eigenvalue Convergence with an Analytical Criticality Verification Test Set

    International Nuclear Information System (INIS)

    Sood, Avnet; Forster, R. Arthur; Parsons, D. Kent

    2001-01-01

    Monte Carlo simulations of nuclear criticality eigenvalue problems are often performed by general purpose radiation transport codes such as MCNP. MCNP performs detailed statistical analysis of the criticality calculation and provides feedback to the user with warning messages, tables, and graphs. The purpose of the analysis is to provide the user with sufficient information to assess spatial convergence of the eigenfunction and thus the validity of the criticality calculation. As a test of this statistical analysis package in MCNP, analytic criticality verification benchmark problems have been used for the first time to assess the performance of the criticality convergence tests in MCNP. The MCNP statistical analysis capability has been recently assessed using the 75 multigroup criticality verification analytic problem test set. MCNP was verified with these problems at the 10 -4 to 10 -5 statistical error level using 40 000 histories per cycle and 2000 active cycles. In all cases, the final boxed combined k eff answer was given with the standard deviation and three confidence intervals that contained the analytic k eff . To test the effectiveness of the statistical analysis checks in identifying poor eigenfunction convergence, ten problems from the test set were deliberately run incorrectly using 1000 histories per cycle, 200 active cycles, and 10 inactive cycles. Six problems with large dominance ratios were chosen from the test set because they do not achieve the normal spatial mode in the beginning of the calculation. To further stress the convergence tests, these problems were also started with an initial fission source point 1 cm from the boundary thus increasing the likelihood of a poorly converged initial fission source distribution. The final combined k eff confidence intervals for these deliberately ill-posed problems did not include the analytic k eff value. In no case did a bad confidence interval go undetected. Warning messages were given signaling that

  16. Performance analysis of LMFBR control rods

    International Nuclear Information System (INIS)

    Pitner, A.L.; Birney, K.R.

    1975-01-01

    Control rods in the FFTF and LMFBR's will consist of pin bundles of stainless steel-clad boron carbide pellets. In the FFTF reference design, sixty-one pins of 0.474-inch diameter each containing a 36-inch stack of 0.362-inch diameter boron carbide pellets comprise a control rod. Reactivity control is provided by the 10 B (n,α) 7 Li reaction in the boron carbide. This reaction is accompanied by an energy release of 2.8 MeV, and heating from this reaction typically approaches 100 watts/cm 3 for natural boron carbide pellets in an LMFBR flux. Performance analysis of LMFBR control rods must include an assessment of the thermal performance of control pins. In addition, irradiation performance with regard to helium release, pellet swelling, and reactivity worth depletion as a function of service time must be evaluated

  17. Analysis of performance limitations for superconducting cavities

    International Nuclear Information System (INIS)

    J. R. Delayen; L. R. Doolittle; C. E. Reece

    1998-01-01

    The performance of superconducting cavities in accelerators can be limited by several factors, such as: field emission, quenches, arcing, rf power; and the maximum gradient at which a cavity can operate will be determined by the lowest of these limitations for that particular cavity. The CEBAF accelerator operates with over 300 cavities and, for each of them, the authors have determined the maximum operating gradient and its limiting factor. They have developed a model that allows them to determine the distribution of gradients that could be achieved for each of these limitations independently of the others. The result of this analysis can guide an R and D program to achieve the best overall performance improvement. The same model can be used to relate the performance of single-cell and multi-cell cavities

  18. Performance measurement in transport sector analysis

    Directory of Open Access Journals (Sweden)

    M. Išoraitė

    2004-06-01

    . Alternatives analysis. The numerous courses of action could be taken to improve performance. However, due to limited resources choices have to be made. This requires prioritizing.

  19. Insight analysis of biplane Wells turbine performance

    International Nuclear Information System (INIS)

    Shaaban, S.

    2012-01-01

    Highlights: ► Downstream rotor reduces overall turbine efficiency during normal operation. ► Recirculation behind downstream rotor significantly reduces the torque delivered by the turbine. ► Upstream rotor significantly affects downstream rotor performance even at high gap to chord ratios. ► Downstream rotor produces only 10–30% of the turbine power despite its feasible exergy level. ► The downstream rotor significantly delays turbine start up. - Abstract: Wells turbines are very promising in converting wave energy. Improving the design and performance of Wells turbines requires deep understanding of the energy conversion process and losses mechanisms of these energy convertors. The performance of a biplane Wells turbine having 45° stagger angle between rotors is numerically investigated. The turbine performance is simulated by solving the steady 3D incompressible Reynolds Averaged Navier–Stocks equation (RANS). The present numerical investigation shows that the upstream rotor significantly affects the downstream rotor performance even at high gap-to-chord ratio (G/c = 1.4). The contribution of the downstream rotor in the overall biplane Wells turbine performance is limited. The downstream rotor torque represents 10–30% of the total turbine torque and the upstream rotor efficiency is 1.5–5 times the downstream rotor efficiency at normal operating conditions. Exergy analysis shows that the downstream rotor is the main component that reduces the turbine second law efficiency. The blade exergy increases from hub to tip and decreases from leading edge to trailing edge. Therefore, 3D blade profile optimization is essential for substantial improvement of the energy conversion process. Improving the design of the inter-rotors zone can significantly improve biplane Wells turbine performance. Future biplane Wells turbine designs should focus essentially on improving the downstream rotor performance.

  20. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  1. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose

  2. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  3. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  4. Structural Verification of the First Orbital Wonder of the World - The Structural Testing and Analysis of the International Space Station (ISS)

    Science.gov (United States)

    Zipay, John J.; Bernstein, Karen S.; Bruno, Erica E.; Deloo, Phillipe; Patin, Raymond

    2012-01-01

    The International Space Station (ISS) can be considered one of the structural engineering wonders of the world. On par with the World Trade Center, the Colossus of Rhodes, the Statue of Liberty, the Great Pyramids, the Petronas towers and the Burj Khalifa skyscraper of Dubai, the ambition and scope of the ISS structural design, verification and assembly effort is a truly global success story. With its on-orbit life projected to be from its beginning in 1998 to the year 2020 (and perhaps beyond), all of those who participated in its development can consider themselves part of an historic engineering achievement representing all of humanity. The structural design and verification of the ISS could be the subject of many scholarly papers. Several papers have been written on the structural dynamic characterization of the ISS once it was assembled on-orbit [1], but the ground-based activities required to assure structural integrity and structural life of the individual elements from delivery to orbit through assembly and planned on-orbit operations have never been totally summarized. This paper is intended to give the reader an overview of some of the key decisions made during the structural verification planning for the elements of the U.S. On-Orbit Segment (USOS) as well as to summarize the many structural tests and structural analyses that were performed on its major elements. An effort is made for this paper to be summarily comprehensive, but as with all knowledge capture efforts of this kind, there are bound to be errors of omission. Should the reader discover any of these, please feel free to contact the principal author. The ISS (Figure 1) is composed of pre-integrated truss segments and pressurized elements supplied by NASA, the Russian Federal Space Agency (RSA), the European Space Agency (ESA) and the Japanese Aerospace Exploration Agency (JAXA). Each of these elements was delivered to orbit by a launch vehicle and connected to one another either robotically or

  5. Performance Analysis of the Romanian Administration

    Directory of Open Access Journals (Sweden)

    Marius Constantin PROFIROIU

    2013-10-01

    Full Text Available The performance of public administration is one of the top priorities of the national governments worldwide, not only for Romania. The role of a performing management system at the level of public administration is to ensure a high quality and efficiency of the adopted policies and strategies, of the provided public services and of the administrative act itself, and to guarantee the advantage of a competitive and efficient administration both in relation to its own citizens, and in competition with other cities and countries throughout Europe and all around the world. Following these considerations, and based upon an empirical research conducted with the aid of a survey regarding ‘The analysis of the performance level of the Romanian public administration’ the article aims to (1 identify modern management tools that determine and influence the performance of Romanian public institutions, (2 analyze the effects of using project management as organizational capacity development instruments by public administration in Romania, and (3 determine the influence and effects of the external factors on the performance and development of Romanian public administration.

  6. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  7. Performance management in healthcare: a critical analysis.

    Science.gov (United States)

    Hewko, Sarah J; Cummings, Greta G

    2016-01-01

    Purpose - The purpose of this paper is to explore the underlying theoretical assumptions and implications of current micro-level performance management and evaluation (PME) practices, specifically within health-care organizations. PME encompasses all activities that are designed and conducted to align employee outputs with organizational goals. Design/methodology/approach - PME, in the context of healthcare, is analyzed through the lens of critical theory. Specifically, Habermas' theory of communicative action is used to highlight some of the questions that arise in looking critically at PME. To provide a richer definition of key theoretical concepts, the authors conducted a preliminary, exploratory hermeneutic semantic analysis of the key words "performance" and "management" and of the term "performance management". Findings - Analysis reveals that existing micro-level PME systems in health-care organizations have the potential to create a workforce that is compliant, dependent, technically oriented and passive, and to support health-care systems in which inequalities and power imbalances are perpetually reinforced. Practical implications - At a time when the health-care system is under increasing pressure to provide high-quality, affordable services with fewer resources, it may be wise to investigate new sector-specific ways of evaluating and managing performance. Originality/value - In this paper, written for health-care leaders and health human resource specialists, the theoretical assumptions and implications of current PME practices within health-care organizations are explored. It is hoped that readers will be inspired to support innovative PME practices within their organizations that encourage peak performance among health-care professionals.

  8. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2003-07-25

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports (BSC 2003 [DIRS 160964]; BSC 2003 [DIRS 160965]; BSC 2003 [DIRS 160976]; BSC 2003 [DIRS 161239]; BSC 2003 [DIRS 161241]) contain detailed description of the model input parameters. This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs and conversion factors for the TSPA. The BDCFs will be used in performance assessment for calculating annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose from beta- and photon-emitting radionuclides.

  9. A new approach in performing microdiffraction analysis

    International Nuclear Information System (INIS)

    Winter, D.J.; Squires, B.A.

    1995-01-01

    Microdiffraction is defined as the x-ray diffraction analysis performed on small samples or MD areas of large samples. Since smallness is a relative term, microdiffraction is considered the technique of choice when samples are too small for the optics and precision of conventional instrumentation. The limit on the size of the sample is dependent upon the accuracy of the instrumentation, which is measured by such variables as the diameter of the incident beam and the sphere of confusion of the goniometer (accuracy of the circle centers). If the sample area of interest is part of a multiphase material, it is necessary for the diameter of the incident x-ray beam to be smaller than the sample area in order to assure that the diffraction pattern produced is from the sample area of interest only. Today, microdiffraction is being performed on samples as small as a few microns in diameter. Common applications for microdiffraction include composite materials such as wafers and pads used in the semiconductor industry, inclusions on laser disks and forensic studies. The analysis is often complicated by the fact that the sample areas can be a few grains or even a single crystal. Conventional powder diffractometers are very well suited for analyzing large volumes of polycrystalline material, however, they require much longer counting times when the sample volume is very small. Ideally, what is needed is the optics of a single crystal diffractometer with the performance of a conventional powder diffractometer. 6 figs

  10. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  11. Financial Performance Analysis Of Financial Service Cooperative

    Directory of Open Access Journals (Sweden)

    Eyo Asro Sasmita

    2015-08-01

    Full Text Available This research is aimed to test and identify empirical evidence regarding the effect of capital structure and loan to financial performance of cooperative where the relationship between loan and financial performance is moderated by non-performing loan. The population of this research is 257 Financial Service Cooperative hereinafter referred to as KJK as the abbreviation for Koperasi Jasa Keuangan of Urban Village Community Economic Empowerment hereinafter referred to as PEMK as the abbreviation for Pemberdayaan Ekonomi Masyarakat Kelurahan in Jakarta 2011 to 2013. Sample is determined by using purposive sampling method. The data is secondary data which is obtained from the Revolving Fund Management Unit hereinafter referred to as UPDB as the abbreviation for Unit Pengelola Dana Bergulir Jakarta. Hypothesis is tested by using multiple linear regression analysis with SPSS 20.00. The number of sample used in this research is 120. Research findings explain that 1 Capital Structure hereinafter referred to as SM as the abbreviation for Struktur Modal has positive and significant impact on financial performance hereinafter referred to as KIN as the abbreviation for Kinerja Keuangan because the probability value of 0000 is smaller than amp945 0.05. Calculation shows that if the capital structure rises 1 assuming that the loan and non-performing loan variables remain the same then the financial performance will increase 0.017. 2 Loans hereinafter referred to as PIN as the abbreviation for Pinjaman given has positive and significant impact on KIN because the probability value of 0001 is smaller than amp945 0.05. If the loan rises 1 assuming that the capital structure and non-performing loan variables remain the same then the KIN will increase 0.013. 3 Non-performing loan has negative and significant effect on KIN because the probability value of 0000 is smaller than amp945 0.05. PBR varible increase 1 assuming that the loan and capital structure variables

  12. Performance Analysis of Photovoltaic Water Heating System

    Directory of Open Access Journals (Sweden)

    Tomas Matuska

    2017-01-01

    Full Text Available Performance of solar photovoltaic water heating systems with direct coupling of PV array to DC resistive heating elements has been studied and compared with solar photothermal systems. An analysis of optimum fixed load resistance for different climate conditions has been performed for simple PV heating systems. The optimum value of the fixed load resistance depends on the climate, especially on annual solar irradiation level. Use of maximum power point tracking compared to fixed optimized load resistance increases the annual yield by 20 to 35%. While total annual efficiency of the PV water heating systems in Europe ranges from 10% for PV systems without MPP tracking up to 15% for system with advanced MPP trackers, the efficiency of solar photothermal system for identical hot water load and climate conditions is more than 3 times higher.

  13. Performance measurement with fuzzy data envelopment analysis

    CERN Document Server

    Tavana, Madjid

    2014-01-01

    The intensity of global competition and ever-increasing economic uncertainties has led organizations to search for more efficient and effective ways to manage their business operations.  Data envelopment analysis (DEA) has been widely used as a conceptually simple yet powerful tool for evaluating organizational productivity and performance. Fuzzy DEA (FDEA) is a promising extension of the conventional DEA proposed for dealing with imprecise and ambiguous data in performance measurement problems. This book is the first volume in the literature to present the state-of-the-art developments and applications of FDEA. It is designed for students, educators, researchers, consultants and practicing managers in business, industry, and government with a basic understanding of the DEA and fuzzy logic concepts.

  14. Comparative performances analysis of neonatal ventilators.

    Science.gov (United States)

    Baldoli, Ilaria; Tognarelli, Selene; Scaramuzzo, Rosa T; Ciantelli, Massimiliano; Cecchi, Francesca; Gentile, Marzia; Sigali, Emilio; Ghirri, Paolo; Boldrini, Antonio; Menciassi, Arianna; Laschi, Cecilia; Cuttano, Armando

    2015-02-08

    Mechanical ventilation is a therapeutic action for newborns with respiratory diseases but may have side effects. Correct equipment knowledge and training may limit human errors. We aimed to test different neonatal mechanical ventilators' performances by an acquisition module (a commercial pressure sensor plus an isolated chamber and a dedicated software). The differences (ΔP) between peak pressure values and end-expiration pressure were investigated for each ventilator. We focused on discrepancies among measured and imposed pressure data. A statistical analysis was performed. We investigated the measured/imposed ΔP relation. The ΔP do not reveal univocal trends related to ventilation setting parameters and the data distributions were non-Gaussian. Measured ΔP represent a significant parameter in newborns' ventilation, due to the typical small volumes. The investigated ventilators showed different tendencies. Therefore, a deep specific knowledge of the intensive care devices is mandatory for caregivers to correctly exploit their operating principles.

  15. Diversity Performance Analysis on Multiple HAP Networks

    Science.gov (United States)

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  16. Diversity Performance Analysis on Multiple HAP Networks

    Directory of Open Access Journals (Sweden)

    Feihong Dong

    2015-06-01

    Full Text Available One of the main design challenges in wireless sensor networks (WSNs is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV. In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF and cumulative distribution function (CDF of the received signal-to-noise ratio (SNR are derived. In addition, the average symbol error rate (ASER with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques.

  17. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis

  18. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objectives of this analysis are to develop BDCFs for the

  19. performance performance analysis of gsm networks in minna

    African Journals Online (AJOL)

    eobe

    in terms of key performance indicators (KPI) based on statistics performance indicators ... in this study. Keywords: GSM Network, Drive Test, KPI and Radio Frequency Network Optimization. 1. .... message (SMS) traffic or in scenarios where so.

  20. Z-2 Architecture Description and Requirements Verification Results

    Science.gov (United States)

    Graziosi, Dave; Jones, Bobby; Ferl, Jinny; Scarborough, Steve; Hewes, Linda; Ross, Amy; Rhodes, Richard

    2016-01-01

    , partial pressure relief valve, purge valve, donning stand and ISS Body Restraint Tether (BRT). Examples of manned requirements include verification of anthropometric range, suit self-don/doff, secondary suit exit method, donning stand self-ingress/egress and manned mobility covering eight functional tasks. The eight functional tasks include kneeling with object pick-up, standing toe touch, cross-body reach, walking, reach to the SIP and helmet visor. This paper will provide an overview of the Z-2 design. Z-2 requirements verification testing was performed with NASA at the ILC Houston test facility. This paper will also discuss pre-delivery manned and unmanned test results as well as analysis performed in support of requirements verification.

  1. Idaho National Laboratory Quarterly Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Lisbeth [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 60 reportable events (23 from the 4th Qtr FY14 and 37 from the prior three reporting quarters) as well as 58 other issue reports (including not reportable events and Significant Category A and B conditions) identified at INL from July 2013 through October 2014. Battelle Energy Alliance (BEA) operates the INL under contract DE AC07 051D14517.

  2. An exergy method for compressor performance analysis

    Energy Technology Data Exchange (ETDEWEB)

    McGovern, J A; Harte, S [Trinity Coll., Dublin (Ireland)

    1995-07-01

    An exergy method for compressor performance analysis is presented. The purpose of this is to identify and quantify defects in the use of a compressor`s shaft power. This information can be used as the basis for compressor design improvements. The defects are attributed to friction, irreversible heat transfer, fluid throttling, and irreversible fluid mixing. They are described, on a common basis, as exergy destruction rates and their locations are identified. The method can be used with any type of positive displacement compressor. It is most readily applied where a detailed computer simulation program is available for the compressor. An analysis of an open reciprocating refrigeration compressor that used R12 refrigerant is given as an example. The results that are presented consist of graphs of the instantaneous rates of exergy destruction according to the mechanisms involved, a pie chart of the breakdown of the average shaft power wastage by mechanism, and a pie chart with a breakdown by location. (author)

  3. Building America Performance Analysis Procedures: Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Anderson, R.; Judkoff, R.; Christensen, C.; Eastment, M.; Norton, P.; Reeves, P.; Hancock, E.

    2004-06-01

    To measure progress toward multi-year Building America research goals, cost and performance trade-offs are evaluated through a series of controlled field and laboratory experiments supported by energy analysis techniques that use test data to''calibrate'' energy simulation models. This report summarizes the guidelines for reporting such analytical results using the Building America Research Benchmark (Version 3.1) in studies that also include consideration of current Regional and Builder Standard Practice. Version 3.1 of the Benchmark is generally consistent with the 1999 Home Energy Rating System (HERS) Reference Home, with additions that allow evaluation of all home energy uses.

  4. Performing data analysis using IBM SPSS

    CERN Document Server

    Meyers, Lawrence S; Guarino, A J

    2013-01-01

    This book is designed to be a user's guide for students and other interested readers to perform statistical data analysis with IBM SPSS, which is a major statistical software package used extensively in academic, government, and business settings. This book addresses the needs, level of sophistication, and interest in introductory statistical methodology on the part of undergraduate and graduate students in social and behavioral science, business, health-related, and education programs.  Each chapter covers a particular statistical procedure and has the following format: an example pr

  5. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  6. Performance Analysis of Microfinance Institutions of India

    Directory of Open Access Journals (Sweden)

    Muhammad Azhar Ikram Ahmad

    2014-12-01

    Full Text Available This is a study of Microfinance Institutions-MFIs of India. It includes analysis of MFIs of India. This study includes analysis of performance of microfinance institutions with reference to both financial and non-financial ways. Performance of microfinance institutions is measured using four parameters, which are sustainability/profitability, outreach, operational and financial efficiency. Data is taken of 99 Microfinance Institutions of India from the Microfinance Information Exchange for a period of 11 years. Variables of this study are both in absolute and relative terms. The endogenous variables are Return on Assets and Return on Equity for sustainability, Number of Borrowers per Staff Member for operational efficiency, Cost per Borrower for financial efficiency, and Number of Active Borrowers for outreach. Panel data analysis is done after checking the assumptions of the model. Hausman Test is applied to find out the suitability of Fixed or Random Effect Model. Both random and fixed effect were found suitable for application. In addition to this descriptive analysis of the variables is also done. The results show that most of the variables used in the study are significant in outreach model; other than rank, financial revenue to assets ratio, portfolio at risk, deposits, and capital to assets ratio all other variables are significant in case of sustainability using ROA model and same variables are found insignificant in ROE model except financial expense to assets ratio; in financial efficiency model both significant and insignificant variables are found; and in case of operational efficiency all variables are found significant.

  7. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  8. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  9. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  10. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  11. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  12. Validation and verification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    Ades, M.J.; Crowe, R.D.; Toffer, H.

    1991-04-01

    This report discusses a verification and validation (V ampersand V) plan for computer codes used for safety analysis and probabilistic risk assessment calculations. The present plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River Office (DOE-SRO) to bring the essential safety analysis and probabilistic risk assessment codes in compliance with verification and validation requirements

  13. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    Science.gov (United States)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  14. Design of Service Net based Correctness Verification Approach for Multimedia Conferencing Service Orchestration

    Directory of Open Access Journals (Sweden)

    Cheng Bo

    2012-02-01

    Full Text Available Multimedia conferencing is increasingly becoming a very important and popular application over Internet. Due to the complexity of asynchronous communications and handle large and dynamically concurrent processes for multimedia conferencing, which confront relevant challenge to achieve sufficient correctness guarantees, and supporting the effective verification methods for multimedia conferencing services orchestration is an extremely difficult and challenging problem. In this paper, we firstly present the Business Process Execution Language (BPEL based conferencing service orchestration, and mainly focus on the service net based correction verification approach for multimedia conferencing services orchestration, which can automatically translated the BPEL based service orchestration into a corresponding Petri net model with the Petri Net Markup Language (PNML, and also present the BPEL service net reduction rules and multimedia conferencing service orchestration correction verification algorithms. We perform the correctness analysis and verification using the service net properties as safeness, reachability and deadlocks, and also provide an automated support tool for the formal analysis and soundness verification for the multimedia conferencing services orchestration scenarios. Finally, we give the comparison and evaluations.

  15. Formal verification and validation of the safety-critical software in a digital reactor protection system

    International Nuclear Information System (INIS)

    Kwon, K. C.; Park, G. Y.

    2006-01-01

    This paper describes the Verification and Validation (V and V) activities for the safety-critical software in a Digital Reactor Protection System (DRPS) that is being developed through the Korea nuclear instrumentation and control system project. The main activities of the DRPS V and V process are a preparation of the software planning documentation, a verification of the software according to the software life cycle, a software safety analysis and a software configuration management. The verification works for the Software Requirement Specification (SRS) of the DRPS consist of a technical evaluation, a licensing suitability evaluation, a inspection and traceability analysis, a formal verification, and preparing a test plan and procedure. Especially, the SRS is specified by the formal specification method in the development phase, and the formal SRS is verified by a formal verification method. Through these activities, we believe we can achieve the functionality, performance, reliability, and safety that are the major V and V objectives of the nuclear safety-critical software in a DRPS. (authors)

  16. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle

  17. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  18. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    Wasiolek, M.

    2000-01-01

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain

  19. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  20. Statistical analysis in MSW collection performance assessment.

    Science.gov (United States)

    Teixeira, Carlos Afonso; Avelino, Catarina; Ferreira, Fátima; Bentes, Isabel

    2014-09-01

    The increase of Municipal Solid Waste (MSW) generated over the last years forces waste managers pursuing more effective collection schemes, technically viable, environmentally effective and economically sustainable. The assessment of MSW services using performance indicators plays a crucial role for improving service quality. In this work, we focus on the relevance of regular system monitoring as a service assessment tool. In particular, we select and test a core-set of MSW collection performance indicators (effective collection distance, effective collection time and effective fuel consumption) that highlights collection system strengths and weaknesses and supports pro-active management decision-making and strategic planning. A statistical analysis was conducted with data collected in mixed collection system of Oporto Municipality, Portugal, during one year, a week per month. This analysis provides collection circuits' operational assessment and supports effective short-term municipality collection strategies at the level of, e.g., collection frequency and timetables, and type of containers. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Development and verification of Monte Carlo burnup calculation system

    International Nuclear Information System (INIS)

    Ando, Yoshihira; Yoshioka, Kenichi; Mitsuhashi, Ishi; Sakurada, Koichi; Sakurai, Shungo

    2003-01-01

    Monte Carlo burnup calculation code system has been developed to evaluate accurate various quantities required in the backend field. From the Actinide Research in a Nuclear Element (ARIANE) program, by using, the measured nuclide compositions of fuel rods in the fuel assemblies irradiated in the commercial Netherlands BWR, the analyses have been performed for the code system verification. The code system developed in this paper has been verified through analysis for MOX and UO2 fuel rods. This system enables to reduce large margin assumed in the present criticality analysis for LWR spent fuels. (J.P.N.)

  2. Analysis, testing and verification of the behavior of composite pavements under Florida conditions using a heavy vehicle simulator

    Science.gov (United States)

    Tapia Gutierrez, Patricio Enrique

    Whitetopping (WT) is a rehabilitation method to resurface deteriorated asphalt pavements. While some of these composite pavements have performed very well carrying heavy load, other have shown poor performance with early cracking. With the objective of analyzing the applicability of WT pavements under Florida conditions, a total of nine full-scale WT test sections were constructed and tested using a Heavy Vehicle Simulator (HVS) in the APT facility at the FDOT Material Research Park. The test sections were instrumented to monitor both strain and temperature. A 3-D finite element model was developed to analyze the WT test sections. The model was calibrated and verified using measured FWD deflections and HVS load-induced strains from the test sections. The model was then used to evaluate the potential performance of these test sections under critical temperature-load condition in Florida. Six of the WT pavement test sections had a bonded concrete-asphalt interface by milling, cleaning and spraying with water the asphalt surface. This method produced excellent bonding at the interface, with shear strength of 195 to 220 psi. Three of the test sections were intended to have an unbonded concrete-asphalt interface by applying a debonding agent in the asphalt surface. However, shear strengths between 119 and 135 psi and a careful analysis of the strain and the temperature data indicated a partial bond condition. The computer model was able to satisfactorily model the behavior of the composite pavement by mainly considering material properties from standard laboratory tests and calibrating the spring elements used to model the interface. Reasonable matches between the measured and the calculated strains were achieved when a temperature-dependent AC elastic modulus was included in the analytical model. The expected numbers of repetitions of the 24-kip single axle loads at critical thermal condition were computed for the nine test sections based on maximum tensile stresses

  3. Environmental Technology Verification Report - Electric Power and Heat Production Using Renewable Biogas at Patterson Farms

    Science.gov (United States)

    The U.S. EPA operates the Environmental Technology Verification program to facilitate the deployment of innovative technologies through performance verification and information dissemination. A technology area of interest is distributed electrical power generation, particularly w...

  4. Environmental Technology Verification: Baghouse Filtration Products--TDC Filter Manufacturing, Inc., SB025 Filtration Media

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  5. Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3

    Science.gov (United States)

    The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.

  6. Improved and verification of fast reactor safety analysis techniques. Annual summary, March 1, 1975--February 29, 1976

    International Nuclear Information System (INIS)

    Jackson, J.F.; Bott, T.F.

    1976-01-01

    Analyses of the Kiwi-TNT and SNAPTRAN-2 experiments have been performed with the VENUS-II fast-reactor disassembly code. The results show that VENUS-II provides an adequate characterization of these experiments. As is the case for LMFBRs, the excursions were initially turned over by temperature feedback effects, with ultimate shutdown coming from core disassembly. The calculated fission energies agree with the experimental values to within about 50 percent for the Kiwi excursion and 10 percent for the SNAPTRAN-2 experiment. The results of the analyses are being evaluated to determine the reasons for the remaining differences. It appears that part of the difference observed in the Kiwi-TNT analysis could relate to not explicitly treating the heat-transfer from the beaded fuel (a problem not present in LMFBR calculations). Both analyses also have uncertainties associated with the new equation-of-state that had to be added to VENUS-II to allow treatment of the core materials not used in fast reactors. Finally, there are uncertainties in the temperature feedback coefficients being used. In general, the uncertainties associated with applying VENUS-II to LMFBR excursions should be even smaller than those encountered in these experimental comparisons. This is because the temperature (Doppler) coefficients and core material equations-of-state are better known, and the complications associated with heat transfer from the beaded fuel are not present

  7. Time-domain simulation and nonlinear analysis on ride performance of four-wheel vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y S; He, H; Geng, A L [School of Automobile and Traffic Engineering, Liaoning University of Technology, Jinzhou 121001 (China)], E-mail: jzwbt@163.com

    2008-02-15

    A nonlinear dynamic model with eight DOFs of a four-wheel vehicle is established in this paper. After detaching the nonlinear characteristics of the leaf springs and shock absorbers, the multi-step linearizing method is used to simulate the vehicle vibration in time domain, under a correlated four-wheel road roughness model. Experimental verifications suggest that the newly built vehicle model and simulation procedure are reasonable and feasible to be used in vehicle vibration analysis. Furthermore, some nonlinear factors of the leaf springs and shock absorbers, which affect the vehicle ride performance (or comfort), are investigated under different vehicle running speeds. Some substaintial rules of the nonlinear vehicle vibrations are revealed in this paper.

  8. Time-domain simulation and nonlinear analysis on ride performance of four-wheel vehicles

    International Nuclear Information System (INIS)

    Wang, Y S; He, H; Geng, A L

    2008-01-01

    A nonlinear dynamic model with eight DOFs of a four-wheel vehicle is established in this paper. After detaching the nonlinear characteristics of the leaf springs and shock absorbers, the multi-step linearizing method is used to simulate the vehicle vibration in time domain, under a correlated four-wheel road roughness model. Experimental verifications suggest that the newly built vehicle model and simulation procedure are reasonable and feasible to be used in vehicle vibration analysis. Furthermore, some nonlinear factors of the leaf springs and shock absorbers, which affect the vehicle ride performance (or comfort), are investigated under different vehicle running speeds. Some substaintial rules of the nonlinear vehicle vibrations are revealed in this paper

  9. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  10. Safety Injection Tank Performance Analysis Using CFD

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Oan; Lee, Jeong Ik; Nietiadi Yohanes Setiawan [KAIST, Daejeon (Korea, Republic of); Addad Yacine [KUSTAR, Abu Dhabi (United Arab Emirates); Bang, Young Seok; Yoo, Seung Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    This may affect the core cooling capability and threaten the fuel integrity during LOCA situations. However, information on the nitrogen flow rate during discharge is very limited due to the associated experimental measurement difficulties, and these phenomena are hardly reflected in current 1D system codes. In the current study, a CFD analysis is presented which hopefully should allow obtaining a more realistic prediction of the SIT performance which can then be reflected on 1D system codes to simulate various accident scenarios. Current Computational Fluid Dynamics (CFD) calculations have had limited success in predicting the fluid flow accurately. This study aims to find a better CFD prediction and more accurate modeling to predict the system performance during accident scenarios. The safety injection tank with fluidic device was analyzed using commercial CFD. A fine resolution grid was used to capture the vortex of the fluidic device. The calculation so far has shown good consistency with the experiment. Calculation should complete by the conference date and will be thoroughly analyzed to be discussed. Once a detailed CFD computation is finished, a small-scale experiment will be conducted for the given conditions. Using the experimental results and the CFD model, physical models can be validated to give more reliable results. The data from CFD and experiments will provide a more accurate K-factor of the fluidic device which can later be applied in system code inputs.

  11. Availability Performance Analysis of Thermal Power Plants

    Science.gov (United States)

    Bhangu, Navneet Singh; Singh, Rupinder; Pahuja, G. L.

    2018-03-01

    This case study presents the availability evaluation method of thermal power plants for conducting performance analysis in Indian environment. A generic availability model has been proposed for a maintained system (thermal plants) using reliability block diagrams and fault tree analysis. The availability indices have been evaluated under realistic working environment using inclusion exclusion principle. Four year failure database has been used to compute availability for different combinatory of plant capacity, that is, full working state, reduced capacity or failure state. Availability is found to be very less even at full rated capacity (440 MW) which is not acceptable especially in prevailing energy scenario. One of the probable reason for this may be the difference in the age/health of existing thermal power plants which requires special attention of each unit from case to case basis. The maintenance techniques being used are conventional (50 years old) and improper in context of the modern equipment, which further aggravate the problem of low availability. This study highlights procedure for finding critical plants/units/subsystems and helps in deciding preventive maintenance program.

  12. Radio-science performance analysis software

    Science.gov (United States)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  13. Importance Performance Analysis as a Trade Show Performance Evaluation and Benchmarking Tool

    OpenAIRE

    Tafesse, Wondwesen; Skallerud, Kåre; Korneliussen, Tor

    2010-01-01

    Author's accepted version (post-print). The purpose of this study is to introduce importance performance analysis as a trade show performance evaluation and benchmarking tool. Importance performance analysis considers exhibitors’ performance expectation and perceived performance in unison to evaluate and benchmark trade show performance. The present study uses data obtained from exhibitors of an international trade show to demonstrate how importance performance analysis can be used to eval...

  14. 9 CFR 417.8 - Agency verification.

    Science.gov (United States)

    2010-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  15. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    Science.gov (United States)

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  16. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  17. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    Science.gov (United States)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich

    2015-04-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced

  18. Verification of a CT scanner using a miniature step gauge

    DEFF Research Database (Denmark)

    Cantatore, Angela; Andreasen, J.L.; Carmignato, S.

    2011-01-01

    The work deals with performance verification of a CT scanner using a 42mm miniature replica step gauge developed for optical scanner verification. Errors quantification and optimization of CT system set-up in terms of resolution and measurement accuracy are fundamental for use of CT scanning...

  19. 37 CFR 262.7 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  20. Performance Analysis of Depleted Oil Reservoirs for Underground Gas Storage

    Directory of Open Access Journals (Sweden)

    Dr. C.I.C. Anyadiegwu

    2014-02-01

    Full Text Available The performance of underground gas storage in depleted oil reservoir was analysed with reservoir Y-19, a depleted oil reservoir in Southern region of the Niger Delta. Information on the geologic and production history of the reservoir were obtained from the available field data of the reservoir. The verification of inventory was done to establish the storage capacity of the reservoir. The plot of the well flowing pressure (Pwf against the flow rate (Q, gives the deliverability of the reservoir at various pressures. Results of the estimated properties signified that reservoir Y-19 is a good candidate due to its storage capacity and its flow rate (Q of 287.61 MMscf/d at a flowing pressure of 3900 psig

  1. Safety assessment and verification for nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2001-01-01

    This publication supports the Safety Requirements on the Safety of Nuclear Power Plants: Design. This Safety Guide was prepared on the basis of a systematic review of all the relevant publications including the Safety Fundamentals, Safety of Nuclear Power Plants: Design, current and ongoing revisions of other Safety Guides, INSAG reports and other publications that have addressed the safety of nuclear power plants. This Safety Guide also provides guidance for Contracting Parties to the Convention on Nuclear Safety in meeting their obligations under Article 14 on Assessment and Verification of Safety. The Safety Requirements publication entitled Safety of Nuclear Power Plants: Design states that a comprehensive safety assessment and an independent verification of the safety assessment shall be carried out before the design is submitted to the regulatory body. This publication provides guidance on how this requirement should be met. This Safety Guide provides recommendations to designers for carrying out a safety assessment during the initial design process and design modifications, as well as to the operating organization in carrying out independent verification of the safety assessment of new nuclear power plants with a new or already existing design. The recommendations for performing a safety assessment are suitable also as guidance for the safety review of an existing plant. The objective of reviewing existing plants against current standards and practices is to determine whether there are any deviations which would have an impact on plant safety. The methods and the recommendations of this Safety Guide can also be used by regulatory bodies for the conduct of the regulatory review and assessment. Although most recommendations of this Safety Guide are general and applicable to all types of nuclear reactors, some specific recommendations and examples apply mostly to water cooled reactors. Terms such as 'safety assessment', 'safety analysis' and 'independent

  2. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  3. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples......: a textbook “philosophers” example, and an example motivated by a ubiquitous computing application. We give a tractable heuristic with which to approximate interference between reaction rules, and prove this analysis to be safe. We provide a mechanism for state reachability checking of bigraphical reactive...

  4. Sigma metric analysis for performance of creatinine with fresh frozen serum.

    Science.gov (United States)

    Kang, Fengfeng; Zhang, Chuanbao; Wang, Wei; Wang, Zhiguo

    2016-01-01

    Six sigma provides an objective and quantitative methodology to describe the laboratory testing performance. In this study, we conducted a national trueness verification scheme with fresh frozen serum (FFS) for serum creatinine to evaluate its performance in China. Two different concentration levels of FFS, targeted with reference method, were sent to 98 laboratories in China. Imprecision and bias of the measurement procedure were calculated for each participant to further evaluate the sigma value. Quality goal index (QGI) analysis was used to investigate the reason of unacceptable performance for laboratories with σ high concentration of creatinine had preferable sigma values. For the enzymatic method, 7.0% (5/71) to 45.1% (32/71) of the laboratories need to improve their measurement procedures (σ 1.2). Only 3.1-5.3% of the laboratories should improve both of the precision and trueness. Sigma metric analysis of the serum creatinine assays is disappointing, which was mainly due to the unacceptable analytical bias according to the QGI analysis. Further effort is needed to enhance the trueness of the creatinine measurement.

  5. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  6. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  7. An investigation and comparison on network performance analysis

    OpenAIRE

    Lanxiaopu, Mi

    2012-01-01

    This thesis is generally about network performance analysis. It contains two parts. The theory part summarizes what network performance is and inducts the methods of doing network performance analysis. To answer what network performance is, a study into what network services are is done. And based on the background research, there are two important network performance metrics: Network delay and Throughput should be included in network performance analysis. Among the methods of network a...

  8. EFL LEARNERS REPAIR SEQUENCE TYPES ANALYSIS AS PEER- ASSESSMENT IN ORAL PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Novia Trisanti

    2017-04-01

    Full Text Available There are certain concerns that EFL teacher needs to observe in assessing students oral performance, such as the amount of words which the learners utter, the grammatical errors that they make, the hesitation and certain expression that they produce. This paper attempts to give overview of research results using qualitative method which show the impacts of repair sequence types analysis on those elements needed to be observed as students peer and self-assessment to enhance their speaking ability. The subject was tertiary level learners of English Department, State University of Semarang, Indonesia in 2012. Concerning the repair types, there are four repair sequences as reviewed by Buckwalter (2001, they are Self-Initiated Self Repair (SISR, Self-Initiated Other Repair (SIOR, Other-Initiated Self Repair (OISR, and Other-Initiated Other Repair (OIOR. Having the repair sequences types anaysis, the students investigated the repair sequence of their peers while they performed in class conversation. The modified peer- assessment guideline as proposed by Brown (2004 was used in identifying, categorizing and classifying the types of repair sequences in their peers oral performance. While, the peer-assessment can be a valuable additional means to improve students speaking since it is one of the motives that drive peer- evaluation, along with peer- verification, also peer and self- enhancement. The analysis results were then interpreted to see whether there was significant finding related to the students’ oral performance enhancement.

  9. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  10. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  11. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  12. Verification of Thermal Models of Internally Cooled Gas Turbine Blades

    Directory of Open Access Journals (Sweden)

    Igor Shevchenko

    2018-01-01

    Full Text Available Numerical simulation of temperature field of cooled turbine blades is a required element of gas turbine engine design process. The verification is usually performed on the basis of results of test of full-size blade prototype on a gas-dynamic test bench. A method of calorimetric measurement in a molten metal thermostat for verification of a thermal model of cooled blade is proposed in this paper. The method allows obtaining local values of heat flux in each point of blade surface within a single experiment. The error of determination of local heat transfer coefficients using this method does not exceed 8% for blades with radial channels. An important feature of the method is that the heat load remains unchanged during the experiment and the blade outer surface temperature equals zinc melting point. The verification of thermal-hydraulic model of high-pressure turbine blade with cooling allowing asymmetrical heat removal from pressure and suction sides was carried out using the developed method. An analysis of heat transfer coefficients confirmed the high level of heat transfer in the leading edge, whose value is comparable with jet impingement heat transfer. The maximum of the heat transfer coefficients is shifted from the critical point of the leading edge to the pressure side.

  13. Staff Performance Analysis: A Method for Identifying Brigade Staff Tasks

    National Research Council Canada - National Science Library

    Ford, Laura

    1997-01-01

    ... members of conventional mounted brigade staff. Initial analysis of performance requirements in existing documentation revealed that the performance specifications were not sufficiently detailed for brigade battle staffs...

  14. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  15. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  16. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  17. Sampling and Analysis Plan for Verification Sampling of LANL-Derived Residual Radionuclides in Soils within Tract A-18-2 for Land Conveyance

    Energy Technology Data Exchange (ETDEWEB)

    Ruedig, Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-30

    Public Law 105-119 directs the U.S. Department of Energy (DOE) to convey or transfer parcels of land to the Incorporated County of Los Alamos or their designees and to the Department of Interior, Bureau of Indian Affairs, in trust for the Pueblo de San Ildefonso. Los Alamos National Security is tasked to support DOE in conveyance and/or transfer of identified land parcels no later than September 2022. Under DOE Order 458.1, Radiation Protection of the Public and the Environment (O458.1, 2013) and Los Alamos National Laboratory (LANL or the Laboratory) implementing Policy 412 (P412, 2014), real property with the potential to contain residual radioactive material must meet the criteria for clearance and release to the public. This Sampling and Analysis Plan (SAP) is a second investigation of Tract A-18-2 for the purpose of verifying the previous sampling results (LANL 2017). This sample plan requires 18 projectspecific soil samples for use in radiological clearance decisions consistent with LANL Procedure ENV-ES-TP-238 (2015a) and guidance in the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM, 2000). The sampling work will be conducted by LANL, and samples will be evaluated by a LANL-contracted independent lab. However, there will be federal review (verification) of all steps of the sampling process.

  18. Verification of the FBR fuel bundle-duct interaction analysis code BAMBOO by the out-of-pile bundle compression test with large diameter pins

    Science.gov (United States)

    Uwaba, Tomoyuki; Ito, Masahiro; Nemoto, Junichi; Ichikawa, Shoichi; Katsuyama, Kozo

    2014-09-01

    The BAMBOO computer code was verified by results for the out-of-pile bundle compression test with large diameter pin bundle deformation under the bundle-duct interaction (BDI) condition. The pin diameters of the examined test bundles were 8.5 mm and 10.4 mm, which are targeted as preliminary fuel pin diameters for the upgraded core of the prototype fast breeder reactor (FBR) and for demonstration and commercial FBRs studied in the FaCT project. In the bundle compression test, bundle cross-sectional views were obtained from X-ray computer tomography (CT) images and local parameters of bundle deformation such as pin-to-duct and pin-to-pin clearances were measured by CT image analyses. In the verification, calculation results of bundle deformation obtained by the BAMBOO code analyses were compared with the experimental results from the CT image analyses. The comparison showed that the BAMBOO code reasonably predicts deformation of large diameter pin bundles under the BDI condition by assuming that pin bowing and cladding oval distortion are the major deformation mechanisms, the same as in the case of small diameter pin bundles. In addition, the BAMBOO analysis results confirmed that cladding oval distortion effectively suppresses BDI in large diameter pin bundles as well as in small diameter pin bundles.

  19. Verification of the FBR fuel bundle–duct interaction analysis code BAMBOO by the out-of-pile bundle compression test with large diameter pins

    Energy Technology Data Exchange (ETDEWEB)

    Uwaba, Tomoyuki, E-mail: uwaba.tomoyuki@jaea.go.jp [Japan Atomic Energy Agency, 4002, Narita-cho, Oarai-machi, Ibaraki 311-1393 (Japan); Ito, Masahiro; Nemoto, Junichi [Japan Atomic Energy Agency, 4002, Narita-cho, Oarai-machi, Ibaraki 311-1393 (Japan); Ichikawa, Shoichi [Japan Atomic Energy Agency, 2-1, Shiraki, Tsuruga-shi, Fukui 919-1279 (Japan); Katsuyama, Kozo [Japan Atomic Energy Agency, 4002, Narita-cho, Oarai-machi, Ibaraki 311-1393 (Japan)

    2014-09-15

    The BAMBOO computer code was verified by results for the out-of-pile bundle compression test with large diameter pin bundle deformation under the bundle–duct interaction (BDI) condition. The pin diameters of the examined test bundles were 8.5 mm and 10.4 mm, which are targeted as preliminary fuel pin diameters for the upgraded core of the prototype fast breeder reactor (FBR) and for demonstration and commercial FBRs studied in the FaCT project. In the bundle compression test, bundle cross-sectional views were obtained from X-ray computer tomography (CT) images and local parameters of bundle deformation such as pin-to-duct and pin-to-pin clearances were measured by CT image analyses. In the verification, calculation results of bundle deformation obtained by the BAMBOO code analyses were compared with the experimental results from the CT image analyses. The comparison showed that the BAMBOO code reasonably predicts deformation of large diameter pin bundles under the BDI condition by assuming that pin bowing and cladding oval distortion are the major deformation mechanisms, the same as in the case of small diameter pin bundles. In addition, the BAMBOO analysis results confirmed that cladding oval distortion effectively suppresses BDI in large diameter pin bundles as well as in small diameter pin bundles.

  20. and application to autopilot performance analysis

    Directory of Open Access Journals (Sweden)

    Daniel E. Davison

    2000-01-01

    Full Text Available This paper deals with the notion of disturbance model uncertainty. The disturbance is modeled as the output of a first-order filter which is driven by white noise and whose bandwidth and gain are uncertain. An analytical expression for the steady-state output variance as a function of the uncertain bandwidth and gain is derived, and several properties of this variance function are analyzed. Two notions, those of disturbance bandwidth margin and disturbance gain margin are also introduced. These tools are then applied to the analysis of a simple altitude-hold autopilot system in the presence of turbulence where the turbulence scale is treated as an uncertain parameter. It is shown that the autopilot, which is satisfactory for nominal turbulence scale, may be inadequate when the uncertainty is taken into account. Moreover, it is proven that, in order to obtain a design that provides robust performance in the face of turbulence scale uncertainty, it is necessary to substantially increase the controller bandwidth, even if one is willing to sacrifice the autopilot's holding ability and stability robustness.

  1. Weak lensing magnification in the Dark Energy Survey Science Verification data

    Science.gov (United States)

    Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration

    2018-05-01

    In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.

  2. Sensitivity analysis for thermo-hydraulics model of a Westinghouse type PWR. Verification of the simulation results

    Energy Technology Data Exchange (ETDEWEB)

    Farahani, Aref Zarnooshe [Islamic Azad Univ., Tehran (Iran, Islamic Republic of). Dept. of Nuclear Engineering, Science and Research Branch; Yousefpour, Faramarz [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Islamic Azad Univ., Tehran (Iran, Islamic Republic of). Dept. of Basic Sciences; Islamic Azad Univ., Tehran (Iran, Islamic Republic of). Young Researchers and Elite Club

    2017-07-15

    Development of a steady-state model is the first step in nuclear safety analysis. The developed model should be qualitatively analyzed first, then a sensitivity analysis is required on the number of nodes for models of different systems to ensure the reliability of the obtained results. This contribution aims to show through sensitivity analysis, the independence of modeling results to the number of nodes in a qualified MELCOR model for a Westinghouse type pressurized power plant. For this purpose, and to minimize user error, the nuclear analysis software, SNAP, is employed. Different sensitivity cases were developed by modification of the existing model and refinement of the nodes for the simulated systems including steam generators, reactor coolant system and also reactor core and its connecting flow paths. By comparing the obtained results to those of the original model no significant difference is observed which is indicative of the model independence to the finer nodes.

  3. Performance analysis of nuclear materials accounting systems

    International Nuclear Information System (INIS)

    Cobb, D.D.; Shipley, J.P.

    1979-01-01

    Techniques for analyzing the level of performance of nuclear materials accounting systems in terms of the four performance measures, total amount of loss, loss-detection time, loss-detection probability, and false-alarm probability, are presented. These techniques are especially useful for analyzing the expected performance of near-real-time (dynamic) accounting systems. A conservative estimate of system performance is provided by the CUSUM (cumulative summation of materials balances) test. Graphical displays, called performance surfaces, are developed as convenient tools for representing systems performance, and examples from a recent safeguards study of a nuclear fuels reprocessing plant are given. 6 refs

  4. VeriClick: an efficient tool for table format verification

    Science.gov (United States)

    Nagy, George; Tamhankar, Mangesh

    2012-01-01

    The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.

  5. Memory Efficient Data Structures for Explicit Verification of Timed Systems

    DEFF Research Database (Denmark)

    Taankvist, Jakob Haahr; Srba, Jiri; Larsen, Kim Guldstrand

    2014-01-01

    Timed analysis of real-time systems can be performed using continuous (symbolic) or discrete (explicit) techniques. The explicit state-space exploration can be considerably faster for models with moderately small constants, however, at the expense of high memory consumption. In the setting of timed......-arc Petri nets, we explore new data structures for lowering the used memory: PTries for efficient storing of configurations and time darts for semi-symbolic description of the state-space. Both methods are implemented as a part of the tool TAPAAL and the experiments document at least one order of magnitude...... of memory savings while preserving comparable verification times....

  6. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  7. Verification of the astrometric performance of the Korean VLBI network, using comparative SFPR studies with the VLBA AT 14/7 mm

    Energy Technology Data Exchange (ETDEWEB)

    Rioja, María J.; Dodson, Richard; Jung, TaeHyun; Sohn, Bong Won; Byun, Do-Young; Cho, Se-Hyung; Lee, Sang-Sung; Kim, Jongsoo; Kim, Kee-Tae; Oh, Chung Sik; Han, Seog-Tae; Je, Do-Heung; Chung, Moon-Hee; Wi, Seog-Oh; Kang, Jiman; Lee, Jung-Won; Chung, Hyunsoo; Kim, Hyo Ryoung; Kim, Hyun-Goo [Korea Astronomy and Space Science Institute, Daedeokdae-ro 776, Yuseong-gu, Daejeon 305-348 (Korea, Republic of); Agudo, Iván, E-mail: maria.rioja@icrar.org [Joint Institute for VLBI in Europe, Postbus 2, NL-7990 AA Dwingeloo (Netherlands); and others

    2014-11-01

    The Korean VLBI Network (KVN) is a new millimeter VLBI dedicated array with the capability to simultaneously observe at multiple frequencies, up to 129 GHz. The innovative multi-channel receivers present significant benefits for astrometric measurements in the frequency domain. The aim of this work is to verify the astrometric performance of the KVN using a comparative study with the VLBA, a well-established instrument. For that purpose, we carried out nearly contemporaneous observations with the KVN and the VLBA, at 14/7 mm, in 2013 April. The KVN observations consisted of simultaneous dual frequency observations, while the VLBA used fast frequency switching observations. We used the Source Frequency Phase Referencing technique for the observational and analysis strategy. We find that having simultaneous observations results in superior compensation for all atmospheric terms in the observables, in addition to offering other significant benefits for astrometric analysis. We have compared the KVN astrometry measurements to those from the VLBA. We find that the structure blending effects introduce dominant systematic astrometric shifts, and these need to be taken into account. We have tested multiple analytical routes to characterize the impact of the low-resolution effects for extended sources in the astrometric measurements. The results from the analysis of the KVN and full VLBA data sets agree within 2σ of the thermal error estimate. We interpret the discrepancy as arising from the different resolutions. We find that the KVN provides astrometric results with excellent agreement, within 1σ, when compared to a VLBA configuration that has a similar resolution. Therefore, this comparative study verifies the astrometric performance of the KVN using SFPR at 14/7 mm, and validates the KVN as an astrometric instrument.

  8. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  9. Design verification for reactor head replacement

    International Nuclear Information System (INIS)

    Dwivedy, K.K.; Whitt, M.S.; Lee, R.

    2005-01-01

    must be negotiated. This paper does not describe the massive efforts required by the NSSS and manufacturer's engineering groups nor does it include the challenges of construction in development of mechanical handling of heavy and large components, or the effort for providing adequate access for the head replacement and restoring the containment structure. The paper outlines the analysis and design efforts needed to support reactor head replacement. The paper concludes that the verification efforts performed by the utility design group not only provide increased assurance of design adequacy, but also serves as an important player on the strong team that is required for a successful head replacement. (authors)

  10. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  11. Verification of the safety communication protocol in train control system using colored Petri net

    International Nuclear Information System (INIS)

    Chen Lijie; Tang Tao; Zhao Xianqiong; Schnieder, Eckehard

    2012-01-01

    This paper deals with formal and simulation-based verification of the safety communication protocol in ETCS (European Train Control System). The safety communication protocol controls the establishment of safety connection between train and trackside. Because of its graphical user interface and modeling flexibility upon the changes in the system conditions, this paper proposes a composition Colored Petri Net (CPN) representation for both the logic and the timed model. The logic of the protocol is proved to be safe by means of state space analysis: the dead markings are correct; there are no dead transitions; being fair. Further analysis results have been obtained using formal and simulation-based verification approach. The timed models for the open transmit system and the application process are created for the purpose of performance analysis of the safety communication protocol. The models describe the procedure of data transmission and processing, and also provide relevant timed and stochastic factors, as well as time delay and lost packet, which may influence the time for establishment of safety connection of the protocol. Time for establishment of safety connection of the protocol in normal state is verified by formal verification, and then time for establishment of safety connection with different probability of lost packet is simulated. After verification it is found that the time for establishment of safety connection of the safety communication protocol satisfies the safety requirements.

  12. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  13. Using timing information in speaker verification

    CSIR Research Space (South Africa)

    Van Heerden, CJ

    2005-11-01

    Full Text Available This paper presents an analysis of temporal information as a feature for use in speaker verification systems. The relevance of temporal information in a speaker’s utterances is investigated, both with regard to improving the robustness of modern...

  14. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    International Nuclear Information System (INIS)

    Eyler, L.L.; Trent, D.S.; Budden, M.J.

    1983-09-01

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs

  15. On-line high-performance liquid chromatography-ultraviolet-nuclear magnetic resonance method of the markers of nerve agents for verification of the Chemical Weapons Convention.

    Science.gov (United States)

    Mazumder, Avik; Gupta, Hemendra K; Garg, Prabhat; Jain, Rajeev; Dubey, Devendra K

    2009-07-03

    This paper details an on-flow liquid chromatography-ultraviolet-nuclear magnetic resonance (LC-UV-NMR) method for the retrospective detection and identification of alkyl alkylphosphonic acids (AAPAs) and alkylphosphonic acids (APAs), the markers of the toxic nerve agents for verification of the Chemical Weapons Convention (CWC). Initially, the LC-UV-NMR parameters were optimized for benzyl derivatives of the APAs and AAPAs. The optimized parameters include stationary phase C(18), mobile phase methanol:water 78:22 (v/v), UV detection at 268nm and (1)H NMR acquisition conditions. The protocol described herein allowed the detection of analytes through acquisition of high quality NMR spectra from the aqueous solution of the APAs and AAPAs with high concentrations of interfering background chemicals which have been removed by preceding sample preparation. The reported standard deviation for the quantification is related to the UV detector which showed relative standard deviations (RSDs) for quantification within +/-1.1%, while lower limit of detection upto 16mug (in mug absolute) for the NMR detector. Finally the developed LC-UV-NMR method was applied to identify the APAs and AAPAs in real water samples, consequent to solid phase extraction and derivatization. The method is fast (total experiment time approximately 2h), sensitive, rugged and efficient.

  16. Portal verification for breast cancer radiotherapy

    International Nuclear Information System (INIS)

    Petkovska, Sonja; Pejkovikj, Sasho; Apostolovski, Nebojsha

    2013-01-01

    At the University Clinic in Skopje, breast cancer irradiation is being planned and performed by using a mono-iso centrical method, which means that a unique isocenter (I C) for all irradiation fields is used. The goal of this paper is to present the patient’s position in all coordinates before the first treatment session, relative to the position determined during the CT simulation. Deviation of up to 5 mm is allowed. The analysis was made by using a portal verification. Sixty female patients at random selection are reviewed. The matching results show that for each patient deviation exists at least on one axis. The largest deviations are in the longitudinal direction (head-feet) up to 4 mm, mean 1.8 mm. In 60 out of 85 analysed fields, the deviation is towards the head. In lateral direction, median deviation is 1.1 mm and in 65% of the analysed portals those deviations are in medial direction – contralateral breast which can increases the dose in the lung and in the contralateral breast. This deviation for supraclavicular field can increase the dose in the spinal cord. Although these doses are well below the limit, this fact should be taken into account in setting the treatment fields. The final conclusion from the research is that despite of the fact we are dealing with small deviations, in conditions when accuracy in positioning is done with portal, the portal verification needs to be done in the coming weeks of the treatment, not only before the first treatment. This provides information for an intra fractional set-up deviation. (Author)

  17. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    Science.gov (United States)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  18. Operational Modal Analysis and the Performance Assessment of Vehicle Suspension Systems

    Directory of Open Access Journals (Sweden)

    L. Soria

    2012-01-01

    Full Text Available Comfort, road holding and safety of passenger cars are mainly influenced by an appropriate design of suspension systems. Improvements of the dynamic behaviour can be achieved by implementing semi-active or active suspension systems. In these cases, the correct design of a well-performing suspension control strategy is of fundamental importance to obtain satisfying results. Operational Modal Analysis allows the experimental structural identification in those that are the real operating conditions: Moving from output-only data, leading to modal models linearised around the more interesting working points and, in the case of controlled systems, providing the needed information for the optimal design and verification of the controller performance. All these characters are needed for the experimental assessment of vehicle suspension systems. In the paper two suspension architectures are considered equipping the same car type. The former is a semi-active commercial system, the latter a novel prototypic active system. For the assessment of suspension performance, two different kinds of tests have been considered, proving ground tests on different road profiles and laboratory four poster rig tests. By OMA-processing the signals acquired in the different testing conditions and by comparing the results, it is shown how this tool can be effectively utilised to verify the operation and the performance of those systems, by only carrying out a simple, cost-effective road test.

  19. Cost and performance analysis of physical security systems

    International Nuclear Information System (INIS)

    Hicks, M.J.; Yates, D.; Jago, W.H.; Phillips, A.W.

    1998-04-01

    Analysis of cost and performance of physical security systems can be a complex, multi-dimensional problem. There are a number of point tools that address various aspects of cost and performance analysis. Increased interest in cost tradeoffs of physical security alternatives has motivated development of an architecture called Cost and Performance Analysis (CPA), which takes a top-down approach to aligning cost and performance metrics. CPA incorporates results generated by existing physical security system performance analysis tools, and utilizes an existing cost analysis tool. The objective of this architecture is to offer comprehensive visualization of complex data to security analysts and decision-makers

  20. Unmanned Aircraft Systems Detect and Avoid System: End-to-End Verification and Validation Simulation Study of Minimum Operations Performance Standards for Integrating Unmanned Aircraft into the National Airspace System

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.