WorldWideScience

Sample records for model verification studies

  1. Results of the Independent Verification and Validation Study for the D2-Puff Model

    National Research Council Canada - National Science Library

    Bowers, J

    1999-01-01

    .... The independent verification and validation (IV&V) study of D2-Puff Version 2.0.6 focused on these accreditation requirements and the implicit requirement that the model provide safe-sided hazard estimates...

  2. Study on formal modeling and verification of safety computer platform

    Directory of Open Access Journals (Sweden)

    Xi Wang

    2016-05-01

    Full Text Available With the development of automatic control and communication technology, communication-based train control system is adopted by more and more urban mass transit system to automatically supervise the train speed to follow a desired trajectory. Taking reliability, availability, maintainability, and safety into consideration, 2 × 2-out-of-2 safety computer platform is usually utilized as the hardware platform of safety-critical subsystem in communication-based train control. To enhance the safety integrity level of safety computer platform, safety-related logics have to be verified before integrating them into practical systems. Therefore, a significant problem of developing safety computer platform is how to guarantee that system behaviors will satisfy the function requirements as well as responding to external events and processes within the limit of right time. Based on the qualitative and quantitative analysis of function and timing characteristics, this article introduces a formal modeling and verification approach for this real-time system. In the proposed method, timed automata network model for 2 × 2-out-of-2 safety computer platform is built, and system requirements are specified and formalized as computation tree logic properties which can be verified by UPPAAL model checker.

  3. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  4. Mesoscale model forecast verification during monsoon 2008

    Indian Academy of Sciences (India)

    There have been very few mesoscale modelling studies of the Indian monsoon, with focus on the verification and intercomparison of the operational real time forecasts. With the exception of Das et al (2008), most of the studies in the literature are either the case studies of tropical cyclones and thunderstorms or the sensitivity ...

  5. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    Science.gov (United States)

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-01-01

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949

  6. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  7. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  8. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  9. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  10. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the Knowledge Workers Productivity Approach. It is hoped that this paper will help managers to implement different corresponding measures. A case study is presented where this model measure and validates at the ...

  11. Systematic study of source mask optimization and verification flows

    Science.gov (United States)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  12. TU electric reactor model verification

    International Nuclear Information System (INIS)

    Willingham, C.E.; Killgore, M.R.

    1989-01-01

    Power reactor benchmark calculations using the code package CASMO-3/SIMULATE-3 have been performed for six cycles of Prairie Island Unit 1. The reload fuel designs for the selected cycles include gadolinia as a burnable absorber, natural uranium axial blankets, and increased water-to-fuel ratio. The calculated results for both low-power physics tests (boron end points, control rod worths, and isothermal temperature coefficients) and full-power operation (power distributions and boron letdown) are compared to measured plant data. These comparisons show that the TU Electric reactor physics models accurately predict important physics parameters for power reactors

  13. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  14. VERIFICATION OF GEAR DYNAMIC MODEL IN DIFFERENT OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ

    2014-09-01

    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  15. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  16. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  17. Formal Modeling and Verification for MVB

    Directory of Open Access Journals (Sweden)

    Mo Xia

    2013-01-01

    Full Text Available Multifunction Vehicle Bus (MVB is a critical component in the Train Communication Network (TCN, which is widely used in most of the modern train techniques of the transportation system. How to ensure security of MVB has become an important issue. Traditional testing could not ensure the system correctness. The MVB system modeling and verification are concerned in this paper. Petri Net and model checking methods are used to verify the MVB system. A Hierarchy Colored Petri Net (HCPN approach is presented to model and simulate the Master Transfer protocol of MVB. Synchronous and asynchronous methods are proposed to describe the entities and communication environment. Automata model of the Master Transfer protocol is designed. Based on our model checking platform M3C, the Master Transfer protocol of the MVB is verified and some system logic critical errors are found. Experimental results show the efficiency of our methods.

  18. Verification study on technology for preliminary investigation for HLW geological disposal. Part 2. Verification of surface geophysical prospecting through establishing site descriptive models

    International Nuclear Information System (INIS)

    Kondo, Hirofumi; Suzuki, Koichi; Hasegawa, Takuma; Goto, Keiichiro; Yoshimura, Kimitaka; Muramoto, Shigenori

    2012-01-01

    The Yokosuka demonstration and validation project using Yokosuka CRIEPI site has been conducted since FY 2006 as a cooperative research between NUMO (Nuclear Waste Management Organization of Japan) and CRIEPI. The objectives of this project are to examine and to refine the basic methodology of the investigation and assessment of properties of geological environment in the stage of Preliminary Investigation for HLW geological disposal. Within Preliminary Investigation technologies, surface geophysical prospecting is an important means of obtaining information from deep geological environment for planning borehole surveys. In FY 2010, both seismic prospecting (seismic reflection and vertical seismic profiling methods) for obtaining information about geological structure and electromagnetic prospecting (magneto-telluric and time domain electromagnetic methods) for obtaining information about resistivity structure reflecting the distribution of salt water/fresh water boundary to a depth of over several hundred meters were conducted in the Yokosuka CRIEPI site. Through these surveys, the contribution of geophysical prospecting methods in the surface survey stage to improving the reliability of site descriptive models was confirmed. (author)

  19. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  20. Columbia River Estuary Hybrid Model Studies. Report 1. Verification of Hybrid Modeling of the Columbia River Mouth.

    Science.gov (United States)

    1983-09-01

    Columbia River ," HW-49008, Hanford Atomic Products Operation , Richland, WA. A13 107. Hopkins, T.S. 1971. "Velocity, Temperature , and Pressure...of Oceanography, Seattle, WA. 60. Environmental Protection Agency. 1971. Columbia River Thermal Effects Study: Temperature Predictions, Vol. 2...Research Program Under Columbia River Effects in the Northeast Pacific," University of Washington, Department of Oceanography, Seattle, WA. 5. Andrews,

  1. Control and verification of industrial hybrid systems using models specified with the formalism $ chi $

    NARCIS (Netherlands)

    J.J.H. Fey

    1996-01-01

    textabstractControl and verification of hybrid systems is studied using two industrial examples. The hybrid models of a conveyor-belt and of a biochemical plant for the production of ethanol are specified in the formalism $chi .$ A verification of the closed-loop systems for those examples,

  2. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  3. Hybrid Enrichment Verification Array: Module Characterization Studies

    Energy Technology Data Exchange (ETDEWEB)

    Zalavadia, Mital A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Leon E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McDonald, Benjamin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kulisek, Jonathan A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mace, Emily K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Deshmukh, Nikhil S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-03-01

    The work presented in this report is focused on the characterization and refinement of the Hybrid Enrichment Verification Array (HEVA) approach, which combines the traditional 186-keV 235U signature with high-energy prompt gamma rays from neutron capture in the detector and surrounding collimator material, to determine the relative enrichment and 235U mass of the cylinder. The design of the HEVA modules (hardware and software) deployed in the current field trial builds on over seven years of study and evolution by PNNL, and consists of a ø3''×3'' NaI(Tl) scintillator coupled to an Osprey digital multi-channel analyzer tube base from Canberra. The core of the HEVA methodology, the high-energy prompt gamma-ray signature, serves as an indirect method for the measurement of total neutron emission from the cylinder. A method for measuring the intrinsic efficiency of this “non-traditional” neutron signature and the results from a benchmark experiment are presented. Also discussed are potential perturbing effects on the non-traditional signature, including short-lived activation of materials in the HEVA module. Modeling and empirical results are presented to demonstrate that such effects are expected to be negligible for the envisioned implementation scenario. In comparison to previous versions, the new design boosts the high-energy prompt gamma-ray signature, provides more flexible and effective collimation, and improves count-rate management via commercially available pulse-processing electronics with a special modification prompted by PNNL.

  4. Verification and transfer of thermal pollution model. Volume 5: Verification of 2-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorate (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  5. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Science.gov (United States)

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  6. How reliable are satellite precipitation estimates for driving hydrological models: a verification study over the Mediterranean area

    Science.gov (United States)

    Camici, Stefania; Ciabatta, Luca; Massari, Christian; Brocca, Luca

    2017-04-01

    Floods are one of the most common and dangerous natural hazards, causing every year thousands of casualties and damages worldwide. The main tool for assessing flood risk and reducing damages is represented by hydrologic early warning systems that allow to forecast flood events by using real time data obtained through ground monitoring networks (e.g., raingauges and radars). However, the use of such data, mainly rainfall, presents some issues firstly related to the network density and to the limited spatial representativeness of local measurements. A way to overcome these issues may be the use of satellite-based rainfall products (SRPs) that nowadays are available on a global scale at ever increasing spatial/temporal resolution and accuracy. However, despite the large availability and increased accuracy of SRPs (e.g., the Tropical Rainfall Measurement Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA); the Satellite Application Facility on Support to Operational Hydrology and Water Management (H-SAF); and the recent Global Precipitation Measurement (GPM) mission), remotely sensed rainfall data are scarcely used in hydrological modeling and only a small number of studies have been carried out to outline some guidelines for using satellite data as input for hydrological modelling. Reasons may be related to: 1) the large bias characterizing satellite precipitation estimates, which is dependent on rainfall intensity and season, 2) the spatial/temporal resolution, 3) the timeliness, which is often insufficient for operational purposes, and 4) a general (often not justified) skepticism of the hydrological community in the use of satellite products for land applications. The objective of this study is to explore the feasibility of using SRPs in a lumped hydrologic model (MISDc, "Modello Idrologico Semi-Distribuito in continuo", Masseroni et al., 2017) over 10 basins in the Mediterranean area with different sizes and physiographic characteristics. Specifically

  7. Authoring and verification of clinical guidelines: a model driven approach.

    Science.gov (United States)

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc

  8. On the verification and validation of detonation models

    Science.gov (United States)

    Quirk, James

    2013-06-01

    This talk will consider the verification and validation of detonation models, such as Wescott-Stewart-Davis (Journal of Applied Physics. 2005), from the perspective of the American Institute of Aeronautics and Astronautics policy on numerical accuracy (AIAA J. Vol. 36, No. 1, 1998). A key aspect of the policy is that accepted documentation procedures must be used for journal articles with the aim of allowing the reported work to be reproduced by the interested reader. With the rise of electronic documents, since the policy was formulated, it is now possible to satisfy this mandate in its strictest sense: that is, it is now possible to run a comptuational verification study directly in a PDF, thereby allowing a technical author to report numerical subtleties that traditionally have been ignored. The motivation for this document-centric approach is discussed elsewhere (Quirk2003, Adaptive Mesh Refinement Theory and Practice, Springer), leaving the talk to concentrate on specific detonation examples that should be of broad interest to the shock-compression community.

  9. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available This paper provides a methodology for Validation and Verification (V&V) of a Bayesian Network (BN) model for aircraft vulnerability against Infrared (IR) missile threats. The model considers that the aircraft vulnerability depends both on a missile...

  10. User verification of the FRBR conceptual model

    OpenAIRE

    Pisanski, Jan; Žumer, Maja

    2015-01-01

    Purpose - The paper aims to build on of a previous study of mental model s of the bibliographic universe, which found that the Functional Requirements for Bibliographic Records (FRBR) conceptual model is intuitive. Design/ methodology/approach - A total 120 participants were presented with a list of bibliographic entities and six graphs each. They were asked to choose the graph they thought best represented the relationships between entities described. Findings - The graph bas ed on the FRBR ...

  11. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  12. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Panduro, T. E.; Thorsen, B. J.

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however...

  13. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however...

  14. IP cores design from specifications to production modeling, verification, optimization, and protection

    CERN Document Server

    Mohamed, Khaled Salah

    2016-01-01

    This book describes the life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection. Various trade-offs in the design process are discussed, including  those associated with many of the most common memory cores, controller IPs  and system-on-chip (SoC) buses. Readers will also benefit from the author’s practical coverage of new verification methodologies. such as bug localization, UVM, and scan-chain.  A SoC case study is presented to compare traditional verification with the new verification methodologies. ·         Discusses the entire life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection; ·         Introduce a deep introduction for Verilog for both implementation and verification point of view.  ·         Demonstrates how to use IP in applications such as memory controllers and SoC buses. ·         Describes a new ver...

  15. Learner Verification: A Publisher's Case Study.

    Science.gov (United States)

    Wilson, George

    Learner verification, a process by which publishers monitor the effectiveness of their products and strive to improve their services to schools, is a practice that most companies take seriously. The quality of educational materials may be ensured in many ways: by analysis of sales, through firsthand investigation, and by employing a system of…

  16. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  17. Formal Requirements Modeling for Simulation-Based Verification

    OpenAIRE

    Otter, Martin; Thuy, Nguyen; Bouskela, Daniel; Buffoni, Lena; Elmqvist, Hilding; Fritzson, Peter; Garro, Alfredo; Jardin, Audrey; Olsson, Hans; Payelleville, Maxime; Schamai, Wladimir; Thomas, Eric; Tundis, Andrea

    2015-01-01

    This paper describes a proposal on how to model formal requirements in Modelica for simulation-based verification. The approach is implemented in the open source Modelica_Requirements library. It requires extensions to the Modelica language, that have been prototypically implemented in the Dymola and Open-Modelica software. The design of the library is based on the FOrmal Requirement Modeling Language (FORM-L) defined by EDF, and on industrial use cases from EDF and Dassault Aviation. It uses...

  18. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  19. Verification of Orthogrid Finite Element Modeling Techniques

    Science.gov (United States)

    Steeve, B. E.

    1996-01-01

    The stress analysis of orthogrid structures, specifically with I-beam sections, is regularly performed using finite elements. Various modeling techniques are often used to simplify the modeling process but still adequately capture the actual hardware behavior. The accuracy of such 'Oshort cutso' is sometimes in question. This report compares three modeling techniques to actual test results from a loaded orthogrid panel. The finite element models include a beam, shell, and mixed beam and shell element model. Results show that the shell element model performs the best, but that the simpler beam and beam and shell element models provide reasonable to conservative results for a stress analysis. When deflection and stiffness is critical, it is important to capture the effect of the orthogrid nodes in the model.

  20. Mesoscale model forecast verification during monsoon 2008

    Indian Academy of Sciences (India)

    The systematic error in the 850 hPa temperature indicates that largely the WRF model forecasts feature warm bias and the MM5 model forecasts feature cold bias. Features common to all the three models include warm bias over northwest India and cold bias over southeast peninsula. The 850 hPa specific humidity forecast ...

  1. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  2. Model of supercapacitor with laboratory verification

    Directory of Open Access Journals (Sweden)

    Simić Mitar S.

    2014-01-01

    Full Text Available In this paper historical development and most important production technologies of supercapacitors are presented. Simple model of supecapacitor based on parameters obtained from manufacturer's datasheet is also realized. Purposed model is verified with charge, discharge and self-discharge tests with obtained results close to experimental.

  3. Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Bliguet, Marie Le; Kjær, Andreas

    2010-01-01

    This paper describes how relay interlocking systems as used by the Danish railways can be formally modelled and verified. Such systems are documented by circuit diagrams describing their static layout. It is explained how to derive a state transition system model for the dynamic behaviour of a re...

  4. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  5. Numerical modeling of nitrogen oxide emission and experimental verification

    Directory of Open Access Journals (Sweden)

    Szecowka Lech

    2003-12-01

    Full Text Available The results of nitrogen reduction in combustion process with application of primary method are presented in paper. The reduction of NOx emission, by the recirculation of combustion gasses, staging of fuel and of air was investigated, and than the reduction of NOx emission by simultaneous usage of the mentioned above primary method with pulsatory disturbances.The investigations contain numerical modeling of NOx reduction and experimental verification of obtained numerical calculation results.

  6. VERIFICATION AND VALIDATION OF THE SPARC MODEL

    Science.gov (United States)

    Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values--that is, the physical and chemical constants that govern reactivity. Although empirical structure-activity relationships that allow estimation of some ...

  7. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...... as well as slip. An Unscented Kalman Filter (UKF) based on the dynamic model is used for sensor fusion, feeding sensor measurements back to the robot controller in an intelligent manner. Through practical experiments with the robot, the UKF is demonstrated to improve the reliability of the sensor signals...

  8. A verification system survival probability assessment model test methods

    International Nuclear Information System (INIS)

    Jia Rui; Wu Qiang; Fu Jiwei; Cao Leituan; Zhang Junnan

    2014-01-01

    Subject to the limitations of funding and test conditions, the number of sub-samples of large complex system test less often. Under the single sample conditions, how to make an accurate evaluation of the performance, it is important for reinforcement of complex systems. It will be able to significantly improve the technical maturity of the assessment model, if that can experimental validation and evaluation model. In this paper, a verification system survival probability assessment model test method, the method by the test system sample test results, verify the correctness of the assessment model and a priori information. (authors)

  9. Models and formal verification of multiprocessor system-on-chips

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2008-01-01

    , the configuration of the execution platform and the mapping of the application onto this platform. The computational model provides a basis for formal analysis of systems. The model is translated to timed automata and a tool for system verification and simulation has been developed using Uppaal as backend. We...... a discrete model of computation for such systems and characterize the size of the computation tree it suffices to consider when checking for schedulability. Analysis of multiprocessor system on chips is a major challenge due to the freedom of interrelated choices concerning the application level...

  10. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2017-01-01

    In this article, we present a method and an associated toolchain for the formal verification of the new Danish railway interlocking systems that are compatible with the European Train Control System (ETCS) Level 2. We have made a generic and reconfigurable model of the system behaviour and generic...... safety properties. This model accommodates sequential release - a feature in the new Danish interlocking systems. To verify the safety of an interlocking system, first a domain-specific description of interlocking configuration data is constructed and validated. Then the generic model and safety...

  11. Fossil Fuel Emission Verification Modeling at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Cameron-Smith, P; Kosovic, B; Guilderson, T; Monache, L D; Bergmann, D

    2009-08-06

    We have an established project at LLNL to develop the tools needed to constrain fossil fuel carbon dioxide emissions using measurements of the carbon-14 isotope in atmospheric samples. In Figure 1 we show the fossil fuel plumes from Los Angeles and San Francisco for two different weather patterns. Obviously, a measurement made at any given location is going to depend on the weather leading up to the measurement. Thus, in order to determine the GHG emissions from some region using in situ measurements of those GHGs, we use state-of-the-art global and regional atmospheric chemistry-transport codes to simulate the plumes: the LLNL-IMPACT model (Rotman et al., 2004) and the WRFCHEM community code (http://www.wrf-model.org/index.php). Both codes can use observed (aka assimilated) meteorology in order to recreate the actual transport that occurred. The measured concentration of each tracer at a particular spatio-temporal location is a linear combination of the plumes from each region at that location (for non-reactive species). The challenge is to calculate the emission strengths for each region that fit the observed concentrations. In general this is difficult because there are errors in the measurements and modeling of the plumes. We solve this inversion problem using the strategy illustrated in Figure 2. The Bayesian Inference step combines the a priori estimates of the emissions, and their uncertainty, for each region with the results of the observations, and their uncertainty, and an ensemble of model predicted plumes for each region, and their uncertainty. The result is the mathematical best estimate of the emissions and their errors. In the case of non-linearities, or if we are using a statistical sampling technique such as a Markov Chain Monte Carlo technique, then the process is iterated until it converges (ie reaches stationarity). For the Bayesian inference we can use both a direct inversion capability, which is fast but requires assumptions of linearity and

  12. A program for verification of phylogenetic network models.

    Science.gov (United States)

    Gunawan, Andreas D M; Lu, Bingxin; Zhang, Louxin

    2016-09-01

    Genetic material is transferred in a non-reproductive manner across species more frequently than commonly thought, particularly in the bacteria kingdom. On one hand, extant genomes are thus more properly considered as a fusion product of both reproductive and non-reproductive genetic transfers. This has motivated researchers to adopt phylogenetic networks to study genome evolution. On the other hand, a gene's evolution is usually tree-like and has been studied for over half a century. Accordingly, the relationships between phylogenetic trees and networks are the basis for the reconstruction and verification of phylogenetic networks. One important problem in verifying a network model is determining whether or not certain existing phylogenetic trees are displayed in a phylogenetic network. This problem is formally called the tree containment problem. It is NP-complete even for binary phylogenetic networks. We design an exponential time but efficient method for determining whether or not a phylogenetic tree is displayed in an arbitrary phylogenetic network. It is developed on the basis of the so-called reticulation-visible property of phylogenetic networks. A C-program is available for download on http://www.math.nus.edu.sg/∼matzlx/tcp_package matzlx@nus.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Modeling and Verification of the Bitcoin Protocol

    Directory of Open Access Journals (Sweden)

    Kaylash Chaudhary

    2015-11-01

    Full Text Available Bitcoin is a popular digital currency for online payments, realized as a decentralized peer-to-peer electronic cash system. Bitcoin keeps a ledger of all transactions; the majority of the participants decides on the correct ledger. Since there is no trusted third party to guard against double spending, and inspired by its popularity, we would like to investigate the correctness of the Bitcoin protocol. Double spending is an important threat to electronic payment systems. Double spending would happen if one user could force a majority to believe that a ledger without his previous payment is the correct one. We are interested in the probability of success of such a double spending attack, which is linked to the computational power of the attacker. This paper examines the Bitcoin protocol and provides its formalization as an UPPAAL model. The model will be used to show how double spending can be done if the parties in the Bitcoin protocol behave maliciously, and with what probability double spending occurs.

  14. Verification of the NWP models operated at ICM, Poland

    Science.gov (United States)

    Melonek, Malgorzata

    2010-05-01

    Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw (ICM) started its activity in the field of NWP in May 1997. Since this time the numerical weather forecasts covering Central Europe have been routinely published on our publicly available website. First NWP model used in ICM was hydrostatic Unified Model developed by the UK Meteorological Office. It was a mesoscale version with horizontal resolution of 17 km and 31 levels in vertical. At present two NWP non-hydrostatic models are running in quasi-operational regime. The main new UM model with 4 km horizontal resolution, 38 levels in vertical and forecats range of 48 hours is running four times a day. Second, the COAMPS model (Coupled Ocean/Atmosphere Mesoscale Prediction System) developed by the US Naval Research Laboratory, configured with the three nested grids (with coresponding resolutions of 39km, 13km and 4.3km, 30 vertical levels) are running twice a day (for 00 and 12 UTC). The second grid covers Central Europe and has forecast range of 84 hours. Results of the both NWP models, ie. COAMPS computed on 13km mesh resolution and UM, are verified against observations from the Polish synoptic stations. Verification uses surface observations and nearest grid point forcasts. Following meteorological elements are verified: air temperature at 2m, mean sea level pressure, wind speed and wind direction at 10 m and 12 hours accumulated precipitation. There are presented different statistical indices. For continous variables Mean Error(ME), Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) in 6 hours intervals are computed. In case of precipitation the contingency tables for different thresholds are computed and some of the verification scores such as FBI, ETS, POD, FAR are graphically presented. The verification sample covers nearly one year.

  15. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    Science.gov (United States)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  16. Transforming PLC Programs into Formal Models for Verification Purposes

    CERN Document Server

    Darvas, D; Blanco, E

    2013-01-01

    Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.

  17. Design verification and cold-flow modeling test report

    Energy Technology Data Exchange (ETDEWEB)

    1993-07-01

    This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, and TRW proprietary information has been excluded.

  18. Evaluation and Verification of Decadal Predictions using the MiKlip Central Evaluation System - a Case Study using the MiKlip Prototype Model Data

    Science.gov (United States)

    Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich

    2016-04-01

    MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.

  19. SoS contract verification using statistical model checking

    Directory of Open Access Journals (Sweden)

    Alessandro Mignogna

    2013-11-01

    Full Text Available Exhaustive formal verification for systems of systems (SoS is impractical and cannot be applied on a large scale. In this paper we propose to use statistical model checking for efficient verification of SoS. We address three relevant aspects for systems of systems: 1 the model of the SoS, which includes stochastic aspects; 2 the formalization of the SoS requirements in the form of contracts; 3 the tool-chain to support statistical model checking for SoS. We adapt the SMC technique for application to heterogeneous SoS. We extend the UPDM/SysML specification language to express the SoS requirements that the implemented strategies over the SoS must satisfy. The requirements are specified with a new contract language specifically designed for SoS, targeting a high-level English- pattern language, but relying on an accurate semantics given by the standard temporal logics. The contracts are verified against the UPDM/SysML specification using the Statistical Model Checker (SMC PLASMA combined with the simulation engine DESYRE, which integrates heterogeneous behavioral models through the functional mock-up interface (FMI standard. The tool-chain allows computing an estimation of the satisfiability of the contracts by the SoS. The results help the system architect to trade-off different solutions to guide the evolution of the SoS.

  20. Experimental phantom verification studies for simulations of light interactions with skin: liquid phantoms

    CSIR Research Space (South Africa)

    Karsten, A

    2010-09-01

    Full Text Available phantoms to verify model Slide 5 Verification comparison •Layered structure of skin can be modelled • Solid or liquid phantoms can be used for verification • Solid phantoms prepared from resin, absorbing and scattering particles – advantage: multi... layers possible and phantoms stable and durable for repeatability studies • Liquid samples made from Intralipid® and black ink – optical properties of Intralipid® is well documented in literature •Manufacture phantoms – use phantom parameters...

  1. Verification and transfer of thermal pollution model. Volume 3: Verification of 3-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free-surface model also provides surface height variations with time.

  2. A case study in pathway knowledgebase verification

    Directory of Open Access Journals (Sweden)

    Shah Nigam H

    2006-04-01

    Full Text Available Abstract Background Biological databases and pathway knowledgebases are proliferating rapidly. We are developing software tools for computer-aided hypothesis design and evaluation, and we would like our tools to take advantage of the information stored in these repositories. But before we can reliably use a pathway knowledgebase as a data source, we need to proofread it to ensure that it can fully support computer-aided information integration and inference. Results We design a series of logical tests to detect potential problems we might encounter using a particular knowledgebase, the Reactome database, with a particular computer-aided hypothesis evaluation tool, HyBrow. We develop an explicit formal language from the language implicit in the Reactome data format and specify a logic to evaluate models expressed using this language. We use the formalism of finite model theory in this work. We then use this logic to formulate tests for desirable properties (such as completeness, consistency, and well-formedness for pathways stored in Reactome. We apply these tests to the publicly available Reactome releases (releases 10 through 14 and compare the results, which highlight Reactome's steady improvement in terms of decreasing inconsistencies. We also investigate and discuss Reactome's potential for supporting computer-aided inference tools. Conclusion The case study described in this work demonstrates that it is possible to use our model theory based approach to identify problems one might encounter using a knowledgebase to support hypothesis evaluation tools. The methodology we use is general and is in no way restricted to the specific knowledgebase employed in this case study. Future application of this methodology will enable us to compare pathway resources with respect to the generic properties such resources will need to possess if they are to support automated reasoning.

  3. A case study in pathway knowledgebase verification.

    Science.gov (United States)

    Racunas, Stephen A; Shah, Nigam H; Fedoroff, Nina V

    2006-04-08

    Biological databases and pathway knowledge-bases are proliferating rapidly. We are developing software tools for computer-aided hypothesis design and evaluation, and we would like our tools to take advantage of the information stored in these repositories. But before we can reliably use a pathway knowledge-base as a data source, we need to proofread it to ensure that it can fully support computer-aided information integration and inference. We design a series of logical tests to detect potential problems we might encounter using a particular knowledge-base, the Reactome database, with a particular computer-aided hypothesis evaluation tool, HyBrow. We develop an explicit formal language from the language implicit in the Reactome data format and specify a logic to evaluate models expressed using this language. We use the formalism of finite model theory in this work. We then use this logic to formulate tests for desirable properties (such as completeness, consistency, and well-formedness) for pathways stored in Reactome. We apply these tests to the publicly available Reactome releases (releases 10 through 14) and compare the results, which highlight Reactome's steady improvement in terms of decreasing inconsistencies. We also investigate and discuss Reactome's potential for supporting computer-aided inference tools. The case study described in this work demonstrates that it is possible to use our model theory based approach to identify problems one might encounter using a knowledge-base to support hypothesis evaluation tools. The methodology we use is general and is in no way restricted to the specific knowledge-base employed in this case study. Future application of this methodology will enable us to compare pathway resources with respect to the generic properties such resources will need to possess if they are to support automated reasoning.

  4. Issues to be considered on obtaining plant models for formal verification purposes

    Science.gov (United States)

    Pacheco, R.; Gonzalez, L.; Intriago, M.; Machado, J.; Prisacaru, G.; Olaru, D.

    2016-08-01

    The development of dependable software for mechatronic systems can be a very complex and hard task. For facilitating the obtaining of dependable software for industrial controllers, some powerful software tools and analysis techniques can be used. Mainly, when using simulation and formal verification analysis techniques, it is necessary to develop plant models, in order to describe the plant behavior of those systems. However, developing a plant model implies that designer takes his (or her) decisions concerning granularity and level of abstraction of models; approach to consider for modeling (global or modular); and definition of strategies for simulation and formal verification tasks. This paper intends to highlight some aspects that can be considered for taking into account those decisions. For this purpose, it is presented a case study and there are illustrated and discussed very important aspects concerning above exposed issues.

  5. Modeling and Verification of Insider Threats Using Logical Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2017-01-01

    and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider......In this paper, we combine formal modeling and analysis of infrastructures of organizations with sociological explanation to provide a framework for insider threat analysis. We use the higher order logic (HOL) proof assistant Isabelle/HOL to support this framework. In the formal model, we exhibit...... theory constructed in Isabelle that implements this process of social explanation. To validate that the social explanation is generally useful for the analysis of insider threats and to demonstrate our framework, we model and verify the insider threat patterns of entitled independent and Ambitious Leader...

  6. Extension and validation of an analytical model for in vivo PET verification of proton therapy--a phantom and clinical study

    NARCIS (Netherlands)

    Attanasi, F; Knopf, A; Parodi, K.; Paganetti, Harald; Bortfeld, Thomas; Rosso, V; Del Guerra, Alberto

    2011-01-01

    The interest in positron emission tomography (PET) as a tool for treatment verification in proton therapy has become widespread in recent years, and several research groups worldwide are currently investigating the clinical implementation. After the first off-line investigation with a PET/CT scanner

  7. A comparative verification of high resolution precipitation forecasts using model output statistics

    Science.gov (United States)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  8. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-05-12

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  9. The bakery protocol : a comparitive case-study in formal verification

    NARCIS (Netherlands)

    W.O.D. Griffioen; H.P. Korver

    1995-01-01

    textabstractA Comparative Case-Study in Formal Verification Groote and the second author verified (a version of) the Bakery Protocol in $muCRL$. Their process-algebraic verification is rather complex compared to the protocol. Now the question is: How do other verification techniques perform on this

  10. Verification of Fault Tree Models with RBDGG Methodology

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    Currently, fault tree analysis is widely used in the field of probabilistic safety assessment (PSA) of nuclear power plants (NPPs). To guarantee the correctness of fault tree models, which are usually manually constructed by analysts, a review by other analysts is widely used for verifying constructed fault tree models. Recently, an extension of the reliability block diagram was developed, which is named as RBDGG (reliability block diagram with general gates). The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system and, therefore, the modeling of a system for a system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar to that of the development of the RGGG (Reliability Graph with General Gates) methodology. The difference between the RBDGG methodology and RGGG methodology is that the RBDGG methodology focuses on the block failures while the RGGG methodology focuses on the connection line failures. But, it is also known that an RGGG model can be converted to an RBDGG model and vice versa. In this paper, a new method for the verification of the constructed fault tree models using the RBDGG methodology is proposed and demonstrated

  11. Verification of rubidium-82 for heart studies

    International Nuclear Information System (INIS)

    Budinger, T.F.; Yano, Y.; Twitchell, J.A.; Brennan, K.M.

    1985-01-01

    Whereas 82 Rb has been shown to reflect heart blood-flow under normal circumstances and has the great benefit of being available from a noncyclotron source, there remains a question with regard to the physiology of rubidium transport into the heart muscle. The fraction of the amount of the rubidium tracer that goes into the heart varies with flow, and, unfortunately, the amount that accumulates in the muscle will not therefore be proportional to flow. Over the past three years, the authors have re-evaluated this question and determined that the uptake of rubidium in the myocardium follows a simple model of conservation of mass wherein the amount that is present is equal to the product of flow times extraction

  12. Pneumatic Adaptive Absorber: Mathematical Modelling with Experimental Verification

    Directory of Open Access Journals (Sweden)

    Grzegorz Mikułowski

    2016-01-01

    Full Text Available Many of mechanical energy absorbers utilized in engineering structures are hydraulic dampers, since they are simple and highly efficient and have favourable volume to load capacity ratio. However, there exist fields of applications where a threat of toxic contamination with the hydraulic fluid contents must be avoided, for example, food or pharmacy industries. A solution here can be a Pneumatic Adaptive Absorber (PAA, which is characterized by a high dissipation efficiency and an inactive medium. In order to properly analyse the characteristics of a PAA, an adequate mathematical model is required. This paper proposes a concept for mathematical modelling of a PAA with experimental verification. The PAA is considered as a piston-cylinder device with a controllable valve incorporated inside the piston. The objective of this paper is to describe a thermodynamic model of a double chamber cylinder with gas migration between the inner volumes of the device. The specific situation considered here is that the process cannot be defined as polytropic, characterized by constant in time thermodynamic coefficients. Instead, the coefficients of the proposed model are updated during the analysis. The results of the experimental research reveal that the proposed mathematical model is able to accurately reflect the physical behaviour of the fabricated demonstrator of the shock absorber.

  13. Simscape Modeling Verification in the Simulink Development Environment

    Science.gov (United States)

    Volle, Christopher E. E.

    2014-01-01

    The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.

  14. D Modeling with Photogrammetry by Uavs and Model Quality Verification

    Science.gov (United States)

    Barrile, V.; Bilotta, G.; Nunnari, A.

    2017-11-01

    This paper deals with a test lead by Geomatics laboratory (DICEAM, Mediterranea University of Reggio Calabria), concerning the application of UAV photogrammetry for survey, monitoring and checking. The study case relies with the surroundings of the Department of Agriculture Sciences. In the last years, such area was interested by landslides and survey activities carried out to take the phenomenon under control. For this purpose, a set of digital images were acquired through a UAV equipped with a digital camera and GPS. Successively, the processing for the production of a 3D georeferenced model was performed by using the commercial software Agisoft PhotoScan. Similarly, the use of a terrestrial laser scanning technique allowed to product dense cloud and 3D models of the same area. To assess the accuracy of the UAV-derived 3D models, a comparison between image and range-based methods was performed.

  15. Development and verification of an agent-based model of opinion leadership.

    Science.gov (United States)

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The

  16. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  17. Verification of temporal-causal network models by mathematical analysis

    Directory of Open Access Journals (Sweden)

    Jan Treur

    2016-04-01

    Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.

  18. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  19. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  20. Verification of the karst flow model under laboratory controlled conditions

    Science.gov (United States)

    Gotovac, Hrvoje; Andric, Ivo; Malenica, Luka; Srzic, Veljko

    2016-04-01

    Karst aquifers are very important groundwater resources around the world as well as in coastal part of Croatia. They consist of extremely complex structure defining by slow and laminar porous medium and small fissures and usually fast turbulent conduits/karst channels. Except simple lumped hydrological models that ignore high karst heterogeneity, full hydraulic (distributive) models have been developed exclusively by conventional finite element and finite volume elements considering complete karst heterogeneity structure that improves our understanding of complex processes in karst. Groundwater flow modeling in complex karst aquifers are faced by many difficulties such as a lack of heterogeneity knowledge (especially conduits), resolution of different spatial/temporal scales, connectivity between matrix and conduits, setting of appropriate boundary conditions and many others. Particular problem of karst flow modeling is verification of distributive models under real aquifer conditions due to lack of above-mentioned information. Therefore, we will show here possibility to verify karst flow models under the laboratory controlled conditions. Special 3-D karst flow model (5.6*2.6*2 m) consists of concrete construction, rainfall platform, 74 piezometers, 2 reservoirs and other supply equipment. Model is filled by fine sand (3-D porous matrix) and drainage plastic pipes (1-D conduits). This model enables knowledge of full heterogeneity structure including position of different sand layers as well as conduits location and geometry. Moreover, we know geometry of conduits perforation that enable analysis of interaction between matrix and conduits. In addition, pressure and precipitation distribution and discharge flow rates from both phases can be measured very accurately. These possibilities are not present in real sites what this model makes much more useful for karst flow modeling. Many experiments were performed under different controlled conditions such as different

  1. Verification of the two-dimensional hydrodynamic model based on remote sensing

    Science.gov (United States)

    Sazonov, Alexey; Mikhailukova, Polina; Krylenko, Inna; Frolova, Natalya; Kireeva, Mariya

    2016-04-01

    Mathematical modeling methods are used more and more actively to evaluate possible damage, identify potential flood zone and the influence of individual factors affecting the river during the passage of the flood. Calculations were performed by means of domestic software complex «STREAM-2D» which is based on the numerical solution of two-dimensional St. Venant equations. One of the major challenges in mathematical modeling is the verification of the model. This is usually made using data on water levels from hydrological stations: the smaller the difference of the actual level and the simulated one, the better the quality of the model used. Data from hydrological stations are not always available, so alternative sources of verification, such as remote sensing, are increasingly used. The aim of this work is to develop a method of verification of hydrodynamic model based on a comparison of actual flood zone area, which in turn is determined on the basis of the automated satellite image interpretation methods for different imaging systems and flooded area obtained in the course of the model. The study areas are Lena River, The North Dvina River, Amur River near Blagoveshchensk. We used satellite images made by optical and radar sensors: SPOT-5/HRG, Resurs-F, Radarsat-2. Flooded area were calculated using unsupervised classification (ISODATA and K-mean) for optical images and segmentation for Radarsat-2. Knowing the flow rate and the water level at a given date for the upper and lower limits of the model, respectively, it is possible to calculate flooded area by means of program STREAM-2D and GIS technology. All the existing vector layers with the boundaries of flooding are included in a GIS project for flood area calculation. This study was supported by the Russian Science Foundation, project no. 14-17-00155.

  2. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  3. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Energy Technology Data Exchange (ETDEWEB)

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  4. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Science.gov (United States)

    Chukbar, B. K.

    2015-12-01

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm-3 in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT LASER TOUCH AND TECHNOLOGIES, LLC LASER TOUCH MODEL LT-B512

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of Laser Touch model LT-B512 targeting device manufactured by Laser Touch and Technologies, LLC, for manual spray painting operations. The relative transfer efficiency (TE) improved an avera...

  6. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    Energy Technology Data Exchange (ETDEWEB)

    Itano, M; Yamazaki, T [Inagi Municipal Hospital, Inagi, Tokyo (Japan); Tachibana, R; Uchida, Y [National Cancer Center Hospital East, Kashiwa, Chiba (Japan); Yamashita, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Shimizu, H [Kitasato University Medical Center, Kitamoto, Saitama (Japan); Sugawara, Y; Kotabe, K [National Center for Global Health and Medicine, Shinjuku, Tokyo (Japan); Kamima, T [Cancer Institute Hospital Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Ishibashi, S [Sasebo City General Hospital, Sasebo, Nagasaki (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently, patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)

  7. Extended verification of the model of dynamic near-surface layer of the atmosphere

    Science.gov (United States)

    Polnikov, V. G.

    2013-07-01

    This paper formulates the most general principles for verifying models of the dynamic near-water layer of the atmosphere (DNWLA) and performs an advanced verification of the model proposed by the author earlier [6]. Based on empirical wave spectra from the studies by Donelan [15], Elfouhaily [14], and Kudryavtsev [13] and well-known empirical laws describing the wave-age dependence of the friction coefficient, we adjusted the original version of the model. It was shown that the improvement of model reliability is most dependent on the adequacy of the parameterization of the tangential portion of the total momentum flux to the wavy surface. Then the new version of the model was verified on the basis of field data from two different groups of authors. It was found that the new version of the model is consistent with empirical data with an error not exceeding the measurement error of near-water layer parameters.

  8. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  9. Model Based Verification of Cyber Range Event Environments

    Science.gov (United States)

    2015-12-10

    built a software development environment that uses to support the development and verification of on-board software for spacecraft. Tanizaki et al...Technologies, Piscataway, 2011. [16] Hiroaki Tanizaki , Toshiaki Aoki, and Takuya Katayama, "A Variability Management Method for Software

  10. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    Michael Horsfall

    ABSTRACT: Improving validation and verification (vv) has been one of the most important tasks of the century. However, we have few measures or management interventions to make such improvement possible, and it is difficult to identify patterns that should be followed by developers because systems and processes in an.

  11. Evaluation and verification of thermal stratification models for was

    African Journals Online (AJOL)

    USER

    form bacteria and verification. INTRODUCTION ... bottom. The thermal stratification can be stable-persisting for months or intermittent, appearing for a few hours in the day (Dor et al. 1993; Pedahzur et al., 1993; Torres et al.,. 1997). In waste stabilization ponds the .... water depth of 0.2 and a thick sludge deposits. The WSP ...

  12. Efficient speaker verification using Gaussian mixture model component clustering.

    Energy Technology Data Exchange (ETDEWEB)

    De Leon, Phillip L. (New Mexico State University, Las Cruces, NM); McClanahan, Richard D.

    2012-04-01

    In speaker verification (SV) systems that employ a support vector machine (SVM) classifier to make decisions on a supervector derived from Gaussian mixture model (GMM) component mean vectors, a significant portion of the computational load is involved in the calculation of the a posteriori probability of the feature vectors of the speaker under test with respect to the individual component densities of the universal background model (UBM). Further, the calculation of the sufficient statistics for the weight, mean, and covariance parameters derived from these same feature vectors also contribute a substantial amount of processing load to the SV system. In this paper, we propose a method that utilizes clusters of GMM-UBM mixture component densities in order to reduce the computational load required. In the adaptation step we score the feature vectors against the clusters and calculate the a posteriori probabilities and update the statistics exclusively for mixture components belonging to appropriate clusters. Each cluster is a grouping of multivariate normal distributions and is modeled by a single multivariate distribution. As such, the set of multivariate normal distributions representing the different clusters also form a GMM. This GMM is referred to as a hash GMM which can be considered to a lower resolution representation of the GMM-UBM. The mapping that associates the components of the hash GMM with components of the original GMM-UBM is referred to as a shortlist. This research investigates various methods of clustering the components of the GMM-UBM and forming hash GMMs. Of five different methods that are presented one method, Gaussian mixture reduction as proposed by Runnall's, easily outperformed the other methods. This method of Gaussian reduction iteratively reduces the size of a GMM by successively merging pairs of component densities. Pairs are selected for merger by using a Kullback-Leibler based metric. Using Runnal's method of reduction, we

  13. Linear models to perform treaty verification tasks for enhanced information security

    Science.gov (United States)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  14. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  15. Linear models to perform treaty verification tasks for enhanced information security

    International Nuclear Information System (INIS)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-01-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  16. Demonstrations and verification of debris bed models in the MEDICI reactor cavity code

    International Nuclear Information System (INIS)

    Trebilcock, W.R.; Bergeron, K.D.; Gorham-Bergeron, E.D.

    1984-01-01

    The MEDICI reactor cavity model is under development at Sandia National Laboratories to provide a simple but realistic treatment of ex-vessel severe accident phenomena. Several demonstration cases have been run and are discussed as illustrations of the model's capabilities. Verification of the model with experiments has supplied confidence in the model

  17. Compensation methods to support generic graph editing: A case study in automated verification of schema requirements for an advanced transaction model

    NARCIS (Netherlands)

    Even, S.J.; Spelt, D.

    Compensation plays an important role in advanced transaction models, cooperative work, and workflow systems. However, compensation operations are often simply written as a^−1 in transaction model literature. This notation ignores any operation parameters, results, and side effects. A schema designer

  18. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  19. Evaluation factors for verification and validation of low-level waste disposal site models

    International Nuclear Information System (INIS)

    Moran, M.S.; Mezga, L.J.

    1982-01-01

    The purpose of this paper is to identify general evaluation factors to be used to verify and validate LLW disposal site performance models in order to assess their site-specific applicability and to determine their accuracy and sensitivity. It is intended that the information contained in this paper be employed by model users involved with LLW site performance model verification and validation. It should not be construed as providing protocols, but rather as providing a framework for the preparation of specific protocols or procedures. A brief description of each evaluation factor is provided. The factors have been categorized according to recommended use during either the model verification or the model validation process. The general responsibilities of the developer and user are provided. In many cases it is difficult to separate the responsibilities of the developer and user, but the user is ultimately accountable for both verification and validation processes. 4 refs

  20. Code and Solution Verification of 3D Numerical Modeling of Flow in the Gust Erosion Chamber

    Science.gov (United States)

    Yuen, A.; Bombardelli, F. A.

    2014-12-01

    Erosion microcosms are devices commonly used to investigate the erosion and transport characteristics of sediments at the bed of rivers, lakes, or estuaries. In order to understand the results these devices provide, the bed shear stress and flow field need to be accurately described. In this research, the UMCES Gust Erosion Microcosm System (U-GEMS) is numerically modeled using Finite Volume Method. The primary aims are to simulate the bed shear stress distribution at the surface of the sediment core/bottom of the microcosm, and to validate the U-GEMS produces uniform bed shear stress at the bottom of the microcosm. The mathematical model equations are solved by on a Cartesian non-uniform grid. Multiple numerical runs were developed with different input conditions and configurations. Prior to developing the U-GEMS model, the General Moving Objects (GMO) model and different momentum algorithms in the code were verified. Code verification of these solvers was done via simulating the flow inside the top wall driven square cavity on different mesh sizes to obtain order of convergence. The GMO model was used to simulate the top wall in the top wall driven square cavity as well as the rotating disk in the U-GEMS. Components simulated with the GMO model were rigid bodies that could have any type of motion. In addition cross-verification was conducted as results were compared with numerical results by Ghia et al. (1982), and good agreement was found. Next, CFD results were validated by simulating the flow within the conventional microcosm system without suction and injection. Good agreement was found when the experimental results by Khalili et al. (2008) were compared. After the ability of the CFD solver was proved through the above code verification steps. The model was utilized to simulate the U-GEMS. The solution was verified via classic mesh convergence study on four consecutive mesh sizes, in addition to that Grid Convergence Index (GCI) was calculated and based on

  1. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  2. Validation, Verification, and Testing Techniques Throughout the Life Cycle of a Simulation Study

    OpenAIRE

    1994-01-01

    Life cycle validation, verification, and testing (VV&T) is extremely important for the success of a simulation study. This paper surveys current software VV&T techniques and current simulation model VV&T techniques and describes how they can all be applied throughout the life cycle of a simulation study. The processes and credibility assessment stages of the life cycle are described and the applicability of the VV&T techniques for each stage is stated. A glossary is provided to explicitly ...

  3. Channel Verification Results for the SCME models in a Multi-Probe Based MIMO OTA Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; S. Ashta, Jagjit

    2013-01-01

    , where the focus is on comparing results from various proposed methods. Channel model verification is necessary to ensure that the target channel models are correctly implemented inside the test area. This paper shows that the all the key parameters of the SCME models, i.e., power delay profile, temporal...

  4. Efficiency of material accountability verification procedures: A case study

    International Nuclear Information System (INIS)

    Avenhaus, R.

    1976-01-01

    In the model agreement INFCIRC/153 the international nuclear materials safeguards system has been established such that the material accountability principle is the main safeguards tool, with containment and surveillance as complementary measures. In addition, it has been agreed that the plant operator generates all data necessary for the material balance establishment and reports them to the safeguards authority and furthermore, that these data are verified by representatives of the safeguards authority with the help of independent measurements. In this paper, the problem of the determination of the efficiency of the combined system - data verification and material balance establishment - is analysed. Here, the difficulty arises that the two statistical procedures used are not independent because part of the operator's data are used in both cases. It is the purpose of this paper to work out the procedure for calculating the systems efficiency, i.e. the overall guaranteed probability of detection for the whole system for an assumed diversion and a given false alarm rate as a function of the safeguards effort spent over a given interval of time. Simplified formulae are derived which allow for a quick determination of the whole system efficiency: it is shown that the correlation between the two parts of the total system can be neglected. Therefore, the total systems efficiency can be represented as the product of the efficiencies of the two subsystems. The method developed is applied to a concrete case of a chemical reprocessing plant for irradiated fuels on the basis of data collected earlier. (author)

  5. Evaluation and Verification of Thermal Stratification Models for ...

    African Journals Online (AJOL)

    Stratification is a usual phenomenon occurring in waste stabilization ponds which needs to be incorporated in prediction models. The occurrence of stratification in waste stabilization pond (WSP) alters the flow pattern of the pond. Hence, its study is important for complete and accurate modeling of the WSP. In this study, two ...

  6. Viability Study for an Unattended UF6 Cylinder Verification Station: Phase I Final Report

    International Nuclear Information System (INIS)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.; Branney, Sean; McDonald, Benjamin S.; Webster, Jennifer B.; Zalavadia, Mital A.; Todd, Lindsay C.; Kulisek, Jonathan A.; Nordquist, Heather; Deshmukh, Nikhil S.; Stewart, Scott

    2016-01-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235 U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 ''typical'' Type 30B cylinders, and the viability of an ''NDA Fingerprint'' concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study

  7. Stream flow simulation and verification in ungauged zones by coupling hydrological and hydrodynamic models: a case study of the Poyang Lake ungauged zone

    Directory of Open Access Journals (Sweden)

    L. Zhang

    2017-11-01

    Full Text Available To solve the problem of estimating and verifying stream flow without direct observation data, we estimated stream flow in ungauged zones by coupling a hydrological model with a hydrodynamic model, using the Poyang Lake basin as a test case. To simulate the stream flow of the ungauged zone, we built a soil and water assessment tool (SWAT model for the entire catchment area covering the upstream gauged area and ungauged zone, and then calibrated the SWAT model using the data in the gauged area. To verify the results, we built two hydrodynamic scenarios (the original and adjusted scenarios for Poyang Lake using the Delft3D model. In the original scenario, the upstream boundary condition is the observed stream flow from the upstream gauged area, while, in the adjusted scenario, it is the sum of the observed stream flow from the gauged area and the simulated stream flow from the ungauged zone. The experimental results showed that there is a stronger correlation and lower bias (R2 = 0.81, PBIAS  =  10.00 % between the observed and simulated stream flow in the adjusted scenario compared to that (R2 = 0.77, PBIAS  =  20.10 % in the original scenario, suggesting the simulated stream flow of the ungauged zone is reasonable. Using this method, we estimated the stream flow of the Poyang Lake ungauged zone as 16.4 ± 6.2 billion m3 a−1, representing ∼ 11.24 % of the annual total water yield of the entire watershed. Of the annual water yield, 70 % (11.48 billion m3 a−1 is concentrated in the wet season, while 30 % (4.92 billion m3 a−1 comes from the dry season. The ungauged stream flow significantly improves the water balance with the closing error decreased by 13.48 billion m3 a−1 (10.10 % of the total annual water resource from 30.20 ± 9.1 billion m3 a−1 (20.10 % of the total annual water resource to 16.72 ± 8.53 billion m3 a−1 (10.00 % of the total

  8. BProVe: A formal verification framework for business process models

    DEFF Research Database (Denmark)

    Corradini, Flavio; Fornari, Fabrizio; Polini, Andrea

    2017-01-01

    Business Process Modelling has acquired increasing relevance in software development. Available notations, such as BPMN, permit to describe activities of complex organisations. On the one hand, this shortens the communication gap between domain experts and IT specialists. On the other hand......, this permits to clarify the characteristics of software systems introduced to provide automatic support for such activities. Nevertheless, the lack of formal semantics hinders the automatic verification of relevant properties. This paper presents a novel verification framework for BPMN 2.0, called BPro......Ve. It is based on an operational semantics, implemented using MAUDE, devised to make the verification general and effective. A complete tool chain, based on the Eclipse modelling environment, allows for rigorous modelling and analysis of Business Processes. The approach has been validated using more than one...

  9. Verification study of the FORE-2M nuclear/thermal-hydraulilc analysis computer code

    International Nuclear Information System (INIS)

    Coffield, R.D.; Tang, Y.S.; Markley, R.A.

    1982-01-01

    The verification of the LMFBR core transient performance code, FORE-2M, was performed in two steps. Different components of the computation (individual models) were verified by comparing with analytical solutions and with results obtained from other conventionally accepted computer codes (e.g., TRUMP, LIFE, etc.). For verification of the integral computation method of the code, experimental data in TREAT, SEFOR and natural circulation experiments in EBR-II were compared with the code calculations. Good agreement was obtained for both of these steps. Confirmation of the code verification for undercooling transients is provided by comparisons with the recent FFTF natural circulation experiments. (orig.)

  10. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  11. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification

    Science.gov (United States)

    Sager, Jennifer E.; Yu, Jingjing; Ragueneau-Majlessi, Isabelle

    2015-01-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms “PBPK” and “physiologically based pharmacokinetic model” to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines. PMID:26296709

  12. On the need for data for the verification of service life models for frost damage

    DEFF Research Database (Denmark)

    Geiker, Mette Rica; Engelund, Sven

    1999-01-01

    The purpose of this paper is to draw the attention to the need for the verification of service life models for frost attack on concrete and the collection of relevant data. To illustrate the type of data needed the paper presents models for internal freeze/thaw damage (internal cracking including...

  13. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  14. Verification of atmospheric diffusion models using data of long term atmospheric diffusion experiments

    International Nuclear Information System (INIS)

    Tamura, Junji; Kido, Hiroko; Hato, Shinji; Homma, Toshimitsu

    2009-03-01

    Straight-line or segmented plume models as atmospheric diffusion models are commonly used in probabilistic accident consequence assessment (PCA) codes due to cost and time savings. The PCA code, OSCAAR developed by Japan Atomic Energy Research Institute (Present; Japan Atomic Energy Agency) uses the variable puff trajectory model to calculate atmospheric transport and dispersion of released radionuclides. In order to investigate uncertainties involved with the structure of the atmospheric dispersion/deposition model in OSCAAR, we have introduced the more sophisticated computer codes that included regional meteorological models RAMS and atmospheric transport model HYPACT, which were developed by Colorado State University, and comparative analyses between OSCAAR and RAMS/HYPACT have been performed. In this study, model verification of OSCAAR and RAMS/HYPACT was conducted using data of long term atmospheric diffusion experiments, which were carried out in Tokai-mura, Ibaraki-ken. The predictions by models and the results of the atmospheric diffusion experiments indicated relatively good agreements. And it was shown that model performance of OSCAAR was the same degree as it of RAMS/HYPACT. (author)

  15. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  16. Results of verification and investigation of wind velocity field forecast. Verification of wind velocity field forecast model

    International Nuclear Information System (INIS)

    Ogawa, Takeshi; Kayano, Mitsunaga; Kikuchi, Hideo; Abe, Takeo; Saga, Kyoji

    1995-01-01

    In Environmental Radioactivity Research Institute, the verification and investigation of the wind velocity field forecast model 'EXPRESS-1' have been carried out since 1991. In fiscal year 1994, as the general analysis, the validity of weather observation data, the local features of wind field, and the validity of the positions of monitoring stations were investigated. The EXPRESS which adopted 500 m mesh so far was improved to 250 m mesh, and the heightening of forecast accuracy was examined, and the comparison with another wind velocity field forecast model 'SPEEDI' was carried out. As the results, there are the places where the correlation with other points of measurement is high and low, and it was found that for the forecast of wind velocity field, by excluding the data of the points with low correlation or installing simplified observation stations to take their data in, the forecast accuracy is improved. The outline of the investigation, the general analysis of weather observation data and the improvements of wind velocity field forecast model and forecast accuracy are reported. (K.I.)

  17. Automatic Verification of Railway Interlocking Systems: A Case Study

    DEFF Research Database (Denmark)

    Petersen, Jakob Lyng

    1998-01-01

    This paper presents experiences in applying formal verification to a large industrial piece of software. The are of application is railway interlocking systems. We try to prove requirements of the program controlling the Swedish railway Station Alingsås by using the decision procedure which is ba...

  18. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View Texas A& M; DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-12-20

    processes. These models were built on a numerical framework for solving conservation law problems in one-dimensional geometries such as spheres, cylinders, and lines. Coupled with the framework are specific models for adsorption in commercial adsorbents, such as zeolites and mordenites. Utilizing this modeling approach, the authors were able to accurately describe and predict adsorption kinetic data obtained from experiments at a variety of different temperatures and gas phase concentrations. A demonstration of how these models, and framework, can be used to simulate adsorption in fixed- bed columns is provided. The CO2 absorption work involved modeling with supportive experimental information. A dynamic model was developed to simulate CO2 absorption using high alkaline content water solutions. The model is based upon transient mass and energy balances for chemical species commonly present in CO2 absorption. A computer code was developed to implement CO2 absorption with a chemical reaction model. Experiments were conducted in a laboratory scale column to determine the model parameters. The influence of geometric parameters and operating variables on CO2 absorption was studied over a wide range of conditions. Continuing work could employ the model to control column operation and predict the absorption behavior under various input conditions and other prescribed experimental perturbations. The value of the validated models and numerical frameworks developed in this project is that they can be used to predict the sorption behavior of off-gas evolved during the reprocessing of nuclear waste and thus reduce the cost of the experiments. They can also be used to design sorption processes based on concentration limits and flow-rates determined at the plant level.

  19. Verification of Occupants’ Behaviour Models in Residential Buildings

    DEFF Research Database (Denmark)

    Andersen, Rune Korsholm

    During the last decade, studies about stochastic models of occupants’ behaviour in relation to control of the indoor environment have been published. Often the overall aim of these models is to enable more reliable predictions of building performance using building performance simulations (BPS...... to determine if the event takes place or not. Finally, the simulated window position is compared to the measured ones and the True Positive Rate and False Positive Rate along with other metrics can be calculated and compared. The method evaluates the models abilities to predict the position of the window...... based on the dataset. In each time step, the probabilities are subtracted from the observed transitions, to find the residuals. Finally, the residuals can be averaged, and compared. The validation by simulation relies on detailed Building Performance Simulations (BPS) using models under evaluation...

  20. Experimental Verification of the Transient Model in an Enrichment Circuit

    International Nuclear Information System (INIS)

    Fernandino, Maria; Brasnarof, Daniel; Delmastro, Dario

    2003-01-01

    In the present work an experimental closed loop representing a single stage of an uranium gaseous diffusion enrichment cascade is described, loop that is used to experimentally validate an analytical model that describes the dynamics inside such a loop.The conditions established inside the experimental loop after a few working hours were reproduced by the analytical model, leaving the slower thermal phenomena taking place for future studies.Two kinds of perturbations were experimentally introduced: a change in the range of operation of one of the compressors and the addition of mass into the loop.Numerical and experimental results are compared and presented in this work. The analytical model proposed was verified against these two changes, with very good agreement in the time response and measured values.This analytical model allows us to determine the characteristic time response of the system

  1. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker Verification

    OpenAIRE

    Sarkar, A. K.; Tan, Zheng-Hua

    2016-01-01

    In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using the utterances of a particular pass-phrase. During training, pass-phrase specific target speaker models are derived from the particular PBM using the training data for the respective target model. ...

  2. Temperature Modeling of Lost Creek Lake Using CE-QUAL-W2: A Report on the Development, Calibration, Verification, and Application of the Model

    Science.gov (United States)

    2017-05-01

    ER D C/ EL T R- 17 -6 Temperature Modeling of Applegate Lake Using CE-QUAL-W2 A Report on the Development, Calibration, Verification...library at http://acwc.sdp.sirsi.net/client/default. ERDC/EL TR-17-6 May 2017 Temperature Modeling of Applegate Lake Using CE-QUAL-W2 A Report on...model and the corresponding results from the study provided CENWP with more refined estimates of water temperatures so that more defendable water

  3. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  4. Translating activity diagram from duration calculus for modeling of real-time systems and its formal verification using UPPAAL and DiVinE

    International Nuclear Information System (INIS)

    Rahim, M.A.B.U.; Arif, F.

    2016-01-01

    The RTS (Real-Time Systems) are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language) for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus) implementation based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost. (author)

  5. Sorption Modeling and Verification for Off-Gas Treatment

    International Nuclear Information System (INIS)

    Tavlarides, Lawrence L.; Lin, Ronghong; Nan, Yue; Yiacoumi, Sotira; Tsouris, Costas; Ladshaw, Austin; Sharma, Ketki; Gabitto, Jorge; DePaoli, David

    2015-01-01

    uptake data. Two parallel approaches have been explored for integrating the kernels described above into a mass-transport model for adsorption in fixed beds. In one, the GSTA isotherm kernel has been incorporated into the MOOSE framework; in the other approach, a focused finite-difference framework and PDE kernels have been developed. Issues, including oscillatory behavior in MOOSE solutions to advection-diffusion problems, and opportunities have been identified for each approach, and a path forward has been identified toward developing a stronger modeling platform. Experimental systems were established for collection of microscopic kinetics and equilibria data for single and multicomponent uptake of gaseous species on solid sorbents. The systems, which can operate at ambient temperature to 250°C and dew points from -69 to 17°C, are useful for collecting data needed for modeling performance of sorbents of interest. Experiments were conducted to determine applicable models and parameters for isotherms and mass transfer for water and/or iodine adsorption on MS3A. Validation experiments were also conducted for water adsorption on fixed beds of MS3A. For absorption, work involved modeling with supportive experimentation. A dynamic model was developed to simulate CO 2 absorption with chemical reaction using high alkaline content water solutions. A computer code was developed to implement the model based upon transient mass and energy balances. Experiments were conducted in a laboratory-scale column to determine model parameters. The influence of geometric parameters and operating variables on CO 2 absorption was studied over a wide range of conditions. This project has resulted in 7 publications, with 3 manuscripts in preparation. Also, 15 presentations were given at national meetings of ANS and AIChE and at Material Recovery and Waste Forms Campaign Working Group meetings.

  6. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence L. [Syracuse Univ., NY (United States); Lin, Ronghong [Syracuse Univ., NY (United States); Nan, Yue [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Ladshaw, Austin [Georgia Inst. of Technology, Atlanta, GA (United States); Sharma, Ketki [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View A & M Univ., Prairie View, TX (United States); DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-04-29

    uptake data. Two parallel approaches have been explored for integrating the kernels described above into a mass-transport model for adsorption in fixed beds. In one, the GSTA isotherm kernel has been incorporated into the MOOSE framework; in the other approach, a focused finite-difference framework and PDE kernels have been developed. Issues, including oscillatory behavior in MOOSE solutions to advection-diffusion problems, and opportunities have been identified for each approach, and a path forward has been identified toward developing a stronger modeling platform. Experimental systems were established for collection of microscopic kinetics and equilibria data for single and multicomponent uptake of gaseous species on solid sorbents. The systems, which can operate at ambient temperature to 250°C and dew points from -69 to 17°C, are useful for collecting data needed for modeling performance of sorbents of interest. Experiments were conducted to determine applicable models and parameters for isotherms and mass transfer for water and/or iodine adsorption on MS3A. Validation experiments were also conducted for water adsorption on fixed beds of MS3A. For absorption, work involved modeling with supportive experimentation. A dynamic model was developed to simulate CO2 absorption with chemical reaction using high alkaline content water solutions. A computer code was developed to implement the model based upon transient mass and energy balances. Experiments were conducted in a laboratory-scale column to determine model parameters. The influence of geometric parameters and operating variables on CO2 absorption was studied over a wide range of conditions. This project has resulted in 7 publications, with 3 manuscripts in preparation. Also, 15 presentations were given at national meetings of ANS and AIChE and at Material Recovery and Waste Forms Campaign Working Group meetings.

  7. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor...

  8. The Parametric Model for PLC Reference Chanells and its Verification in Real PLC Environment

    Directory of Open Access Journals (Sweden)

    Rastislav Roka

    2008-01-01

    Full Text Available For the expansion of PLC systems, it is necesssary to have a detailed knowledge of the PLC transmission channel properties. This contribution shortly discusses characteristics of the PLC environment and a classification of PLC transmission channels. A main part is focused on the parametric model for PLC reference channels and its verification in the real PLC environment utilizing experimental measurements.

  9. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  10. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  11. Model of PE-CVD Apparatus: Verification and Simulations

    Directory of Open Access Journals (Sweden)

    J. Geiser

    2010-01-01

    Full Text Available In this paper, we present a simulation of chemical vapor deposition with metallic bipolar plates. In chemical vapor deposition, a delicate optimization between temperature, pressure and plasma power is important to obtain homogeneous deposition. The aim is to reduce the number of real-life experiments in a given CVD plasma reactor. Based on the large physical parameter space, there are a hugh number of possible experiments. A detailed study of the physical experiments in a CVD plasma reactor allows to reduce the problem to an approximate mathematical model, which is the underlying transport-reaction model. Significant regions of the CVD apparatus are approximated and physical parameters are transferred to the mathematical parameters. Such an approximation reduces the mathematical parameter space to a realistic number of numerical experiments. The numerical results are discussed with physical experiments to give a valid model for the assumed growth and we could reduce expensive physical experiments.

  12. Modelling, property verification and behavioural equivalence of lactose operon regulation.

    Science.gov (United States)

    Pinto, Marcelo Cezar; Foss, Luciana; Mombach, José Carlos Merino; Ribeiro, Leila

    2007-02-01

    Understanding biochemical pathways is one of the biggest challenges in the field of molecular biology nowadays. Computer science can contribute in this area by providing formalisms and tools to simulate and analyse pathways. One formalism that is suited for modelling concurrent systems is Milner's Calculus of Communicating Systems (CCS). This paper shows the viability of using CCS to model and reason about biochemical networks. As a case study, we describe the regulation of lactose operon. After describing this operon formally using CCS, we validate our model by automatically checking some known properties for lactose regulation. Moreover, since biological systems tend to be very complex, we propose to use multiple descriptions of the same system at different levels of abstraction. The compatibility of these multiple views can be assured via mathematical proofs of observational equivalence.

  13. Verification of geological/engineering model in waterflood areas

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, B.; Szpakiewicz, M.; Honarpour, M.; Schatzinger, R.A.; Tillman, R.

    1988-12-01

    The construction of a detailed geological/engineering model is the basis for development of the methodology for characterizing reservoir heterogeneity. The NIPER geological/engineering model is the subject of this report. The area selected for geological and production performance studies is a four-section area within the Powder River Basin which includes the Tertiary Incentive Project (TIP) pilot. Log, well test, production, and core data were acquired for construction of the geological model of a barrier island reservoir. In this investigation, emphasis was on the synthesis and quantification of the abundant geological information acquired from the literature and field studies (subsurface and outcrop) by mapping the geological heterogeneities that influence fluid flow. The geological model was verified by comparing it with the exceptionally complete production data available for Bell Creek field. This integration of new and existing information from various geological, geophysical, and engineering disciplines has enabled better definition of the heterogeneities that influence production during different recovery operations. 16 refs., 26 figs., 6 tabs.

  14. A study of compositional verification based IMA integration method

    Science.gov (United States)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  15. Verification of a Quality Management Theory: Using a Delphi Study

    OpenAIRE

    Mosadeghrad, Ali Mohammad

    2013-01-01

    BackgroundA model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. MethodsThe proposed model was further developed using feedback from ...

  16. Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling

    Science.gov (United States)

    Kraan, Aafke Christine

    2015-01-01

    Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β+ emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects. PMID:26217586

  17. Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling.

    Science.gov (United States)

    Kraan, Aafke Christine

    2015-01-01

    Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β (+) emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.

  18. Land surface Verification Toolkit (LVT) - a generalized framework for land surface model evaluation

    Science.gov (United States)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J.; Harrison, K.; Liu, Y.; Shaw, M.

    2012-06-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  19. Structured Programming Series. Volume 15. Validation and Verification Study

    Science.gov (United States)

    1975-05-22

    System for FORTRAN," February 1975, pp 1-16. Gruenberger, F., "Program Testing and Validacion ," Datamation, July 1968, pp 39-47. Holland, J. G...by block rnmbor) Verification Validation Inspection Testing Certify Prove 20 ABSTRACT (■Conllnu» on nimm tldo II ntcofmry and Idonllly by...majority of software projects rely almost entirely on computer based testing as the method of verifying and validating software. Second, structured

  20. NRPB models for calculating the transfer of radionuclides through the environment. Verification and validation

    International Nuclear Information System (INIS)

    Attwood, C.; Barraclough, I.; Brown, J.

    1998-06-01

    There is a wide range of models available at NRPB to predict the transfer of radionuclides through the environment. Such models form an essential part of assessments of the radiological impact of releases of radionuclides into the environment. These models cover: the atmosphere; the aquatic environment; the geosphere; the terrestrial environment including foodchains. It is important that the models used for radiological impact assessments are robust, reliable and suitable for the assessment being undertaken. During model development it is, therefore, important that the model is both verified and validated. Verification of a model involves ensuring that it has been implemented correctly, while validation consists of demonstrating that the model is an adequate representation of the real environment. The extent to which a model can be verified depends on its complexity and whether similar models exist. For relatively simple models verification is straightforward, but for more complex models verification has to form part of the development, coding and testing of the model within quality assurance procedures. Validation of models should ideally consist of comparisons between the results of the models and experimental or environmental measurement data that were not used to develop the model. This is more straightforward for some models than for others depending on the quantity and type of data available. Validation becomes increasingly difficult for models which are intended to predict environmental transfer at long times or at great distances. It is, therefore, necessary to adopt qualitative validation techniques to ensure that the model is an adequate representation of the real environment. This report summarises the models used at NRPB to predict the transfer of radionuclides through the environment as part of a radiological impact assessment. It outlines the work carried out to verify and validate the models. The majority of these models are not currently available

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: BROME AGRI SALES, LTD., MAXIMIZER SEPARATOR, MODEL MAX 1016 - 03/01/WQPC-SWP

    Science.gov (United States)

    Verification testing of the Brome Agri Sales Ltd. Maximizer Separator, Model MAX 1016 (Maximizer) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The Maximizer is an inclined screen solids separator that can be used to s...

  2. VERIFICATION OF 3D BUILDING MODELS USING MUTUAL INFORMATION IN AIRBORNE OBLIQUE IMAGES

    Directory of Open Access Journals (Sweden)

    A. P. Nyaruhuma

    2012-07-01

    Full Text Available This paper describes a method for automatic verification of 3D building models using airborne oblique images. The problem being tackled is identifying buildings that are demolished or changed since the models were constructed or identifying wrong models using the images. The models verified are of CityGML LOD2 or higher since their edges are expected to coincide with actual building edges. The verification approach is based on information theory. Corresponding variables between building models and oblique images are used for deriving mutual information for individual edges, faces or whole buildings, and combined for all perspective images available for the building. The wireframe model edges are projected to images and verified using low level image features – the image pixel gradient directions. A building part is only checked against images in which it may be visible. The method has been tested with models constructed using laser points against Pictometry images that are available for most cities of Europe and may be publically viewed in the so called Birds Eye view of the Microsoft Bing Maps. Results are that nearly all buildings are correctly categorised as existing or demolished. Because we now concentrate only on roofs we also used the method to test and compare results from nadir images. This comparison made clear that especially height errors in models can be more reliably detected in oblique images because of the tilted view. Besides overall building verification, results per individual edges can be used for improving the 3D building models.

  3. Weather forecast in north-western Greece: RISKMED warnings and verification of MM5 model

    Directory of Open Access Journals (Sweden)

    A. Bartzokas

    2010-02-01

    Full Text Available The meteorological model MM5 is applied operationally for the area of north-western Greece for one-year period (1 June 2007–31 May 2008. The model output is used for daily weather forecasting over the area. An early warning system is developed, by dividing the study area in 16 sub-regions and defining specific thresholds for issuing alerts for adverse weather phenomena. The verification of the model is carried out by comparing the model results with observations from three automatic meteorological stations. For air temperature and wind speed, correlation coefficients and biases are calculated, revealing that there is a significant overestimation of the early morning air temperature. For precipitation amount, yes/no contingency tables are constructed for 4 specific thresholds and some categorical statistics are applied, showing that the prediction of precipitation in the area under study is generally satisfactory. Finally, the thunderstorm warnings issued by the system are verified against the observed lightning activity.

  4. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  5. Modelling and Verification of Multiple UAV Mission Using SMV

    Directory of Open Access Journals (Sweden)

    Gopinadh Sirigineedi

    2010-03-01

    Full Text Available Model checking has been used to verify the correctness of digital circuits, security protocols, communication protocols, as they can be modelled by means of finite state transition model. However, modelling the behaviour of hybrid systems like UAVs in a Kripke model is challenging. This work is aimed at capturing the behaviour of an UAV performing cooperative search mission into a Kripke model, so as to verify it against the temporal properties expressed in Computational Tree Logic (CTL. SMV model checker is used for the purpose of model checking.

  6. Game Theory Models for the Verification of the Collective Behaviour of Autonomous Cars

    OpenAIRE

    Varga, László Z.

    2017-01-01

    The collective of autonomous cars is expected to generate almost optimal traffic. In this position paper we discuss the multi-agent models and the verification results of the collective behaviour of autonomous cars. We argue that non-cooperative autonomous adaptation cannot guarantee optimal behaviour. The conjecture is that intention aware adaptation with a constraint on simultaneous decision making has the potential to avoid unwanted behaviour. The online routing game model is expected to b...

  7. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    Science.gov (United States)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  8. Modeling and verification of hemispherical solar still using ANSYS CFD

    Energy Technology Data Exchange (ETDEWEB)

    Panchal, Hitesh N. [KSV University, Gujarat Power Engineering and Research Institute, Mehsana (India); Shah, P.K. [Silver Oak College of Engineering and Technology, Ahmedabad, Gujarat (India)

    2013-07-01

    In every efficient solar still design, water temperature, vapor temperature and distillate output, and difference between water temperature and inner glass cover temperatures are very important. Here, two dimensional three phase model of hemispherical solar still is made for evaporation as well as condensation process in ANSYS CFD. Simulation results like water temperature, vapor temperature, distillate output compared with actual experimental results of climate conditions of Mehsana (latitude of 23° 59’ and longitude of 72° 38) of hemispherical solar still. Water temperature and distillate output were good agreement with actual experimental results. Study shows that ANSYS-CFD is very powerful as well as efficient tool for design, comparison purpose of hemispherical solar still.

  9. Verification of a Quality Management Theory: Using a Delphi Study

    Directory of Open Access Journals (Sweden)

    Ali Mohammad Mosadeghrad

    2013-11-01

    Full Text Available BackgroundA model of quality management called Strategic Collaborative Quality Management (SCQM model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. MethodsThe proposed model was further developed using feedback from thirty quality management experts using a Delphi method. Further, a guidebook for its implementation was prepared including a road map and performance measurement. ResultsThe research led to the development of a context-specific model of quality management for healthcare organisations and a series of guidelines for its implementation. ConclusionA proper model of quality management should be developed and implemented properly in healthcare organisations to achieve business excellence.

  10. Verification of a quality management theory: using a delphi study.

    Science.gov (United States)

    Mosadeghrad, Ali Mohammad

    2013-11-01

    A model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. The proposed model was further developed using feedback from thirty quality management experts using a Delphi method. Further, a guidebook for its implementation was prepared including a road map and performance measurement. The research led to the development of a context-specific model of quality management for healthcare organisations and a series of guidelines for its implementation. A proper model of quality management should be developed and implemented properly in healthcare organisations to achieve business excellence.

  11. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Tsao, Jeffrey Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trucano, Timothy G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kleban, Stephen D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naugle, Asmeret Bier [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Verzi, Stephen Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Curtis M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flanagan, Tatiana Paz [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gabert, Kasimir Georg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lave, Matthew Samuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chen, Wei [Northwestern Univ., Evanston, IL (United States); DeLaurentis, Daniel [Purdue Univ., West Lafayette, IN (United States); Hubler, Alfred [Univ. of Illinois, Urbana, IL (United States); Oberkampf, Bill [WLO Consulting, Austin, TX (United States)

    2016-08-01

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledge gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?

  12. FAST Mast Structural Response to Axial Loading: Modeling and Verification

    Science.gov (United States)

    Knight, Norman F., Jr.; Elliott, Kenny B.; Templeton, Justin D.; Song, Kyongchan; Rayburn, Jeffery T.

    2012-01-01

    The International Space Station s solar array wing mast shadowing problem is the focus of this paper. A building-block approach to modeling and analysis is pursued for the primary structural components of the solar array wing mast structure. Starting with an ANSYS (Registered Trademark) finite element model, a verified MSC.Nastran (Trademark) model is established for a single longeron. This finite element model translation requires the conversion of several modeling and analysis features for the two structural analysis tools to produce comparable results for the single-longeron configuration. The model is then reconciled using test data. The resulting MSC.Nastran (Trademark) model is then extended to a single-bay configuration and verified using single-bay test data. Conversion of the MSC. Nastran (Trademark) single-bay model to Abaqus (Trademark) is also performed to simulate the elastic-plastic longeron buckling response of the single bay prior to folding.

  13. Verification of aseismic design model by using experimental results

    International Nuclear Information System (INIS)

    Mizuno, N.; Sugiyama, N.; Suzuki, T.; Shibata, Y.; Miura, K.; Miyagawa, N.

    1985-01-01

    A lattice model is applied as an analysis model for an aseismic design of the Hamaoka nuclear reactor building. With object to verify an availability of this design model, two reinforced concrete blocks are constructed on the ground and the forced vibration tests are carried out. The test results are well followed by simulation analysis using the lattice model. Damping value of the ground obtained from the test is more conservative than the design value. (orig.)

  14. Desublimation process: verification and applications of a theoretical model

    International Nuclear Information System (INIS)

    Eby, R.S.

    1979-01-01

    A theoretical model simulating the simultaneous heat and mass transfer which takes place during the desublimation of a gas to a solid is presented. Desublimer column loading profiles to experimentally verify the model were obtained using a gamma scintillation technique. The data indicate that, if the physical parameters of the desublimed frost material are known, the model can accurately predict the desublimation phenomenon. The usefulness of the model in different engineering applications is also addressed

  15. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  16. Are Australian and New Zealand trauma service resources reflective of the Australasian Trauma Verification Model Resource Criteria?

    Science.gov (United States)

    Leonard, Elizabeth; Curtis, Kate

    2014-01-01

    The Australasian Trauma Verification Program was developed in 2000 to improve the quality of care provided at services in Australia and New Zealand. The programme outlines resources required for differing levels of trauma services. This study compares the human resources in Australia and New Zealand trauma services with those recommended by the Australasian College of Surgeons Trauma Verification Program. In September 2011, all trauma nurse coordinators in Australia and New Zealand were invited to participate in an electronic survey endorsed by the Australasian Trauma Society. This study expands on previous bi-national research and aimed to identify demographic and trauma service human resource levels. Fifty-three surveys (78%) were completed and all 27 Level 1 trauma centres represented. Of the Level 1 trauma centres, a trauma director and fellow were available at 16 (51.8%) and 14 (40.7%) centres, respectively. The majority (93%) had a full-time trauma coordinator although a trauma case manager was only available at 14 (48.1%) of Level 1 trauma centres. Despite the large amount of data collection and extraction required, trauma services had limited access to a data manager (50.9%) or clerical staff (36.9%). Human resources in Australian and NZ trauma services are not reflective of those recommended by the Australasian Trauma Verification Program. This impacts on the ability to coordinate trauma monitoring and performance improvement. Review of the Australasian Trauma Verification Model Resource Criteria is required. Injury surveillance in Australia and NZ is hampered by insufficient trauma registry resources. © 2014 Royal Australasian College of Surgeons.

  17. From Wireless Sensor Networks to Wireless Body Area Networks: Formal Modeling and Verification on Security Using PAT

    Directory of Open Access Journals (Sweden)

    Tieming Chen

    2016-01-01

    Full Text Available Model checking has successfully been applied on verification of security protocols, but the modeling process is always tedious and proficient knowledge of formal method is also needed although the final verification could be automatic depending on specific tools. At the same time, due to the appearance of novel kind of networks, such as wireless sensor networks (WSN and wireless body area networks (WBAN, formal modeling and verification for these domain-specific systems are quite challenging. In this paper, a specific and novel formal modeling and verification method is proposed and implemented using an expandable tool called PAT to do WSN-specific security verification. At first, an abstract modeling data structure for CSP#, which is built in PAT, is developed to support the node mobility related specification for modeling location-based node activity. Then, the traditional Dolev-Yao model is redefined to facilitate modeling of location-specific attack behaviors on security mechanism. A throughout formal verification application on a location-based security protocol in WSN is described in detail to show the usability and effectiveness of the proposed methodology. Furthermore, also a novel location-based authentication security protocol in WBAN can be successfully modeled and verified directly using our method, which is, to the best of our knowledge, the first effort on employing model checking for automatic analysis of authentication protocol for WBAN.

  18. Development and verification of printed circuit board toroidal transformer model

    DEFF Research Database (Denmark)

    Pejtersen, Jens; Mønster, Jakob Døllner; Knott, Arnold

    2013-01-01

    by comparing calculated parameters with 3D finite element simulations and experimental measurement results. The developed transformer model shows good agreement with the simulated and measured results. The model can be used to predict the parameters of printed circuit board toroidal transformer configurations......An analytical model of an air core printed circuit board embedded toroidal transformer configuration is presented. The transformer has been developed for galvanic isolation of very high frequency switch-mode dc-dc power converter applications. The theoretical model is developed and verified...

  19. A verification strategy for web services composition using enhanced stacked automata model.

    Science.gov (United States)

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  20. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    Science.gov (United States)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  1. Verification and improvement of a predictive model for radionuclide migration

    International Nuclear Information System (INIS)

    Miller, C.W.; Benson, L.V.; Carnahan, C.L.

    1982-01-01

    Prediction of the rates of migration of contaminant chemical species in groundwater flowing through toxic waste repositories is essential to the assessment of a repository's capability of meeting standards for release rates. A large number of chemical transport models, of varying degrees of complexity, have been devised for the purpose of providing this predictive capability. In general, the transport of dissolved chemical species through a water-saturated porous medium is influenced by convection, diffusion/dispersion, sorption, formation of complexes in the aqueous phase, and chemical precipitation. The reliability of predictions made with the models which omit certain of these processes is difficult to assess. A numerical model, CHEMTRN, has been developed to determine which chemical processes govern radionuclide migration. CHEMTRN builds on a model called MCCTM developed previously by Lichtner and Benson

  2. A verification procedure for MSC/NASTRAN Finite Element Models

    Science.gov (United States)

    Stockwell, Alan E.

    1995-01-01

    Finite Element Models (FEM's) are used in the design and analysis of aircraft to mathematically describe the airframe structure for such diverse tasks as flutter analysis and actively controlled landing gear design. FEM's are used to model the entire airplane as well as airframe components. The purpose of this document is to describe recommended methods for verifying the quality of the FEM's and to specify a step-by-step procedure for implementing the methods.

  3. Predictions and Verification of an Isotope Marine Boundary Layer Model

    Science.gov (United States)

    Feng, X.; Posmentier, E. S.; Sonder, L. J.; Fan, N.

    2017-12-01

    A one-dimensional (1D), steady state isotope marine boundary layer (IMBL) model is constructed. The model includes meteorologically important features absent in Craig and Gordon type models, namely height-dependent diffusion/mixing and convergence of subsiding external air. Kinetic isotopic fractionation results from this height-dependent diffusion which starts as pure molecular diffusion at the air-water interface and increases linearly with height due to turbulent mixing. The convergence permits dry, isotopically depleted air subsiding adjacent to the model column to mix into ambient air. In δD-δ18O space, the model results fill a quadrilateral, of which three sides represent 1) vapor in equilibrium with various sea surface temperatures (SSTs) (high d18O boundary of quadrilateral); 2) mixture of vapor in equilibrium with seawater and vapor in the subsiding air (lower boundary depleted in both D and 18O); and 3) vapor that has experienced the maximum possible kinetic fractionation (high δD upper boundary). The results can be plotted in d-excess vs. δ18O space, indicating that these processes all cause variations in d-excess of MBL vapor. In particular, due to relatively high d-excess in the descending air, mixing of this air into the MBL causes an increase in d-excess, even without kinetic isotope fractionation. The model is tested by comparison with seven datasets of marine vapor isotopic ratios, with excellent correspondence; >95% of observational data fall within the quadrilateral area predicted by the model. The distribution of observations also highlights the significant influence of vapor from the nearby converging descending air on isotopic variations in the MBL. At least three factors may explain the affect the isotopic composition of precipitation. The model can be applied to modern as well as paleo- climate conditions.

  4. Target Soil Impact Verification: Experimental Testing and Kayenta Constitutive Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Broome, Scott Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Flint, Gregory Mark [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Dewers, Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Newell, Pania [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This report details experimental testing and constitutive modeling of sandy soil deformation under quasi - static conditions. This is driven by the need to understand constitutive response of soil to target/component behavior upon impact . An experimental and constitutive modeling program was followed to determine elastic - plastic properties and a compressional failure envelope of dry soil . One hydrostatic, one unconfined compressive stress (UCS), nine axisymmetric compression (ACS) , and one uniaxial strain (US) test were conducted at room temperature . Elastic moduli, assuming isotropy, are determined from unload/reload loops and final unloading for all tests pre - failure and increase monotonically with mean stress. Very little modulus degradation was discernable from elastic results even when exposed to mean stresses above 200 MPa . The failure envelope and initial yield surface were determined from peak stresses and observed onset of plastic yielding from all test results. Soil elasto - plastic behavior is described using the Brannon et al. (2009) Kayenta constitutive model. As a validation exercise, the ACS - parameterized Kayenta model is used to predict response of the soil material under uniaxial strain loading. The resulting parameterized and validated Kayenta model is of high quality and suitable for modeling sandy soil deformation under a range of conditions, including that for impact prediction.

  5. VAMOS: The verification and monitoring options study: Current research options for in-situ monitoring and verification of contaminant remediation and containment within the vadose zone

    International Nuclear Information System (INIS)

    Betsill, J.D.; Gruebel, R.D.

    1995-09-01

    The Verification and Monitoring Options Study Project (VAMOS) was established to identify high-priority options for future vadose-zone environmental research in the areas of in-situ remediation monitoring, post-closure monitoring, and containment emplacement and verification monitoring. VAMOS examined projected needs not currently being met with applied technology in order to develop viable monitoring and verification research options. The study emphasized a compatible systems approach to reinforce the need for utilizing compatible components to provide user friendly site monitoring systems. To identify the needs and research options related to vadose-zone environmental monitoring and verification, a literature search and expert panel forums were conducted. The search included present drivers for environmental monitoring technology, technology applications, and research efforts. The forums included scientific, academic, industry, and regulatory environmental professionals as well as end users of environmental technology. The experts evaluated current and future monitoring and verification needs, methods for meeting these needs, and viable research options and directions. A variety of high-priority technology development, user facility, and technology guidance research options were developed and presented as an outcome of the literature search and expert panel forums

  6. Robust control design verification using the modular modeling system

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ben-Abdennour, A.; Lee, K.Y.

    1991-01-01

    The Modular Modeling System (B ampersand W MMS) is being used as a design tool to verify robust controller designs for improving power plant performance while also providing fault-accommodating capabilities. These controllers are designed based on optimal control theory and are thus model based controllers which are targeted for implementation in a computer based digital control environment. The MMS is being successfully used to verify that the controllers are tolerant of uncertainties between the plant model employed in the controller and the actual plant; i.e., that they are robust. The two areas in which the MMS is being used for this purpose is in the design of (1) a reactor power controller with improved reactor temperature response, and (2) the design of a multiple input multiple output (MIMO) robust fault-accommodating controller for a deaerator level and pressure control problem

  7. Application of a Monte Carlo linac model in routine verifications of dose calculations

    International Nuclear Information System (INIS)

    Linares Rosales, H. M.; Alfonso Laguardia, R.; Lara Mas, E.; Popescu, T.

    2015-01-01

    The analysis of some parameters of interest in Radiotherapy Medical Physics based on an experimentally validated Monte Carlo model of an Elekta Precise lineal accelerator, was performed for 6 and 15 Mv photon beams. The simulations were performed using the EGSnrc code. As reference for simulations, the optimal beam parameters values (energy and FWHM) previously obtained were used. Deposited dose calculations in water phantoms were done, on typical complex geometries commonly are used in acceptance and quality control tests, such as irregular and asymmetric fields. Parameters such as MLC scatter, maximum opening or closing position, and the separation between them were analyzed from calculations in water. Similarly simulations were performed on phantoms obtained from CT studies of real patients, making comparisons of the dose distribution calculated with EGSnrc and the dose distribution obtained from the computerized treatment planning systems (TPS) used in routine clinical plans. All the results showed a great agreement with measurements, finding all of them within tolerance limits. These results allowed the possibility of using the developed model as a robust verification tool for validating calculations in very complex situation, where the accuracy of the available TPS could be questionable. (Author)

  8. Analysis and verification of a prediction model of solar energetic proton events

    Science.gov (United States)

    Wang, J.; Zhong, Q.

    2017-12-01

    The solar energetic particle event can cause severe radiation damages near Earth. The alerts and summary products of the solar energetic proton events were provided by the Space Environment Prediction Center (SEPC) according to the flux of the greater than 10 MeV protons taken by GOES satellite in geosynchronous orbit. The start of a solar energetic proton event is defined as the time when the flux of the greater than 10 MeV protons equals or exceeds 10 proton flux units (pfu). In this study, a model was developed to predict the solar energetic proton events, provide the warning for the solar energetic proton events at least minutes in advance, based on both the soft X-ray flux and integral proton flux taken by GOES. The quality of the forecast model was measured against verifications of accuracy, reliability, discrimination capability, and forecast skills. The peak flux and rise time of the solar energetic proton events in the six channels, >1MeV, >5 MeV, >10 MeV, >30 MeV, >50 MeV, >100 MeV, were also simulated and analyzed.

  9. Carbon dioxide stripping in aquaculture -- part III: model verification

    Science.gov (United States)

    Colt, John; Watten, Barnaby; Pfeiffer, Tim

    2012-01-01

    Based on conventional mass transfer models developed for oxygen, the use of the non-linear ASCE method, 2-point method, and one parameter linear-regression method were evaluated for carbon dioxide stripping data. For values of KLaCO2 down at higher values of KLaCO2. How to correct KLaCO2 for gas phase enrichment remains to be determined. The one-parameter linear regression model was used to vary the C*CO2 over the test, but it did not result in a better fit to the experimental data when compared to the ASCE or fixed C*CO2 assumptions.

  10. Structure-dynamic model verification calculation of PWR 5 tests

    International Nuclear Information System (INIS)

    Engel, R.

    1980-02-01

    Within reactor safety research project RS 16 B of the German Federal Ministry of Research and Technology (BMFT), blowdown experiments are conducted at Battelle Institut e.V. Frankfurt/Main using a model reactor pressure vessel with a height of 11,2 m and internals corresponding to those in a PWR. In the present report the dynamic loading on the pressure vessel internals (upper perforated plate and barrel suspension) during the DWR 5 experiment are calculated by means of a vertical and horizontal dynamic model using the CESHOCK code. The equations of motion are resolved by direct integration. (orig./RW) [de

  11. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Science.gov (United States)

    Franz, K. J.; Hogue, T. S.

    2011-11-01

    The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP) systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE), and the Shuffle Complex Evolution Metropolis (SCEM). Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA) model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  12. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  13. Modelling and Verification of Web Services Business Activity Protocol

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2011-01-01

    WS-Business Activity specification defines two coordination protocols in order to ensure a consistent agreement on the outcome of long-running distributed applications. We use the model checker Uppaal to analyse the Business Agreement with Coordination Completion protocol type. Our analyses show...

  14. Verification of Context-Dependent Channel-Based Service Models

    NARCIS (Netherlands)

    N. Kokash (Natallia); , C. (born Köhler, , C.) Krause (Christian); E.P. de Vink (Erik Peter)

    2010-01-01

    htmlabstractThe paradigms of service-oriented computing and model-driven development are becoming of increasing importance in the eld of software engineering. According to these paradigms, new systems are composed with added value from existing stand-alone services to support business processes

  15. Verification of the Skorohod-Olevsky Viscous Sintering (SOVS) Model

    Energy Technology Data Exchange (ETDEWEB)

    Lester, Brian T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-16

    Sintering refers to a manufacturing process through which mechanically pressed bodies of ceramic (and sometimes metal) powders are heated to drive densification thereby removing the inherit porosity of green bodies. As the body densifies through the sintering process, the ensuing material flow leads to macroscopic deformations of the specimen and as such the final configuration differs form the initial. Therefore, as with any manufacturing step, there is substantial interest in understanding and being able to model the sintering process to predict deformation and residual stress. Efforts in this regard have been pursued for face seals, gear wheels, and consumer products like wash-basins. To understand the sintering process, a variety of modeling approaches have been pursued at different scales.

  16. Model Verification and Validation Using Graphical Information Systems Tools

    Science.gov (United States)

    2013-07-31

    coastal ocean sufficiently to have a complete picture of the flow. The analysis will thus consist of comparing these incomplete pictures of the current...50 cm. This would suggest that tidal flats would exist at synoptic scales but not daily because there are expanses of the lagoon that are < 50 cm...historical daily data from the correct time of year but not from the correct day. This indicates that the model flow is generally correct at synoptic

  17. Design and verification of the 'GURI 01' bundle model

    International Nuclear Information System (INIS)

    Benito, G.D.

    1990-01-01

    This work presents a general description of the 'GURI 01' bundle model, designed by INVAP S.E., under international radioactive material transportation regulations, as a B(U) type bundle for international transportation up to a maximum of 350000 Ci of Co60. Moreover, the methodologies used and the results obtained from the structural evaluation of the mechanic essay and from the evaluation of the thermal behaviour under normal or accident conditions are briefly discussed. (Author) [es

  18. Specification, Model Generation, and Verification of Distributed Applications

    OpenAIRE

    Madelaine, Eric

    2011-01-01

    Since 2001, in the Oasis team, I have developed research on the semantics of applications based on distributed objects, applying in the context of a real language, and applications of realistic size, my previous researches in the field of process algebras. The various aspects of this work naturally include behavioral semantics and the definition of procedures for model generation, taking into account the different concepts of distributed applications, but also upstream, static code analysis a...

  19. A New Approach to Model Verification, Falsification and Selection

    Directory of Open Access Journals (Sweden)

    Andrew J. Buck

    2015-06-01

    Full Text Available This paper shows that a qualitative analysis, i.e., an assessment of the consistency of a hypothesized sign pattern for structural arrays with the sign pattern of the estimated reduced form, can always provide decisive insight into a model’s validity both in general and compared to other models. Qualitative analysis can show that it is impossible for some models to have generated the data used to estimate the reduced form, even though standard specification tests might show the model to be adequate. A partially specified structural hypothesis can be falsified by estimating as few as one reduced form equation. Zero restrictions in the structure can themselves be falsified. It is further shown how the information content of the hypothesized structural sign patterns can be measured using a commonly applied concept of statistical entropy. The lower the hypothesized structural sign pattern’s entropy, the more a priori information it proposes about the sign pattern of the estimated reduced form. As an hypothesized structural sign pattern has a lower entropy, it is more subject to type 1 error and less subject to type 2 error. Three cases illustrate the approach taken here.

  20. Verification of an effective dose equivalent model for neutrons

    International Nuclear Information System (INIS)

    Tanner, J.E.; Piper, R.K.; Leonowich, J.A.; Faust, L.G.

    1992-01-01

    Since the effective dose equivalent, based on the weighted sum of organ dose equivalents, is not a directly measurable quantity, it must be estimated with the assistance of computer modelling techniques and a knowledge of the incident radiation field. Although extreme accuracy is not necessary for radiation protection purposes, a few well chosen measurements are required to confirm the theoretical models. Neutron doses and dose equivalents were measured in a RANDO phantom at specific locations using thermoluminescence dosemeters, etched track dosemeters, and a 1.27 cm (1/2 in) tissue-equivalent proportional counter. The phantom was exposed to a bare and a D 2 O-moderated 252 Cf neutron source at the Pacific Northwest Laboratory's Low Scatter Facility. The Monte Carlo code MCNP with the MIRD-V mathematical phantom was used to model the human body and to calculate the organ doses and dose equivalents. The experimental methods are described and the results of the measurements are compared with the calculations. (author)

  1. Verification of an effective dose equivalent model for neutrons

    International Nuclear Information System (INIS)

    Tanner, J.E.; Piper, R.K.; Leonowich, J.A.; Faust, L.G.

    1991-10-01

    Since the effective dose equivalent, based on the weighted sum of organ dose equivalents, is not a directly measurable quantity, it must be estimated with the assistance of computer modeling techniques and a knowledge of the radiation field. Although extreme accuracy is not necessary for radiation protection purposes, a few well-chosen measurements are required to confirm the theoretical models. Neutron measurements were performed in a RANDO phantom using thermoluminescent dosemeters, track etch dosemeters, and a 1/2-in. (1.27-cm) tissue equivalent proportional counter in order to estimate neutron doses and dose equivalents within the phantom at specific locations. The phantom was exposed to bare and D 2 O-moderated 252 Cf neutrons at the Pacific Northwest Laboratory's Low Scatter Facility. The Monte Carlo code MCNP with the MIRD-V mathematical phantom was used to model the human body and calculate organ doses and dose equivalents. The experimental methods are described and the results of the measurements are compared to the calculations. 8 refs., 3 figs., 3 tabs

  2. Verification of Conjugate Heat Transfer Models in a Closed Volume with Radiative Heat Source

    Directory of Open Access Journals (Sweden)

    Maksimov Vyacheslav I.

    2016-01-01

    Full Text Available The results of verification of mathematical model of convective-conductive heat transfer in a closed volume with a thermally conductive enclosing structures are presented. Experiments were carried out to determine the temperature of floor premises in the working conditions of radiant heating systems. Comparison of mathematical modelling of temperature fields and experiments showed their good agreement. It is concluded that the mathematical model of conjugate heat transfers in the air cavity with a heat-conducting and heat-retaining walls correspond to the real process of formation of temperature fields in premises with gas infrared heaters system.

  3. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    Science.gov (United States)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  4. TU Electric reactor physics model verification: Power reactor benchmark

    International Nuclear Information System (INIS)

    Willingham, C.E.; Killgore, M.R.

    1988-01-01

    Power reactor benchmark calculations using the advanced code package CASMO-3/SIMULATE-3 have been performed for six cycles of Prairie Island Unit 1. The reload fuel designs for the selected cycles included gadolinia as a burnable absorber, natural uranium axial blankets and increased water-to-fuel ratio. The calculated results for both startup reactor physics tests (boron endpoints, control rod worths, and isothermal temperature coefficients) and full power depletion results were compared to measured plant data. These comparisons show that the TU Electric reactor physics models accurately predict important measured parameters for power reactors

  5. Pyrolysis of biomass briquettes, modelling and experimental verification

    NARCIS (Netherlands)

    van der Aa, B; Lammers, G; Beenackers, AACM; Kopetz, H; Weber, T; Palz, W; Chartier, P; Ferrero, GL

    1998-01-01

    Carbonisation of biomass briquettes was studied using a dedicated single briquette carbonisation reactor. The reactor enabled continuous measurement of the briquette mass and continuous measurement of the radial temperature profile in the briquette. Furthermore pyrolysis gas production and

  6. Double piezoelectric energy harvesting cell: modeling and experimental verification

    Science.gov (United States)

    Wang, Xianfeng; Shi, Zhifei

    2017-06-01

    In this paper, a novel energy transducer named double piezoelectric energy harvesting cell (DPEHC) consisting of two flex-compressive piezoelectric energy harvesting cells (F-C PEHCs) is proposed. At the very beginning, two F-C PEHCs, a kind of cymbal type energy transducer, are assembled together sharing the same end just in order to be placed steady. However, throughout an open-circuit voltage test, additional energy harvesting performance of the DPEHC prototype appears. Taking the interaction between the two F-C PEHCs into account, a mechanical model for analyzing the DPEHC is established. The electric output of the DPEHC under harmonic excitation is obtained theoretically and verified experimentally, and good agreement is found. In addition, as an inverse problem, the method for identifying the key mechanical parameters of the DPEHC is recommended. Finally, the additional energy harvesting performance of the DPEHC is quantitatively discussed. Numerical results show that the additional energy harvesting performance of the DPEHC is correlated with the key mechanical parameters of the DPEHC. For the present DPEHC prototype, the energy harvesting addition is over 400% compared with two independent F-C PEHCs under the same load condition.

  7. experimental verification of discharge sediment model at incipient

    African Journals Online (AJOL)

    user

    1983-09-01

    Sep 1, 1983 ... armour on cessation of sediment - feed. The study is being conducted in a laboratory flume because the required tests for the necessary hydraulic quantities like discharge can be scaled down avoiding the necessity for large capital for equipment and personnel that would have been the case in the field.

  8. Introducing uncertainty of radar-rainfall estimates to the verification of mesoscale model precipitation forecasts

    Directory of Open Access Journals (Sweden)

    M. P. Mittermaier

    2008-05-01

    Full Text Available A simple measure of the uncertainty associated with using radar-derived rainfall estimates as "truth" has been introduced to the Numerical Weather Prediction (NWP verification process to assess the effect on forecast skill and errors. Deterministic precipitation forecasts from the mesoscale version of the UK Met Office Unified Model for a two-day high-impact event and for a month were verified at the daily and six-hourly time scale using a spatially-based intensity-scale method and various traditional skill scores such as the Equitable Threat Score (ETS and log-odds ratio. Radar-rainfall accumulations from the UK Nimrod radar-composite were used.

    The results show that the inclusion of uncertainty has some effect, shifting the forecast errors and skill. The study also allowed for the comparison of results from the intensity-scale method and traditional skill scores. It showed that the two methods complement each other, one detailing the scale and rainfall accumulation thresholds where the errors occur, the other showing how skillful the forecast is. It was also found that for the six-hourly forecasts the error distributions remain similar with forecast lead time but skill decreases. This highlights the difference between forecast error and forecast skill, and that they are not necessarily the same.

  9. Verification and Validation of a Three-Dimensional Orthotropic Plasticity Constitutive Model Using a Unidirectional Composite

    Directory of Open Access Journals (Sweden)

    Canio Hoffarth

    2017-03-01

    Full Text Available A three-dimensional constitutive model has been developed for modeling orthotropic composites subject to impact loads. It has three distinct components—a deformation model involving elastic and plastic deformations; a damage model; and a failure model. The model is driven by tabular data that is generated either using laboratory tests or via virtual testing. A unidirectional composite—T800/F3900, commonly used in the aerospace industry, is used in the verification and validation tests. While the failure model is under development, these tests indicate that the implementation of the deformation and damage models in a commercial finite element program, LS-DYNA, is efficient, robust and accurate.

  10. Methods, Computational Platform, Verification, and Application of Earthquake-Soil-Structure-Interaction Modeling and Simulation

    Science.gov (United States)

    Tafazzoli, Nima

    Seismic response of soil-structure systems has attracted significant attention for a long time. This is quite understandable with the size and the complexity of soil-structure systems. The focus of three important aspects of ESSI modeling could be on consistent following of input seismic energy and a number of energy dissipation mechanisms within the system, numerical techniques used to simulate dynamics of ESSI, and influence of uncertainty of ESSI simulations. This dissertation is a contribution to development of one such tool called ESSI Simulator. The work is being done on extensive verified and validated suite for ESSI Simulator. Verification and validation are important for high fidelity numerical predictions of behavior of complex systems. This simulator uses finite element method as a numerical tool to obtain solutions for large class of engineering problems such as liquefaction, earthquake-soil-structure-interaction, site effect, piles, pile group, probabilistic plasticity, stochastic elastic-plastic FEM, and detailed large scale parallel models. Response of full three-dimensional soil-structure-interaction simulation of complex structures is evaluated under the 3D wave propagation. Domain-Reduction-Method is used for applying the forces as a two-step procedure for dynamic analysis with the goal of reducing the large size computational domain. The issue of damping of the waves at the boundary of the finite element models is studied using different damping patterns. This is used at the layer of elements outside of the Domain-Reduction-Method zone in order to absorb the residual waves coming out of the boundary layer due to structural excitation. Extensive parametric study is done on dynamic soil-structure-interaction of a complex system and results of different cases in terms of soil strength and foundation embedment are compared. High efficiency set of constitutive models in terms of computational time are developed and implemented in ESSI Simulator

  11. Verification and validation of an actuator disc model

    DEFF Research Database (Denmark)

    Réthoré, Pierre-Elouan; Laan, van der, Paul Maarten; Troldborg, Niels

    2014-01-01

    reduce the computational cost of large wind farm wake simulations. The special case of the actuator disc is successfully validated with an analytical solution for heavily loaded turbines and with a full-rotor computation in computational fluid dynamics. Copyright © 2013 John Wiley & Sons, Ltd.......Wind turbine wake can be studied in computational fluid dynamics with the use of permeable body forces (e.g. actuator disc, line and surface). This paper presents a general flexible method to redistribute wind turbine blade forces as permeable body forces in a computational domain. The method can...... take any kind of shape discretization, determine the intersectional elements with the computational grid and use the size of these elements to redistribute proportionally the forces. This method can potentially reduce the need for mesh refinement in the region surrounding the rotor and, therefore, also...

  12. Verification and Validation of Numerical Models for Air/Water Flow on Coastal and Navigation Fluid-Structure Interaction Applications

    Science.gov (United States)

    Kees, C. E.; Farthing, M.; Dimakopoulos, A.; DeLataillade, T.

    2015-12-01

    Performance analysis and optimization of coastal and navigation structures is becoming feasible due to recent improvements in numerical methods for multiphase flows and the steady increase in capacity and availability of high performance computing resources. Now that the concept of fully three-dimensional air/water flow modelling for real world engineering analysis is achieving acceptance by the wider engineering community, it is critical to expand careful comparative studies on verification,validation, benchmarking, and uncertainty quantification for the variety of competing numerical methods that are continuing to evolve. Furthermore, uncertainty still remains about the relevance of secondary processes such as surface tension, air compressibility, air entrainment, and solid phase (structure) modelling so that questions about continuum mechanical theory and mathematical analysis of multiphase flow are still required. Two of the most popular and practical numerical approaches for large-scale engineering analysis are the Volume-Of-Fluid (VOF) and Level Set (LS) approaches. In this work we will present a publically available verification and validation test set for air-water-structure interaction problems as well as computational and physical model results including a hybrid VOF-LS method, traditional VOF methods, and Smoothed Particle Hydrodynamics (SPH) results. The test set repository and test problem formats will also be presented in order to facilitate future comparative studies and reproduction of scientific results.

  13. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2018-01-01

    is compared to conventional text-independent background model based TD-SV systems using either Gaussian mixture model (GMM)-universal background model (UBM) or Hidden Markov model (HMM)-UBM or i-vector paradigms. In addition, we consider two approaches to build PBMs: speaker-independent and speaker......In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using...... and the selected PBM is then used for the log likelihood ratio (LLR) calculation with respect to the claimant model. The proposed method incorporates the pass-phrase identification step in the LLR calculation, which is not considered in conventional standalone TD-SV systems. The performance of the proposed method...

  14. Adjusting for Differential-verification Bias in Diagnostic-accuracy Studies A Bayesian Approach

    NARCIS (Netherlands)

    de Groot, Joris A. H.; Dendukuri, Nandini; Janssen, Kristel J. M.; Reitsma, Johannes B.; Bossuyt, Patrick M. M.; Moons, Karel G. M.

    2011-01-01

    In studies of diagnostic accuracy, the performance of an index test is assessed by verifying its results against those of a reference standard. If verification of index-test results by the preferred reference standard can be performed only in a subset of subjects, an alternative reference test could

  15. Optimal metering plan for measurement and verification on a lighting case study

    International Nuclear Information System (INIS)

    Ye, Xianming; Xia, Xiaohua

    2016-01-01

    M&V (Measurement and Verification) has become an indispensable process in various incentive EEDSM (energy efficiency and demand side management) programmes to accurately and reliably measure and verify the project performance in terms of energy and/or cost savings. Due to the uncertain nature of the unmeasurable savings, there is an inherent trade-off between the M&V accuracy and M&V cost. In order to achieve the required M&V accuracy cost-effectively, we propose a combined spatial and longitudinal MCM (metering cost minimisation) model to assist the design of optimal M&V metering plans, which minimises the metering cost whilst satisfying the required measurement and sampling accuracy of M&V. The objective function of the proposed MCM model is the M&V metering cost that covers the procurement, installation and maintenance of the metering system whereas the M&V accuracy requirements are formulated as the constraints. Optimal solutions to the proposed MCM model offer useful information in designing the optimal M&V metering plan. The advantages of the proposed MCM model are demonstrated by a case study of an EE lighting retrofit project and the model is widely applicable to other M&V lighting projects with different population sizes and sampling accuracy requirements. - Highlights: • A combined spatial and longitudinal optimisation model is proposed to reduce M&V cost. • The combined optimisation model handles M&V sampling uncertainty cost-effectively. • The model exhibits a better performance than the separate spatial or longitudinal models. • The required 90/10 criterion sampling accuracy is satisfied for each M&V report.

  16. Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.

    Science.gov (United States)

    Li, Haoxiang; Hua, Gang

    2018-04-01

    Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.

  17. Software verification, model validation, and hydrogeologic modelling aspects in nuclear waste disposal system simulations. A paradigm shift

    International Nuclear Information System (INIS)

    Sheng, G.M.

    1994-01-01

    This work reviewed the current concept of nuclear waste disposal in stable, terrestrial geologic media with a system of natural and man-made multi-barriers. Various aspects of this concept and supporting research were examined with the emphasis on the Canadian Nuclear Fuel Waste Management Program. Several of the crucial issues and challenges facing the current concept were discussed. These include: The difficulties inherent in a concept that centres around lithologic studies; the unsatisfactory state of software quality assurance in the present computer simulation programs; and the lack of a standardized, comprehensive, and systematic procedure to carry out a rigorous process of model validation and assessment of simulation studies. An outline of such an approach was presented and some of the principles, tools and techniques for software verification were introduced and described. A case study involving an evaluation of the Canadian performance assessment computer program is presented. A new paradigm to nuclear waste disposal was advocated to address the challenges facing the existing concept. The RRC (Regional Recharge Concept) was introduced and its many advantages were described and shown through a modelling exercise. (orig./HP)

  18. Pneumatic Muscles Actuated Lower-Limb Orthosis Model Verification with Actual Human Muscle Activation Patterns

    Directory of Open Access Journals (Sweden)

    Dzahir M.A.M

    2017-01-01

    Full Text Available A review study was conducted on existing lower-limb orthosis systems for rehabilitation which implemented pneumatic muscle type of actuators with the aim to clarify the current and on-going research in this field. The implementation of pneumatic artificial muscle will play an important role for the development of the advanced robotic system. In this research a derivation model for the antagonistic mono- and bi-articular muscles using pneumatic artificial muscles of a lower limb orthosis will be verified with actual human’s muscle activities models. A healthy and young male 29 years old subject with height 174cm and weight 68kg was used as a test subject. Two mono-articular muscles Vastus Medialis (VM and Vastus Lateralis (VL were selected to verify the mono-articular muscle models and muscle synergy between anterior muscles. Two biarticular muscles Rectus Femoris (RF and Bicep Femoris (BF were selected to verify the bi-articular muscle models and muscle co-contraction between anterior-posterior muscles. The test was carried out on a treadmill with a speed of 4.0 km/h, which approximately around 1.25 m/s for completing one cycle of walking motion. The data was collected for about one minute on a treadmill and 20 complete cycles of walking motion were successfully recorded. For the evaluations, the mathematical model obtained from the derivation and the actual human muscle activation patterns obtained using the surface electromyography (sEMG system were compared and analysed. The results shown that, high correlation values ranging from 0.83 up to 0.93 were obtained in between the derivation model and the actual human muscle’s model for both mono- and biarticular muscles. As a conclusion, based on the verification with the sEMG muscle activities data and its correlation values, the proposed derivation models of the antagonistic mono- and bi-articular muscles were suitable to simulate and controls the pneumatic muscles actuated lower limb

  19. Modelling and Formal Verification of Timing Aspects in Large PLC Programs

    CERN Document Server

    Fernandez Adiego, B; Blanco Vinuela, E; Tournier, J-C; Gonzalez Suarez, V M; Blech, J O

    2014-01-01

    One of the main obstacle that prevents model checking from being widely used in industrial control systems is the complexity of building formal models out of PLC programs, especially when timing aspects need to be integrated. This paper brings an answer to this obstacle by proposing a methodology to model and verify timing aspects of PLC programs. Two approaches are proposed to allow the users to balance the trade-off between the complexity of the model, i.e. its number of states, and the set of specifications possible to be verified. A tool supporting the methodology which allows to produce models for different model checkers directly from PLC programs has been developed. Verification of timing aspects for real-life PLC programs are presented in this paper using NuSMV.

  20. Development and verification of a space-dependent dynamic model of a natural circulation steam generator

    International Nuclear Information System (INIS)

    Mewdell, C.G.; Harrison, W.C.; Hawley, E.H.

    1980-01-01

    This paper describes the development and verification of a Non-Linear Space-Dependent Dynamic Model of a Natural Circulation Steam Generator typical of boilers used in CANDU nuclear power stations. The model contains a detailed one-dimensional dynamic description of both the primary and secondary sides of an integral pre-heater natural circulation boiler. Two-phase flow effects on the primary side are included. The secondary side uses a drift-flux model in the boiling sections and a detailed non-equilibrium point model for the steam drum. The paper presents the essential features of the final model called BOILER-2, its solution scheme, the RD-12 loop and test boiler, the boiler steady-state and transient experiments, and the comparison of the model predictions with experimental results. (author)

  1. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  2. Verification of a coupled atmosphere-ocean model using satellite observations over the Adriatic Sea

    Directory of Open Access Journals (Sweden)

    V. Djurdjevic

    2008-07-01

    Full Text Available Verification of the EBU-POM regional atmosphere-ocean coupled model (RAOCM was carried out using satellite observations of SST and surface winds over the Adriatic Sea. The atmospheric component has a horizontal resolution of 0.125 degree (approximately 10 km and 32 vertical levels, while the ocean component has a horizontal resolution of approximately 4 km with 21 sigma vertical levels.

    Verification of the forecasted SST was performed for 15 forecasts during 2006, each of them seven days long. These forecasts coincide with the operating cycle of the Adriatic Regional Model (AREG, which provided the initial fields and boundary conditions for the ocean component of EBU-POM. Two sources of data were used for the initial and boundary conditions of the atmosphere: primary data were obtained from the European Center for Medium-Range Weather Forecasting (ECMWF, while data from National Centers for Environmental Prediction (NCEP were used to test the sensitivity to boundary conditions.

    Forecast skill was expressed in terms of BIAS and root mean square error (RMSE. During most the of verification period, the model had a negative BIAS of approximately −0.3°, while RMSE varied between 1.1° and 1.2°. Interestingly, these errors did not increase over time, which means that the forecast skill did not decline during the integrations.

    The 10-m wind verification was conducted for one period of 17 days in February 2007, during a strong bora episode, for which satellite estimates of surface winds were available. During the same period, SST measurements were conducted twice a day, which enabled us to verify diurnal variations of SST simulated by the RAOCM model. Since ECMWF's deterministic forecasts do not cover such a long period, we decided to use the ECMWF analysis, i.e. we ran the model in hindcast mode. The winds simulated in this analysis were weaker than the satellite estimates, with a mean BIAS of −0.8 m/s.

  3. Verification of a coupled atmosphere-ocean model using satellite observations over the Adriatic Sea

    Directory of Open Access Journals (Sweden)

    V. Djurdjevic

    2008-07-01

    Full Text Available Verification of the EBU-POM regional atmosphere-ocean coupled model (RAOCM was carried out using satellite observations of SST and surface winds over the Adriatic Sea. The atmospheric component has a horizontal resolution of 0.125 degree (approximately 10 km and 32 vertical levels, while the ocean component has a horizontal resolution of approximately 4 km with 21 sigma vertical levels. Verification of the forecasted SST was performed for 15 forecasts during 2006, each of them seven days long. These forecasts coincide with the operating cycle of the Adriatic Regional Model (AREG, which provided the initial fields and boundary conditions for the ocean component of EBU-POM. Two sources of data were used for the initial and boundary conditions of the atmosphere: primary data were obtained from the European Center for Medium-Range Weather Forecasting (ECMWF, while data from National Centers for Environmental Prediction (NCEP were used to test the sensitivity to boundary conditions. Forecast skill was expressed in terms of BIAS and root mean square error (RMSE. During most the of verification period, the model had a negative BIAS of approximately −0.3°, while RMSE varied between 1.1° and 1.2°. Interestingly, these errors did not increase over time, which means that the forecast skill did not decline during the integrations. The 10-m wind verification was conducted for one period of 17 days in February 2007, during a strong bora episode, for which satellite estimates of surface winds were available. During the same period, SST measurements were conducted twice a day, which enabled us to verify diurnal variations of SST simulated by the RAOCM model. Since ECMWF's deterministic forecasts do not cover such a long period, we decided to use the ECMWF analysis, i.e. we ran the model in hindcast mode. The winds simulated in this analysis were weaker than the satellite estimates, with a mean BIAS of −0.8 m/s.

  4. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at{sub R}isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at{sub R}isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the

  5. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  6. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    Science.gov (United States)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  7. Additional Model Datasets and Results to Accelerate the Verification and Validation of RELAP-7

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-11-01

    The RELAP-7 code verification and validation activities are ongoing under the code assessment plan proposed in the previous document (INL-EXT-16-40015). Among the list of V&V test problems in the ‘RELAP-7 code V&V RTM (Requirements Traceability Matrix)’, the RELAP-7 7-equation model has been tested with additional demonstration problems and the results of these tests are reported in this document. In this report, we describe the testing process, the test cases that were conducted, and the results of the evaluation.

  8. Experimental Validation and Model Verification for a Novel Geometry ICPC Solar Collector

    DEFF Research Database (Denmark)

    Perers, Bengt; Duff, William S.; Daosukho, Jirachote

    at the various specified incident angles provide model verification for the investigation into causes of ray attenuation and provide accounts for rays that escape. Two fourteen tube modules were tested on Sandia National Laboratory’s two-axis tracking (AZTRAK) platform. By adjusting the tracking of the platform...... at the corresponding specified incident angles are compared to the Sandia results. A 100 m2 336 Novel ICPC evacuated tube solar collector array has been in continuous operation at a demonstration project in Sacramento California since 1998. Data from the initial operation of the array are used to further validate...

  9. Separating stages of arithmetic verification: An ERP study with a novel paradigm.

    Science.gov (United States)

    Avancini, Chiara; Soltész, Fruzsina; Szűcs, Dénes

    2015-08-01

    In studies of arithmetic verification, participants typically encounter two operands and they carry out an operation on these (e.g. adding them). Operands are followed by a proposed answer and participants decide whether this answer is correct or incorrect. However, interpretation of results is difficult because multiple parallel, temporally overlapping numerical and non-numerical processes of the human brain may contribute to task execution. In order to overcome this problem here we used a novel paradigm specifically designed to tease apart the overlapping cognitive processes active during arithmetic verification. Specifically, we aimed to separate effects related to detection of arithmetic correctness, detection of the violation of strategic expectations, detection of physical stimulus properties mismatch and numerical magnitude comparison (numerical distance effects). Arithmetic correctness, physical stimulus properties and magnitude information were not task-relevant properties of the stimuli. We distinguished between a series of temporally highly overlapping cognitive processes which in turn elicited overlapping ERP effects with distinct scalp topographies. We suggest that arithmetic verification relies on two major temporal phases which include parallel running processes. Our paradigm offers a new method for investigating specific arithmetic verification processes in detail. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Experimental verification of mathematical model of the heat transfer in exhaust system

    Directory of Open Access Journals (Sweden)

    Petković Snežana

    2011-01-01

    Full Text Available A Catalyst convertor has maximal efficiency when it reaches working temperature. In a cold start phase efficiency of the catalyst is low and exhaust emissions have high level of air pollutants. The exhaust system optimization, in order to decrease time of achievement of the catalyst working temperature, caused reduction of the total vehicle emission. Implementation of mathematical models in development of exhaust systems decrease total costs and reduce time. Mathematical model has to be experimentally verified and calibrated, in order to be useful in the optimization process. Measurement installations have been developed and used for verification of the mathematical model of unsteady heat transfer in exhaust systems. Comparisons between experimental results and the mathematical model are presented in this paper. Based on obtained results, it can be concluded that there is a good agreement between the model and the experimental results.

  11. Viability Study for an Unattended UF6 Cylinder Verification Station: Phase I Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Miller, Karen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Garner, James R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Branney, Sean [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McDonald, Benjamin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Webster, Jennifer B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zalavadia, Mital A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Todd, Lindsay C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kulisek, Jonathan A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Nordquist, Heather [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Deshmukh, Nikhil S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stewart, Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-05-31

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field

  12. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...... of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated...

  13. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  14. Modeling and Verification of Reconfigurable and Energy-Efficient Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Jiafeng Zhang

    2015-01-01

    Full Text Available This paper deals with the formal modeling and verification of reconfigurable and energy-efficient manufacturing systems (REMSs that are considered as reconfigurable discrete event control systems. A REMS not only allows global reconfigurations for switching the system from one configuration to another, but also allows local reconfigurations on components for saving energy when the system is in a particular configuration. In addition, the unreconfigured components of such a system should continue running during any reconfiguration. As a result, during a system reconfiguration, the system may have several possible paths and may fail to meet control requirements if concurrent reconfiguration events and normal events are not controlled. To guarantee the safety and correctness of such complex systems, formal verification is of great importance during a system design stage. This paper extends the formalism reconfigurable timed net condition/event systems (R-TNCESs in order to model all possible dynamic behavior in such systems. After that, the designed system based on extended R-TNCESs is verified with the help of a software tool SESA for functional, temporal, and energy-efficient properties. This paper is illustrated by an automatic assembly system.

  15. Managing the Verification Trajectory

    NARCIS (Netherlands)

    Ruys, T.C.; Brinksma, Hendrik

    In this paper we take a closer look at the automated analysis of designs, in particular of verification by model checking. Model checking tools are increasingly being used for the verification of real-life systems in an industrial context. In addition to ongoing research aimed at curbing the

  16. Model-based verification method for solving the parameter uncertainty in the train control system

    International Nuclear Information System (INIS)

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  17. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    Science.gov (United States)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated

  18. Verification of extended model of goal directed behavior applied on aggression

    Directory of Open Access Journals (Sweden)

    Katarína Vasková

    2016-01-01

    behavioral desire. Also important impact of this factor on prevolitional stages of aggressive behavior was identified. Next important predictor of behavioral desire was anticipation of positive emotions, but not negative emotions. These results correspond with theory of self-regulation where behavior that is focused on goal attainment is accompanied with positive emotions (see for example Cacioppo, Gardner & Berntson, 1999, Carver, 2004. Results confirmed not only sufficient model fit, but also explained 53% of variance of behavioral desire, 68% of intention and 37% of behavior. Some limitations should be mentioned - especially unequal gender representation in the second sample. Some results could be affected by lower sample size. For the future we recommend use also other types of aggressive behavior in verification EMGB and also to apply more complex incorporation of inhibition to the model. At last, character of this study is co-relational, therefore further researches should manipulate with key variables in experimental way to appraise main characteristics of stated theoretical background.

  19. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    International Nuclear Information System (INIS)

    Yamashita, M; Kokubo, M; Takahashi, R; Takayama, K; Tanabe, H; Sueoka, M; Okuuchi, N; Ishii, M; Iwamoto, Y; Tachibana, H

    2016-01-01

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used for dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and

  20. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    Energy Technology Data Exchange (ETDEWEB)

    Yamashita, M; Kokubo, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Takayama, K [Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Tanabe, H; Sueoka, M; Okuuchi, N [Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Ishii, M; Iwamoto, Y [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used for dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and

  1. Verification of Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Jacobsen, Lasse; Jacobsen, Morten; Møller, Mikael Harkjær

    2011-01-01

    of interesting theoretical properties distinguishing them from other time extensions of Petri nets. We shall give an overview of the recent theory developed in the verification of TAPN extended with features like read/transport arcs, timed inhibitor arcs and age invariants. We will examine in detail...... the boundaries of automatic verification and the connections between TAPN and the model of timed automata. Finally, we will mention the tool TAPAAL that supports modelling, simulation and verification of TAPN and discuss a small case study of alternating bit protocol....

  2. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nielsen, Kim [Ramboll, Copenhagen (Denmark); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bunnik, Tim [MARIN (Netherlands); Touzon, Imanol [Tecnalia (Spain); Nam, Bo Woo [KRISO (Korea, Rep. of); Kim, Jeong Seok [KRISO (Korea, Rep. of); Janson, Carl Erik [Chalmers University (Sweden); Jakobsen, Ken-Robert [EDRMedeso (Norway); Crowley, Sarah [WavEC (Portugal); Vega, Luis [Hawaii Natural Energy Institute (United States); Rajagopalan, Krishnakimar [Hawaii Natural Energy Institute (United States); Mathai, Thomas [Glosten (United States); Greaves, Deborah [Plymouth University (United Kingdom); Ransley, Edward [Plymouth University (United Kingdom); Lamont-Kane, Paul [Queen' s University Belfast (United Kingdom); Sheng, Wanan [University College Cork (Ireland); Costello, Ronan [Wave Venture (United Kingdom); Kennedy, Ben [Wave Venture (United Kingdom); Thomas, Sarah [Floating Power Plant (Denmark); Heras, Pilar [Floating Power Plant (Denmark); Bingham, Harry [Technical University of Denmark (Denmark); Kurniawan, Adi [Aalborg University (Denmark); Kramer, Morten Mejlhede [Aalborg University (Denmark); Ogden, David [INNOSEA (France); Girardin, Samuel [INNOSEA (France); Babarit, Aurelien [EC Nantes (France); Wuillaume, Pierre-Yves [EC Nantes (France); Steinke, Dean [Dynamic Systems Analysis (Canada); Roy, Andre [Dynamic Systems Analysis (Canada); Beatty, Scott [Cascadia Coast Research (Canada); Schofield, Paul [ANSYS (United States); Kim, Kyong-Hwan [KRISO (Korea, Rep. of); Jansson, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden); BCAM (Spain); Hoffman, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden)

    2017-10-16

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30) [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.

  3. Raman laser spectrometer optical head: qualification model assembly and integration verification

    Science.gov (United States)

    Ramos, G.; Sanz-Palomino, M.; Moral, A. G.; Canora, C. P.; Belenguer, T.; Canchal, R.; Prieto, J. A. R.; Santiago, A.; Gordillo, C.; Escribano, D.; Lopez-Reyes, G.; Rull, F.

    2017-08-01

    Raman Laser Spectrometer (RLS) is the Pasteur Payload instrument of the ExoMars mission, within the ESA's Aurora Exploration Programme, that will perform for the first time in an out planetary mission Raman spectroscopy. RLS is composed by SPU (Spectrometer Unit), iOH (Internal Optical Head), and ICEU (Instrument Control and Excitation Unit). iOH focuses the excitation laser on the samples (excitation path), and collects the Raman emission from the sample (collection path, composed on collimation system and filtering system). Its original design presented a high laser trace reaching to the detector, and although a certain level of laser trace was required for calibration purposes, the high level degrades the Signal to Noise Ratio confounding some Raman peaks. So, after the bread board campaign, some light design modifications were implemented in order to fix the desired amount of laser trace, and after the fabrication and the commitment of the commercial elements, the assembly and integration verification process was carried out. A brief description of the iOH design update for the engineering and qualification model (iOH EQM) as well as the assembly process are briefly described in this papers. In addition, the integration verification and the first functional tests, carried out with the RLS calibration target (CT), results are reported on.

  4. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    Science.gov (United States)

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-06

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  5. Verification of the Hydrodynamic and Sediment Transport Hybrid Modeling System for Cumberland Sound and Kings Bay Navigation Channel, Georgia

    Science.gov (United States)

    1989-07-01

    different loca- tions and should not be directly evaluated. These data are provided for 72 8’ 0�PHSICAL MODEL 0*0NUMERICAL MODEL E HIGH WATER LE 6...TII. JM~ls a. Station 843 MIODEL. VERIFICATION - 1985 GEOMSETRY tt m •u - FIELD 61*110K le 4. a." -- 0 . 0 p-iOi" Plate \\1. is. in. 14. 1. Is. U. 94...STOTION 3n 4. a. L 0 MODE VEIICTO 195GOMTYw"" l -. L I 31 -6. 10. In. 14. 18. 18. a. *. 14. MIEL TIM. HMES a. Station 396 MI’ODEL VERIFICATION - 1985

  6. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

    Directory of Open Access Journals (Sweden)

    Shrirang Ambaji KULKARNI

    2017-04-01

    Full Text Available Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique for routing in dynamic network. To analyze the correctness of Q-Routing algorithms mathematically, we provide a proof and also implement a SPIN based verification model. We also perform simulation based analysis of Q-Routing for given metrics.

  7. Experimental verification of the energetic model of the dry mechanical reclamation process

    Directory of Open Access Journals (Sweden)

    R. Dańko

    2008-04-01

    Full Text Available The experimental results of the dry mechanical reclamation process, which constituted the bases for the verification of the energetic model of this process, developed by the author on the grounds of the Rittinger’s deterministic hypothesis of the crushing process, are presented in the paper. Used foundry sands with bentonite, with water-glass from the floster technology and used sands with furan FL 105 resin were used in the reclamation tests. In the mechanical and mechanical-cryogenic reclamation a wide range of time variations and reclamation conditions influencing intensity of the reclamation process – covering all possible parameters used in industrial devices - were applied. The developed theoretical model constitutes a new tool allowing selecting optimal times for the reclamation treatment of the given spent foundry sand at the assumed process intensity realized in rotor reclaimers - with leaves or rods as grinding elements mounted horizontally on the rotor axis.

  8. A Verification Study on the Loop-Breaking Logic of FTREX

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2008-01-01

    The logical loop problem in fault tree analysis (FTA) has been solved by manually or automatically breaking their circular logics. The breaking of logical loops is one of uncertainty sources in fault tree analyses. A practical method which can verify fault tree analysis results was developed by Choi. The method has the capability to handle logical loop problems. It has been implemented in a FORTRAN program which is called VETA (Verification and Evaluation of fault Tree Analysis results) code. FTREX, a well-known fault tree quantifier developed by KAERI, has an automatic loop-breaking logic. In order to make certain of the correctness of the loop-breaking logic of FTREX, some typical trees with complex loops are developed and applied to this study. This paper presents some verification results of the loop-breaking logic tested by the VETA code

  9. Experimental verification of dynamic radioecological models established after the Chernobyl reactor accident

    International Nuclear Information System (INIS)

    Voigt, G.; Mueller, H.; Proehl, G.; Stocke, H.; Paretzke, H.G.

    1991-01-01

    The experiments reported were carried out for a verification of existing, dynamic radioecological models, especially of the ECOSYS model. The database used for the verification covers the radioactivity concentrations of Cs-134, Cs-137, I-131 measured after the Chernobyl reactor accident in foodstuffs and environmental samples, the results of field experiments on radionuclide translocation after foliar uptake or absorption by the roots of edible plants. The measured data were compared with the model predictions for the radionuclides under review. The Cs-134 and Cs-137 translocation factors which describe the redistribution of these radionuclides in the plant after foliar uptake were experimentally determined by a single sprinkling with Chernobyl rainwater, and were measured to be the following as a function of sprinkling time: winter wheat, 0.002-0.13; spring wheat, 0.003-0.09; winter rye, 0.002-0.27; barley, 0.002-0.04; potatoes, 0.05-0.35; carrots, 0.02-0.07; bush beans, 0.04-0.3; cabbage, 0.1-0.5. The weathering half-life of the radionuclides in lettuce was determined to be ten days. Transfer factors determined for root absorption of Cs-137 were measured to be an average of 0.002 for grains, 0.002 for potatoes, 0.004 for white cabbage, 0.003 for bush beans and carrots, and 0.007 for lettuce. There was an agreement between the ECOSYS model predictions and the measured radioactivity concentrations of the corresponding radionuclides. (orig./HP) [de

  10. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    International Nuclear Information System (INIS)

    Mao, S P; Rottenberg, X; Rochus, V; Czarnecki, P; Helin, P; Severi, S; Tilmans, H A C; Nauwelaers, B

    2017-01-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp .). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells. (paper)

  11. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    Science.gov (United States)

    Mao, S. P.; Rottenberg, X.; Rochus, V.; Czarnecki, P.; Helin, P.; Severi, S.; Nauwelaers, B.; Tilmans, H. A. C.

    2017-03-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp.). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells.

  12. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  13. Developing Verification Systems for Building Information Models of Heritage Buildings with Heterogeneous Datasets

    Science.gov (United States)

    Chow, L.; Fai, S.

    2017-08-01

    The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  14. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  15. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    International Nuclear Information System (INIS)

    Baba, H; Tachibana, H; Kamima, T; Takahashi, R; Kawai, D; Sugawara, Y; Yamamoto, T; Sato, A; Yamashita, M

    2015-01-01

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%

  16. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  17. A study on periodic safety verification on MOV performance

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Du Eon; Park, Jong Ho; Han, Jae Seob; Kang, Hyeon Taek; Lee, Jeong Min; Song, Kyu Jo; Shin, Wan Sun; Lee, Taek Sang [Chungnam National Univ., Taejon (Korea, Republic of)

    2000-03-15

    The objectives of this study, therefore, are to define the optimized valve diagnostic variances which early detect the abnormal conditions during the surveillance of the valve and consequently reduce the radiation exposure. The major direction of the development is to detect in advance the valve degradation by monitoring the motor current and power signals which can be obtained remotely at Motor Control Center (MCC). A series of valve operation experiments have been performed under several kinds of abnormal conditions by using the test apparatus which consists of a 3-inch gate valve, a motor(0.33 Hp, 460V, 0.8A, 1560rpm), actuator(SMB-000-2 type), some measuring devices(power analyzer, oscilloscope, data recorder and current transformer, AC current and voltage transducer) and connection cables.

  18. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  19. Community Radiative Transfer Model for Inter-Satellites Calibration and Verification

    Science.gov (United States)

    Liu, Q.; Nalli, N. R.; Ignatov, A.; Garrett, K.; Chen, Y.; Weng, F.; Boukabara, S. A.; van Delst, P. F.; Groff, D. N.; Collard, A.; Joseph, E.; Morris, V. R.; Minnett, P. J.

    2014-12-01

    Developed at the Joint Center for Satellite Data Assimilation, the Community Radiative Transfer Model (CRTM) [1], operationally supports satellite radiance assimilation for weather forecasting. The CRTM also supports JPSS/NPP and GOES-R missions [2] for instrument calibration, validation, monitoring long-term trending, and satellite retrieved products [3]. The CRTM is used daily at the NOAA NCEP to quantify the biases and standard deviations between radiance simulations and satellite radiance measurements in a time series and angular dependency. The purposes of monitoring the data assimilation system are to ensure the proper performance of the assimilation system and to diagnose problems with the system for future improvements. The CRTM is a very useful tool for cross-sensor verifications. Using the double difference method, it can remove the biases caused by slight differences in spectral response and geometric angles between measurements of the two instruments. The CRTM is particularly useful to reduce the difference between instruments for climate studies [4]. In this study, we will carry out the assessment of the Suomi National Polar-orbiting Partnership (SNPP) [5] Cross-track Infrared Sounder (CrIS) data [6], Advanced Technology Microwave Sounder (ATMS) data, and data for Visible Infrared Imaging Radiometer Suite (VIIRS) [7][8] thermal emissive bands. We use dedicated radiosondes and surface data acquired from NOAA Aerosols and Ocean Science Expeditions (AEROSE) [9]. The high quality radiosondes were launched when Suomi NPP flew over NOAA Ship Ronald H. Brown situated in the tropical Atlantic Ocean. The atmospheric data include profiles of temperature, water vapor, and ozone, as well as total aerosol optical depths. The surface data includes air temperature and humidity at 2 meters, skin temperature (Marine Atmospheric Emitted Radiance Interferometer, M-AERI [10]), surface temperature, and surface wind vector. [1] Liu, Q., and F. Weng, 2006: JAS [2] Liu, Q

  20. Modelling of hydrodynamics and mecury transport in lake Velenje. Part 2, Modelling and model verification

    OpenAIRE

    Kotnik, Jože; Žagar, Dušan; Rajar, Rudi; Horvat, Milena

    2004-01-01

    PCFLOW3D - a three-dimensional mathematical model that was developed at the Chair of Fluid Mechanics of the Faculty of Civil and Geodetic Engineering, University of Ljubljana, was used for hydrodynamic and Hg transport simulations in Lake Velenje. The model is fully non-linear and computes three velocity components, water elevation and pressure. Transport-dispersion equations for salinity and heat (and/or any pollutant) are further used to compute the distributions of these par...

  1. Statistical Modeling, Simulation, and Experimental Verification of Wideband Indoor Mobile Radio Channels

    Directory of Open Access Journals (Sweden)

    Yuanyuan Ma

    2018-01-01

    Full Text Available This paper focuses on the modeling, simulation, and experimental verification of wideband single-input single-output (SISO mobile fading channels for indoor propagation environments. The indoor reference channel model is derived from a geometrical rectangle scattering model, which consists of an infinite number of scatterers. It is assumed that the scatterers are exponentially distributed over the two-dimensional (2D horizontal plane of a rectangular room. Analytical expressions are derived for the probability density function (PDF of the angle of arrival (AOA, the PDF of the propagation path length, the power delay profile (PDP, and the frequency correlation function (FCF. An efficient sum-of-cisoids (SOC channel simulator is derived from the nonrealizable reference model by employing the SOC principle. It is shown that the SOC channel simulator approximates closely the reference model with respect to the FCF. The SOC channel simulator enables the performance evaluation of wideband indoor wireless communication systems with reduced realization expenditure. Moreover, the rationality and usefulness of the derived indoor channel model is confirmed by various measurements at 2.4, 5, and 60 GHz.

  2. Verification of Model of Calculation of Intra-Chamber Parameters In Hybrid Solid-Propellant Rocket Engines

    Directory of Open Access Journals (Sweden)

    Zhukov Ilya S.

    2016-01-01

    Full Text Available On the basis of obtained analytical estimate of characteristics of hybrid solid-propellant rocket engine verification of earlier developed physical and mathematical model of processes in a hybrid solid-propellant rocket engine for quasi-steady-state flow regime was performed. Comparative analysis of calculated and analytical data indicated satisfactory comparability of simulation results.

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE — BAYSAVER TECHNOLOGIES, INC. BAYSAVER SEPARATION SYSTEM, MODEL 10K

    Science.gov (United States)

    Verification testing of the BaySaver Separation System, Model 10K was conducted on a 10 acre drainage basin near downtown Griffin, Georgia. The system consists of two water tight pre-cast concrete manholes and a high-density polyethylene BaySaver Separator Unit. The BaySaver Mod...

  4. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    International Nuclear Information System (INIS)

    Lee, Se Ho; Lee, Seung Wook; Han, Su Chul; Park, Seung Woo

    2016-01-01

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study

  5. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Se Ho; Lee, Seung Wook [Pusan National University, Busan (Korea, Republic of); Han, Su Chul; Park, Seung Woo [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-05-15

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

  6. Operational Characteristics Identification and Simulation Model Verification for Incheon International Airport

    Science.gov (United States)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Zhu, Zhifan; Jung, Yoon C.; Jeong, Myeongsook; Kim, Hyounkyong; Oh, Eunmi; Hong, Sungkwon; Lee, Junwon

    2016-01-01

    integrated into NASA's Airspace Technology Demonstration-2 (ATD-2) project for technology demonstration of Integrated Arrival-Departure-Surface (IADS) operations at CLT. This study is a part of the international research collaboration between KAIA (Korea Agency for Infrastructure Technology Advancement), KARI (Korea Aerospace Research Institute) and NASA, which is being conducted to validate the effectiveness of SARDA concept as a controller decision support tool for departure and surface management of ICN. This paper presents the preliminary results of the collaboration effort. It includes investigation of the operational environment of ICN, data analysis for identification of the operational characteristics of the airport, construction and verification of airport simulation model using Surface Operations Simulator and Scheduler (SOSS), NASA's fast-time simulation tool.

  7. [Verification of the VEF photon beam model for dose calculations by the Voxel-Monte-Carlo-Algorithm].

    Science.gov (United States)

    Kriesen, Stephan; Fippel, Matthias

    2005-01-01

    The VEF linac head model (VEF, virtual energy fluence) was developed at the University of Tübingen to determine the primary fluence for calculations of dose distributions in patients by the Voxel-Monte-Carlo-Algorithm (XVMC). This analytical model can be fitted to any therapy accelerator head by measuring only a few basic dose data; therefore, time-consuming Monte-Carlo simulations of the linac head become unnecessary. The aim of the present study was the verification of the VEF model by means of water-phantom measurements, as well as the comparison of this system with a common analytical linac head model of a commercial planning system (TMS, formerly HELAX or MDS Nordion, respectively). The results show that both the VEF and the TMS models can very well simulate the primary fluence. However, the VEF model proved superior in the simulations of scattered radiation and in the calculations of strongly irregular MLC fields. Thus, an accurate and clinically practicable tool for the determination of the primary fluence for Monte-Carlo-Simulations with photons was established, especially for the use in IMRT planning.

  8. Verification of the VEF photon beam model for dose calculations by the voxel-Monte-Carlo-algorithm

    International Nuclear Information System (INIS)

    Kriesen, S.; Fippel, M.

    2005-01-01

    The VEF linac head model (VEF, virtual energy fluence) was developed at the University of Tuebingen to determine the primary fluence for calculations of dose distributions in patients by the Voxel-Monte-Carlo-Algorithm (XVMC). This analytical model can be fitted to any therapy accelerator head by measuring only a few basic dose data; therefore, time-consuming Monte-Carlo simulations of the linac head become unnecessary. The aim of the present study was the verification of the VEF model by means of water-phantom measurements, as well as the comparison of this system with a common analytical linac head model of a commercial planning system (TMS, formerly HELAX or MDS Nordion, respectively). The results show that both the VEF and the TMS models can very well simulate the primary fluence. However, the VEF model proved superior in the simulations of scattered radiation and in the calculations of strongly irregular MLC fields. Thus, an accurate and clinically practicable tool for the determination of the primary fluence for Monte-Carlo-Simulations with photons was established, especially for the use in IMRT planning. (orig.)

  9. Numerical climate modeling and verification of selected areas for heat waves of Pakistan using ensemble prediction system

    International Nuclear Information System (INIS)

    Amna, S; Samreen, N; Khalid, B; Shamim, A

    2013-01-01

    Depending upon the topography, there is an extreme variation in the temperature of Pakistan. Heat waves are the Weather-related events, having significant impact on the humans, including all socioeconomic activities and health issues as well which changes according to the climatic conditions of the area. The forecasting climate is of prime importance for being aware of future climatic changes, in order to mitigate them. The study used the Ensemble Prediction System (EPS) for the purpose of modeling seasonal weather hind-cast of three selected areas i.e., Islamabad, Jhelum and Muzaffarabad. This research was purposely carried out in order to suggest the most suitable climate model for Pakistan. Real time and simulated data of five General Circulation Models i.e., ECMWF, ERA-40, MPI, Meteo France and UKMO for selected areas was acquired from Pakistan Meteorological Department. Data incorporated constituted the statistical temperature records of 32 years for the months of June, July and August. This study was based on EPS to calculate probabilistic forecasts produced by single ensembles. Verification was done out to assess the quality of the forecast t by using standard probabilistic measures of Brier Score, Brier Skill Score, Cross Validation and Relative Operating Characteristic curve. The results showed ECMWF the most suitable model for Islamabad and Jhelum; and Meteo France for Muzaffarabad. Other models have significant results by omitting particular initial conditions.

  10. Fiction and reality in the modelling world - Balance between simplicity and complexity, calibration and identifiability, verification and falsification

    DEFF Research Database (Denmark)

    Harremoës, P.; Madsen, H.

    1999-01-01

    Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable by calibr...... and to incorporate that in the design, operation and control of urban drainage structures. (C) 1999 IAWQ Published by Elsevier Science Ltd. All rights reserved....

  11. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  12. Statistical methods to correct for verification bias in diagnostic studies are inadequate when there are few false negatives: a simulation study

    Directory of Open Access Journals (Sweden)

    Vickers Andrew J

    2008-11-01

    Full Text Available Abstract Background A common feature of diagnostic research is that results for a diagnostic gold standard are available primarily for patients who are positive for the test under investigation. Data from such studies are subject to what has been termed "verification bias". We evaluated statistical methods for verification bias correction when there are few false negatives. Methods A simulation study was conducted of a screening study subject to verification bias. We compared estimates of the area-under-the-curve (AUC corrected for verification bias varying both the rate and mechanism of verification. Results In a single simulated data set, varying false negatives from 0 to 4 led to verification bias corrected AUCs ranging from 0.550 to 0.852. Excess variation associated with low numbers of false negatives was confirmed in simulation studies and by analyses of published studies that incorporated verification bias correction. The 2.5th – 97.5th centile range constituted as much as 60% of the possible range of AUCs for some simulations. Conclusion Screening programs are designed such that there are few false negatives. Standard statistical methods for verification bias correction are inadequate in this circumstance.

  13. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    Energy Technology Data Exchange (ETDEWEB)

    Amadio, G.; et al.

    2017-11-22

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physics models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.

  14. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    Science.gov (United States)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Behera, S. P.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Hariri, F.; Jun, S. Y.; Konstantinov, D.; Kumawat, H.; Ivantchenko, V.; Lima, G.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2017-10-01

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physics models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.

  15. Thermal Pollution Mathematical Model. Volume 4: Verification of Three-Dimensional Rigid-Lid Model at Lake Keowee. [envrionment impact of thermal discharges from power plants

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1980-01-01

    The rigid lid model was developed to predict three dimensional temperature and velocity distributions in lakes. This model was verified at various sites (Lake Belews, Biscayne Bay, etc.) and th verification at Lake Keowee was the last of these series of verification runs. The verification at Lake Keowee included the following: (1) selecting the domain of interest, grid systems, and comparing the preliminary results with archival data; (2) obtaining actual ground truth and infrared scanner data both for summer and winter; and (3) using the model to predict the measured data for the above periods and comparing the predicted results with the actual data. The model results compared well with measured data. Thus, the model can be used as an effective predictive tool for future sites.

  16. Verification of forward kinematics of the numerical and analytical model of Fanuc AM100iB robot

    Science.gov (United States)

    Cholewa, A.; Świder, J.; Zbilski, A.

    2016-08-01

    The article presents the verification of forward kinematics of Fanuc AM100iB robot. The developed kinematic model of the machine was verified using tests on an actual robot model. The tests consisted in positioning the robot operating in the mode of controlling the values of natural angles in selected points of its workspace and reading the indications of the coordinate values of the TCP point in the robot's global coordinate system on the operator panel. Validation of the model consisted of entering the same values of natural angles that were used for positioning the robot in its inputs and calculating the coordinate values of the TCP of the machine's CAE model, and then comparing the results obtained with the values read. These results are the introduction to the partial verification of the dynamic model of the analysed device.

  17. The middle range verification of numerical model performance for heavy rainfall in North China

    Science.gov (United States)

    Zhang, Bo; Zhao, Bin; Niu, Ruoyun

    2017-04-01

    The heavy rainfall forecast in North China is the focus and difficulty in middle range numerical weather forecast. 70 typical heavy precipitation cases in North China in summer from 2010 to 2016 are selected, which are divided into vortex type, the west trough and shear line type according to the atmospheric circulation. Based on ECMWF model and the Chinese operational model T639, a spatial verification method MODE is used, the middle range precipitation forecast abilities for heavy rain in summer in North China are evaluated according to contrast the difference of centroidal distance, axis angel and aspect ratios. It is found that the ECMWF model and the T639 model all show weak predictive ability for the low-vortex-type heavy rainfall in Northern China from all the similarities. When the area of rainfall is larger, the precipitation patterns of the two models are mostly northeast-southwest. It is consistent with the actual situation. For a large area of precipitation area, both models predict the precipitation area aspect ratio is less than 1. It shows that precipitation drop area is long and narrow, and the forecast is also consistent with the actual situation. However, as far as T639 and ECMWF models are concerned, there are systematic deviations in the precipitation area, and the predicted precipitation area is located on the southwestern side of the field. For smaller/larger areas of precipitation, the predicted precipitation area is larger/smaller than the actual situation. In addition, a sensitive test for the regional heavy precipitation process in North China (such as Huanghuai and other regions) from July 18 to 20, 2016 is also done and the results show that each numerical model of the process prediction is not successful. Therefore, further research is needed on the future correction of systematic bias of numerical models of regional heavy precipitation in medium-term forecasters.

  18. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  19. ParFlow.RT: Development and Verification of a New Reactive Transport Model

    Science.gov (United States)

    Beisman, J. J., III

    2015-12-01

    In natural subsurface systems, total elemental fluxes are often heavily influenced by areas of disproportionately high reaction rates. These pockets of high reaction rates tend to occur at interfaces, such as the hyporheic zone, where a hydrologic flowpath converges with either a chemically distinct hydrologic flowpath or a reactive substrate. Understanding the affects that these highly reactive zones have on the behavior of shallow subsurface systems is integral to the accurate quantification of nutrient fluxes and biogeochemical cycling. Numerical simulations of these systems may be able to offer some insight. To that end, we have developed a new reactive transport model, ParFlow.RT, by coupling the parallel flow and transport code ParFlow with the geochemical engines of both PFLOTRAN and CrunchFlow. The coupling was accomplished via the Alquimia biogeochemistry API, which provides a unified interface to several geochemical codes and allows a relatively simple implementation of advanced geochemical functionality in flow and transport codes. This model uses an operator-splitting approach, where the transport and reaction steps are solved separately. Here, we present the details of this new model, and the results of verification simulations and biogeochemical cycling simulations of the DOE's East River field site outside of Gothic, CO.

  20. Modelling horizontal steam generator with ATHLET. Verification of different nodalization schemes and implementation of verified constitutive equations

    Energy Technology Data Exchange (ETDEWEB)

    Beliaev, J.; Trunov, N.; Tschekin, I. [OKB Gidropress (Russian Federation); Luther, W. [GRS Garching (Germany); Spolitak, S. [RNC-KI (Russian Federation)

    1995-12-31

    Currently the ATHLET code is widely applied for modelling of several Power Plants of WWER type with horizontal steam generators. A main drawback of all these applications is the insufficient verification of the models for the steam generator. This paper presents the nodalization schemes for the secondary side of the steam generator, the results of stationary calculations, and preliminary comparisons to experimental data. The consideration of circulation in the water inventory of the secondary side is proved to be necessary. (orig.). 3 refs.

  1. Modeling and verification of the diffraction-limited visible light telescope aboard the solar observing satellite HINODE

    Science.gov (United States)

    Katsukawa, Y.; Suematsu, Y.; Tsuneta, S.; Ichimoto, K.; Shimizu, T.

    2011-09-01

    HINODE, Japanese for "sunrise", is a spacecraft dedicated for observations of the Sun, and was launched in 2006 to study the Sun's magnetic fields and how their explosive energies propagate through the different atmospheric layers. The spacecraft carries the Solar Optical Telescope (SOT), which has a 50 cm diameter clear aperture and provides a continuous series of diffraction-limited visible light images from space. The telescope was developed through international collaboration between Japan and US. In order to achieve the diffraction-limited performance, thermal and structural modeling of the telescope was extensively used in its development phase to predict how the optical performance changes dependent on the thermal condition in orbit. Not only the modeling, we devoted many efforts to verify the optical performance in ground tests before the launch. The verification in the ground tests helped us to find many issues, such as temperature dependent focus shifts, which were not identified only through the thermal-structural modeling. Another critical issue was micro-vibrations induced by internal disturbances of mechanical gyroscopes and momentum wheels for attitude control of the spacecraft. Because the structural modeling was not accurate enough to predict how much the image quality was degraded by the micro-vibrations, we measured their transmission in a spacecraft-level test.

  2. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Fehnker, Ansgar; Fehnker, Ansgar

    2002-01-01

    We report on the use of model checking techniques for both the verification of a process control program and the derivation of optimal control schedules. Most of this work has been carried out as part of a case study for the EU VHS project (Verification of Hybrid Systems), in which the program for a

  3. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  4. Rheological-dynamical continuum damage model for concrete under uniaxial compression and its experimental verification

    Directory of Open Access Journals (Sweden)

    Milašinović Dragan D.

    2015-01-01

    Full Text Available A new analytical model for the prediction of concrete response under uniaxial compression and its experimental verification is presented in this paper. The proposed approach, referred to as the rheological-dynamical continuum damage model, combines rheological-dynamical analogy and damage mechanics. Within the framework of this approach the key continuum parameters such as the creep coefficient, Poisson’s ratio and damage variable are functionally related. The critical values of the creep coefficient and damage variable under peak stress are used to describe the failure mode of the concrete cylinder. The ultimate strain is determined in the post-peak regime only, using the secant stress-strain relation from damage mechanics. The post-peak branch is used for the energy analysis. Experimental data for five concrete compositions were obtained during the examination presented herein. The principal difference between compressive failure and tensile fracture is that there is a residual stress in the specimens, which is a consequence of uniformly accelerated motion of load during the examination of compressive strength. The critical interpenetration displacements and crushing energy are obtained theoretically based on the concept of global failure analysis. [Projekat Ministarstva nauke Republike Srbije, br. ON 174027: Computational Mechanics in Structural Engineering i br. TR 36017: Utilization of by-products and recycled waste materials in concrete composites for sustainable construction development in Serbia: Investigation and environmental assessment of possible applications

  5. Verification of an interaction model of an ultrasonic oscillatory system with periodontal tissues

    Directory of Open Access Journals (Sweden)

    V. A. Karpuhin

    2014-01-01

    Full Text Available Verification of an interaction model of an ultrasonic oscillatory system with biological tissues which was developed in COMSOL Multiphysics was carried out. It was shown that calculation results in COMSOL Multiphysics obtained using the “Finer” grid (the ratio of the grid step to a minimum transversal section area of the model ≤ 0.3 mm-1 best of all qualitatively and quantitatively corresponded to practical results. The average relative error of the obtained results in comparison with the experimental ones did not exceed 4.0%. Influence of geometrical parameters (thickness of load on electrical admittance of the ultrasonic oscillatory system interacting with biological tissues was investigated. It was shown that increase in thickness of load within the range from 0 to 95 mm led to decrease in calculated values of natural resonance frequency of longitudinal fluctuations and electrical admittance from 26,58 to 26,35 kHz and from 0,86 to 0,44 mS.

  6. Verification test of the SURF and SURFplus models in xRage

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-18

    As a verification test of the SURF and SURFplus models in the xRage code we use a propagating underdriven detonation wave in 1-D. This is about the only test cases for which an accurate solution can be determined based on the theoretical structure of the solution. The solution consists of a steady ZND reaction zone profile joined with a scale invariant rarefaction or Taylor wave and followed by a constant state. The end of the reaction profile and the head of the rarefaction coincide with the sonic CJ state of the detonation wave. The constant state is required to match a rigid wall boundary condition. For a test case, we use PBX 9502 with the same EOS and burn rate as previously used to test the shock detector algorithm utilized by the SURF model. The detonation wave is propagated for 10 μs (slightly under 80mm). As expected, the pointwise errors are largest in the neighborhood of discontinuities; pressure discontinuity at the lead shock front and pressure derivative discontinuities at the head and tail of the rarefaction. As a quantitative measure of the overall accuracy, the L2 norm of the difference of the numerical pressure and the exact solution is used. Results are presented for simulations using both a uniform grid and an adaptive grid that refines the reaction zone.

  7. Integrating Requirements Engineering, Modeling, and Verification Technologies into Software and Systems Engineering

    National Research Council Canada - National Science Library

    Broy, Manfred; Leucker, Martin

    2007-01-01

    The objective of this project is the development of an integrated suite of technologies focusing on end-to-end software development supporting requirements analysis, design, implementation, and verification...

  8. Functions of social support and self-verification in association with loneliness, depression, and stress.

    Science.gov (United States)

    Wright, Kevin B; King, Shawn; Rosenberg, Jenny

    2014-01-01

    This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.

  9. 3D MODELING WITH PHOTOGRAMMETRY BY UAVS AND MODEL QUALITY VERIFICATION

    Directory of Open Access Journals (Sweden)

    V. Barrile

    2017-11-01

    Full Text Available This paper deals with a test lead by Geomatics laboratory (DICEAM, Mediterranea University of Reggio Calabria, concerning the application of UAV photogrammetry for survey, monitoring and checking. The study case relies with the surroundings of the Department of Agriculture Sciences. In the last years, such area was interested by landslides and survey activities carried out to take the phenomenon under control. For this purpose, a set of digital images were acquired through a UAV equipped with a digital camera and GPS. Successively, the processing for the production of a 3D georeferenced model was performed by using the commercial software Agisoft PhotoScan. Similarly, the use of a terrestrial laser scanning technique allowed to product dense cloud and 3D models of the same area. To assess the accuracy of the UAV-derived 3D models, a comparison between image and range-based methods was performed.

  10. Thermal Pollution Mathematical Model. Volume 2; Verification of One-Dimensional Numerical Model at Lake Keowee

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1980-01-01

    A one dimensional model for studying the thermal dynamics of cooling lakes was developed and verified. The model is essentially a set of partial differential equations which are solved by finite difference methods. The model includes the effects of variation of area with depth, surface heating due to solar radiation absorbed at the upper layer, and internal heating due to the transmission of solar radiation to the sub-surface layers. The exchange of mechanical energy between the lake and the atmosphere is included through the coupling of thermal diffusivity and wind speed. The effects of discharge and intake by power plants are also included. The numerical model was calibrated by applying it to Cayuga Lake. The model was then verified through a long term simulation using Lake Keowee data base. The comparison between measured and predicted vertical temperature profiles for the nine years is good. The physical limnology of Lake Keowee is presented through a set of graphical representations of the measured data base.

  11. Kinematic Modelling and Simulation of a 2-R Robot Using SolidWorks and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Mahmoud Gouasmi

    2012-12-01

    Full Text Available The simulation of robot systems is becoming very popular, especially with the lowering of the cost of computers, and it can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. The trajectory planning of redundant manipulators is a very active area since many tasks require special characteristics to be satisfied. The importance of redundant manipulators has increased over the last two decades because of the possibility of avoiding singularities as well as obstacles within the course of motion. The angle that the last link of a 2 DOF manipulator makes with the x-axis is required in order to find the solution for the inverse kinematics problem. This angle could be optimized with respect to a given specified key factor (time, velocity, torques while the end-effector performs a chosen trajectory (i.e., avoiding an obstacle in the task space. Modeling and simulation of robots could be achieved using either of the following models: the geometrical model (positions, postures, the kinematic model and the dynamic model. To do so, the modelization of a 2-R robot type is implemented. Our main tasks are comparing two robot postures with the same trajectory (path and for the same length of time, and establishing a computing code to obtain the kinematic and dynamic parameters. SolidWorks and MATLAB/Simulink softwares are used to check the theory and the robot motion simulation. This could be easily generalized to a 3-R robot and possibly therefore to any serial robot (Scara, Puma, etc.. The verification of the obtained results by both softwares allows us to qualitatively evaluate and underline the validityof the chosen model and obtain the right conclusions. The results of the simulations are discussed and an agreement between the two softwares is certainly obtained.

  12. Modeling and experimental verification of laser self-mixing interference phenomenon with the structure of two-external-cavity feedback

    Science.gov (United States)

    Chen, Peng; Liu, Yuwei; Gao, Bingkun; Jiang, Chunlei

    2018-03-01

    A semiconductor laser employed with two-external-cavity feedback structure for laser self-mixing interference (SMI) phenomenon is investigated and analyzed. The SMI model with two directions based on F-P cavity is deduced, and numerical simulation and experimental verification were conducted. Experimental results show that the SMI with the structure of two-external-cavity feedback under weak light feedback is similar to the sum of two SMIs.

  13. Verification of the model of predisposition in triathlon – structural model of confirmative factor analysis

    Directory of Open Access Journals (Sweden)

    Lenka Kovářová

    2012-09-01

    Full Text Available BACKGROUND: The triathlon is a combination of three different types of sport – swimming, cycling, and running. Each of these requires different top level predispositions and complex approach to talent selection is a rather difficult process. Attempts to identify assumptions in the triathlon have so far been specific and focused only on some groups of predispositions (physiology, motor tests, and psychology. The latest studies missed the structural approach and were based on determinants of sport performance, theory of sports training and expert assessment. OBJECTIVE: The aim of our study was to verify the model of predisposition in the short triathlon for talent assessment of young male athletes age 17–20 years. METHODS: The research sample consisted of 55 top level triathletes – men, who were included in the Government supported sports talent programme in the Czech Republic at the age of 17–20 years. We used a confirmative factor analysis (FA and Path diagram to verify the model, which allow us to explain mutual relationships among observed variables. For statistical data processing we used a structure equating modeling (SEM by software Lisrel L88. RESULTS: The study confirms best structural model for talent selection in triathlon at the age of 17–20 years old men, which composed seventeen indicators (tests and explained 91% of all cross-correlations (Goodness of Fit Index /GFI/ 0.91, Root Mean Square Residual /RMSR/ 0.13. Tests for predispositions in triathlons were grouped into five items, three motor predispositions (swimming, cycling and running skills, aerobic and psychological predispositions. Aerobic predispositions showed the highest importance to the assumptions to the general factor (1.00; 0. Running predispositions were measured as a very significant factor (–0.85; 0.28 which confirms importance of this critical stage of the race. Lower factor weight showed clusters of swimming (–0.61; 0.63 and cycling (0.53; 0

  14. Presentation and verification of a simple mathematical model foridentification of the areas behind noise barrierwith the highest performance

    Directory of Open Access Journals (Sweden)

    M. Monazzam

    2009-07-01

    Full Text Available Background and aims   Traffic noise barriers are the most important measure to control the environmental noise pollution. Diffraction from top edge of noise barriers is the most important path of indirect sound wave moves towards receiver.Therefore, most studies are focused on  improvement of this kind.   Methods   T-shape profile barriers are one of the most successful barrier among many different profiles. In this investigation the theory of destructive effect of diffracted waves from real edge of barrier and the wave diffracted from image of the barrier with phase difference of radians is used. Firstly a simple mathematical representation of the zones behind rigid and absorbent T- shape barriers with the highest insertion loss using the destructive effect of indirect path via barrier  image is introduced and then two different profile reflective and absorption barrier is used for  verification of the introduced model   Results   The results are then compared with the results of a verified two dimensional boundary element method at 1/3 octave band frequencies and in a wide field behind those barriers. Avery good agreement between the results has been achieved. In this method effective height is used for any different profile barriers.   Conclusion   The introduced model is very simple, flexible and fast and could be used for choosing the best location of profile rigid and absorptive barriers to achieve the highest  performance.  

  15. Kinematic Modeling and Simulation of a 2-R Robot by Using Solid Works and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Fernini Brahim

    2012-05-01

    Full Text Available Simulation of robot systems which is getting very popular, especially with the lowering cost of computers, can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. Object staging modelisation using robots holds, wether for the object or the robot, the following models: The geometric one, the kinematics one and the dynamic one. To do so, the modelisation of a 2-R robot type is being implemented. Comparing between two robot postures with the same trajectory (path and for the same length of time and establishing a computing code to obtain the kinematic and dynamic parameters are the main tasks. SolidWorks and Matlab/Simulink softwares are used to check the theory and the robot motion simulation. The verification of the obtained results by both softwares allows us to, qualitatively evaluate ,underline the rightness of the chosen model and to get the right conclusions. The results of simulations were discussed. An agreement between the two softwares is certainly Obtained.

  16. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  17. Efficient Source Data Verification Using Statistical Acceptance Sampling : A Simulation Study

    NARCIS (Netherlands)

    van den Bor, Rutger M.; Oosterman, Bas; Oostendorp, MB; Grobbee, Diederick E.; Roes, Kit C B

    2016-01-01

    Background: One approach to increase the efficiency of clinical trial monitoring is to replace 100% source data verification (SDV) by verification of samples of source data. An intuitive strategy for determining appropriate sampling plans (ie, sample sizes and the maximum tolerable number of

  18. The Bakery Protocol: A Comparative Case-Study in Formal Verification

    NARCIS (Netherlands)

    Griffioen, D.; Korver, H.

    Groote and the second author verified (a version of) the Bakery Protocol in μCRL. Their process-algebraic verification is rather complex compared to the protocol. Now the question is: How do other verification techniques perform on this protocol? In this paper we present a new correctness proof by

  19. Finite element code FENIA verification and application for 3D modelling of thermal state of radioactive waste deep geological repository

    Science.gov (United States)

    Butov, R. A.; Drobyshevsky, N. I.; Moiseenko, E. V.; Tokarev, U. N.

    2017-11-01

    The verification of the FENIA finite element code on some problems and an example of its application are presented in the paper. The code is being developing for 3D modelling of thermal, mechanical and hydrodynamical (THM) problems related to the functioning of deep geological repositories. Verification of the code for two analytical problems has been performed. The first one is point heat source with exponential heat decrease, the second one - linear heat source with similar behavior. Analytical solutions have been obtained by the authors. The problems have been chosen because they reflect the processes influencing the thermal state of deep geological repository of radioactive waste. Verification was performed for several meshes with different resolution. Good convergence between analytical and numerical solutions was achieved. The application of the FENIA code is illustrated by 3D modelling of thermal state of a prototypic deep geological repository of radioactive waste. The repository is designed for disposal of radioactive waste in a rock at depth of several hundred meters with no intention of later retrieval. Vitrified radioactive waste is placed in the containers, which are placed in vertical boreholes. The residual decay heat of radioactive waste leads to containers, engineered safety barriers and host rock heating. Maximum temperatures and corresponding times of their establishment have been determined.

  20. A study of applications scribe frame data verifications using design rule check

    Science.gov (United States)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  1. Assessment of surface solar irradiance derived from real-time modelling techniques and verification with ground-based measurements

    Science.gov (United States)

    Kosmopoulos, Panagiotis G.; Kazadzis, Stelios; Taylor, Michael; Raptis, Panagiotis I.; Keramitsoglou, Iphigenia; Kiranoudis, Chris; Bais, Alkiviadis F.

    2018-02-01

    This study focuses on the assessment of surface solar radiation (SSR) based on operational neural network (NN) and multi-regression function (MRF) modelling techniques that produce instantaneous (in less than 1 min) outputs. Using real-time cloud and aerosol optical properties inputs from the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation (MSG) satellite and the Copernicus Atmosphere Monitoring Service (CAMS), respectively, these models are capable of calculating SSR in high resolution (1 nm, 0.05°, 15 min) that can be used for spectrally integrated irradiance maps, databases and various applications related to energy exploitation. The real-time models are validated against ground-based measurements of the Baseline Surface Radiation Network (BSRN) in a temporal range varying from 15 min to monthly means, while a sensitivity analysis of the cloud and aerosol effects on SSR is performed to ensure reliability under different sky and climatological conditions. The simulated outputs, compared to their common training dataset created by the radiative transfer model (RTM) libRadtran, showed median error values in the range -15 to 15 % for the NN that produces spectral irradiances (NNS), 5-6 % underestimation for the integrated NN and close to zero errors for the MRF technique. The verification against BSRN revealed that the real-time calculation uncertainty ranges from -100 to 40 and -20 to 20 W m-2, for the 15 min and monthly mean global horizontal irradiance (GHI) averages, respectively, while the accuracy of the input parameters, in terms of aerosol and cloud optical thickness (AOD and COT), and their impact on GHI, was of the order of 10 % as compared to the ground-based measurements. The proposed system aims to be utilized through studies and real-time applications which are related to solar energy production planning and use.

  2. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H.

    2006-09-01

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard

  3. A systematic approach for model verification: application on seven published activated sludge models.

    Science.gov (United States)

    Hauduc, H; Rieger, L; Takács, I; Héduit, A; Vanrolleghem, P A; Gillot, S

    2010-01-01

    The quality of simulation results can be significantly affected by errors in the published model (typing, inconsistencies, gaps or conceptual errors) and/or in the underlying numerical model description. Seven of the most commonly used activated sludge models have been investigated to point out the typing errors, inconsistencies and gaps in the model publications: ASM1; ASM2d; ASM3; ASM3 + Bio-P; ASM2d + TUD; New General; UCTPHO+. A systematic approach to verify models by tracking typing errors and inconsistencies in model development and software implementation is proposed. Then, stoichiometry and kinetic rate expressions are checked for each model and the errors found are reported in detail. An attached spreadsheet (see http://www.iwaponline.com/wst/06104/0898.pdf) provides corrected matrices with the calculations of all stoichiometric coefficients for the discussed biokinetic models and gives an example of proper continuity checks.

  4. Optimal Calculation of Residuals for ARMAX Models with Applications to Model Verification

    DEFF Research Database (Denmark)

    Knudsen, Torben

    1997-01-01

    Residual tests for sufficient model orders are based on the assumption that prediction errors are white when the model is correct. If an ARMAX system has zeros in the MA part which are close to the unit circle, then the standard predictor can have large transients. Even when the correct model...

  5. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  6. A verification of the bakery protocol combining algebraic and model-oriented techniques

    NARCIS (Netherlands)

    C. Brovedani; A.S. Klusener (Steven)

    1996-01-01

    textabstractIn this paper we give a specification of the so called Bakery protocol in an extension of the process algebra ACP with abstract datatypes. We prove that this protocol is equal to a Queue, modulo branching bisimulation equivalence. The verification is as follows. First we give a linear

  7. Verification of COMDES-II Systems Using UPPAAL with Model Transformation

    DEFF Research Database (Denmark)

    Xu, Ke; Pettersson, Paul; Sierszecki, Krzysztof

    2008-01-01

    in a timed multitasking environment, modal continuous operation combining reactive control behavior with continuous data processing, etc., by following the principle of separation-of-concerns. In the paper we present a transformational approach to the formal verification of both timing and reactive behaviors...

  8. Substantiation and verification of the heat exchange crisis model in a rod bundles by means of the KORSAR thermohydraulic code

    International Nuclear Information System (INIS)

    Bobkov, V.P.; Vinogradov, V.N.; Efanov, A.D.; Sergeev, V.V.; Smogalev, I.P.

    2003-01-01

    The results of verifying the model for calculating the heat exchange crisis in the uniformly heated rod bundles, realized in the calculation code of the improved evaluation KORSAR, are presented. The model for calculating the critical heat fluxes in this code is based on the tabular method. The experimental data bank of the Branch base center of the thermophysical data GNTs RF - FEhI for the rod bundles, structurally similar to the WWER fuel assemblies, was used by the verification within the wide range of parameters: pressure from 0.11 up to 20 MPa and mass velocity from 5- up to 5000 kg/(m 2 s) [ru

  9. Development and Verification of CFD Models for Modeling Wind Conditions on Forested Wind Turbine Sites

    DEFF Research Database (Denmark)

    Andersen, Morten Q.; Mortensen, Kasper; Nielsen, Daniel E.

    2009-01-01

    This paper describes a proposed CFD model to simulate the wind conditions on a forested site. The model introduces porous subdomains representing the forests in the terrain. Obtained simulation values are compared to field measurements in- and outside a forest. Initial results are very promising...

  10. Unsatisfying forecast of a Mediterranean cyclone: a verification study employing state-of-the-art techniques

    Science.gov (United States)

    Casaioli, M.; Mariani, S.; Accadia, C.; Tartaglione, N.; Speranza, A.; Lavagnini, A.; Bolliger, M.

    2006-09-01

    On 16-17 November 2000, a relatively intense precipitation event on the north-western Italy was heavily underestimated, mainly due to shifting error, by three operational 10-km limited area models (LAMs) which differ about basic equations, domain size, and parameterisation schemes. The scope of the work is to investigate possible common error-sources independent from the single model, in particular the effect of initialisation. Thus, the complex evolution over the western Mediterranean Sea of the cyclone responsible for the event was investigated. Several objective and subjective verification techniques have been employed to check one of the LAMs' forecast against the available observations (precipitation from rain gauge and retrieved from ground-based radar, and satellite-retrieved atmospheric humidity patterns). Despite a clear statement is not achieved, results indicate that high sensitivity to the initial conditions, and the inadequacy of the observational network on the southern Mediterranean area, can play a major role in producing the forecast shifting error on the target area.

  11. Unsatisfying forecast of a Mediterranean cyclone: a verification study employing state-of-the-art techniques

    Directory of Open Access Journals (Sweden)

    M. Casaioli

    2006-01-01

    Full Text Available On 16–17 November 2000, a relatively intense precipitation event on the north-western Italy was heavily underestimated, mainly due to shifting error, by three operational 10-km limited area models (LAMs which differ about basic equations, domain size, and parameterisation schemes. The scope of the work is to investigate possible common error-sources independent from the single model, in particular the effect of initialisation. Thus, the complex evolution over the western Mediterranean Sea of the cyclone responsible for the event was investigated. Several objective and subjective verification techniques have been employed to check one of the LAMs' forecast against the available observations (precipitation from rain gauge and retrieved from ground-based radar, and satellite-retrieved atmospheric humidity patterns. Despite a clear statement is not achieved, results indicate that high sensitivity to the initial conditions, and the inadequacy of the observational network on the southern Mediterranean area, can play a major role in producing the forecast shifting error on the target area.

  12. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    Science.gov (United States)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  13. Verification and validation of models: far-field modelling of radionuclide migration

    International Nuclear Information System (INIS)

    Porter, J.D.; Herbert, A.W.; Clarke, D.S.; Roe, P.; Vassilic Melling, D.; Einfeldt, B.; Mackay, R.; Glendinning, R.

    1992-01-01

    The aim of this project was to improve the capability, efficiency and realism of the NAMMU and NAPSAC codes, which simulate groundwater flow and solute transport. Using NAMMU, various solution methods for non linear problems were investigated. The Broyden method gave a useful reduction in computing time and appeared robust. The relative saving obtained with this method increased with the problem size. This was also the case when parameter stepping was used. The existing empirical sorption models in NAMMU were generalized and a ternary heterogeneous ion exchange model was added. These modifications were tested and gave excellent results. The desirability of coupling NAMMU to an existing geochemical speciation code was assessed

  14. A room acoustical computer model for industrial environments - the model and its verification

    DEFF Research Database (Denmark)

    Christensen, Claus Lynge; Foged, Hans Torben

    1998-01-01

    sources. Combining these three source types it is possible to model huge machinery in an easy and visually clear way. Traditionally room acoustic simulations have been aimed at auditorium acoustics. The aim of the simulations has been to model the room acoustic measuring setup consisting...... of an omnidirectional sound source and a microphone. This allows the comparison of simulated results with the ones measured in real rooms. However when simulating the acoustic environment in industrial rooms, the sound sources are often far from being point like, as they can be distributed over a large space...

  15. Development of the VESUVIUS module. Molten jet breakup modeling and model verification

    Energy Technology Data Exchange (ETDEWEB)

    Vierow, K. [Nuclear Power Engineering Corp., Tokyo (Japan); Nagano, Katsuhiro; Araki, Kazuhiro

    1998-01-01

    With the in-vessel vapor explosion issue ({alpha}-mode failure) now considered to pose an acceptably small risk to the safety of a light water reactor, ex-vessel vapor explosions are being given considerable attention. Attempts are being made to analytically model breakup of continuous-phase jets, however uncertainty exists regarding the basic phenomena. In addition, the conditions upon reactor vessel failure, which determine the starting point of the ex-vessel vapor explosion process, are difficult to quantify. Herein, molten jet ejection from the reactor pressure vessel is characterized. Next, the expected mode of jet breakup is determined and the current state of analytical modeling is reviewed. A jet breakup model for ex-vessel scenarios, with the primary breakup mechanism being the Kelvin-Helmholtz instability, is described. The model has been incorporated into the VESUVIUS module and comparisons of VESUVIUS calculations against FARO L-06 experimental data show differences, particularly in the pressure curve and amount of jet breakup. The need for additional development to resolve these differences is discussed. (author)

  16. Modeling of containment response for Krsko NPP Full Scope Simulator verification

    International Nuclear Information System (INIS)

    Kljenak, I.; Skerlavaj, A.

    2000-01-01

    Containment responses during the first 10000 s of Anticipated Transient Without Scram and Small Break Loss-of-Coolant Accident scenarios in the Krsko two-loop Westinghouse pressurized water reactor nuclear power plant were simulated with the CONTAIN computer code. Sources of coolant were obtained from simulations with the RELAP5 code. The simulations were carried out so that the results could be used for the verification of the Krsko Full Scope Simulator. (author)

  17. Mathematical modeling of a fluidized bed rice husk gasifier: Part 3 -- Model verification

    Energy Technology Data Exchange (ETDEWEB)

    Mansaray, K.G.; Ghaly, A.E.; Al-Taweel, A.M.; Ugursal, V.I.; Hamdullahpur, F.

    2000-04-01

    The validity of the two-compartment model developed for fluidized bed gasification of biomass was tested using experimental data obtained from a dual-distributor-type fluidized bed gasifier. The fluidized bed was operated on rice husks at various bed heights (19.5, 25.5, and 31.5 cm), fluidization velocities (0.22, 0.28, and 0.33 m/s), and equivalence ratios (0.25, 0.30, and 0.35). The model gave reasonable predictions of the core, annulus, and exit temperatures as well as the mole fractions of the combustible gas components and product gas higher heating value, except for the overall carbon conversion, which was overestimated. This could be attributed to uncertainties in the sampling procedure.

  18. Mathematical modeling of a fluidized bed rice husk gasifier: Part 3 - Model verification

    Energy Technology Data Exchange (ETDEWEB)

    Mansaray, K.G.; Ghaly, A.E.; Al-Taweel, A.M.; Ugursal, V.I.; Hamdullahpur, F.

    2000-03-01

    The validity of the two-compartment model developed for fluidized bed gasification of biomass was tested using experimental data obtained from a dual-distributor-type fluidized bed gasifier. The fluidized bed was operated on rice husks at various bed heights (19.5, 25.5, and 31.5 cm), fluidization velocities (0.22, 0.28, and 0.33 m/s), and equivalence ratios (0.25, 0.30, and 0.35). The model gave reasonable predictions of the core, annulus, and exit temperatures as well as the mole fractions of the combustible gas components and product gas higher heating value, except for the overall carbon conversion, which was overestimated. This could be attributed to uncertainties in the sampling procedure. (Author)

  19. Modeling radon entry into houses with basements: Model description and verification

    International Nuclear Information System (INIS)

    Revzan, K.L.; Fisk, W.J.; Gadgil, A.J.

    1991-01-01

    We model radon entry into basements using a previously developed three-dimensional steady-state finite difference model that has been modified in the following ways: first, cylindrical coordinates are used to take advantage of the symmetry of the problem in the horizontal plant; second, the configuration of the basement has been made more realistic by incorporating the concrete footer; third, a quadratic relationship between the pressure and flow in the L-shaped gap between slab, footer, and wall has been employed; fourth, the natural convection of the soil gas which follows from the heating of the basement in winter has been taken into account. The temperature field in the soil is determined from the equation of energy conservation, using the basement, surface, and deep-soil temperatures as boundary conditions. The pressure field is determined from Darcy's law and the equation of mass conservation (continuity), assuming that there is no flow across any boundary except the soil surface (atmospheric pressure) and the opening in the basement shell (fixed pressure). After the pressure and temperatures field have been obtained the velocity field is found from Darcy's law. Finally, the radon concentration field is found from the equation of mass-transport. The convective radon entry rate through the opening or openings is then calculated. In this paper we describe the modified model, compare the predicted radon entry rates with and without the consideration of thermal convection, and compare the predicted rates with determined from data from 7 houses in the Spokane River valley of Washington and Idaho. Although the predicted rate is much lower than the mean of the rates determined from measurements, errors in the measurement of soil permeability and variations in the permeability of the area immediately under the basement slab, which has a significant influence on the pressure field, can account for the range of entry rates inferred from the data. 25 refs., 8 figs

  20. Towards the Availability of the Distributed Cluster Rendering System: Automatic Modeling and Verification

    DEFF Research Database (Denmark)

    Wang, Kemin; Jiang, Zhengtao; Wang, Yongbin

    2012-01-01

    In this study, we proposed a Continuous Time Markov Chain Model towards the availability of n-node clusters of Distributed Rendering System. It's an infinite one, we formalized it, based on the model, we implemented a software, which can automatically model with PRISM language. With the tool, whe...

  1. State Token Petri Net modeling method for formal verification of computerized procedure including operator's interruptions of procedure execution flow

    International Nuclear Information System (INIS)

    Kim, Yun Goo; Seong, Poong Hyun

    2012-01-01

    The Computerized Procedure System (CPS) is one of the primary operating support systems in the digital Main Control Room. The CPS displays procedure on the computer screen in the form of a flow chart, and displays plant operating information along with procedure instructions. It also supports operator decision making by providing a system decision. A procedure flow should be correct and reliable, as an error would lead to operator misjudgement and inadequate control. In this paper we present a modeling for the CPS that enables formal verification based on Petri nets. The proposed State Token Petri Nets (STPN) also support modeling of a procedure flow that has various interruptions by the operator, according to the plant condition. STPN modeling is compared with Coloured Petri net when they are applied to Emergency Operating Computerized Procedure. A converting program for Computerized Procedure (CP) to STPN has been also developed. The formal verification and validation methods of CP with STPN increase the safety of a nuclear power plant and provide digital quality assurance means that are needed when the role and function of the CPS is increasing.

  2. The USP Performance Verification Test, Part II: collaborative study of USP's Lot P Prednisone Tablets.

    Science.gov (United States)

    Glasgow, Maria; Dressman, Shawn; Brown, William; Foster, Thomas; Schuber, Stefan; Manning, Ronald G; Wahab, Samir Z; Williams, Roger L; Hauck, Walter W

    2008-05-01

    Periodic performance verification testing (PVT) is used by laboratories to assess and demonstrate proficiency and for other purposes as well. For dissolution, the PVT is specified in the US Pharmacopeia General Chapter Dissolution under the title Apparatus Suitability Test. For Apparatus 1 and 2, USP provides two reference standard tablets for this purpose. For each new lot of these reference standards, USP conducts a collaborative study. For new USP Lot P Prednisone Tablets, 28 collaborating laboratories provided data. The study was conducted with three sets of tablets: Lot O open label, Lot O blinded, and Lot P blinded. The blinded Lot O data were used for apparatus suitability testing. Acceptance limits were determined after dropping data due to failure of apparatus suitability, identification of data as unusual on control charts, or protocol violations. Results yielded acceptance criteria of (47, 82) for Apparatus 1 and (37, 70) for Apparatus 2. Results generally were similar for Lot P compared to results from Lot O except that the average percent dissolved for Lot P is greater than for Lot O with Apparatus 2.

  3. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    Science.gov (United States)

    Zavodsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use

  4. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    Directory of Open Access Journals (Sweden)

    Tuo Ming Fu

    2016-01-01

    Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion

  5. Tracer experiment data sets for the verification of local and meso-scale atmospheric dispersion models including topographic effects

    International Nuclear Information System (INIS)

    Sartori, E.; Schuler, W.

    1992-01-01

    Software and data for nuclear energy applications are acquired, tested and distributed by several information centres; in particular, relevant computer codes are distributed internationally by the OECD/NEA Data Bank (France) and by ESTSC and EPIC/RSIC (United States). This activity is coordinated among the centres and is extended outside the OECD area through an arrangement with the IAEA. This article proposes more specifically a scheme for acquiring, storing and distributing atmospheric tracer experiment data (ATE) required for verification of atmospheric dispersion models especially the most advanced ones including topographic effects and specific to the local and meso-scale. These well documented data sets will form a valuable complement to the set of atmospheric dispersion computer codes distributed internationally. Modellers will be able to gain confidence in the predictive power of their models or to verify their modelling skills. (au)

  6. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  7. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  8. Bias associated with delayed verification in test accuracy studies: accuracy of tests for endometrial hyperplasia may be much higher than we think!

    Directory of Open Access Journals (Sweden)

    Coomarasamy Aravinthan

    2004-05-01

    Full Text Available Abstract Background To empirically evaluate bias in estimation of accuracy associated with delay in verification of diagnosis among studies evaluating tests for predicting endometrial hyperplasia. Methods Systematic reviews of all published research on accuracy of miniature endometrial biopsy and endometr ial ultrasonography for diagnosing endometrial hyperplasia identified 27 test accuracy studies (2,982 subjects. Of these, 16 had immediate histological verification of diagnosis while 11 had verification delayed > 24 hrs after testing. The effect of delay in verification of diagnosis on estimates of accuracy was evaluated using meta-regression with diagnostic odds ratio (dOR as the accuracy measure. This analysis was adjusted for study quality and type of test (miniature endometrial biopsy or endometrial ultrasound. Results Compared to studies with immediate verification of diagnosis (dOR 67.2, 95% CI 21.7–208.8, those with delayed verification (dOR 16.2, 95% CI 8.6–30.5 underestimated the diagnostic accuracy by 74% (95% CI 7%–99%; P value = 0.048. Conclusion Among studies of miniature endometrial biopsy and endometrial ultrasound, diagnostic accuracy is considerably underestimated if there is a delay in histological verification of diagnosis.

  9. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  10. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  11. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Today formal verification is finding increasing acceptance in some areas, especially model abstraction and functional verification. Other major chal- lenges, like timing verification, remain before this technology can be posed as a complete alternative to simulation. This special issue is devoted to presenting some of the ...

  12. Quality assurance of sterilized products: verification of a model relating frequency of contaminated items and increasing radiation dose

    International Nuclear Information System (INIS)

    Khan, A.A.; Tallentire, A.; Dwyer, J.

    1977-01-01

    Values of the γ-radiation resistance parameters (k and n of the 'multi-hit' expression) for Bacillus pumilus E601 spores and Serratia marcescens cells have been determined and the constancy of values for a given test condition demonstrated. These organisms, differing by a factor of about 50 in k value, have been included in a laboratory test system for use in verification of a model describing the dependence of the proportion of contaminated items in a population of items on radiation dose. The proportions of contaminated units of the test system at various γ-radiation doses have been measured for different initial numbers and types of organisms present in units either singly or together. Using the model, the probabilities of contaminated units for corresponding sets of conditions have been evaluated together with associated variances. Measured proportions and predicted probabilities agree well, showing that the model holds in a laboratory contrived situation. (author)

  13. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    OpenAIRE

    M. A. Lukin

    2014-01-01

    The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maxi...

  14. Documentation, User Support, and Verification of Wind Turbine and Plant Models

    Energy Technology Data Exchange (ETDEWEB)

    Robert Zavadil; Vadim Zheglov; Yuriy Kazachkov; Bo Gong; Juan Sanchez; Jun Li

    2012-09-18

    As part of the Utility Wind Energy Integration Group (UWIG) and EnerNex's Wind Turbine Modeling Project, EnerNex has received ARRA (federal stimulus) funding through the Department of Energy (DOE) to further the progress of wind turbine and wind plant models. Despite the large existing and planned wind generation deployment, industry-standard models for wind generation have not been formally adopted. Models commonly provided for interconnection studies are not adequate for use in general transmission planning studies, where public, non-proprietary, documented and validated models are needed. NERC MOD (North American Electric Reliability Corporation) reliability standards require that power flow and dynamics models be provided, in accordance with regional requirements and procedures. The goal of this project is to accelerate the appropriate use of generic wind turbine models for transmission network analysis by: (1) Defining proposed enhancements to the generic wind turbine model structures that would allow representation of more advanced; (2) Comparative testing of the generic models against more detailed (and sometimes proprietary) versions developed by turbine vendors; (3) Developing recommended parameters for the generic models to best mimic the performance of specific commercial wind turbines; (4) Documenting results of the comparative simulations in an application guide for users; (5) Conducting technology transfer activities in regional workshops for dissemination of knowledge and information gained, and to engage electric power and wind industry personnel in the project while underway; (6) Designing of a "living" homepage to establish an online resource for transmission planners.

  15. Towards a phase field model of the microstructural evolution of duplex steel with experimental verification

    DEFF Research Database (Denmark)

    Poulsen, Stefan Othmar; Voorhees, P.W.; Lauridsen, Erik Mejdal

    2012-01-01

    A phase field model to study the microstructural evolution of a polycrystalline dual-phase material with conserved phase fraction has been implemented, and 2D simulations have been performed. For 2D simulations, the model predicts the cubic growth well-known for diffusion-controlled systems. Some...

  16. MARATHON Verification (MARV)

    Science.gov (United States)

    2017-08-01

    the verification and "lessons learned " from the semantic and technical issues we discovered as we implemented the approach. 15. SUBJECT TERMS...any programming language at use at CAA for modeling or other data analysis applications, to include R, Python , Scheme, Common Lisp, Julia, Mathematica

  17. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  18. Automatic verification of SSD and generation of respiratory signal with lasers in radiotherapy: a preliminary study.

    Science.gov (United States)

    Prabhakar, Ramachandran

    2012-01-01

    Source to surface distance (SSD) plays a very important role in external beam radiotherapy treatment verification. In this study, a simple technique has been developed to verify the SSD automatically with lasers. The study also suggests a methodology for determining the respiratory signal with lasers. Two lasers, red and green are mounted on the collimator head of a Clinac 2300 C/D linac along with a camera to determine the SSD. A software (SSDLas) was developed to estimate the SSD automatically from the images captured by a 12-megapixel camera. To determine the SSD to a patient surface, the external body contour of the central axis transverse computed tomography (CT) cut is imported into the software. Another important aspect in radiotherapy is the generation of respiratory signal. The changes in the lasers separation as the patient breathes are converted to produce a respiratory signal. Multiple frames of laser images were acquired from the camera mounted on the collimator head and each frame was analyzed with SSDLas to generate the respiratory signal. The SSD as observed with the ODI on the machine and SSD measured by the SSDlas software was found to be within the tolerance limit. The methodology described for generating the respiratory signals will be useful for the treatment of mobile tumors such as lung, liver, breast, pancreas etc. The technique described for determining the SSD and the generation of respiratory signals using lasers is cost effective and simple to implement. Copyright © 2011 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. Predicted and actual indoor environmental quality: Verification of occupants' behaviour models in residential buildings

    DEFF Research Database (Denmark)

    Andersen, Rune Korsholm; Fabi, Valentina; Corgnati, Stefano P.

    2016-01-01

    Occupants' interactions with the building envelope and building systems can have a large impact on the indoor environment and energy consumption in a building. As a consequence, any realistic forecast of building performance must include realistic models of the occupants' interactions...... with the building controls (windows, thermostats, solar shading etc.). During the last decade, studies about stochastic models of occupants' behaviour in relation to control of the indoor environment have been published. Often the overall aim of these models is to enable more reliable predictions of building...... performance using building energy performance simulations (BEPS). However, the validity of these models has only been sparsely tested. In this paper, stochastic models of occupants' behaviour from literature were tested against measurements in five apartments. In a monitoring campaign, measurements of indoor...

  20. Bias associated with delayed verification in test accuracy studies: accuracy of tests for endometrial hyperplasia may be much higher than we think!

    OpenAIRE

    Clark, T Justin; ter Riet, Gerben; Coomarasamy, Aravinthan; Khan, Khalid S

    2004-01-01

    Abstract Background To empirically evaluate bias in estimation of accuracy associated with delay in verification of diagnosis among studies evaluating tests for predicting endometrial hyperplasia. Methods Systematic reviews of all published research on accuracy of miniature endometrial biopsy and endometr ial ultrasonography for diagnosing endometrial hyperplasia identified 27 test accuracy studies (2,982 subjects). Of these, 16 had immediate histological verification of diagnosis while 11 ha...

  1. Real-Time Kennedy Space Center and Cape Canaveral Air Force Station High-Resolution Model Implementation and Verification

    Science.gov (United States)

    Shafer, Jaclyn A.; Watson, Leela R.

    2015-01-01

    Customer: NASA's Launch Services Program (LSP), Ground Systems Development and Operations (GSDO), and Space Launch System (SLS) programs. NASA's LSP, GSDO, SLS and other programs at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) use the daily and weekly weather forecasts issued by the 45th Weather Squadron (45 WS) as decision tools for their day-to-day and launch operations on the Eastern Range (ER). For example, to determine if they need to limit activities such as vehicle transport to the launch pad, protect people, structures or exposed launch vehicles given a threat of severe weather, or reschedule other critical operations. The 45 WS uses numerical weather prediction models as a guide for these weather forecasts, particularly the Air Force Weather Agency (AFWA) 1.67 kilometer Weather Research and Forecasting (WRF) model. Considering the 45 WS forecasters' and Launch Weather Officers' (LWO) extensive use of the AFWA model, the 45 WS proposed a task at the September 2013 Applied Meteorology Unit (AMU) Tasking Meeting requesting the AMU verify this model. Due to the lack of archived model data available from AFWA, verification is not yet possible. Instead, the AMU proposed to implement and verify the performance of an ER version of the AMU high-resolution WRF Environmental Modeling System (EMS) model (Watson 2013) in real-time. The tasking group agreed to this proposal; therefore the AMU implemented the WRF-EMS model on the second of two NASA AMU modeling clusters. The model was set up with a triple-nested grid configuration over KSC/CCAFS based on previous AMU work (Watson 2013). The outer domain (D01) has 12-kilometer grid spacing, the middle domain (D02) has 4-kilometer grid spacing, and the inner domain (D03) has 1.33-kilometer grid spacing. The model runs a 12-hour forecast every hour, D01 and D02 domain outputs are available once an hour and D03 is every 15 minutes during the forecast period. The AMU assessed the WRF-EMS 1

  2. Verification of the New FAST v8 Capabilities for the Modeling of Fixed-Bottom Offshore Wind Turbines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Barahona, B.; Jonkman, J.; Damiani, R.; Robertson, A.; Hayman, G.

    2014-12-01

    Coupled dynamic analysis has an important role in the design of offshore wind turbines because the systems are subject to complex operating conditions from the combined action of waves and wind. The aero-hydro-servo-elastic tool FAST v8 is framed in a novel modularization scheme that facilitates such analysis. Here, we present the verification of new capabilities of FAST v8 to model fixed-bottom offshore wind turbines. We analyze a series of load cases with both wind and wave loads and compare the results against those from the previous international code comparison projects-the International Energy Agency (IEA) Wind Task 23 Subtask 2 Offshore Code Comparison Collaboration (OC3) and the IEA Wind Task 30 OC3 Continued (OC4) projects. The verification is performed using the NREL 5-MW reference turbine supported by monopile, tripod, and jacket substructures. The substructure structural-dynamics models are built within the new SubDyn module of FAST v8, which uses a linear finite-element beam model with Craig-Bampton dynamic system reduction. This allows the modal properties of the substructure to be synthesized and coupled to hydrodynamic loads and tower dynamics. The hydrodynamic loads are calculated using a new strip theory approach for multimember substructures in the updated HydroDyn module of FAST v8. These modules are linked to the rest of FAST through the new coupling scheme involving mapping between module-independent spatial discretizations and a numerically rigorous implicit solver. The results show that the new structural dynamics, hydrodynamics, and coupled solutions compare well to the results from the previous code comparison projects.

  3. Modelling and Analysis of Smart Grid: A Stochastic Model Checking Case Study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Zhu, Huibiao; Nielson, Hanne Riis

    2012-01-01

    that require novel methods and applications. In this context, an important issue is the verification of certain quantitative properties of the system. In this paper, we consider a specific Chinese Smart Grid implementation as a case study and address the verification problem for performance and energy......Cyber-physical systems integrate information and communication technology functions to the physical elements of a system for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... consumption. We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker....

  4. Quantitative modelling and analysis of a Chinese smart grid: a stochastic model checking case study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2014-01-01

    that require novel methods and applications. One of the important issues in this context is the verification of certain quantitative properties of the system. In this paper, we consider a specific Chinese smart grid implementation as a case study and address the verification problem for performance and energy......Cyber-physical systems integrate information and communication technology with the physical elements of a system, mainly for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... consumption.We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker....

  5. Simulating the Water Use of Thermoelectric Power Plants in the United States: Model Development and Verification

    Science.gov (United States)

    Betrie, G.; Yan, E.; Clark, C.

    2016-12-01

    Thermoelectric power plants use the highest amount of freshwater second to the agriculture sector. However, there is scarcity of information that characterizes the freshwater use of these plants in the United States. This could be attributed to the lack of model and data that are required to conduct analysis and gain insights. The competition for freshwater among sectors will increase in the future as the amount of freshwater gets limited due climate change and population growth. A model that makes use of less data is urgently needed to conduct analysis and identify adaptation strategies. The objectives of this study are to develop a model and simulate the water use of thermoelectric power plants in the United States. The developed model has heat-balance, climate, cooling system, and optimization modules. It computes the amount of heat rejected to the environment, estimates the quantity of heat exchanged through latent and sensible heat to the environment, and computes the amount of water required per unit generation of electricity. To verify the model, we simulated a total of 876 fossil-fired, nuclear and gas-turbine power plants with different cooling systems (CS) using 2010-2014 data obtained from Energy Information Administration. The CS includes once-through with cooling pond, once-through without cooling ponds, recirculating with induced draft and recirculating with induced draft natural draft. The results show that the model reproduced the observed water use per unit generation of electricity for the most of the power plants. It is also noticed that the model slightly overestimates the water use during the summer period when the input water temperatures are higher. We are investigating the possible reasons for the overestimation and address it in the future work. The model could be used individually or coupled to regional models to analyze various adaptation strategies and improve the water use efficiency of thermoelectric power plants.

  6. Simulation studies for the in-vivo dose verification of particle therapy

    International Nuclear Information System (INIS)

    Rohling, Heide

    2015-01-01

    An increasing number of cancer patients is treated with proton beams or other light ion beams which allow to deliver dose precisely to the tumor. However, the depth dose distribution of these particles, which enables this precision, is sensitive to deviations from the treatment plan, as e.g. anatomical changes. Thus, to assure the quality of the treatment, a non-invasive in-vivo dose verification is highly desired. This monitoring of particle therapy relies on the detection of secondary radiation which is produced by interactions between the beam particles and the nuclei of the patient's tissue. Up to now, the only clinically applied method for in-vivo dosimetry is Positron Emission Tomography which makes use of the β + -activity produced during the irradiation (PT-PET). Since from a PT-PET measurement the applied dose cannot be directly deduced, the simulated distribution of β + -emitting nuclei is used as a basis for the analysis of the measured PT-PET data. Therefore, the reliable modeling of the production rates and the spatial distribution of the β + -emitters is required. PT-PET applied during instead of after the treatment is referred to as in-beam PET. A challenge concerning in-beam PET is the design of the PET camera, because a standard full-ring scanner is not feasible. Thus, for in-beam PET and PGI dedicated detection systems and, moreover, profound knowledge about the corresponding radiation fields are required. Using various simulation codes, this thesis contributes to the modelling of the β + -emitters and photons produced during particle irradiation, as well as to the evaluation and optimization of hardware for both techniques. Concerning the modeling of the production of the relevant β + -emitters, the abilities of the Monte Carlo simulation code PHITS and of the deterministic, one-dimensional code HIBRAC were assessed. HIBRAC was substantially extended to enable the modeling of the depth-dependent yields of specific nuclides. For proton

  7. Verification of simulation model with COBRA-IIIP code by confrontment of experimental results

    International Nuclear Information System (INIS)

    Silva Galetti, M.R. da; Pontedeiro, A.C.; Oliveira Barroso, A.C. de

    1985-01-01

    It is presented an evaluation of the COBRA IIIP/MIT code (of thermal hydraulic analysis by subchannels), comparing their results with experimental data obtained in stationary and transient regimes. It was done a study to calculate the spatial and temporal critical heat flux. It is presented a sensitivity study of simulation model related to the turbulent mixture and the number of axial intervals. (M.C.K.) [pt

  8. Outcomes of the JNT 1955 Phase I Viability Study of Gamma Emission Tomography for Spent Fuel Verification

    Energy Technology Data Exchange (ETDEWEB)

    Jacobsson-Svard, Staffan; Smith, Leon E.; White, Timothy; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, Holly R.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2017-05-17

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly has been assessed within the IAEA Support Program project JNT 1955, phase I, which was completed and reported to the IAEA in October 2016. Two safeguards verification objectives were identified in the project; (1) independent determination of the number of active pins that are present in a measured assembly, in the absence of a priori information about the assembly; and (2) quantitative assessment of pin-by-pin properties, for example the activity of key isotopes or pin attributes such as cooling time and relative burnup, under the assumption that basic fuel parameters (e.g., assembly type and nominal fuel composition) are known. The efficacy of GET to meet these two verification objectives was evaluated across a range of fuel types, burnups and cooling times, while targeting a total interrogation time of less than 60 minutes. The evaluations were founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types were used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. The simulated instrument response data were then processed using a variety of tomographic-reconstruction and image-processing methods, and scoring metrics were defined and used to evaluate the performance of the methods.This paper describes the analysis framework and metrics used to predict tomographer performance. It also presents the design of a “universal” GET (UGET) instrument intended to support the full range of verification scenarios envisioned by the IAEA. Finally, it gives examples of the expected partial-defect detection capabilities for some fuels and diversion scenarios, and it provides a comparison of predicted performance for the notional UGET design and an optimized variant of an existing IAEA instrument.

  9. Field verification of linear and nonlinear hybrid wave models for offshore tower response prediction

    Energy Technology Data Exchange (ETDEWEB)

    Couch, A.T. [Hudson Engineering, Houston, TX (United States). Offshore Structural Div.; Conte, J.P. [Rice Univ., Houston, TX (United States). Dept. of Civil Engineering

    1996-12-31

    Accuracy of the prediction of the dynamic response of deepwater fixed offshore platforms to irregular sea waves depends very much on the theory used to determine water kinematics from the mudline to the free surface. A common industry practice consists of using linear wave theory, which assumes infinitesimal wave steepness, in conjunction with empirical wave stretching techniques to provide a more realistic representation of near surface kinematics. The current velocity field is then added to the wave-induced fluid velocity field and the wave-and-current forces acting on the structure are computed via Morrison`s equation. The first objective of this study is to compare the predicted responses of Cognac, a deepwater fixed platform, obtained from various empirically stretched linear wave models with the response of Cognac predicted based on the Hybrid Wave Model. The latter is a recently developed higher-order, and therefore more accurate, wave model which satisfies, up to the second-order in wave steepness, the local mass conservation and the free surface boundary conditions up to the free surface. The second objective of this study consists of comparing the various analytical response predictions with the measured response of the Cognac platform. Availability of a set of oceanographic and structural vibration data for Cognac provides a unique opportunity to evaluate the prediction ability of traditional analytical models used in designing such structures. The results of this study indicate that (1) the use of the Hybrid Wave Model provides a predicted platform response which is in closer agreement with the measured response than the predictions based on the various stretched linear wave models; and (2) the Wheeler stretching technique produces platform response results which are more accurate than those obtained by using the other stretching schemes considered here.

  10. Migration of 90Sr, 137Cs and Pu in soils. Verification of a computer model on the behaviour of these radiocontaminants in soils of Western Europe

    International Nuclear Information System (INIS)

    Frissel, M.J.; Poelstra, P.; Klugt, N. van der.

    1980-01-01

    The main emphasis in 1979 was on the 239 240 Pu model for simulating translocations in soil. The verification was hampered because data for 239 Pu were available from only two locations. A comparison between the observed and predicted Pu distribution however indicated the possibility of using the available simulation approach for 239 240 Pu. (Auth.)

  11. DISCRETE DYNAMIC MODEL OF BEVEL GEAR – VERIFICATION THE PROGRAM SOURCE CODE FOR NUMERICAL SIMULATION

    Directory of Open Access Journals (Sweden)

    Krzysztof TWARDOCH

    2014-06-01

    Full Text Available In the article presented a new model of physical and mathematical bevel gear to study the influence of design parameters and operating factors on the dynamic state of the gear transmission. Discusses the process of verifying proper operation of copyright calculation program used to determine the solutions of the dynamic model of bevel gear. Presents the block diagram of a computing algorithm that was used to create a program for the numerical simulation. The program source code is written in an interactive environment to perform scientific and engineering calculations, MATLAB

  12. Developpement of a GoldSim Biosphere Model, Evaluation, and Its Verification

    International Nuclear Information System (INIS)

    Lee, Youn Myoung; Hwang, Yong Soo

    2009-12-01

    For the purpose of evaluating dose rate to individual due to long-term release of nuclides from the repository for an HLW or a pyroprocessing repository, a biosphere assessment model and the implemented program based on BIOMASS methodology have been developed by utilizing GoldSim, a general model developing tool. To show its practicability and usability as well as to see the sensitivity of parametric and scenario variations to the annual exposure, some probabilistic calculations are made and investigated. For the cases when changing the exposure groups and associated GBIs as well as varying selected input values, all of which seem important for the biosphere evaluation, dose rate per nuclide release rate is probabilistically calculated and analyzed. A series of comparison studies with JAEA, Japan have been also carried out to verify the model

  13. A fission gas release model for MOX fuel and its verification

    International Nuclear Information System (INIS)

    Koo, Y.H.; Sohn, D.S.; Strijov, P.

    2000-01-01

    A fission gas release model for MOX fuel has been developed based on a model for UO 2 fuel. Using the concept of equivalent cell, the model considers the uneven distribution of Pu within the fuel matrix and a number of Pu-rich particles that could lead to a non-uniform fission rate and fission gas distribution across the fuel pellet. The model has been incorporated into a code, COSMOS, and some parametric studies were made to analyze the effect of the size and Pu content of Pu-rich agglomerates. The model was then applied to the experimental data obtained from the FIGARO program, which consisted of the base irradiation of MOX fuels in the BEZNAU-1 PWR and the subsequent irradiation of four refabricated fuel segments in the Halden reactor. The calculated gas releases show good agreement with the measured ones. In addition, the present analysis indicates that the microstructure of the MOX fuel used in the FIGARO program is such that it has produced little difference in terms of gas release compared with UO 2 fuel. (author)

  14. Modeling and experimental verification of thermally induced residual stress in RF-MEMS

    International Nuclear Information System (INIS)

    Somà, Aurelio; Saleem, Muhammad Mubasher

    2015-01-01

    Electrostatically actuated radio frequency microelectromechanical systems (RF-MEMS) generally consist of microcantilevers and clamped–clamped microbeams. The presence of residual stress in these microstructures affects the static and dynamic behavior of the device. In this study, nonlinear finite element method (FEM) modeling and the experimental validation of residual stress induced in the clamped–clamped microbeams and the symmetric toggle RF-MEMS switch (STS) is presented. The formation of residual stress due to plastic deformation during the thermal loading-unloading cycle in the plasma etching step of the microfabrication process is explained and modeled using the Bauschinger effect. The difference between the designed and the measured natural frequency and pull-in voltage values for the clamped–clamped microbeams is explained by the presence of the nonhomogenous tensile residual stress. For the STS switch specimens, three-dimensional (3D) FEM models are developed and the initial deflection at zero bias voltage, observed during the optical profile measurements, is explained by the residual stress developed during the plasma etching step. The simulated residual stress due to the plastic deformation is included in the STS models to obtain the switch pull-in voltage. At the end of the simulation process, a good correspondence is obtained between the FEM model results and the experimental measurements for both the clamped–clamped microbeams and the STS switch specimens. (paper)

  15. [Verification of the double dissociation model of shyness using the implicit association test].

    Science.gov (United States)

    Fujii, Tsutomu; Aikawa, Atsushi

    2013-12-01

    The "double dissociation model" of shyness proposed by Asendorpf, Banse, and Mtücke (2002) was demonstrated in Japan by Aikawa and Fujii (2011). However, the generalizability of the double dissociation model of shyness was uncertain. The present study examined whether the results reported in Aikawa and Fujii (2011) would be replicated. In Study 1, college students (n = 91) completed explicit self-ratings of shyness and other personality scales. In Study 2, forty-eight participants completed IAT (Implicit Association Test) for shyness, and their friends (n = 141) rated those participants on various personality scales. The results revealed that only the explicit self-concept ratings predicted other-rated low praise-seeking behavior, sociable behavior and high rejection-avoidance behavior (controlled shy behavior). Only the implicit self-concept measured by the shyness IAT predicted other-rated high interpersonal tension (spontaneous shy behavior). The results of this study are similar to the findings of the previous research, which supports generalizability of the double dissociation model of shyness.

  16. Implementation and verification of a coupled fire model as a thermal boundary condition within P3/THERMAL

    International Nuclear Information System (INIS)

    Hensinger, D.M.; Gritzo, L.A.; Koski, J.A.

    1996-01-01

    A user-defined boundary condition subroutine has been implemented within P3/THERMAL to represent the heat flux between a noncombusting object and an engulfing fire. The heat flux calculations includes a simple 2D fire model in which energy and radiative heat transport equations are solved to produce estimates of the heat fluxes at the fire-object interface. These estimates reflect radiative coupling between a cold object and the flow of hot combustion gases which has been observed in fire experiments. The model uses a database of experimental pool fire measurements for far field boundary conditions and volumetric heat release rates. Taking into account the coupling between a structure and the fire is an improvement over the σT 4 approximation frequently used as a boundary condition for engineered system response and is the preliminary step in the development of a fire model with a predictive capability. This paper describes the implementation of the fire model as a P3/THERMAL boundary condition and presents the results of a verification calculation carried out using the model

  17. Theoretical model and experimental verification on the PID tracking method using liquid crystal optical phased array

    Science.gov (United States)

    Wang, Xiangru; Xu, Jianhua; Huang, Ziqiang; Wu, Liang; Zhang, Tianyi; Wu, Shuanghong; Qiu, Qi

    2017-02-01

    Liquid crystal optical phased array (LC-OPA) has been considered with great potential on the non-mechanical laser deflector because it is fabricated using photolithographic patterning technology which has been well advanced by the electronics and display industry. As a vital application of LC-OPA, free space laser communication has demonstrated its merits on communication bandwidth. Before data communication, ATP (acquisition, tracking and pointing) process costs relatively long time to result in a bottle-neck of free space laser communication. Meanwhile, dynamic real time accurate tracking is sensitive to keep a stable communication link. The electro-optic medium liquid crystal with low driving voltage can be used as the laser beam deflector. This paper presents a fast-track method using liquid crystal optical phased array as the beam deflector, CCD as a beacon light detector. PID (Proportion Integration Differentiation) loop algorithm is introduced as the controlling algorithm to generate the corresponding steering angle. To achieve the goal of fast and accurate tracking, theoretical analysis and experimental verification are demonstrated that PID closed-loop system can suppress the attitude random vibration. Meanwhile, theoretical analysis shows that tracking accuracy can be less than 6.5μrad, with a relative agreement with experimental results which is obtained after 10 adjustments that the tracking accuracy is less than12.6μrad.

  18. Modelling of near-field radionuclide transport phenomena in a KBS-3V type of repository for nuclear waste with Goldsim Code - and verification against previous methods

    International Nuclear Information System (INIS)

    Pulkkanen, V.-M.; Nordman, H.

    2010-03-01

    Traditional radionuclide transport models overestimate significantly some phenomena, or completely ignore them. This motivates the development of new more precise models. As a result, this work is a description of commissioning of a new KBS-3V near-field radionuclide transport model, which has been done with a commercial software called GoldSim. According to earlier models, GoldSim model uses rz coordinates, but the solubilities of radionuclides have been treated more precisely. To begin with, the physical phenomena concerning near-field transport have been introduced according to GoldSim way of thinking. Also, the computational methods of GoldSim have been introduced and compared to methods used earlier. The actual verification of GoldSim model has been carried out by comparing the GoldSim results from simple cases to the corresponding results obtained with REPCOM, a software developed by VTT and used in several safety assessments. The results agree well. Finally, a few complicated cases were studied. In these cases, the REPCOM's limitations in handling of some phenomena become evident. The differences in the results are caused especially by the extension of the solubility limit to the whole computational domain, and the element-wise treatment of the solubilities which was used instead of nuclide-wise treatment. This work has been carried out as a special assignment to the former laboratory of Advanced Energy Systems in Helsinki University of Technology. The work was done at VTT. (orig.)

  19. Modelling of near-field radionuclide transport phenomena in a KBS-3V type of repository for nuclear waste with Goldsim Code - and verification against previous methods

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkanen, V.-M.; Nordman, H. (VTT Technical Research Centre, Espoo (Finland))

    2010-03-15

    Traditional radionuclide transport models overestimate significantly some phenomena, or completely ignore them. This motivates the development of new more precise models. As a result, this work is a description of commissioning of a new KBS-3V near-field radionuclide transport model, which has been done with a commercial software called GoldSim. According to earlier models, GoldSim model uses rz coordinates, but the solubilities of radionuclides have been treated more precisely. To begin with, the physical phenomena concerning near-field transport have been introduced according to GoldSim way of thinking. Also, the computational methods of GoldSim have been introduced and compared to methods used earlier. The actual verification of GoldSim model has been carried out by comparing the GoldSim results from simple cases to the corresponding results obtained with REPCOM, a software developed by VTT and used in several safety assessments. The results agree well. Finally, a few complicated cases were studied. In these cases, the REPCOM's limitations in handling of some phenomena become evident. The differences in the results are caused especially by the extension of the solubility limit to the whole computational domain, and the element-wise treatment of the solubilities which was used instead of nuclide-wise treatment. This work has been carried out as a special assignment to the former laboratory of Advanced Energy Systems in Helsinki University of Technology. The work was done at VTT. (orig.)

  20. Verification of RBMK-1500 reactor main circulation circuit model with Cathare V1.3L

    International Nuclear Information System (INIS)

    Jasiulevicius, A.

    2001-01-01

    Among other computer codes, French code CATHARE is also applied for RBMK reactor calculations. In this paper results of such application for Ignalina NPP reactor (RBMK-1500 type) main circulation circuit are presented. Three transients calculations were performed: all main circulation pumps (MCP) trip, trip of one main circulation pump and trip of one main circulation pump without a closure of check valve on the pump line. Calculation results were compared to data from the Ignalina NPP, where all these transients were recorded in the years 1986, 1996 and 1998. The presented studies prove the capability of the CATHARE code to treat thermal-hydraulic transients with a reactor scram in the RBMK, in case of single or multiple pump trips. However, the presented model needs further improvements in order to simulate loss of coolant accidents. For this reason, emergency core cooling system should be included in the model. Additional model improvement is also needed in order to gain more independent pressure behavior in both loops. Also, flow rates through the reactor channels should be modeled by dividing channels into several groups, referring to channel power (in RBMK power produced in a channel, located in different parts of the core is not the same). The point-neutron kinetic model of the CATHARE code is not suitable to predict transients when the reactor is operating at a nominal power level. Such transients would require the use of 3D-neutron kinetics model to describe properly the strong space-time effect on the power distribution in the reactor core

  1. Verification of RBMK-1500 reactor main circulation circuit model with Cathare V1.3L

    Energy Technology Data Exchange (ETDEWEB)

    Jasiulevicius, A. [Kaunas University of Technology, Dept. of Thermal and Nuclear Energy, Kaunas (Lithuania)

    2001-07-01

    Among other computer codes, French code CATHARE is also applied for RBMK reactor calculations. In this paper results of such application for Ignalina NPP reactor (RBMK-1500 type) main circulation circuit are presented. Three transients calculations were performed: all main circulation pumps (MCP) trip, trip of one main circulation pump and trip of one main circulation pump without a closure of check valve on the pump line. Calculation results were compared to data from the Ignalina NPP, where all these transients were recorded in the years 1986, 1996 and 1998. The presented studies prove the capability of the CATHARE code to treat thermal-hydraulic transients with a reactor scram in the RBMK, in case of single or multiple pump trips. However, the presented model needs further improvements in order to simulate loss of coolant accidents. For this reason, emergency core cooling system should be included in the model. Additional model improvement is also needed in order to gain more independent pressure behavior in both loops. Also, flow rates through the reactor channels should be modeled by dividing channels into several groups, referring to channel power (in RBMK power produced in a channel, located in different parts of the core is not the same). The point-neutron kinetic model of the CATHARE code is not suitable to predict transients when the reactor is operating at a nominal power level. Such transients would require the use of 3D-neutron kinetics model to describe properly the strong space-time effect on the power distribution in the reactor core.

  2. Verification of voltage/ frequency requirement for emergency diesel generator in nuclear power plant using dynamic modeling

    International Nuclear Information System (INIS)

    Hur, J.S.; Roh, M.S.

    2013-01-01

    Full-text: One major cause of the plant shutdown is the loss of electrical power. The study is to comprehend the coping action against station blackout including emergency diesel generator, sequential loading of safety system and to ensure that the emergency diesel generator should meet requirements, especially voltage and frequency criteria using modeling tool. This paper also considered the change of the sequencing time and load capacity only for finding electrical design margin. However, the revision of load list must be verified with safety analysis. From this study, it is discovered that new load calculation is a key factor in EDG localization and in-house capability increase. (author)

  3. From model conception to verification and validation, a global approach to multiphase Navier-Stoke models with an emphasis on volcanic explosive phenomenology

    Energy Technology Data Exchange (ETDEWEB)

    Dartevelle, Sebastian

    2007-10-01

    Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards. In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key

  4. Chemical transport in a fissured rock: verification of a numerical model

    International Nuclear Information System (INIS)

    Rasmuson, A.; Narasimham, T.N.; Neretnieks.

    1982-01-01

    Due to the very long-term, high toxicity of some nuclear waste products, models are required to predict, in certain cases, the spatial and temporal distribution of chemical concentration less than 0.001% of the concentration released from the repository. A numerical model, TRUMP, which solves the advective diffusion equation in general three dimensions, with or without decay and source term has been verified. The method is based on an integrated finite difference approach. The studies show that as long as the magnitude of advectance is equal to or less than that of conductance for the closed surface bonding any volume element in the region (that is, numerical Peclet number -3 % or less. The realistic input parameters used in the sample calculations suggest that such a range of Peclet numbers is indeed likely to characterize deep groundwater systems in granitic and ancient argillaceous systems. A sensitivity analysis based on the errors in prediction introduced due to uncertainties in input parameters are likely to be larger than the computational inaccuracies introduced by the numerical model. Currently, a disadvantage in the TRUMP model is that the iterative method of solving the set of simultaneous equations is rather slow when time constants vary widely over the flow region. Although the iterative solution may be very desirable for large three-dimensional problems in order to minimize computer storage, it seems desirable to use a direct solver technique in conjunction with the mixed explicit-implicit approach whenever possible. Work in this direction is in progress

  5. Simulating and Predicting Cereal Crop Yields in Ethiopia: Model Calibration and Verification

    Science.gov (United States)

    Yang, M.; Wang, G.; Ahmed, K. F.; Eggen, M.; Adugna, B.; Anagnostou, E. N.

    2017-12-01

    Agriculture in developing countries are extremely vulnerable to climate variability and changes. In East Africa, most people live in the rural areas with outdated agriculture techniques and infrastructure. Smallholder agriculture continues to play a key role in this area, and the rate of irrigation is among the lowest of the world. As a result, seasonal and inter-annual weather patterns play an important role in the spatiotemporal variability of crop yields. This study investigates how various climate variables (e.g., temperature, precipitation, sunshine) and agricultural practice (e.g., fertilization, irrigation, planting date) influence cereal crop yields using a process-based model (DSSAT) and statistical analysis, and focuses on the Blue Nile Basin of Ethiopia. The DSSAT model is driven with meteorological forcing from the ECMWF's latest reanalysis product that cover the past 35 years; the statistical model will be developed by linking the same meteorological reanalysis data with harvest data at the woreda level from the Ethiopian national dataset. Results from this study will set the stage for the development of a seasonal prediction system for weather and crop yields in Ethiopia, which will serve multiple sectors in coping with the agricultural impact of climate variability.

  6. Single-pass beam measurements for the verification of the LHC magnetic model

    CERN Document Server

    Calaga, R; Redaelli, S; Sun, Y; Tomas, R; Venturini-Delsolaro, W; Zimmermann, F

    2010-01-01

    During the 2009 LHC injection tests, the polarities and effects of specific quadrupole and higher-order magnetic circuits were investigated. A set of magnet circuits had been selected for detailed investigation based on a number of criteria. On or off-momentum difference trajectories launched via appropriate orbit correctors for varying strength settings of the magnet circuits under study - e.g. main, trim and skew quadrupoles; sextupole families and spool piece correctors; skew sextupoles, octupoles - were compared with predictions from various optics models. These comparisons allowed confirming or updating the relative polarity conventions used in the optics model and the accelerator control system, as well as verifying the correct powering and assignment of magnet families. Results from measurements in several LHC sectors are presented.

  7. Modelling and verification of single slope solar still using ANSYS-CFX

    Energy Technology Data Exchange (ETDEWEB)

    Panchal, Hitesh N. [Research Scholar, Kadi Sarvavishwavidyalaya University, Gandhinagar (India); Shah, P.K. [Principal, Silver Oak College of Engineering and Technology, Ahmedabad (India)

    2011-07-01

    Solar distillation method is an easy, small scale and cost effective technique for providing safe water. It requires an energy input as heat and the solar radiation can be source of energy. Solar still is a device which uses process of solar distillation. Here, a two phase, three dimensional model was made for evaporation as well as condensation process in solar still by using ANSYS CFX method to simulate the present model. Simulation results of solar still compared with actual experiment data of single basin solar still at climate conditions of Mehsana (23{sup o}12' N, 72{sup o}30'). There is a good agreement with experimental results and simulation results of distillate output, water temperature and heat transfer coefficients. Overall study shows the ANSYS CFX is a powerful tool for diagnostic as well as analysis of solar still.

  8. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  9. SU-E-T-48: A Multi-Institutional Study of Independent Dose Verification for Conventional, SRS and SBRT

    International Nuclear Information System (INIS)

    Takahashi, R; Kamima, T; Tachibana, H; Baba, H; Itano, M; Yamazaki, T; Ishibashi, S; Higuchi, Y; Shimizu, H; Yamamoto, T; Yamashita, M; Sugawara, Y; Sato, A; Nishiyama, S; Kawai, D; Miyaoka, S

    2015-01-01

    Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle 3 (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sites (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly

  10. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  11. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    International Nuclear Information System (INIS)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke

    2015-01-01

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  12. Acculturation and mental health--empirical verification of J.W. Berry's model of acculturative stress.

    Science.gov (United States)

    Koch, M W; Bjerregaard, P; Curtis, C

    2004-01-01

    Many studies concerning mental health among ethnic minorities have used the concept of acculturation as a model of explanation, in particular J.W. Berry's model of acculturative stress. But Berry's theory has only been empirically verified few times. The aims of the study were to examine whether Berry's hypothesis about the connection between acculturation and mental health can be empirically verified for Greenlanders living in Denmark and to analyse whether acculturation plays a significant role for mental health among Greenlanders living in Denmark. The study used data from the 1999 Health Profile for Greenlanders in Denmark. As measure of mental health we applied the General Health Questionnaire (GHQ-12). Acculturation was assessed from answers to questions about how the respondents value the fact that children maintain their traditional cultural identity as Greenlander and how well the respondents speak Greenlandic and Danish. The statistical methods included binary logistic regression. We found no connection between Berry's definition of acculturation and mental health among Greenlanders in Denmark. On the other hand, our findings showed a significant relation between mental health and gender, age, marital position, occupation and long-term illness. The findings indicate that acculturation in the way Berry defines it plays a lesser role for mental health among Greenlanders in Denmark than socio-demographic and socio-economic factors. Therefore we cannot empirically verify Berry's hypothesis.

  13. HUMTRN: documentation and verification for an ICRP-based age- and sex-specific human simulation model for radionuclide dose assessment

    International Nuclear Information System (INIS)

    Gallegos, A.F.; Wenzel, W.J.

    1984-06-01

    The dynamic human simulation model HUMTRN is designed specifically as a major module of BIOTRAN to integrate climatic, hydrologic, atmospheric, food crop, and herbivore simulation with human dietary and physiological characteristics, and metabolism and radionuclides to predict radiation doses to selected organs of both sexes in different age groups. The model is based on age- and weight-specific equations developed for predicting human radionuclide transport from metabolic and physical characteristics. These characteristics are modeled from studies documented by the International Commission on Radiological Protection (ICRP 23). HUMTRN allows cumulative doses from uranium or plutonium radionuclides to be predicted by modeling age-specific anatomical, physiological, and metabolic properties of individuals between 1 and 70 years of age and can track radiation exposure and radionuclide metabolism for any age group for specified daily or yearly time periods. The simulated daily dose integration of eight or more simultaneous air, water, and food intakes gives a new, comprehensive, dynamic picture of radionuclide intake, uptake, and hazard analysis for complex scenarios. A detailed example using site-specific data based on the Pantex studies is included for verification. 14 references, 24 figures, 10 tables

  14. Use of an Existing Airborne Radon Data Base in the Verification of the NASA/AEAP Core Model

    Science.gov (United States)

    Kritz, Mark A.

    1998-01-01

    The primary objective of this project was to apply the tropospheric atmospheric radon (Rn222) measurements to the development and verification of the global 3-D atmospheric chemical transport model under development by NASA's Atmospheric Effects of Aviation Project (AEAP). The AEAP project had two principal components: (1) a modeling effort, whose goal was to create, test and apply an elaborate three-dimensional atmospheric chemical transport model (the NASA/AEAP Core model to an evaluation of the possible short and long-term effects of aircraft emissions on atmospheric chemistry and climate--and (2) a measurement effort, whose goal was to obtain a focused set of atmospheric measurements that would provide some of the observational data used in the modeling effort. My activity in this project was confined to the first of these components. Both atmospheric transport and atmospheric chemical reactions (as well the input and removal of chemical species) are accounted for in the NASA/AEAP Core model. Thus, for example, in assessing the effect of aircraft effluents on the chemistry of a given region of the upper troposphere, the model must keep track not only of the chemical reactions of the effluent species emitted by aircraft flying in this region, but also of the transport into the region of these (and other) species from other, remote sources--for example, via the vertical convection of boundary layer air to the upper troposphere. Radon, because of its known surface source and known radioactive half-life, and freedom from chemical production or loss, and from removal from the atmosphere by physical scavenging, is a recognized and valuable tool for testing the transport components of global transport and circulation models.

  15. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    Science.gov (United States)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture

  16. THE FLOOD RISK IN THE LOWER GIANH RIVER: MODELLING AND FIELD VERIFICATION

    Directory of Open Access Journals (Sweden)

    NGUYEN H. D.

    2016-03-01

    Full Text Available Problems associated with flood risk definitely represent a highly topical issue in Vietnam. The case of the lower Gianh River in the central area of Vietnam, with a watershed area of 353 km2, is particularly interesting. In this area, periodically subject to flood risk, the scientific question is strongly linked to risk management. In addition, flood risk is the consequence of the hydrological hazard of an event and the damages related to this event. For this reason, our approach is based on hydrodynamic modelling using Mike Flood to simulate the runoff during a flood event. Unfortunately the data in the studied area are quite limited. Our computation of the flood risk is based on a three-step modelling process, using rainfall data coming from 8 stations, cross sections, the topographic map and the land-use map. The first step consists of creating a 1-D model using Mike 11, in order to simulate the runoff in the minor river bed. In the second step, we use Mike 21 to create a 2-D model to simulate the runoff in the flood plain. The last step allows us to couple the two models in order to precisely describe the variables for the hazard analysis in the flood plain (the water level, the speed, the extent of the flooding. Moreover the model is calibrated and verified using observational data of the water level at hydrologic stations and field control data (on the one hand flood height measurements, on the other hand interviews with the community and with the local councillors. We then generate GIS maps in order to improve flood hazard management, which allows us to create flood hazard maps by coupling the flood plain map and the runoff speed map. Our results show that: the flood peak, caused by typhoon Nari, reached more than 6 m on October 16th 2013 at 4 p.m. (its area was extended by 149 km². End that the typhoon constitutes an extreme flood hazard for 11.39%, very high for 10.60%, high for 30.79%, medium for 31.91% and a light flood hazard for 15

  17. Verification SEBAL and Hargreaves –Samani Models to Estimate Evapotranspiration by Lysimeter Data

    Directory of Open Access Journals (Sweden)

    Ali Morshedi

    2017-02-01

    Full Text Available Introduction: Evapotranspiration (ET is an important component of the hydrological cycle, energy equations at the surface and water balance. ET estimation is needed in various fields of science, such as hydrology, agriculture, forestry and pasture, and water resources management. Conventional methods used to estimate evapotranspiration from point measurements. Remote sensing models have the capability to estimate ET using surface albedo, surface temperature and vegetation indices in larger scales. Surface Energy Balance Algorithm for Land (SEBAL estimate ET at the moment of satellite path as a residual of energy balance equation for each pixel. In this study Hargreaves-Samani (HS and SEBAL models ET compared to an alfalfa lysimeter data’s, located in Shahrekord plain within the Karun basin. Satellite imageries were based on Landsat 7 ETM+ sensor data’s in seven satellite passes for path 164 and row 38 in the World Reference System, similar to lysimeter sampling data period, from April to October 2011. SEBAL uses the energy balance equation to estimate evapotranspiration. Equation No. 1 shows the energy balance equation for an evaporative surface: λET=Rn–G–H [1] In this equation Rn, H, G and λET represent the net radiation flux input to the surface (W/m2, Sensible heat flux (W/m2, soil heat flux (W/m2, and latent heat of vaporization (W/m2, respectively. In this equation the vertical flux considered and the horizontal fluxes of energy are neglected. The above equation must be used for large surfaces and uniformly full cover plant area. SEBAL is provided for estimating ET, using the minimum data measured by ground equipment. This model is applied and tested in more than 30 countries with an accuracy of about 85% at field scale, and 95 percent in the daily and seasonal scales. In Borkhar watershed (East of Isfahan, IRAN ASTER and MODIS satellite imageries were used for SEBAL to compare Penman-Monteith model. Results showed that estimated

  18. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  19. Steps in the construction and verification of an explanatory model of psychosocial adjustment

    Directory of Open Access Journals (Sweden)

    Arantzazu Rodríguez-Fernández

    2016-06-01

    Full Text Available The aim of the present study was to empirically test an explanatory model of psychosocial adjustment during adolescence, with psychosocial adjustment during this stage being understood as a combination of school adjustment (or school engagement and subjective well-being. According to the hypothetic model, psychosocial adjustment depends on self-concept and resilience, which in turn act as mediators of the influence of perceived social support (from family, peers and teachers on this adjustment. Participants were 1250 secondary school students (638 girls and 612 boys aged between 12 and 15 years (Mean = 13.72; SD = 1.09. The results provided evidence of: (a the influence of all three types of perceived support on subject resilience and self-concept, with perceived family support being particularly important in this respect; (b the influence of the support received from teachers on school adjustment and support received from the family on psychological wellbeing; and (c the absence of any direct influence of peer support on psychosocial adjustment, although indirect influence was observed through the psychological variables studied. These results are discussed from an educational perspective and in terms of future research

  20. Steps in the construction and verification of an explanatory model of psychosocial adjustment

    Directory of Open Access Journals (Sweden)

    Arantzazu Rodríguez-Fernández

    2016-06-01

    Full Text Available The aim of the present study was to empirically test an explanatory model of psychosocial adjustment during adolescence, with psychosocial adjustment during this stage being understood as a combination of school adjustment (or school engagement and subjective well-being. According to the hypothetic model, psychosocial adjustment depends on self-concept and resilience, which in turn act as mediators of the influence of perceived social support (from family, peers and teachers on this adjustment. Participants were 1250 secondary school students (638 girls and 612 boys aged between 12 and 15 years (Mean = 13.72; SD = 1.09. The results provided evidence of: (a the influence of all three types of perceived support on subject resilience and self-concept, with perceived family support being particularly important in this respect; (b the influence of the support received from teachers on school adjustment and support received from the family on psychological wellbeing; and (c the absence of any direct influence of peer support on psychosocial adjustment, although indirect influence was observed through the psychological variables studied. These results are discussed from an educational perspective and in terms of future research.

  1. Experimental study on design verification of new concept for integral reactor safety system

    International Nuclear Information System (INIS)

    Chung, Moon Ki; Choi, Ki Yong; Park, Hyun Sik; Cho, Seok; Park, Choon Kyung; Lee, Sung Jae; Song, Chul Hwa

    2004-01-01

    The pressurized light water cooled, medium power (330 MWt) SMART (System-integrated Modular Advanced ReacTor) has been under development at KAERI for a dual purpose : seawater desalination and electricity generation. The SMART design verification phase was followed to conduct various separate effects tests and comprehensive integral effect tests. The high temperature / high pressure thermal-hydraulic test facility, VISTA(Experimental Verification by Integral Simulation of Transient and Accidents) has been constructed to simulate the SMART-P (the one fifth scaled pilot plant) by KAERI. Experimental tests have been performed to investigate the thermal-hydraulic dynamic characteristics of the primary and the secondary systems. Heat transfer characteristics and natural circulation performance of the PRHRS (Passive Residual Heat Removal System) of SMART-P were also investigated using the VISTA facility. The coolant flows steadily in the natural circulation loop which is composed of the Steam Generator (SG) primary side, the secondary system, and the PRHRS. The heat transfers through the PRHRS heat exchanger and ECT are sufficient enough to enable the natural circulation of the coolant

  2. Preliminary verification results of the DWD limited area model LME and evaluation of its storm forecasting skill over the area of Cyprus

    Directory of Open Access Journals (Sweden)

    A. Orphanou

    2006-01-01

    Full Text Available A preliminary verification and evaluation is made of the forecast fields of the non-hydrostatic limited area model LME of the German Weather Service (DWD, for a recent three month period. For this purpose, observations from two synoptic stations in Cyprus are utilized. In addition, days with depressions over the area were selected in order to evaluate the model's forecast skill in storm forecasting.

  3. PET/CT imaging for treatment verification after proton therapy: a study with plastic phantoms and metallic implants.

    Science.gov (United States)

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B; Bonab, Ali A; Alpert, Nathaniel M; Lohmann, Kevin; Bortfeld, Thomas

    2007-02-01

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of

  4. Model for Sulfate Diffusion Depth in Concrete under Complex Aggressive Environments and Its Experimental Verification

    Directory of Open Access Journals (Sweden)

    Yingwu Zhou

    2015-01-01

    Full Text Available Sulfate attack is one of the most important factors that lead to the performance deterioration of concrete materials. The progress of the sulfate diffusion depth in concrete is an important index that quantitatively characterizes the rate of concrete damage, cracking, and spalling due to sulfate attacks. The progress of the diffusion depth of concrete to sulfate attack is systematically investigated in this paper by both theoretical and experimental study. A newly time-varying model of the diffusion depth is developed, which has comprehensively considered a mass of parameter of complex environments for the first time. On this basis, a method is further proposed for effectively predicting the residual life of in-service concrete structures subject to sulfate attack. Integrating the data from the self-designed high-temperature dry-wet accelerated corrosion test and a large amount of experimental data reported in the existing literatures, the effectiveness and accuracy of the time-varying model of the diffusion depth by sulfates are finally verified.

  5. Analytical model for performance verification of liquid poison injection system of a nuclear reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kansal, Anuj Kumar, E-mail: anujkumarkansal@gmail.com; Maheshwari, Naresh Kumar, E-mail: nmahesh@barc.gov.in; Vijayan, Pallippattu Krishnan, E-mail: vijayanp@barc.gov.in

    2014-08-15

    Highlights: • One-dimensional modelling of shut down system-2. • Semi-empirical correlation poison jet progression. • Validation of code. - Abstract: Shut down system-2 (SDS-2) in advanced vertical pressure tube type reactor, provides rapid reactor shutdown by high pressure injection of a neutron absorbing liquid called poison, into the moderator in the calandria. Poison inside the calandria is distributed by poison jets issued from holes provided in the injection tubes. Effectiveness of the system depends on the rate and spread of the poison in the moderator. In this study, a transient one-dimensional (1D) hydraulic code, COPJET is developed, to predict the performance of system by predicting progression of poison jet with time. Validation of the COPJET is done with the data available in literature. Thereafter, it is applied for advanced vertical pressure type reactor.

  6. Type of cultural orientation and empathy in brazilians: Verification of a theoretical model

    Directory of Open Access Journals (Sweden)

    Nilton Formiga

    2013-01-01

    Full Text Available The events in contemporary have affected the social spaces, economic and cultural as well the interpersonal relations. It is believed that an individualist or collectivist orientation would influence people's ability to recognition of his capacity for interpersonal resonance: to empathize. The present study aims to verify a theoretical model in which the type of cultural orientation does associate with empathy. 456 subjects, male and female, ages from 12 to 67 years, of different educational levels of public and private institutions in the cities of Joao Pessoa-PB and Rio de Janeiro-RJ, answered to the Multidimensional Scale of Interpersonal Reactivity, the Scale of the Attributes of Individualistic and Collectivistic Cultural Orientation and socio-demographic data. There was a positive association between the collectivist orientation and empathy, on the other hand, individualistic orientation was associated negatively with empathy. It also highlighted the existence of a higher average score on collectivism which influences the high score on empathy.

  7. Design, implementation and verification of software code for radiation dose assessment based on simple generic environmental model

    International Nuclear Information System (INIS)

    I Putu Susila; Arif Yuniarto

    2017-01-01

    Radiation dose assessment to determine the potential of radiological impacts of various installations within nuclear facility complex is necessary to ensure environmental and public safety. A simple generic model-based method for calculating radiation doses caused by the release of radioactive substances into the environment has been published by the International Atomic Energy Agency (IAEA) as the Safety Report Series No. 19 (SRS-19). In order to assist the application of the assessment method and a basis for the development of more complex assessment methods, an open-source based software code has been designed and implemented. The software comes with maps and is very easy to be used because assessment scenarios can be done through diagrams. Software verification was performed by comparing its result to SRS-19 and CROM software calculation results. Dose estimated by SRS-19 are higher compared to the result of developed software. However, these are still acceptable since dose estimation in SRS-19 is based on conservative approach. On the other hand, compared to CROM software, the same results for three scenarios and a non-significant difference of 2.25 % in another scenario were obtained. These results indicate the correctness of our implementation and implies that the developed software is ready for use in real scenario. In the future, the addition of various features and development of new model need to be done to improve the capability of software that has been developed. (author)

  8. Storage and growth of denitrifiers in aerobic granules: part II. model calibration and verification.

    Science.gov (United States)

    Ni, Bing-Jie; Yu, Han-Qing; Xie, Wen-Ming

    2008-02-01

    A mathematical model to describe the simultaneous storage and growth activities of denitrifiers in aerobic granules under anoxic conditions has been developed in an accompanying article. The sensitivity of the nitrate uptake rate (NUR) toward the stoichiometric and kinetic coefficients is analyzed in this article. The model parameter values are estimated by minimizing the sum of squares of the deviations between the measured and model-predicted values. The model is successfully calibrated and a set of stoichiometric and kinetic parameters for the anoxic storage and growth of the denitrifiers are obtained. Thereafter, the model established is verified with three set of experimental data. The comparison between the model established with the ASM1 model and ASM3 shows that the present model is appropriate to simulate and predict the performance of a granule-based denitrification system. (c) 2007 Wiley Periodicals, Inc.

  9. Runtime Verification with State Estimation

    Science.gov (United States)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  10. Validation and verification of agent models for trust: Independent compared to relative trust

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van

    2011-01-01

    In this paper, the results of a validation experiment for two existing computational trust models describing human trust are reported. One model uses experiences of performance in order to estimate the trust in different trustees. The second model in addition carries the notion of relative trust.

  11. Gas Chromatographic Verification of a Mathematical Model: Product Distribution Following Methanolysis Reactions.

    Science.gov (United States)

    Lam, R. B.; And Others

    1983-01-01

    Investigated application of binomial statistics to equilibrium distribution of ester systems by employing gas chromatography to verify the mathematical model used. Discusses model development and experimental techniques, indicating the model enables a straightforward extension to symmetrical polyfunctional esters and presents a mathematical basis…

  12. Experimental Verification of Same Simple Equilibrium Models of Masonry Shear Walls

    Science.gov (United States)

    Radosław, Jasiński

    2017-10-01

    This paper contains theoretical fundamentals of strut and tie models, used in unreinforced horizontal shear walls. Depending on support conditions and wall loading, we can distinguish models with discrete bars when point load is applied to the wall (type I model) or with continuous bars (type II model) when load is uniformly distributed at the wall boundary. The main part of this paper compares calculated results with the own tests on horizontal shear walls made of solid brick, silicate elements and autoclaved aerated concrete. The tests were performed in Poland. The model required some modifications due to specific load and static diagram.

  13. Process verification of a hydrological model using a temporal parameter sensitivity analysis

    OpenAIRE

    M. Pfannerstill; B. Guse; D. Reusser; N. Fohrer

    2015-01-01

    To ensure reliable results of hydrological models, it is essential that the models reproduce the hydrological process dynamics adequately. Information about simulated process dynamics is provided by looking at the temporal sensitivities of the corresponding model parameters. For this, the temporal dynamics of parameter sensitivity are analysed to identify the simulated hydrological processes. Based on these analyses it can be verified if the simulated hydrological processes ...

  14. Experimental verification of computational model for wind turbine blade geometry design

    Directory of Open Access Journals (Sweden)

    Štorch Vít

    2015-01-01

    Full Text Available A 3D potential flow solver with unsteady force free wake model intended for optimization of blade shape for wind power generation is applied on a test case scenario formed by a wind turbine with vertical axis of rotation. The calculation is sensitive to correct modelling of wake and its interaction with blades. The validity of the flow solver is verified by comparing experimentally obtained performance data of model rotor with numerical results.

  15. Induction Heating of Carbon-Fiber Composites: Experimental Verification of Models

    National Research Council Canada - National Science Library

    Fink, Bruce

    2000-01-01

    .... The validity of the global thermal generation model is established through an experimental test matrix in which various specimen configurations are evaluated and compared with theoretical predictions...

  16. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    developed, all models constructed in each phase are verifiable. This requires that the modelling notations are formally defined and related in order to have tool support developed for the integration of sophisticated checkers, generators and transformations. This paper summarises our research on the method...... of Refinement of Component and Object Systems (rCOS) and illustrates it with experiences from the work on the Common Component Modelling Example (CoCoME). This gives evidence that the formal techniques developed in rCOS can be integrated into a model-driven development process and shows where it may...

  17. Use of the Long Duration Exposure Facility's thermal measurement system for the verification of thermal models

    Science.gov (United States)

    Berrios, William M.

    1992-01-01

    The Long Duration Exposure Facility (LDEF) postflight thermal model predicted temperatures were matched to flight temperature data recorded by the Thermal Measurement System (THERM), LDEF experiment P0003. Flight temperatures, recorded at intervals of approximately 112 minutes for the first 390 days of LDEF's 2105 day mission were compared with predictions using the thermal mathematical model (TMM). This model was unverified prior to flight. The postflight analysis has reduced the thermal model uncertainty at the temperature sensor locations from +/- 40 F to +/- 18 F. The improved temperature predictions will be used by the LDEF's principal investigators to calculate improved flight temperatures experienced by 57 experiments located on 86 trays of the facility.

  18. Initial Experimental Verification of the Neutron Beam Modeling for the LBNL BNCT Facility

    International Nuclear Information System (INIS)

    Bleuel, D.L.; Chu, W.T.; Donahue, R.J.; Ludewigt, B.A.; McDonald, R.J.; Smith, A.R.; Stone, N.A.; Vuji, J.

    1999-01-01

    In preparation for future clinical BNCT trials, neutron production via the 7Li(p,n) reaction as well as subsequent moderation to produce epithermal neutrons have been studied. Proper design of a moderator and filter assembly is crucial in producing an optimal epithermal neutron spectrum for brain tumor treatments. Based on in-phantom figures-of-merit,desirable assemblies have been identified. Experiments were performed at the Lawrence Berkeley National Laboratory's 88-inch cyclotron to characterize epithermal neutron beams created using several microampere of 2.5 MeV protons on a lithium target. The neutron moderating assembly consisted of Al/AlF3 and Teflon, with a lead reflector to produce an epithermal spectrum strongly peaked at 10-20 keV. The thermal neutron fluence was measured as a function of depth in a cubic lucite head phantom by neutron activation in gold foils. Portions of the neutron spectrum were measured by in-air activation of six cadmium-covered materials (Au, Mn, In, Cu, Co, W) with high epithermal neutron absorption resonances. The results are reasonably reproduced in Monte Carlo computational models, confirming their validity

  19. Integrated Ray Tracing Model for End-to-end Performance Verification of Amon-Ra Instrument

    Science.gov (United States)

    Lee, Jae-Min; Park, Won Hyun; Ham, Sun-Jeong, Yi, Hyun-Su; Yoon, Jee Yeon; Kim, Sug-Whan; Choi, Ki-Hyuk; Kim, Zeen Chul; Lockwood, Mike

    2007-03-01

    The international EARTHSHINE mission is to measure 1% anomaly of the Earth global albedo and total solar irradiance using Amon-Ra instrument around Lagrange point 1. We developed a new ray tracing based integrated end-to-end simulation tool that overcomes the shortcomings of the existing end-to-end performance simulation techniques. We then studied the in-orbit radiometric performance of the breadboard Amon-Ra visible channel optical system. The TSI variation and the Earth albedo anomaly, reported elsewhere, were used as the key input variables in the simulation. The output flux at the instrument focal plane confirms that the integrated ray tracing based end-to-end science simulation delivers the correct level of incident power to the Amon-Ra instrument well within the required measurement error budget of better than ±0.28%. Using the global angular distribution model (ADM), the incident flux is then used to estimate the Earth global albedo and the TSI variation, confirming the validity of the primary science cases at the L1 halo orbit. These results imply that the integrated end-to-end ray tracing technique, reported here, can serve as an effective and powerful building block of the on-line science analysis tool in support of the international EARTHSHINE mission currently being developed.

  20. Model-based virtual VSB mask writer verification for efficient mask error checking and optimization prior to MDP

    Science.gov (United States)

    Pack, Robert C.; Standiford, Keith; Lukanc, Todd; Ning, Guo Xiang; Verma, Piyush; Batarseh, Fadi; Chua, Gek Soon; Fujimura, Akira; Pang, Linyong

    2014-10-01

    A methodology is described wherein a calibrated model-based `Virtual' Variable Shaped Beam (VSB) mask writer process simulator is used to accurately verify complex Optical Proximity Correction (OPC) and Inverse Lithography Technology (ILT) mask designs prior to Mask Data Preparation (MDP) and mask fabrication. This type of verification addresses physical effects which occur in mask writing that may impact lithographic printing fidelity and variability. The work described here is motivated by requirements for extreme accuracy and control of variations for today's most demanding IC products. These extreme demands necessitate careful and detailed analysis of all potential sources of uncompensated error or variation and extreme control of these at each stage of the integrated OPC/ MDP/ Mask/ silicon lithography flow. The important potential sources of variation we focus on here originate on the basis of VSB mask writer physics and other errors inherent in the mask writing process. The deposited electron beam dose distribution may be examined in a manner similar to optical lithography aerial image analysis and image edge log-slope analysis. This approach enables one to catch, grade, and mitigate problems early and thus reduce the likelihood for costly long-loop iterations between OPC, MDP, and wafer fabrication flows. It moreover describes how to detect regions of a layout or mask where hotspots may occur or where the robustness to intrinsic variations may be improved by modification to the OPC, choice of mask technology, or by judicious design of VSB shots and dose assignment.

  1. TU-FG-BRB-06: A Prompt Gamma-Ray Spectroscopy System for Clinical Studies of in Vivo Proton Range Verification

    International Nuclear Information System (INIS)

    Verburg, J; Bortfeld, T

    2016-01-01

    Purpose: We present a new system to perform prompt gamma-ray spectroscopy during proton pencil-beam scanning treatments, which enables in vivo verification of the proton range. This system will be used for the first clinical studies of this technology. Methods: After successful pre-clinical testing of prompt gamma-ray spectroscopy, a full scale system for clinical studies is now being assembled. Prompt gamma-rays will be detected during patient treatment using an array of 8 detector modules arranged behind a tungsten collimator. Each detector module consists of a lanthanum(III) bromide scintillator, a photomultiplier tube, and custom electronics for stable high voltage supply and signal amplification. A new real-time data acquisition and control system samples the signals from the detectors with analog-to-digital converters, analyses events of interest, and communicates with the beam delivery systems. The timing of the detected events was synchronized to the cyclotron radiofrequency and the pencil-beam delivery. Range verification is performed by matching measured energy- and timeresolved gamma-ray spectra to nuclear reaction models based on the clinical treatment plan. Experiments in phantoms were performed using clinical beams in order to assess the performance of the systems. Results: The experiments showed reliable real-time analysis of more than 10 million detector events per second. The individual detector modules acquired accurate energy- and time-resolved gamma-ray measurements at a rate of 1 million events per second, which is typical for beams delivered with a clinical dose rate. The data acquisition system successfully tracked the delivery of the scanned pencil-beams to determine the location of range deviations within the treatment field. Conclusion: A clinical system for proton range verification using prompt gamma-ray spectroscopy has been designed and is being prepared for use during patient treatments. We anticipate to start a first clinical study

  2. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  3. Verification of back stress in a constitutive model for cyclic plasticity

    International Nuclear Information System (INIS)

    Ishikawa, H.

    1991-01-01

    Aiming at a formulation of the unified constitutive model of cyclic plasticity and creep, the concept of the effective stress which is defined as stress measured from the current center of yield surface, is employed to explain the intermittent creep period after cyclic prestraining. Then the experimental results show a marvelous regularity which might make us easily to construct the unified constitutive model. (author)

  4. Erythrocyte lysis in isotonic solution of ammonium chloride: Theoretical modelling and experimental verification

    NARCIS (Netherlands)

    Chernyshev, A.V.; Tarasov, P.A.; Semianov, K.A.; Nekrasov, V.M.; Hoekstra, A.G.; Maltsev, V.P.

    2008-01-01

    A mathematical model of erythrocyte lysis in isotonic solution of ammonium chloride is presented in frames of a statistical approach. The model is used to evaluate several parameters of mature erythrocytes (volume, surface area, hemoglobin concentration, number of anionic exchangers on membrane,

  5. Motion/posture modeling and simulation verification of physically handicapped in manufacturing system design

    Science.gov (United States)

    Fu, Yan; Li, Shiqi; Chen, Gwen-guo

    2013-03-01

    Non-obstacle design is critical to tailor physically handicapped workers in manufacturing system. Simultaneous consideration of variability in physically disabled users, machines and environment of the manufacturing system is extremely complex and generally requires modeling of physically handicapped interaction with the system. Most current modeling either concentrates on the task results or functional disability. The integration of physical constraints with task constraints is far more complex because of functional disability and its extended influence on adjacent body parts. A framework is proposed to integrate the two constraints and thus model the specific behavior of the physical handicapped in virtual environment generated by product specifications. Within the framework a simplified model of physical disabled body is constructed, and body motion is generated based on 3 levels of constraints(effecter constraints, kinematics constraints and physical constraints). The kinematics and dynamic calculations are made and optimized based on the weighting manipulated by the kinematics constraints and dynamic constraints. With object transferring task as example, the model is validated in Jack 6.0. Modelled task motion elements except for squatting and overreaching well matched with captured motion elements. The proposed modeling method can model the complex behavior of the physically handicapped by integrating both task and physical disability constraints.

  6. TARDEC FIXED HEEL POINT (FHP): DRIVER CAD ACCOMMODATION MODEL VERIFICATION RERPOT

    Science.gov (United States)

    2017-11-09

    Distribution Statement A. Approve for public release; distribution is unlimited | P a g e 4 REVISION HISTORY ...34 10.1.2 M&S Use History ...the CAD model was compared to the outputs of the UMTRI Soldier Driver Accommodation (2017) model spreadsheet; and boundary manikin hip and eye

  7. Car-following model with relative-velocity effect and its experimental verification.

    Science.gov (United States)

    Shamoto, Daisuke; Tomoeda, Akiyasu; Nishi, Ryosuke; Nishinari, Katsuhiro

    2011-04-01

    In driving a vehicle, drivers respond to the changes of both the headway and the relative velocity to the vehicle in front. In this paper a new car-following model including these maneuvers is proposed. The acceleration of the model becomes infinite (has a singularity) when the distance between two vehicles is zero, and the asymmetry between the acceleration and the deceleration is incorporated in a nonlinear way. The model is simple but contains enough features of driving for reproducing real vehicle traffic. From the linear stability analysis, we confirm that the model shows the metastable homogeneous flow around the critical density, beyond which a traffic jam emerges. Moreover, we perform experiments to verify this model. From the data it is shown that the acceleration of a vehicle has a positive correlation with the relative velocity.

  8. Verification of some numerical models for operationally predicting mesoscale winds aloft

    International Nuclear Information System (INIS)

    Cornett, J.S.; Randerson, D.

    1977-01-01

    Four numerical models are described for predicting mesoscale winds aloft for a 6 h period. These models are all tested statistically against persistence as the control forecast and against predictions made by operational forecasters. Mesoscale winds aloft data were used to initialize the models and to verify the predictions on an hourly basis. The model yielding the smallest root-mean-square vector errors (RMSVE's) was the one based on the most physics which included advection, ageostrophic acceleration, vertical mixing and friction. Horizontal advection was found to be the most important term in reducing the RMSVE's followed by ageostrophic acceleration, vertical advection, surface friction and vertical mixing. From a comparison of the mean absolute errors based on up to 72 independent wind-profile predictions made by operational forecasters, by the most complete model, and by persistence, we conclude that the model is the best wind predictor in the free air. In the boundary layer, the results tend to favor the forecaster for direction predictions. The speed predictions showed no overall superiority in any of these three models

  9. Implementation and verification of interface constitutive model in FLAC3D

    Directory of Open Access Journals (Sweden)

    Hai-min Wu

    2011-09-01

    Full Text Available Due to the complexity of soil-structure interaction, simple constitutive models typically used for interface elements in general computer programs cannot satisfy the requirements of discontinuous deformation analysis of structures that contain different interfaces. In order to simulate the strain-softening characteristics of interfaces, a nonlinear strain-softening interface constitutive model was incorporated into fast Lagrange analysis of continua in three dimensions (FLAC3D through a user-defined program in the FISH environment. A numerical simulation of a direct shear test for geosynthetic interfaces was conducted to verify that the interface model was implemented correctly. Results of the numerical tests show good agreement with the results obtained from theoretical calculations, indicating that the model incorporated into FLAC3D can simulate the nonlinear strain-softening behavior of interfaces involving geosynthetic materials. The results confirmed the validity and reliability of the improved interface model. The procedure and method of implementing an interface constitutive model into a commercial computer program also provide a reference for implementation of a new interface constitutive model in FLAC3D.

  10. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  11. Helicopter stability and control modeling improvements and verification on two helicopters

    Science.gov (United States)

    Schrage, D. P.; Peters, D. A.; Prasad, J. V. R.; Stumpf, W. F.; He, Chengjian

    1988-01-01

    A linearized model of helicopter flight dynamics is developed which includes the flapping, lead-lag, and dynamic inflow degrees of freedom (DOF). The model is a combination of analytical terms and numerically determined stability derivatives, and is used to investigate the importance of the rotor DOF to stability and control modeling. The results show that the rotor DOF can have a significant impact on some of the natural modes in a linear model. The flap and dynamic inflow DOF show the greatest influence. Flapping exhibits strong coupling to the body, dynamic inflow, and to lead-lag to a lesser extent. Dynamic inflow tends to damp the high-frequency flapping modes, and reduces the damping on coupled body-flap motion. Dynamic inflow also couples to the flapping motion to produce complex roots. With body-flap and lag regressing modes as exceptions, the results show essentially similar behavior for most modes of articulated and hingeless rotor helicopters.

  12. Analysis of Changes in Auditory Nerve Signals Following Simulated Tinnitus for the Verification of Cochlear Model

    National Research Council Canada - National Science Library

    Yi, Y

    2001-01-01

    For an interpretation of the tinnitus phenomenon, reticular lamina which transmits energy in a cochlear was assumed as a mass and the components for the stiffness and control were added to the model...

  13. The Bilevel Design Problem for Communication Networks on Trains: Model, Algorithm, and Verification

    Directory of Open Access Journals (Sweden)

    Yin Tian

    2014-01-01

    Full Text Available This paper proposes a novel method to solve the problem of train communication network design. Firstly, we put forward a general description of such problem. Then, taking advantage of the bilevel programming theory, we created the cost-reliability-delay model (CRD model that consisted of two parts: the physical topology part aimed at obtaining the networks with the maximum reliability under constrained cost, while the logical topology part focused on the communication paths yielding minimum delay based on the physical topology delivered from upper level. We also suggested a method to solve the CRD model, which combined the genetic algorithm and the Floyd-Warshall algorithm. Finally, we used a practical example to verify the accuracy and the effectiveness of the CRD model and further applied the novel method on a train with six carriages.

  14. Nascap-2k Spacecraft-Plasma Environment Interactions Modeling: New Capabilities and Verification

    National Research Council Canada - National Science Library

    Davis, V. A; Mandell, M. J; Cooke, D. L; Ferguson, D. C

    2007-01-01

    .... Here we examine the accuracy and limitations of two new capabilities of Nascap-2k: modeling of plasma plumes such as generated by electric thrusters and enhanced PIC computational capabilities...

  15. Study of the flow field past dimpled aerodynamic surfaces: numerical simulation and experimental verification

    Science.gov (United States)

    Binci, L.; Clementi, G.; D’Alessandro, V.; Montelpare, S.; Ricci, R.

    2017-11-01

    This work presents the study of the flow field past of dimpled laminar airfoil. Fluid dynamic behaviour of these elements has been not still deeply studied in the scientific community. Therefore Computational Fluid-Dynamics (CFD) is here used to analyze the flow field induced by dimples on the NACA 64-014A laminar airfoil at Re = 1.75 · 105 at α = 0°. Reynolds Averaged Navier–Stokes (RANS) equations and Large-Eddy Simulations (LES) were compared with wind tunnel measurements in order to evaluate their effectiveness in the modeling this kind of flow field. LES equations were solved using a specifically developed OpenFOAM solver adopting an L–stable Singly Diagonally Implicit Runge–Kutta (SDIRK) technique with an iterated PISO-like procedure for handling pressure-velocity coupling within each RK stage. Dynamic Smagorinsky subgrid model was employed. LES results provided good agreement with experimental data, while RANS equations closed with \\[k-ω -γ -\\overset{}{\\mathop{{{\\operatorname{Re}}θ, \\text{t}}}} \\] approach overstimates laminar separation bubble (LSB) extension of dimpled and un–dimpled configurations. Moreover, through skin friction coefficient analysis, we found a different representation of the turbulent zone between the numerical models; indeed, with RANS model LSB seems to be divided in two different parts, meanwhile LES model shows a LSB global reduction.

  16. Modeling and experimental verification of infusion speed of liquids in photonic crystal fibers

    DEFF Research Database (Denmark)

    Sørensen, Thorkild; Noordegraaf, Danny; Nielsen, Kristian

    A theoretical method for predicting infusion time of liquids in microcapillaries is formulated. Through a microscopical, a fluorescent, and, finally, through a reflectometric measurement method, the model is successfully verified in real photonic crystal fibers.......A theoretical method for predicting infusion time of liquids in microcapillaries is formulated. Through a microscopical, a fluorescent, and, finally, through a reflectometric measurement method, the model is successfully verified in real photonic crystal fibers....

  17. Predictive Simulation of Material Failure Using Peridynamics-Advanced Constitutive Modeling, Verification and Validation

    Science.gov (United States)

    2016-03-31

    system solver utilities • PyTrilinos - a python interface providing Python wrappers for many Trilinos packages, and offering compatibility between...parallelization is achieved using Epetra data structures for distributed variables. Model force evaluations are coded in Python , making extensive use...analytical solutions. For comparison, equivalent models are created and analyzed in Abaqus 6.12 to verify simple cases. 66 DISTRIBUTION A: Distribution

  18. Experimental verification of mathematical model of the heat transfer in exhaust system

    OpenAIRE

    Petković Snežana; Pešić Radivoje; Lukić Jovanka

    2011-01-01

    A Catalyst convertor has maximal efficiency when it reaches working temperature. In a cold start phase efficiency of the catalyst is low and exhaust emissions have high level of air pollutants. The exhaust system optimization, in order to decrease time of achievement of the catalyst working temperature, caused reduction of the total vehicle emission. Implementation of mathematical models in development of exhaust systems decrease total costs and reduce time. Mathematical model has to be...

  19. Release Fixed Heel Point (FHP) Accommodation Model Verification and Validation (V and V) Plan - Rev A

    Science.gov (United States)

    2017-01-23

    including the effects of body size, vehicle layout, and Soldier protective equipment and gear. The boundaries defined include the required space and...90% of Soldier population, 85% male). The model can guide vehicle designers in creating an optimized work space for the occupant. The CAD...posture metrics observed in vehicles . The results of the validation testing may lead to the FHP accommodation model being adjusted to address

  20. Verification of Nine-phase PMSM Model in d-q Coordinates with Mutual Couplings

    OpenAIRE

    Kozovský, Matúš; Blaha, Petr; Václavek, Pavel

    2016-01-01

    Electric motors with more than three phases have many advantages comparing to ordinary three-phase motor. For this reason it is natural to pay attention to hem and to work on advanced control methods. Control algorithms development requires to operate with the model of motor. This paper presents the modeling concept of the nine-phase permanent magnet synchronous motor (PMSM) in three times three-phase arrangement fed by nine-phase voltage source inverter (VSI). Magnetic interaction between...

  1. Design, analysis and verification of a knee joint oncological prosthesis finite element model.

    Science.gov (United States)

    Zach, Lukáš; Kunčická, Lenka; Růžička, Pavel; Kocich, Radim

    2014-11-01

    The aim of this paper was to design a finite element model for a hinged PROSPON oncological knee endoprosthesis and to verify the model by comparison with ankle flexion angle using knee-bending experimental data obtained previously. Visible Human Project CT scans were used to create a general lower extremity bones model and to compose a 3D CAD knee joint model to which muscles and ligaments were added. Into the assembly the designed finite element PROSPON prosthesis model was integrated and an analysis focused on the PEEK-OPTIMA hinge pin bushing stress state was carried out. To confirm the stress state analysis results, contact pressure was investigated. The analysis was performed in the knee-bending position within 15.4-69.4° hip joint flexion range. The results showed that the maximum stress achieved during the analysis (46.6 MPa) did not exceed the yield strength of the material (90 MPa); the condition of plastic stability was therefore met. The stress state analysis results were confirmed by the distribution of contact pressure during knee-bending. The applicability of our designed finite element model for the real implant behaviour prediction was proven on the basis of good correlation of the analytical and experimental ankle flexion angle data. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. A dynamic human water and electrolyte balance model for verification and optimization of life support systems in space flight applications

    Science.gov (United States)

    Hager, P.; Czupalla, M.; Walter, U.

    2010-11-01

    In this paper we report on the development of a dynamic MATLAB SIMULINK® model for the water and electrolyte balance inside the human body. This model is part of an environmentally sensitive dynamic human model for the optimization and verification of environmental control and life support systems (ECLSS) in space flight applications. An ECLSS provides all vital supplies for supporting human life on board a spacecraft. As human space flight today focuses on medium- to long-term missions, the strategy in ECLSS is shifting to closed loop systems. For these systems the dynamic stability and function over long duration are essential. However, the only evaluation and rating methods for ECLSS up to now are either expensive trial and error breadboarding strategies or static and semi-dynamic simulations. In order to overcome this mismatch the Exploration Group at Technische Universität München (TUM) is developing a dynamic environmental simulation, the "Virtual Habitat" (V-HAB). The central element of this simulation is the dynamic and environmentally sensitive human model. The water subsystem simulation of the human model discussed in this paper is of vital importance for the efficiency of possible ECLSS optimizations, as an over- or under-scaled water subsystem would have an adverse effect on the overall mass budget. On the other hand water has a pivotal role in the human organism. Water accounts for about 60% of the total body mass and is educt and product of numerous metabolic reactions. It is a transport medium for solutes and, due to its high evaporation enthalpy, provides the most potent medium for heat load dissipation. In a system engineering approach the human water balance was worked out by simulating the human body's subsystems and their interactions. The body fluids were assumed to reside in three compartments: blood plasma, interstitial fluid and intracellular fluid. In addition, the active and passive transport of water and solutes between those

  3. Verification of a Monte Carlo model of the Missouri S&T reactor

    Science.gov (United States)

    Richardson, Brad Paul

    The purpose of this research is to ensure that an MCNP model of the Missouri S&T reactor produces accurate results so that it may be used to predict the effects of some desired upgrades to the reactor. The desired upgrades are an increase in licensed power from 200 kW to 400kW, and the installation of a secondary cooling system to prevent heating of the pool. This was performed by comparing simulations performed using the model with experiments performed using the reactor. The experiments performed were, the approach to criticality method of predicting the critical control rod height, measurement of the axial flux profile, moderator temperature coefficient of reactivity, and void coefficient of reactivity. The results of these experiments and results from the simulation show that the model produces a similar axial flux profile, and that it models the void and temperature coefficients of reactivity well. The model does however over-predict the criticality of the core, such that it predicts a lower critical rod height and a keff greater than one when simulating conditions in which the reactor was at a stable power. It is assumed that this is due to the model using fuel compositions from when the fuel was new, while in reality the reactor has been operating with this fuel for nearly 20 years. It has therefore been concluded that the fuel composition should be updated by performing a burnup analysis, and an accurate heat transfer and fluid flow analysis be performed to better represent the temperature profile before the model is used to simulate the effects of the desired upgrades.

  4. M3 version 3.0: Verification and validation; Hydrochemical model of ground water at repository site

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Javier B. (Dept. of Earth Sciences, Univ. of Zaragoza, Zaragoza (Spain)); Laaksoharju, Marcus (Geopoint AB, Sollentuna (Sweden)); Skaarman, Erik (Abscondo, Bromma (Sweden)); Gurban, Ioana (3D-Terra (Canada))

    2009-01-15

    Hydrochemical evaluation is a complex type of work that is carried out by specialists. The outcome of this work is generally presented as qualitative models and process descriptions of a site. To support and help to quantify the processes in an objective way, a multivariate mathematical tool entitled M3 (Multivariate Mixing and Mass balance calculations) has been constructed. The computer code can be used to trace the origin of the groundwater, and to calculate the mixing proportions and mass balances from groundwater data. The M3 code is a groundwater response model, which means that changes in the groundwater chemistry in terms of sources and sinks are traced in relation to an ideal mixing model. The complexity of the measured groundwater data determines the configuration of the ideal mixing model. Deviations from the ideal mixing model are interpreted as being due to reactions. Assumptions concerning important mineral phases altering the groundwater or uncertainties associated with thermodynamic constants do not affect the modelling because the calculations are solely based on the measured groundwater composition. M3 uses the opposite approach to that of many standard hydrochemical models. In M3, mixing is evaluated and calculated first. The constituents that cannot be described by mixing are described by reactions. The M3 model consists of three steps: the first is a standard principal component analysis, followed by mixing and finally mass balance calculations. The measured groundwater composition can be described in terms of mixing proportions (%), while the sinks and sources of an element associated with reactions are reported in mg/L. This report contains a set of verification and validation exercises with the intention of building confidence in the use of the M3 methodology. At the same time, clear answers are given to questions related to the accuracy and the precision of the results, including the inherent uncertainties and the errors that can be made

  5. Future directions of nuclear verification

    International Nuclear Information System (INIS)

    Blix, H.

    1997-01-01

    Future directions of nuclear verification are discussed including the following topics: verification of non-proliferation commitments; practicalities of strengthening safeguards; new tasks in nuclear verification

  6. Experimental Verification of Modeled Thermal Distribution Produced by a Piston Source in Physiotherapy Ultrasound

    Directory of Open Access Journals (Sweden)

    M. I. Gutierrez

    2016-01-01

    Full Text Available Objectives. To present a quantitative comparison of thermal patterns produced by the piston-in-a-baffle approach with those generated by a physiotherapy ultrasonic device and to show the dependency among thermal patterns and acoustic intensity distributions. Methods. The finite element (FE method was used to model an ideal acoustic field and the produced thermal pattern to be compared with the experimental acoustic and temperature distributions produced by a real ultrasonic applicator. A thermal model using the measured acoustic profile as input is also presented for comparison. Temperature measurements were carried out with thermocouples inserted in muscle phantom. The insertion place of thermocouples was monitored with ultrasound imaging. Results. Modeled and measured thermal profiles were compared within the first 10 cm of depth. The ideal acoustic field did not adequately represent the measured field having different temperature profiles (errors 10% to 20%. Experimental field was concentrated near the transducer producing a region with higher temperatures, while the modeled ideal temperature was linearly distributed along the depth. The error was reduced to 7% when introducing the measured acoustic field as the input variable in the FE temperature modeling. Conclusions. Temperature distributions are strongly related to the acoustic field distributions.

  7. The Verification of Structural Decision-Making Model for Evaluating Education on Facebook

    Directory of Open Access Journals (Sweden)

    Kozel Roman

    2013-09-01

    Full Text Available The aim of this paper is to present the work of the research team who tried to construct a model that explores general opinions of students about education on Facebook and also opinions of students about education on the social page for course E-marketing by using structural equation model. Facebook has already been present at universities due to the fact that students use it as a primary source of information about news in courses, duties, and so on. The research team carried out an experiment in the course E-marketing at FE of VŠB – TUO, in which Facebook was used as a tool for communication between students and teachers. The research on the attitude of students towards education on Facebook was conducted by questioning using predefined variables. The first form of the model was designed by factor analysis with method Varimax, when six groups of factors that affect respondents´ opinions about education were defined. A structural equation model was used to verify the validity of the model. It appears that four groups of factors mainly affect respondents´ attitudes to this type of education according to the testing performed. These groups of factors are Engagement, Information and Modern Technologies, Lecturers and Scores, and Education on Facebook. The research team also determined statistically the most significant variables in these factors that affect the opinions of students about education the most.

  8. Model-Based Interpretation and Experimental Verification of ECT Signals of Steam Generator Tubes

    International Nuclear Information System (INIS)

    Song, Sung Jin; Kim, Young Hwan; Kim, Eui Lae; Yim, Chang Jae; Lee, Jin Ho

    2004-01-01

    Model-based inversion tools for eddy current signals have been developed by combining neural networks and finite element modeling, for quantitative flaw characterization in steam generator tubes. In the present work, interpretation of experimental eddy current signals was carried out in order to validate the developed inversion tools. A database was constructed using the synthetic flaw signals generated by the finite element model. The hybrid neural networks composed of a PNN classifier and BPNN size estimators were trained using the synthetic signals. Experimental eddy current signals were obtained from axisymmetric artificial flaws. Interpretation of flaw signals was conducted by feeding the experimental signals into the neural networks. The interpretation was excellent, which shows that the developed inversion tools would be applicable to the Interpretation of real eddy current signals

  9. Verification of the IVA4 film boiling model with the data base of Liu and Theofanous

    Energy Technology Data Exchange (ETDEWEB)

    Kolev, N.I. [Siemens AG Unternehmensbereich KWU, Erlangen (Germany)

    1998-01-01

    Part 1 of this work presents a closed analytical solution for mixed-convection film boiling on vertical walls. Heat transfer coefficients predicted by the proposed model and experimental data obtained at the Royal Institute of Technology in Sweden by Okkonen et al are compared. All data predicted are inside the {+-}10% error band, with mean averaged error being below 4% using the slightly modified analytical solution. The solution obtained is recommended for practical applications. The method presented here is used in Part 2 as a guideline for developing model for film boiling on spheres. The new semi-empirical film boiling model for spheres used in IVA4 computer code is compared with the experimental data base obtained by Liu and Theofanous. The data are predicted within {+-}30% error band. (author)

  10. Model and verification of electrokinetic flow and transport in a micro-electrophoresis device.

    Science.gov (United States)

    Barz, Dominik P J; Ehrhard, Peter

    2005-09-01

    We investigate the electrokinetic flow and transport within a micro-electrophoresis device. A mathematical model is set up, which allows to perform two-dimensional, time-dependent finite-element simulations. The model reflects the dominant features of the system, namely electroosmosis, electrophoresis, externally-applied electrical potentials, and equilibrium chemistry. For the solution of the model equations we rely on numerical simulations of the core region, while the immediate wall region is treated analytically at leading order. This avoids extreme refinements of the numerical grid within the EDL. An asymptotic matching of both solutions and subsequent superposition, nevertheless, provides an approximation for the solution in the entire domain. The results of the simulations are verified against experimental observation and show good agreement.

  11. Experimental verification of a radiofrequency power model for Wi-Fi technology.

    Science.gov (United States)

    Fang, Minyu; Malone, David

    2010-04-01

    When assessing the power emitted from a Wi-Fi network, it has been observed that these networks operate at a relatively low duty cycle. In this paper, we extend a recently introduced model of emitted power in Wi-Fi networks to cover conditions where devices do not always have packets to transmit. We present experimental results to validate the original model and its extension by developing approximate, but practical, testbed measurement techniques. The accuracy of the models is confirmed, with small relative errors: less than 5-10%. Moreover, we confirm that the greatest power is emitted when the network is saturated with traffic. Using this, we give a simple technique to quickly estimate power output based on traffic levels and give examples showing how this might be used in practice to predict current or future power output from a Wi-Fi network.

  12. Formal verification of dynamic hybrid systems: a NuSMV-based model checking approach

    Directory of Open Access Journals (Sweden)

    Xu Zhi

    2018-01-01

    Full Text Available Software security is an important and challenging research topic in developing dynamic hybrid embedded software systems. Ensuring the correct behavior of these systems is particularly difficult due to the interactions between the continuous subsystem and the discrete subsystem. Currently available security analysis methods for system risks have been limited, as they rely on manual inspections of the individual subsystems under simplifying assumptions. To improve this situation, a new approach is proposed that is based on the symbolic model checking tool NuSMV. A dual PID system is used as an example system, for which the logical part and the computational part of the system are modeled in a unified manner. Constraints are constructed on the controlled object, and a counter-example path is ultimately generated, indicating that the hybrid system can be analyzed by the model checking tool.

  13. A Modified Approach to Modeling of Diffusive Transformation Kinetics from Nonisothermal Data and Experimental Verification

    Science.gov (United States)

    Chen, Xiangjun; Xiao, Namin; Cai, Minghui; Li, Dianzhong; Li, Guangyao; Sun, Guangyong; Rolfe, Bernard F.

    2016-09-01

    An inverse model is proposed to construct the mathematical relationship between continuous cooling transformation (CCT) kinetics with constant rates and the isothermal one. The kinetic parameters in JMAK equations of isothermal kinetics can be deduced from the experimental CCT kinetics. Furthermore, a generalized model with a new additive rule is developed for predicting the kinetics of nucleation and growth during diffusional phase transformation with arbitrary cooling paths based only on CCT curve. A generalized contribution coefficient is introduced into the new additivity rule to describe the influences of current temperature and cooling rate on the incubation time of nuclei. Finally, then the reliability of the proposed model is validated using dilatometry experiments of a microalloy steel with fully bainitic microstructure based on various cooling routes.

  14. COUPLING OF CORONAL AND HELIOSPHERIC MAGNETOHYDRODYNAMIC MODELS: SOLUTION COMPARISONS AND VERIFICATION

    Energy Technology Data Exchange (ETDEWEB)

    Merkin, V. G. [The Johns Hopkins University Applied Physics Laboratory, Laurel, MD 20723 (United States); Lionello, R.; Linker, J.; Török, T.; Downs, C. [Predictive Science, Inc., San Diego, CA 92121 (United States); Lyon, J. G., E-mail: slava.merkin@jhuapl.edu [Department of Physics and Astronomy, Dartmouth College, Hanover, NH 03755 (United States)

    2016-11-01

    Two well-established magnetohydrodynamic (MHD) codes are coupled to model the solar corona and the inner heliosphere. The corona is simulated using the MHD algorithm outside a sphere (MAS) model. The Lyon–Fedder–Mobarry (LFM) model is used in the heliosphere. The interface between the models is placed in a spherical shell above the critical point and allows both models to work in either a rotating or an inertial frame. Numerical tests are presented examining the coupled model solutions from 20 to 50 solar radii. The heliospheric simulations are run with both LFM and the MAS extension into the heliosphere, and use the same polytropic coronal MAS solutions as the inner boundary condition. The coronal simulations are performed for idealized magnetic configurations, with an out-of-equilibrium flux rope inserted into an axisymmetric background, with and without including the solar rotation. The temporal evolution at the inner boundary of the LFM and MAS solutions is shown to be nearly identical, as are the steady-state background solutions, prior to the insertion of the flux rope. However, after the coronal mass ejection has propagated through the significant portion of the simulation domain, the heliospheric solutions diverge. Additional simulations with different resolution are then performed and show that the MAS heliospheric solutions approach those of LFM when run with progressively higher resolution. Following these detailed tests, a more realistic simulation driven by the thermodynamic coronal MAS is presented, which includes solar rotation and an azimuthally asymmetric background and extends to the Earth’s orbit.

  15. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  16. Bounded Model Checking and Inductive Verification of Hybrid Discrete-Continuous Systems

    DEFF Research Database (Denmark)

    Becker, Bernd; Behle, Markus; Eisenbrand, Fritz

    2004-01-01

    verication, bounded plan- ning and heuristic search, combinatorial optimization and integer programming. Af- ter sketching the overall verication ow we present rst results indicating that the combination and tight integration of dierent verication engines is a rst step to pave the way to fully automated BMC......We present a concept to signicantly advance the state of the art for bounded model checking (BMC) and inductive verication (IV) of hybrid discrete-continuous systems. Our approach combines the expertise of partners coming from dierent domains, like hybrid systems modeling and digital circuit...

  17. Mathematical modeling and microbiological verification of ohmic heating of a multicomponent mixture of particles in a continuous flow ohmic heater system with electric field parallel to flow.

    Science.gov (United States)

    Kamonpatana, Pitiya; Mohamed, Hussein M H; Shynkaryk, Mykola; Heskitt, Brian; Yousef, Ahmed E; Sastry, Sudhir K

    2013-11-01

    To accomplish continuous flow ohmic heating of a low-acid food product, sufficient heat treatment needs to be delivered to the slowest-heating particle at the outlet of the holding section. This research was aimed at developing mathematical models for sterilization of a multicomponent food in a pilot-scale ohmic heater with electric-field-oriented parallel to the flow and validating microbial inactivation by inoculated particle methods. The model involved 2 sets of simulations, one for determination of fluid temperatures, and a second for evaluating the worst-case scenario. A residence time distribution study was conducted using radio frequency identification methodology to determine the residence time of the fastest-moving particle from a sample of at least 300 particles. Thermal verification of the mathematical model showed good agreement between calculated and experimental fluid temperatures (P > 0.05) at heater and holding tube exits, with a maximum error of 0.6 °C. To achieve a specified target lethal effect at the cold spot of the slowest-heating particle, the length of holding tube required was predicted to be 22 m for a 139.6 °C process temperature with volumetric flow rate of 1.0 × 10(-4) m3/s and 0.05 m in diameter. To verify the model, a microbiological validation test was conducted using at least 299 chicken-alginate particles inoculated with Clostridium sporogenes spores per run. The inoculated pack study indicated the absence of viable microorganisms at the target treatment and its presence for a subtarget treatment, thereby verifying model predictions. © 2013 Institute of Food Technologists®

  18. Verification of high-speed solar wind stream forecasts using operational solar wind models

    DEFF Research Database (Denmark)

    Reiss, Martin A.; Temmer, Manuela; Veronig, Astrid M.

    2016-01-01

    and the background solar wind conditions. We found that both solar wind models are capable of predicting the large-scale features of the observed solar wind speed (root-mean-square error, RMSE ≈100 km/s) but tend to either overestimate (ESWF) or underestimate (WSA) the number of high-speed solar wind streams (threat...

  19. Measurement of Temperature and Soil Properties for Finite Element Model Verification

    Science.gov (United States)

    2012-08-01

    In recent years, ADOT&PF personnel have used TEMP/W, a commercially available two-dimensional finite element program, to conduct thermal modeling of various : embankment configurations in an effort to reduce the thawing of ice-rich permafrost through...

  20. Verification of Overall Safety Factors In Deterministic Design Of Model Tested Breakwaters

    DEFF Research Database (Denmark)

    Burcharth, H. F.

    2001-01-01

    The paper deals with concepts of safety implementation in design. An overall safety factor concept is evaluated on the basis of a reliability analysis of a model tested rubble mound breakwater with monolithic super structure. Also discussed are design load identification and failure mode limit...

  1. Tunable n-path notch filters for blocker suppression: modeling and verification

    NARCIS (Netherlands)

    Ghaffari, A.; Klumperink, Eric A.M.; Nauta, Bram

    2013-01-01

    N-path switched-RC circuits can realize filters with very high linearity and compression point while they are tunable by a clock frequency. In this paper, both differential and single-ended N-path notch filters are modeled and analyzed. Closed-form equations provide design equations for the main

  2. Verification and improvement of analytical modeling of seismic isolation bearings and isolated structures

    International Nuclear Information System (INIS)

    Forni, M.; La Grotteria, M.; Martelli, A.; Bertola, S.; Bettinali, F.; Dusi, A.; Bergamo, G.; Bonacina, G.

    2002-01-01

    Due to the complexity of dynamic behaviour of seismic isolation (SI) devices, high cost of their tests and non-negligible number of devices having excellent potential for nuclear applications, several countries judged of great interest to extend validation of their numerical models of such devices to the analysis of experimental data obtained by others. Thus, a four-years Coordinated Research Program (CRP) on Intercomparison of Analysis Methods for Isolated Nuclear Structures, proposed by ENEA (1995), was endorsed by the IAEA in 1995. There, Italy was jointly represented by ENEA, ENEL and ISMES, and supplied test results concerning both High Damping Rubber Bearings (HDRBs) and the MISS (Model of Isolated Steel Structure) mock-up, which had been isolated using such bearings. Test data provided by Italy to the other countries were also re-analysed to improve mathematical models. Aim of this final report is to summarise, after a brief description of the devices and structures considered, the most important results and conclusions of the numerical analyses carried out by Italy. For more detailed information, especially as far as the execution of the tests and the implementation of the numerical models are concerned, please refer to the technical reports presented by Italy to the Research Coordination Meetings (RCMs). (author)

  3. Contaminant transport in groundwater in the presence of colloids and bacteria: model development and verification.

    Science.gov (United States)

    Bekhit, Hesham M; El-Kordy, Mohamed A; Hassan, Ahmed E

    2009-09-01

    Colloids and bacteria (microorganisms) naturally exist in groundwater aquifers and can significantly impact contaminant migration rates. A conceptual model is first developed to account for the different physiochemical and biological processes, reaction kinetics, and different transport mechanisms of the combined system (contaminant-colloids-bacteria). All three constituents are assumed to be reactive with the reactions taking place between each constituent and the porous medium and also among the different constituents. A general linear kinetic reaction model is assumed for all reactive processes considered. The mathematical model is represented by fourteen coupled partial differential equations describing mass balance and reaction processes. Two of these equations describe colloid movement and reactions with the porous medium, four equations describe bacterial movement and reactions with colloids and the porous medium, and the remaining eight equations describe contaminant movement and its reactions with bacteria, colloids, and the porous medium. The mass balance equations are numerically solved for two-dimensional groundwater systems using a third-order, total variance-diminishing scheme (TVD) for the advection terms. Due to the complex coupling of the equations, they are solved iteratively each time step until a convergence criterion is met. The model is tested against experimental data and the results are favorable.

  4. Climate studies from satellite observations - Special problems in the verification of earth radiation balance, cloud climatology, and related climate experiments

    Science.gov (United States)

    Vonder Haar, T. H.

    1982-01-01

    A body of techniques that have been developed and planned for use during the Earth Radiation Budget Experiment (ERBE), the International Satellite Cloud Climatology Project (ISCCP), and related climate experiments of the 1980's are reviewed. Validation and verification methods must apply for systems of satellites. They include: (1) use of a normalization or intercalibration satellite, (2) special intensive observation areas located over ground-truth sites, and (3) monitoring of sun and earth by several satellites and/or several instruments at the same time. Since each climate application area has a hierarchy of user communities, validation techniques vary from very detailed methods to those that simply assure high relative accuracy in detecting space and time variations for climate studies. It is shown that climate experiments generally require more emphasis on long-term stability and internal consistency of satellite data sets than high absolute accuracy.

  5. Integrated Mecical Model (IMM) 4.0 Verification and Validation (VV) Testing (HRP IWS 2016)

    Science.gov (United States)

    Walton, M; Kerstman, E.; Arellano, J.; Boley, L.; Reyes, D.; Young, M.; Garcia, Y.; Saile, L.; Myers, J.

    2016-01-01

    Timeline, partial treatment, and alternate medications were added to the IMM to improve the fidelity of this model to enhance decision support capabilities. Using standard design reference missions, IMM VV testing compared outputs from the current operational IMM (v3) with those from the model with added functionalities (v4). These new capabilities were examined in a comparative, stepwise approach as follows: a) comparison of the current operational IMM v3 with the enhanced functionality of timeline alone (IMM 4.T), b) comparison of IMM 4.T with the timeline and partial treatment (IMM 4.TPT), and c) comparison of IMM 4.TPT with timeline, partial treatment and alternative medication (IMM 4.0).

  6. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

    OpenAIRE

    Shrirang Ambaji KULKARNI; Raghavendra G . RAO

    2017-01-01

    Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique ...

  7. Modelling in pinnacle for distance extended source-patient and verification with film EBT2 technique

    International Nuclear Information System (INIS)

    Perucha Ortega, M.; Luis simon, J.; Rodriguez Alarcon, C.; Baeza Trujillo, M.; Sanchez Carmona, G.; Vicente Granado, D.; Gutierrez Ramos, S.; Herrador Cordoba, M.

    2013-01-01

    The objective of this work is modelled on the Pinnacle Scheduler the geometry used in our Center for the technique of Total body irradiation which consists of radiate to the patient, whose middle line is 366 cm from the source, in positions lateral decubitus, with 2 fields anteroposterior of 40 x 40 cm 2 , rotated collimator 45 degree centigrade interposing a screen of methacrylate 1 cm thick to 29 cm ahead of the middle line. (Author)

  8. Improvement, Verification, and Refinement of Spatially Explicit Exposure Models in Risk Assessment - SEEM Demonstration

    Science.gov (United States)

    2015-06-01

    Exposure Model UCL upper confidence limit USACHPPM U.S. Army Center for Health Promotion and Preventive Medicine USEPA U.S. Environmental Protection...Health Promotion and Preventative Medicine (USACHPPM) (2004). The median BAF for soil to earthworms was 3.342 and was selected from Table 4.2 of the...renal toxicity of mercury to small mammals at a contaminated terrestrial field site. Ecotoxicology . 2:243-256 (as cited in Sample, B.E., and G.W

  9. Improvement, Verification, and Refinement of Spatially-Explicit Exposure Models in Risk Assessment - SEEM

    Science.gov (United States)

    2015-06-01

    Exposure Model UCL upper confidence limit USACHPPM U.S. Army Center for Health Promotion and Preventive Medicine USEPA U.S. Environmental Protection...Health Promotion and Preventative Medicine (USACHPPM) (2004). The median BAF for soil to earthworms was 3.342 and was selected from Table 4.2 of the...renal toxicity of mercury to small mammals at a contaminated terrestrial field site. Ecotoxicology . 2:243-256 (as cited in Sample, B.E., and G.W

  10. Construction and verification of a model of passenger response to STOL aircraft characteristics

    Science.gov (United States)

    Jacobson, I. D.

    1976-01-01

    A technique for evaluating passenger acceptance of a transportation system's environment has been developed. This includes a model of passenger reaction to the vehicle, as well as the relative satisfaction compared to other system attributes. The technique is applied to two commercial airline operations - a U.S. commuter, and the Canadian Airtransit STOL system. It is demonstrated that system convenience and aircraft interior seating can play a large role in satisfying the passenger.

  11. SKA aperture array verification system: electromagnetic modeling and beam pattern measurements using a micro UAV

    Science.gov (United States)

    de Lera Acedo, E.; Bolli, P.; Paonessa, F.; Virone, G.; Colin-Beltran, E.; Razavi-Ghods, N.; Aicardi, I.; Lingua, A.; Maschio, P.; Monari, J.; Naldi, G.; Piras, M.; Pupillo, G.

    2017-12-01

    In this paper we present the electromagnetic modeling and beam pattern measurements of a 16-elements ultra wideband sparse random test array for the low frequency instrument of the Square Kilometer Array telescope. We discuss the importance of a small array test platform for the development of technologies and techniques towards the final telescope, highlighting the most relevant aspects of its design. We also describe the electromagnetic simulations and modeling work as well as the embedded-element and array pattern measurements using an Unmanned Aerial Vehicle system. The latter are helpful both for the validation of the models and the design as well as for the future instrumental calibration of the telescope thanks to the stable, accurate and strong radio frequency signal transmitted by the UAV. At this stage of the design, these measurements have shown a general agreement between experimental results and numerical data and have revealed the localized effect of un-calibrated cable lengths in the inner side-lobes of the array pattern.

  12. Noncontact measurement of electrostatic fields: Verification of modeled potentials within ion mobility spectrometer drift tube designs

    International Nuclear Information System (INIS)

    Scott, Jill R.; Tremblay, Paul L.

    2007-01-01

    The heart of an ion mobility spectrometer is the drift region where ion separation occurs. While the electrostatic potentials within a drift tube design can be modeled, no method for independently validating the electrostatic field has previously been reported. Two basic drift tube designs were modeled using SIMION 7.0 to reveal the expected electrostatic fields: (1) A traditional alternating set of electrodes and insulators and (2) a truly linear drift tube. One version of the alternating electrode/insulator drift tube and two versions of linear drift tubes were then fabricated. The stacked alternating electrodes/insulators were connected through a resistor network to generate the electrostatic gradient in the drift tube. The two linear drift tube designs consisted of two types of resistive drift tubes with one tube consisting of a resistive coating within an insulating tube and the other tube composed of resistive ferrites. The electrostatic fields within each type of drift tube were then evaluated by a noncontact method using a Kelvin-Zisman type electrostatic voltmeter and probe (results for alternative measurement methods provided in supplementary material). The experimental results were then compared with the electrostatic fields predicted by SIMION. Both the modeling and experimental measurements reveal that the electrostatic fields within a stacked ion mobility spectrometer drift tube are only pseudo-linear, while the electrostatic fields within a resistive drift tube approach perfect linearity

  13. Simulink based behavioural modelling of a pulse oximeter for deployment in rapid development, prototyping and verification.

    Science.gov (United States)

    Shokouhian, M; Morling, R C S; Kale, I

    2012-01-01

    The pulse oximeter is a well-known device for measuring the level of oxygen in blood. Since their invention, pulse oximeters have been under constant development in both aspects of hardware and software; however there are still unsolved problems that limit their performance [6], [7]. Many fresh algorithms and new design techniques are being suggested every year by industry and academic researchers which claim that they can improve accuracy of measurements [8], [9]. With the lack of an accurate computer-based behavioural model for pulse oximeters, the only way for evaluation of these newly developed systems and algorithms is through hardware implementation which can be both expensive and time consuming. This paper presents an accurate Simulink based behavioural model for a pulse oximeter that can be used by industry and academia alike working in this area, as an exploration as well as productivity enhancement tool during their research and development process. The aim of this paper is to introduce a new computer-based behavioural model which provides a simulation environment from which new ideas can be rapidly evaluated long before the real implementation.

  14. SKA aperture array verification system: electromagnetic modeling and beam pattern measurements using a micro UAV

    Science.gov (United States)

    de Lera Acedo, E.; Bolli, P.; Paonessa, F.; Virone, G.; Colin-Beltran, E.; Razavi-Ghods, N.; Aicardi, I.; Lingua, A.; Maschio, P.; Monari, J.; Naldi, G.; Piras, M.; Pupillo, G.

    2018-03-01

    In this paper we present the electromagnetic modeling and beam pattern measurements of a 16-elements ultra wideband sparse random test array for the low frequency instrument of the Square Kilometer Array telescope. We discuss the importance of a small array test platform for the development of technologies and techniques towards the final telescope, highlighting the most relevant aspects of its design. We also describe the electromagnetic simulations and modeling work as well as the embedded-element and array pattern measurements using an Unmanned Aerial Vehicle system. The latter are helpful both for the validation of the models and the design as well as for the future instrumental calibration of the telescope thanks to the stable, accurate and strong radio frequency signal transmitted by the UAV. At this stage of the design, these measurements have shown a general agreement between experimental results and numerical data and have revealed the localized effect of un-calibrated cable lengths in the inner side-lobes of the array pattern.

  15. SU-E-T-50: A Multi-Institutional Study of Independent Dose Verification Software Program for Lung SBRT

    International Nuclear Information System (INIS)

    Kawai, D; Takahashi, R; Kamima, T; Baba, H; Yamamoto, T; Kubo, Y; Ishibashi, S; Higuchi, Y; Takahashi, H; Tachibana, H

    2015-01-01

    Purpose: The accuracy of dose distribution depends on treatment planning system especially in heterogeneity-region. The tolerance level (TL) of the secondary check using the independent dose verification may be variable in lung SBRT plans. We conducted a multi-institutional study to evaluate the tolerance level of lung SBRT plans shown in the AAPM TG114. Methods: Five institutes in Japan participated in this study. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. Analytical Anisotropic Algorithm (AAA), Pencil Beam Convolution with modified Batho-method (PBC-B) and Adaptive Convolve (AC) were used for lung SBRT planning. A measurement using an ion-chamber was performed in a heterogeneous phantom to compare doses from the three different algorithms and the SMU to the measured dose. In addition to it, a retrospective analysis using clinical lung SBRT plans (547 beams from 77 patients) was conducted to evaluate the confidence limit (CL, Average±2SD) in dose between the three algorithms and the SMU. Results: Compared to the measurement, the AAA showed the larger systematic dose error of 2.9±3.2% than PBC-B and AC. The Clarkson-based SMU showed larger error of 5.8±3.8%. The CLs for clinical plans were 7.7±6.0 % (AAA), 5.3±3.3 % (AC), 5.7±3.4 % (PBC -B), respectively. Conclusion: The TLs from the CLs were evaluated. A Clarkson-based system shows a large systematic variation because of inhomogeneous correction. The AAA showed a significant variation. Thus, we must consider the difference of inhomogeneous correction as well as the dependence of dose calculation engine

  16. Effective verification of confidentiality for multi-threaded programs

    NARCIS (Netherlands)

    Ngo, Minh Tri; Stoelinga, Mariëlle Ida Antoinette; Huisman, Marieke

    2014-01-01

    This paper studies how confidentiality properties of multi-threaded programs can be verified efficiently by a combination of newly developed and existing model checking algorithms. In particular, we study the verification of scheduler-specific observational determinism (SSOD), a property that

  17. Verification and Validation of Encapsulation Flow Models in GOMA, Version 1.1; TOPICAL

    International Nuclear Information System (INIS)

    MONDY, LISA ANN; RAO, REKHA R.; SCHUNK, P. RANDALL; SACKINGER, PHILIP A.; ADOLF, DOUGLAS B.

    2001-01-01

    Encapsulation is a common process used in manufacturing most non-nuclear components including: firing sets, neutron generators, trajectory sensing signal generators (TSSGs), arming, fusing and firing devices (AF and Fs), radars, programmers, connectors, and batteries. Encapsulation is used to contain high voltage, to mitigate stress and vibration and to protect against moisture. The purpose of the ASCI Encapsulation project is to develop a simulation capability that will allow us to aid in the encapsulation design process, especially for neutron generators. The introduction of an encapsulant poses many problems because of the need to balance ease of processing and properties necessary to achieve the design benefits such as tailored encapsulant properties, optimized cure schedule and reduced failure rates. Encapsulants can fail through fracture or delamination as a result of cure shrinkage, thermally induced residual stresses, voids or incomplete component embedding and particle gradients. Manufacturing design requirements include (1) maintaining uniform composition of particles in order to maintain the desired thermal coefficient of expansion (CTE) and density, (2) mitigating void formation during mold fill, (3) mitigating cure and thermally induced stresses during cure and cool down, and (4) eliminating delamination and fracture due to cure shrinkage/thermal strains. The first two require modeling of the fluid phase, and it is proposed to use the finite element code GOMA to accomplish this. The latter two require modeling of the solid state; however, ideally the effects of particle distribution would be included in the calculations, and thus initial conditions would be set from GOMA predictions. These models, once they are verified and validated, will be transitioned into the SIERRA framework and the ARIA code. This will facilitate exchange of data with the solid mechanics calculations in SIERRA/ADAGIO

  18. Modelling and Experimental Verification of a DEAP based 2-D rotational positioner

    DEFF Research Database (Denmark)

    Iskandarani, Yosef; Bilberg, Arne; Sarban, Rahimullah

    limited to 3000 V. This work will examine the ability of positioning a shaft coupled to a laser beam pointer in x-y direction which will provide insight into (a) the practicality of using the material for two dimensional rotational positioning and (b) to highlight feasible positioning applications. A test...... of the active actuators with 0.5 kV steps. For each, of the testing combinations the angle target was determined and results compared with the model of the two dimensional positioner. The feedback of positioner was not addressed in this work; through the relevant control theory which will be implemented...

  19. Flow aerodynamics modeling of an MHD swirl combustor - calculations and experimental verification

    International Nuclear Information System (INIS)

    Gupta, A.K.; Beer, J.M.; Louis, J.F.; Busnaina, A.A.; Lilley, D.G.

    1981-01-01

    This paper describes a computer code for calculating the flow dynamics of constant density flow in the second stage trumpet shaped nozzle section of a two stage MHD swirl combustor for application to a disk generator. The primitive pressure-velocity variable, finite difference computer code has been developed to allow the computation of inert nonreacting turbulent swirling flows in an axisymmetric MHD model swirl combustor. The method and program involve a staggered grid system for axial and radial velocities, and a line relaxation technique for efficient solution of the equations. Tue produces as output the flow field map of the non-dimensional stream function, axial and swirl velocity. 19 refs

  20. EXPERIMENTAL VERIFICATION OF COMPUTER MODEL OF COOLING SYSTEM FOR POWERFUL SEMI- CONDUCTOR DEVICE

    Directory of Open Access Journals (Sweden)

    I. A. Khorunzhii

    2007-01-01

    Full Text Available A cooling system for powerful semi-conductor device (power -1 kW consisting of a pin-type radiator and a body is considered in the paper. Cooling is carried out by forced convection of a coolant. Calculated values of temperatures on the radiator surface and experimentally measured values of temperatures in the same surface points have been compared in the paper. It has been shown that the difference between calculated and experimentally measured temperatures does not exceed 0,1-0,2 °C and it is comparable with experimental error value. The given results confirm correctness of a computer model.

  1. Modeling and Experimental Verification of an Electromagnetic and Piezoelectric Hybrid Energy Harvester

    Directory of Open Access Journals (Sweden)

    Fan Yuanyuan

    2016-11-01

    Full Text Available This paper describes mathematical models of an electromagnetic and piezoelectric hybrid energy harvesting system and provides an analysis of the relationship between the resonance frequency and the configuration parameters of the system. An electromagnetic and piezoelectric energy harvesting device was designed and the experimental results showed good agreement with the analytical results. The maximum load power of the hybrid energy harvesting system achieved 4.25 mW at a resonant frequency of 18 Hz when the acceleration was 0.7 g, which is an increase of 15% compared with the 3.62 mW achieved by a single electromagnetic technique.

  2. Model Development and Verification of the CRIPTE Code for Electromagnetic Coupling

    Science.gov (United States)

    2005-10-01

    carried out on a Pentium XEON- processor Linux server. The 2D and 3D FDTD codes were implemented using the C program language. (F) Simulation and...identifying a washer through Horn Antenna pattern discrimination (Fig. IIF-4) is being studied. GYROTRON Radiation Facility We have also setup a...Silvaco, CST) (E) In-House Simulations for Comparisons ( FDTD ) (F) Facilities III Accomplishments on EMP studies 16 (A) Interaction Studies: Topological

  3. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    Directory of Open Access Journals (Sweden)

    Mustafa Yuksel

    2016-01-01

    Full Text Available Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information.

  4. Application of Steinberg vibration fatigue model for structural verification of space instruments

    Science.gov (United States)

    García, Andrés; Sorribes-Palmer, Félix; Alonso, Gustavo

    2018-01-01

    Electronic components in spaceships are subjected to vibration loads during the ascent phase of the launcher. It is important to verify by tests and analysis that all parts can survive in the most severe load cases. The purpose of this paper is to present the methodology and results of the application of the Steinberg's fatigue model to estimate the life of electronic components of the EPT-HET instrument for the Solar Orbiter space mission. A Nastran finite element model (FEM) of the EPT-HET instrument was created and used for the structural analysis. The methodology is based on the use of the FEM of the entire instrument to calculate the relative displacement RDSD and RMS values of the PCBs from random vibration analysis. These values are used to estimate the fatigue life of the most susceptible electronic components with the Steinberg's fatigue damage equation and the Miner's cumulative fatigue index. The estimations are calculated for two different configurations of the instrument and three different inputs in order to support the redesign process. Finally, these analytical results are contrasted with the inspections and the functional tests made after the vibration tests, concluding that this methodology can adequately predict the fatigue damage or survival of the electronic components.

  5. APPROACH FOR THE SEMI-AUTOMATIC VERIFICATION OF 3D BUILDING MODELS

    Directory of Open Access Journals (Sweden)

    P. Helmholz

    2013-04-01

    Full Text Available In the field of spatial sciences, there are a large number of disciplines and techniques for capturing data to solve a variety of different tasks and problems for different applications. Examples include: traditional survey for boundary definitions, aerial imagery for building models, and laser scanning for heritage facades. These techniques have different attributes such as the number of dimensions, accuracy and precision, and the format of the data. However, because of the number of applications and jobs, often over time these data sets captured from different sensor platforms and for different purposes will overlap in some way. In most cases, while this data is archived, it is not used in future applications to value add to the data capture campaign of current projects. It is also the case that newly acquire data are often not used to combine and improve existing models and data integrity. The purpose of this paper is to discuss a methodology and infrastructure to automatically support this concept. That is, based on a job specification, to automatically query existing and newly acquired data based on temporal and spatial relations, and to automatically combine and generate the best solution. To this end, there are three main challenges to examine; change detection, thematic accuracy and data matching.

  6. Verification of high voltage rf capacitive sheath models with particle-in-cell simulations

    Science.gov (United States)

    Wang, Ying; Lieberman, Michael; Verboncoeur, John

    2009-10-01

    Collisionless and collisional high voltage rf capacitive sheath models were developed in the late 1980's [1]. Given the external parameters of a single-frequency capacitively coupled discharge, plasma parameters including sheath width, electron and ion temperature, plasma density, power, and ion bombarding energy can be estimated. One-dimensional electrostatic PIC codes XPDP1 [2] and OOPD1 [3] are used to investigate plasma behaviors within rf sheaths and bulk plasma. Electron-neutral collisions only are considered for collisionless sheaths, while ion-neutral collisions are taken into account for collisional sheaths. The collisionless sheath model is verified very well by PIC simulations for the rf current-driven and voltage-driven cases. Results will be reported for collisional sheaths also. [1] M. A. Lieberman, IEEE Trans. Plasma Sci. 16 (1988) 638; 17 (1989) 338 [2] J. P. Verboncoeur, M. V. Alves, V. Vahedi, and C. K. Birdsall, J. Comp. Phys. 104 (1993) 321 [3] J. P. Verboncoeur, A. B. Langdon and N. T. Gladd, Comp. Phys. Comm. 87 (1995) 199

  7. Revisiting the constant growth angle: Estimation and verification via rigorous thermal modeling

    Science.gov (United States)

    Virozub, Alexander; Rasin, Igal G.; Brandon, Simon

    2008-12-01

    Methods for estimating growth angle ( θgr) values, based on the a posteriori analysis of directionally solidified material (e.g. drops) often involve assumptions of negligible gravitational effects as well as a planar solid/liquid interface during solidification. We relax both of these assumptions when using experimental drop shapes from the literature to estimate the relevant growth angles at the initial stages of solidification. Assumed to be constant, we use these values as input into a rigorous heat transfer and solidification model of the growth process. This model, which is shown to reproduce the experimental shape of a solidified sessile water drop using the literature value of θgr=0∘, yields excellent agreement with experimental profiles using our estimated values for silicon ( θgr=10∘) and germanium ( θgr=14.3∘) solidifying on an isotropic crystalline surface. The effect of gravity on the solidified drop shape is found to be significant in the case of germanium, suggesting that gravity should either be included in the analysis or that care should be taken that the relevant Bond number is truly small enough in each measurement. The planar solidification interface assumption is found to be unjustified. Although this issue is important when simulating the inflection point in the profile of the solidified water drop, there are indications that solidified drop shapes (at least in the case of silicon) may be fairly insensitive to the shape of this interface.

  8. Fractional Market Model and its Verification on the Warsaw STOCK Exchange

    Science.gov (United States)

    Kozłowska, Marzena; Kasprzak, Andrzej; Kutner, Ryszard

    We analyzed the rising and relaxation of the cusp-like local peaks superposed with oscillations which were well defined by the Warsaw Stock Exchange index WIG in a daily time horizon. We found that the falling paths of all index peaks were described by a generalized exponential function or the Mittag-Leffler (ML) one superposed with various types of oscillations. However, the rising paths (except the first one of WIG which rises exponentially and the most important last one which rises again according to the ML function) can be better described by bullish anti-bubbles or inverted bubbles.2-4 The ML function superposed with oscillations is a solution of the nonhomogeneous fractional relaxation equation which defines here our Fractional Market Model (FMM) of index dynamics which can be also called the Rheological Model of Market. This solution is a generalized analog of an exactly solvable fractional version of the Standard or Zener Solid Model of viscoelastic materials commonly used in modern rheology.5 For example, we found that the falling paths of the index can be considered to be a system in the intermediate state lying between two complex ones, defined by short and long-time limits of the Mittag-Leffler function; these limits are given by the Kohlrausch-Williams-Watts (KWW) law for the initial times, and the power-law or the Nutting law for asymptotic time. Some rising paths (i.e., the bullish anti-bubbles) are a kind of log-periodic oscillations of the market in the bullish state initiated by a crash. The peaks of the index can be viewed as precritical or precrash ones since: (i) the financial market changes its state too early from the bullish to bearish one before it reaches a scaling region (defined by the diverging power-law of return per unit time), and (ii) they are affected by a finite size effect. These features could be a reminiscence of a significant risk aversion of the investors and their finite number, respectively. However, this means that the

  9. A preliminary study of in-house Monte Carlo simulations: an integrated Monte Carlo verification system.

    Science.gov (United States)

    Mukumoto, Nobutaka; Tsujii, Katsutomo; Saito, Susumu; Yasunaga, Masayoshi; Takegawa, Hideki; Yamamoto, Tokihiro; Numasaki, Hodaka; Teshima, Teruki

    2009-10-01

    To develop an infrastructure for the integrated Monte Carlo verification system (MCVS) to verify the accuracy of conventional dose calculations, which often fail to accurately predict dose distributions, mainly due to inhomogeneities in the patient's anatomy, for example, in lung and bone. The MCVS consists of the graphical user interface (GUI) based on a computational environment for radiotherapy research (CERR) with MATLAB language. The MCVS GUI acts as an interface between the MCVS and a commercial treatment planning system to import the treatment plan, create MC input files, and analyze MC output dose files. The MCVS consists of the EGSnrc MC codes, which include EGSnrc/BEAMnrc to simulate the treatment head and EGSnrc/DOSXYZnrc to calculate the dose distributions in the patient/phantom. In order to improve computation time without approximations, an in-house cluster system was constructed. The phase-space data of a 6-MV photon beam from a Varian Clinac unit was developed and used to establish several benchmarks under homogeneous conditions. The MC results agreed with the ionization chamber measurements to within 1%. The MCVS GUI could import and display the radiotherapy treatment plan created by the MC method and various treatment planning systems, such as RTOG and DICOM-RT formats. Dose distributions could be analyzed by using dose profiles and dose volume histograms and compared on the same platform. With the cluster system, calculation time was improved in line with the increase in the number of central processing units (CPUs) at a computation efficiency of more than 98%. Development of the MCVS was successful for performing MC simulations and analyzing dose distributions.

  10. Offline Signature Verification Using the Discrete Radon Transform and a Hidden Markov Model

    Directory of Open Access Journals (Sweden)

    J. Coetzer

    2004-04-01

    Full Text Available We developed a system that automatically authenticates offline handwritten signatures using the discrete Radon transform (DRT and a hidden Markov model (HMM. Given the robustness of our algorithm and the fact that only global features are considered, satisfactory results are obtained. Using a database of 924 signatures from 22 writers, our system achieves an equal error rate (EER of 18% when only high-quality forgeries (skilled forgeries are considered and an EER of 4.5% in the case of only casual forgeries. These signatures were originally captured offline. Using another database of 4800 signatures from 51 writers, our system achieves an EER of 12.2% when only skilled forgeries are considered. These signatures were originally captured online and then digitally converted into static signature images. These results compare well with the results of other algorithms that consider only global features.

  11. Safety verification of radiation shielding and heat transfer for a model for dry

    International Nuclear Information System (INIS)

    Yu, Haiyan; Tang, Xiaobin; Wang, Peng; Chen, Feida; Chai, Hao; Chen, Da

    2015-01-01

    Highlights: • New type of dry spent fuel storage was designed. • MC method and FEM were used to verify the reliability of new storage. • Radiation shield and heat transfer both meet IAEA standards: 2 mSv/h, 0.1 mSv/h and 190 °C, 85 °C. • Provided possibilities for future implementation of this type of dry storage. - Abstract: The goal of this research is to develop a type of dry spent fuel storage called CHN-24 container, which could contain an equivalent load of 45 GWD/MTU of spent fuel after 10 years cooling. Basically, radiation shielding performance and safe removal of decay heat, which play important roles in the safety performance, were checked and validated using the Monte Carlo method and finite element analysis to establish the radiation dose rate calculation model and three-dimensional heat transfer model for the CHN-24 container. The dose rates at the surface of the container and at a distance of 1 m from the surface were 0.42 mSv/h and 0.06 mSv/h, respectively. These conform to the International Atomic Energy Agency (IAEA) radioactive material transportation safety standards 2 mSv/h and 0.1 mSv/h. The results shows that the CHN-24 container maintains its structural and material integrity under the condition of normal thermal steady-state heat transfer as well as in case of extreme fire as evinced by transient-state analysis. The temperature inside and on the surface of the container were 150.91 °C and 80 °C under normal storage conditions, which indicated that the design also conform to IAEA heat transfer safety standards of 190 °C and 85 °C

  12. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  13. Verification of Fourier phase and amplitude values from simulated heart motion using a hydrodynamic cardiac model

    International Nuclear Information System (INIS)

    Yiannikas, J.; Underwood, D.A.; Takatani, Setsuo; Nose, Yukihiko; MacIntyre, W.J.; Cook, S.A.; Go, R.T.; Golding, L.; Loop, F.D.

    1986-01-01

    Using pusher-plate-type artificial hearts, changes in the degree of synchrony and stroke volume were compared to phase and amplitude calculations from the first Fourier component of individual-pixel time-activity curves generated from gated radionuclide images (RNA) of these hearts. In addition, the ability of Fourier analysis to quantify paradoxical volume shifts was tested using a ventricular aneurysm model by which the Fourier amplitude was correlated to known increments of paradoxical volume. Predetermined phase-angle differences (incremental increases in asynchrony) and the mean phase-angle difference calculated from RNAs showed an agreement of -7 0 +-4.4 0 (mean +-SD). A strong correlation was noted between stroke volume and Fourier amplitude (r=0.98; P<0.0001) as well as between the paradoxical volume accepted by the 'aneurysm' and the Fourier amplitude (r=0.97; P<0.0001). The degree of asynchrony and changes in stroke volume were accurately reflected by the Fourier phase and amplitude values, respectively. In the specific case of ventricular aneurysms, the data demonstrate that using this method, the paradoxically moving areas may be localized, and the expansile volume within these regions can be quantified. (orig.)

  14. Development and verification of fission gas release model for the design and analysis of future fuel

    International Nuclear Information System (INIS)

    Ku, Yang Hyun; Sohn, Dong Sung.

    1997-08-01

    A mechanistic model has been developed to predict the release behavior of fission gas during steady-state and transient conditions for both LWR UO 2 and MOX fuel. Under the assumption that UO 2 grain surface is composed of fourteen identical circular faces and grain edge bubble can be represented by a triangulated tube around the circumference of three circular grain faces, it introduces the concept of continuous formation of open grain edges tunnels that is proportional to grain edge swelling. In addition, it takes into account the interaction between the gas release from matrix to grain boundary and the reintroduction of gas atoms into the matrix by the irradiation-induced re-solution of grain face bubbles. It also treats analytically the behavior of intragranular, intergranular, and grain edge bubbles under the assumption that both intragranular and intergranular bubbles are uniform in both radius and number density. The effect of contact pressure between clad and pellet on the inter-granular bubble's storage capacity of fission gas has been considered. (author). 43 refs., 4 tabs., 35 figs

  15. Dynamic CT myocardial perfusion imaging: detection of ischemia in a porcine model with FFR verification

    Science.gov (United States)

    Fahmi, Rachid; Eck, Brendan L.; Vembar, Mani; Bezerra, Hiram G.; Wilson, David L.

    2014-03-01

    Dynamic cardiac CT perfusion (CTP) is a high resolution, non-invasive technique for assessing myocardial blood ow (MBF), which in concert with coronary CT angiography enable CT to provide a unique, comprehensive, fast analysis of both coronary anatomy and functional ow. We assessed perfusion in a porcine model with and without coronary occlusion. To induce occlusion, each animal underwent left anterior descending (LAD) stent implantation and angioplasty balloon insertion. Normal ow condition was obtained with balloon completely de ated. Partial occlusion was induced by balloon in ation against the stent with FFR used to assess the extent of occlusion. Prospective ECG-triggered partial scan images were acquired at end systole (45% R-R) using a multi-detector CT (MDCT) scanner. Images were reconstructed using FBP and a hybrid iterative reconstruction (iDose4, Philips Healthcare). Processing included: beam hardening (BH) correction, registration of image volumes using 3D cubic B-spline normalized mutual-information, and spatio-temporal bilateral ltering to reduce partial scan artifacts and noise variation. Absolute blood ow was calculated with a deconvolutionbased approach using singular value decomposition (SVD). Arterial input function was estimated from the left ventricle (LV) cavity. Regions of interest (ROIs) were identi ed in healthy and ischemic myocardium and compared in normal and occluded conditions. Under-perfusion was detected in the correct LAD territory and ow reduction agreed well with FFR measurements. Flow was reduced, on average, in LAD territories by 54%.

  16. Empirical Verification of Fault Models for FPGAs Operating in the Subcritical Voltage Region

    DEFF Research Database (Denmark)

    Birklykke, Alex Aaen; Koch, Peter; Prasad, Ramjee

    2013-01-01

    We present a rigorous empirical study of the bit-level error behavior of field programmable gate arrays operating in the subcricital voltage region. This region is of significant interest as voltage-scaling under normal circumstances is halted by the first occurrence of errors. However, accurate...

  17. The SeaHorn Verification Framework

    Science.gov (United States)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  18. Verification of ceramic structures

    NARCIS (Netherlands)

    Behar-Lafenetre, S.; Cornillon, L.; Rancurel, M.; Graaf, D. de; Hartmann, P.; Coe, G.; Laine, B.

    2012-01-01

    In the framework of the "Mechanical Design and Verification Methodologies for Ceramic Structures" contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and

  19. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  20. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  1. CFD modeling and experimental verification of oscillating flow and heat transfer processes in the micro coaxial Stirling-type pulse tube cryocooler operating at 90-170 Hz

    Science.gov (United States)

    Zhao, Yibo; Yu, Guorui; Tan, Jun; Mao, Xiaochen; Li, Jiaqi; Zha, Rui; Li, Ning; Dang, Haizheng

    2018-03-01

    This paper presents the CFD modeling and experimental verifications of oscillating flow and heat transfer processes in the micro coaxial Stirling-type pulse tube cryocooler (MCSPTC) operating at 90-170 Hz. It uses neither double-inlet nor multi-bypass while the inertance tube with a gas reservoir becomes the only phase-shifter. The effects of the frequency on flow and heat transfer processes in the pulse tube are investigated, which indicates that a low enough frequency would lead to a strong mixing between warm and cold fluids, thereby significantly deteriorating the cooling performance, whereas a high enough frequency would produce the downward sloping streams flowing from the warm end to the axis and almost puncturing the gas displacer from the warm end, thereby creating larger temperature gradients in radial directions and thus undermining the cooling performance. The influence of the pulse tube length on the temperature and velocity when the frequencies are much higher than the optimal one are also discussed. A MCSPTC with an overall mass of 1.1 kg is worked out and tested. With an input electric power of 59 W and operating at 144 Hz, it achieves a no-load temperature of 61.4 K and a cooling capacity of 1.0 W at 77 K. The changing tendencies of tested results are in good agreement with the simulations. The above studies will help to thoroughly understand the underlying mechanism of the inertance MCSPTC operating at very high frequencies.

  2. Phase two of Site 300's ecological risk assessment: Model verification and risk management

    International Nuclear Information System (INIS)

    Carlson, T.M.; Gregory, S.D.

    1995-01-01

    The authors completed the baseline ecological risk assessment (ERA) for Lawrence Livermore National Laboratory's Site 300 in 1993. Using data collection and modeling techniques adapted from the human health risk assessment (HRA), they evaluated the potential hazard of contaminants in environmental media to ecological receptors. They identified potential hazards to (1) aquatic invertebrates from heavy metal contaminants in surface water, (2) burrowing vertebrates from contaminants volatilizing from subsurface soil into burrow air, and (3) grazing deer and burrowing vertebrates from cadmium contamination in surface soil. They recently began collecting data to refine the estimates of potential hazard to these ecological receptors. Bioassay results form the surface water failed to verify a hazard to aquatic invertebrates. Soil vapor surveys of subsurface burrows did verify the presence of high concentrations of volatile organic compounds (VOCs). However, they have not yet verified a true impact on the burrowing populations. The authors also completed an extensive surface soil sampling program, which identified local hot spots of cadmium contamination. In addition, they have been collecting data on the land use patterns of the deer population. Their data indicate that deer do not typically use those areas with cadmium surface soil contamination. Information from this phase of the ERA, along with the results of the HRA, will direct the selection of remedial alternatives for the site. For the ecological receptors, remedial alternatives include developing a risk management program which includes ensuring that (1) sensitive burrowing species (such as rare or endangered species) do not use areas of surface or subsurface contamination, and (2) deer populations do not use areas of surface soil contamination

  3. The Structure of Goal Contents Revisited. A Verification of the Model in Polish Samples

    Directory of Open Access Journals (Sweden)

    Górnik-Durose Małgorzata

    2016-12-01

    Full Text Available The article presents an attempt to confirm the circumplex structure of goal contents, identified in 15 cultures around the world (Grouzet et al., 2005, in nine Polish samples. The procedure followed steps from the original study and included testing the assumed 11-factor goal structure and the two-dimensional circular organization of the goal contents. None of the analyses showed outcomes that would explicitly confirm the results attained in the original study. The CFA showed rather poor fits. Results of the MDS generally supported the assumption about the two-dimensional goal contents structure, however ipsative distance analysis reproduced only one of the two assumed dimensions. Finally, although the CIRCUM analysis showed in principle that in the Polish sample the organization of goal contents on the circumference was quite similar to original, the RMSEA indicated poor fit. Methodological and conceptual reasons for the replication failure are analyzed and discussed.

  4. PEPT: An invaluable tool for 3-D particle tracking and CFD simulation verification in hydrocyclone studies

    Directory of Open Access Journals (Sweden)

    Hoffmann Alex C.

    2013-05-01

    Full Text Available Particle tracks in a hydrocyclone generated both experimentally by positron emission particle tracking (PEPT and numerically with Eulerian-Lagranian CFD have been studied and compared. A hydrocyclone with a cylinder-on-cone design was used in this study, the geometries used in the CFD simulations and in the experiments being identical. It is shown that it is possible to track a fast-moving particle in a hydrocyclone using PEPT with high temporal and spatial resolutions. The numerical 3-D particle trajectories were generated using the Large Eddy Simulation (LES turbulence model for the fluid and Lagrangian particle tracking for the particles. The behaviors of the particles were analyzed in detail and were found to be consistent between experiments and CFD simulations. The tracks of the particles are discussed and related to the fluid flow field visualized in the CFD simulations using the cross-sectional static pressure distribution.

  5. Multi-level nonlinear modeling verification scheme of RC high-rise wall buildings

    OpenAIRE

    Alwaeli, W.; Mwafy, A.; Pilakoutas, K.; Guadagnini, M.

    2017-01-01

    Earthquake-resistant reinforced concrete (RC) high-rise wall buildings are designed and detailed to respond well beyond the elastic range under the expected earthquake ground motions. However, despite their considerable section depth, in terms of analysis, RC walls are still often treated as linear elements, ignoring the effect of deformation compatibility. Due to the limited number of available comprehensive experimental studies on RC structural wall systems subjected to cycling loading, few...

  6. Verification of a TRACE EPRTM model on the basis of a scaling calculation of an SBLOCA ROSA test

    International Nuclear Information System (INIS)

    Freixa, J.; Manera, A.

    2011-01-01

    Research highlights: → Verification of a TRACE input deck for the EPR TM generation III PWR. → Scaling simulation of an SBLOCA experiment of the integral test facility ROSA/LSTF. → The EPR TM model was compared with the TRACE results of the ROSA/LSTF model. - Abstract: In cooperation with the Finnish Radiation and Nuclear Safety Authority (STUK), a project has been launched at the Paul Scherrer Institute (PSI) aimed at performing safety evaluations of the Olkiluoto-3 nuclear power plant (NPP), the first EPR TM , a generation III pressurizer water reactor (PWR); with particular emphasis on small-and large-break loss-of-coolant-accidents (SB/LB-LOCAs) and main steam-line breaks. As a first step of this work, the best estimate system code TRACE has been used to develop a model of Olkiluoto-3. In order to test the nodalization, a scaling calculation from the rig of safety assessment (ROSA) test facility has been performed. The ROSA large scale test facility (LSTF) was built to simulate Westinghouse design pressurized water reactors (PWR) with a four-loop configuration. Even though there are differences between the EPR TM and the Westinghouse designs, the number of similarities is large enough to carry out scaling calculations on SBLOCA and LOCA cases from the ROSA facility; as a matter of fact, the main differences are located in the secondary side. Test 6-1 of the ROSA 1 programme, an SBLOCA with the break situated in the upper head of the reactor pressure vessel (RPV), was of special interest since a very good agreement with the experiment was obtained with a TRACE input deck. In order to perform such scaling calculation, the set-points of the secondary relief and safety valves in the EPR TM nodalization had to be changed to those used in the ROSA facility, the break size and the core power had to be scaled by a factor of 60 (according to the core power and core volume) and the pumps coast down had to be adapted to the ones of the test. The calculation showed

  7. Hydraulic pitch control system for wind turbines: Advanced modeling and verification of an hydraulic accumulator

    DEFF Research Database (Denmark)

    Irizar, Victor; Andreasen, Casper Schousboe

    2017-01-01

    Hydraulic pitch systems provide robust and reliable control of power and speed of modern wind turbines. During emergency stops, where the pitch of the blades has to be taken to a full stop position to avoid over speed situations, hydraulic accumulators play a crucial role. Their efficiency...... and capability of providing enough energy to rotate the blades is affected by thermal processes due to the compression and decompression of the gas chamber. This paper presents an in depth study of the thermodynamical processes involved in an hydraulic accumulator during operation, and how they affect the energy...

  8. Calibration, verification, and use of a water-quality model to simulate effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota

    Science.gov (United States)

    Wesolowski, E.A.

    1994-01-01

    A 30.8-mile reach of the Red River of the North receives treated wastewater from plants at Fargo, North Dakota, and Moorhead, Minnesota, and streamflows from the Sheyenne River. A one-dimensional, steady-state, stream water-quality model, the Enhanced Stream Water Quality Model (QUAL2E), was calibrated and verified for summer stream flow conditions to simulate some of the biochemical processes that result from discharging treated wastewater into this reach of the river. Data obtained to define the river's transport conditions are measurements of channel geometry, streamflow, traveltime, specific conductance, and temperature. Data obtained to define the river's water-quality conditions are measurements of concentrations of selected water-quality constituents and estimates of various reaction coefficients. Most of the water-quality data used to calibrate and verify the model were obtained during two synoptic samplings in August 1989 and August 1990. The water-quality model simulates specific conductance, water temperature, dissolved oxygen, ultimate carbonaceous biochemical oxygen demand, total nitrite plus nitrate as nitrogen, total ammonia as nitrogen, total organic nitrogen as nitrogen, total phosphorus as phosphorus, and algal biomass as chlorophyll a. Of the nine properties and constituents that the calibrated model simulates, all except algae were verified. When increases in dissolved-oxygen concentration are considered, model sensitivity analyses indicate that dissolved-oxygen concentration is most sensitive to maximum specific algal growth rate. When decreases in dissolved-oxygen concentration are considered, model sensitivity analyses indicate that dissolved-oxygen concentration is most sensitive to point-source ammonia. Model simulations indicate nitrification and sediment oxygen demand consume most of the dissolved oxygen in the study reach. The Red River at Fargo Water-Quality Model and the verification data set, including associated reaction

  9. Acculturation and mental health--empirical verification of J.W. Berry's model of acculturative stress

    DEFF Research Database (Denmark)

    Koch, M W; Bjerregaard, P; Curtis, C

    2004-01-01

    to examine whether Berry's hypothesis about the connection between acculturation and mental health can be empirically verified for Greenlanders living in Denmark and to analyse whether acculturation plays a significant role for mental health among Greenlanders living in Denmark. STUDY DESIGN AND METHODS...... identity as Greenlander and how well the respondents speak Greenlandic and Danish. The statistical methods included binary logistic regression. RESULTS: We found no connection between Berry's definition of acculturation and mental health among Greenlanders in Denmark. On the other hand, our findings showed...... a significant relation between mental health and gender, age, marital position, occupation and long-term illness. CONCLUSION: The findings indicate that acculturation in the way Berry defines it plays a lesser role for mental health among Greenlanders in Denmark than socio-demographic and socio-economic factors...

  10. An ethnopharmacological study on Verbascum species: from conventional wound healing use to scientific verification.

    Science.gov (United States)

    Süntar, Ipek; Tatlı, I Irem; Küpeli Akkol, Esra; Keleş, Hikmet; Kahraman, Çiğdem; Akdemir, Zeliha

    2010-11-11

    The leaves, flowers, and whole aerial parts of Verbascum L. (Scrophulariaceae) species are used to treat eczema and other types of inflammatory skin conditions and as a desiccant for wounds in Turkish traditional medicine. In the present study, the methanolic extracts of 13 Verbascum species growing in Turkey, including Verbascum chionophyllum Hub.-Mor., Verbascum cilicicum Boiss., Verbascum dudleyanum (Hub.-Mor.) Hub.-Mor., Verbascum lasianthum Boiss., Verbascum latisepalum Hub.-Mor., Verbascum mucronatum Lam., Verbascum olympicum Boiss., Verbascum pterocalycinum var. mutense Hub.-Mor., Verbascum pycnostachyum Boiss. & Heldr., Verbascum salviifolium Boiss., Verbascum splendidum Boiss., Verbascum stachydifolium Boiss. & Heldr and Verbascum uschackense (Murb.) Hub.-Mor. were assessed for their in vivo wound healing activity. In vivo wound healing activity of the plants were evaluated by linear incision and circular excision experimental models subsequently histopathological analysis. The healing potential was comparatively assessed with a reference ointment Madecassol(®), which contains 1% extract of Centella asiatica. The methanolic extracts of Verbascum olympicum, Verbascum stachydifolium and Verbascum uschackense demonstrated the highest activities on the both wound models. Moreover, the methanolic extracts of Verbascum latisepalum, Verbascum mucronatum, and Verbascum pterocalycinum var. mutense were found generally highly effective. On the other hand, the rest of the species did not show any remarkable wound healing effect. Results of the present study support the continued and expanded utilization of these plant species employed in Turkish folk medicine. The experimental study revealed that Verbascum species display remarkable wound healing activity. Crown Copyright © 2010. Published by Elsevier Ireland Ltd. All rights reserved.

  11. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  12. Electromagnetic Head-And-Neck Hyperthermia Applicator: Experimental Phantom Verification and FDTD Model

    International Nuclear Information System (INIS)

    Paulides, Margarethus M.; Bakker, Jurriaan F.; Rhoon, Gerard C. van

    2007-01-01

    Purpose: To experimentally verify the feasibility of focused heating in the neck region by an array of two rings of six electromagnetic antennas. We also measured the dynamic specific absorption rate (SAR) steering possibilities of this setup and compared these SAR patterns to simulations. Methods and Materials: Using a specially constructed laboratory prototype head-and-neck applicator, including a neck-mimicking cylindrical muscle phantom, we performed SAR measurements by electric field, Schottky-diode sheet measurements and, using the power-pulse technique, by fiberoptic thermometry and infrared thermography. Using phase steering, we also steered the SAR distribution in radial and axial directions. All measured distributions were compared with the predictions by a finite-difference time-domain-based electromagnetic simulator. Results: A central 50% iso-SAR focus of 35 ± 3 mm in diameter and about 100 ± 15 mm in length was obtained for all investigated settings. Furthermore, this SAR focus could be steered toward the desired location in the radial and axial directions with an accuracy of ∼5 mm. The SAR distributions as measured by all three experimental methods were well predicted by the simulations. Conclusion: The results of our study have shown that focused heating in the neck is feasible and that this focus can be effectively steered in the radial and axial directions. For quality assurance measurements, we believe that the Schottky-diode sheet provides the best compromise among effort, speed, and accuracy, although a more specific and improved design is warranted

  13. Using Visual Specifications in Verification of Industrial Automation Controllers

    Directory of Open Access Journals (Sweden)

    Valeriy Vyatkin

    2008-03-01

    Full Text Available This paper deals with further development of a graphical specification language resembling timing-diagrams and allowing specification of partially ordered events in input and output signals. The language specifically aims at application in modular modelling of industrial automation systems and their formal verification via model-checking. The graphical specifications are translated into a model which is connected with the original model under study.

  14. Using Visual Specifications in Verification of Industrial Automation Controllers

    Directory of Open Access Journals (Sweden)

    Bouzon Gustavo

    2008-01-01

    Full Text Available Abstract This paper deals with further development of a graphical specification language resembling timing-diagrams and allowing specification of partially ordered events in input and output signals. The language specifically aims at application in modular modelling of industrial automation systems and their formal verification via model-checking. The graphical specifications are translated into a model which is connected with the original model under study.

  15. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  16. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  17. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  18. Modeling and verification of acoustic wave propagation in indoors using Sabine model in turbine hall of a gas power plant

    Directory of Open Access Journals (Sweden)

    H. Ekhlas

    2014-05-01

    .Conclusion: The presented model is easy and practical and allows managers to model scenarios of noise pollution reduction in indoor environments, before huge expenses of actual control measures. This method is faster comparing to numerical modeling methods. Furthermore, its accuracy is also acceptable.

  19. Microbial ureolysis in the seawater-catalysed urine phosphorus recovery system: Kinetic study and reactor verification.

    Science.gov (United States)

    Tang, Wen-Tao; Dai, Ji; Liu, Rulong; Chen, Guang-Hao

    2015-12-15

    Our previous study has confirmed the feasibility of using seawater as an economical precipitant for urine phosphorus (P) precipitation. However, we still understand very little about the ureolysis in the Seawater-based Urine Phosphorus Recovery (SUPR) system despite its being a crucial step for urine P recovery. In this study, batch experiments were conducted to investigate the kinetics of microbial ureolysis in the seawater-urine system. Indigenous bacteria from urine and seawater exhibited relatively low ureolytic activity, but they adapted quickly to the urine-seawater mixture during batch cultivation. During cultivation, both the abundance and specific ureolysis rate of the indigenous bacteria were greatly enhanced as confirmed by a biomass-dependent Michaelis-Menten model. The period for fully ureolysis was decreased from 180 h to 2.5 h after four cycles of cultivation. Based on the successful cultivation, a lab-scale SUPR reactor was set up to verify the fast ureolysis and efficient P recovery in the SUPR system. Nearly complete urine P removal was achieved in the reactor in 6 h without adding any chemicals. Terminal Restriction Fragment Length Polymorphism (TRFLP) analysis revealed that the predominant groups of bacteria in the SUPR reactor likely originated from seawater rather than urine. Moreover, batch tests confirmed the high ureolysis rates and high phosphorus removal efficiency induced by cultivated bacteria in the SUPR reactor under seawater-to-urine mixing ratios ranging from 1:1 to 9:1. This study has proved that the enrichment of indigenous bacteria in the SUPR system can lead to sufficient ureolytic activity for phosphate precipitation, thus providing an efficient and economical method for urine P recovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Verification and validation for CIPRNet

    NARCIS (Netherlands)

    Voogd, J.

    2016-01-01

    In this chapter it is shown that if an appreciable risk is present in the use of Modelling and Simulation (M&S), Verification and Validation (V&V) should be employed to manage and mitigate that risk. The use of M&S in the domain of critical infrastructure (CI) will always be accompanied by such a

  1. Registration of DRRs and portal images for verification of stereotactic body radiotherapy: a feasibility study in lung cancer treatment

    Science.gov (United States)

    Künzler, Thomas; Grezdo, Jozef; Bogner, Joachim; Birkfellner, Wolfgang; Georg, Dietmar

    2007-04-01

    Image guidance has become a pre-requisite for hypofractionated radiotherapy where the applied dose per fraction is increased. Particularly in stereotactic body radiotherapy (SBRT) for lung tumours, one has to account for set-up errors and intrafraction tumour motion. In our feasibility study, we compared digitally reconstructed radiographs (DRRs) of lung lesions with MV portal images (PIs) to obtain the displacement of the tumour before irradiation. The verification of the tumour position was performed by rigid intensity based registration and three different merit functions such as the sum of squared pixel intensity differences, normalized cross correlation and normalized mutual information. The registration process then provided a translation vector that defines the displacement of the target in order to align the tumour with the isocentre. To evaluate the registration algorithms, 163 test images were created and subsequently, a lung phantom containing an 8 cm3 tumour was built. In a further step, the registration process was applied on patient data, containing 38 tumours in 113 fractions. To potentially improve registration outcome, two filter types (histogram equalization and display equalization) were applied and their impact on the registration process was evaluated. Generated test images showed an increase in successful registrations when applying a histogram equalization filter whereas the lung phantom study proved the accuracy of the selected algorithms, i.e. deviations of the calculated translation vector for all test algorithms were below 1 mm. For clinical patient data, successful registrations occurred in about 59% of anterior-posterior (AP) and 46% of lateral projections, respectively. When patients with a clinical target volume smaller than 10 cm3 were excluded, successful registrations go up to 90% in AP and 50% in lateral projection. In addition, a reliable identification of the tumour position was found to be difficult for clinical target volumes

  2. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    Science.gov (United States)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  3. Numerical Verification Of Geotechnical Structure In Unfavourable Geological Conditions – Case Study

    Directory of Open Access Journals (Sweden)

    Drusa Marián

    2015-06-01

    Full Text Available Numerical modelling represents a powerful tool not only for special geotechnical calculations in cases of complicated and difficult structure design or their foundation conditions, but also for regular tasks of structure foundation. Finite element method is the most utilized method of numerical modelling. This method was used for calculations of the retaining wall monitored during 5 years after construction. Retaining wall of the parking lot with the facing from gabion blocks was chosen for numerical model. Besides the unfavourable geological conditions, a soft nature of the facing was also a difficult part of the modelling. This paper presents the results of the modelling when exact geometry, material characteristics and construction stages were simulated. The results capture the trend of displacements even though the basic material models were utilized. The modelling proved the ability of the finite element method to model the retaining structure with sufficient accuracy as well as reasonable demand on quality and quantity of input data. This method can then be used as a regular design tool during project preparation.

  4. Self-shielding effects in burnup of Gd used as burnable absorber. Previous studies on its experimental verification

    International Nuclear Information System (INIS)

    Abbate, Maximo J.; Sbaffoni, Maria M.

    2003-01-01

    Continuing with the domestic 'Burnable Absorbers Research Plan' studies were done to estimate self-shielding effects during Gd 2 O 3 burnup as burnable absorber included in fuel pins of a CAREM geometry. In this way, its burnup was calculated without and with self-shielding. For the second case, were obtained values depending on internal pin radius and the effective one for the homogenized pin. For Gd 157, the burnup corresponding to the first case resulted 52.6 % and of 1.23 % for the effective one. That shows the magnitude of the effects under study. Considering that is necessary to perform one experimental verification, also are presented calculational results for the case to irradiate a pellet containing UO 2 (natural) and 8 wt % of Gd 2 O 3 , as a function of cooling time, that include: measurable isotopes concentrations, expected activities, and photon spectra for conditions able to be compared with bidimensional calculations with self-shielding. The irradiation time was supposed 30 dpp using RA-3 reactor at 10 MW. (author)

  5. Implementation of project Safe in Amber. Verification study for SFR 1 SAR-08

    Energy Technology Data Exchange (ETDEWEB)

    Thomson, Gavin; Herben, Martin; Lloyd, Pam; Rose, Danny; Smith, Chris; Barraclough, Ian (Enviros Consulting Ltd (GB))

    2008-03-15

    This report documents an exercise in which AMBER has been used to represent the models used in Project SAFE, a safety assessment undertaken on SFR 1. (AMBER is a flexible, graphical-user-interface based tool that allows users to build their own dynamic compartmental models to represent the migration, degradation and fate of contaminants in an environmental system. AMBER allows the user to assess routine, accidental and long-term contaminant release.) AMBER has been used to undertake assessment calculations on all of the disposal system, including all disposal tunnels and the Silo, the geosphere and several biosphere modules. The near-field conceptual models were implemented with minimal changes to the approach undertaken previously in Project SAFE. Model complexity varied significantly between individual disposal facilities increasing significantly from the BLA to the BTF and BMA tunnels and Silo. Radionuclide transport through the fractured granite geosphere was approximated using a compartment model approach in AMBER. Several biosphere models were implemented in AMBER including reasonable biosphere development, which considered the evolution of the Forsmark area from coastal to lacustrine to agricultural environments in response to land uplift. Parameters were sampled from distributions and simulations were run for 1,000 realisations. In undertaking the comparison of AMBER with the various codes and calculation tools used in Project SAFE it was necessary to undertake a detailed analysis of the modelling approach previously adopted, with particular focus given to the near-field models. As a result some discrepancies in the implementation of the models and documentation were noted. The exercise demonstrates that AMBER is fully capable of representing the features of the SFR 1 disposal system in a safety assessment suitable for SAR-08

  6. A semi-empirical model of the direct methanol fuel cell performance. Part I. Model development and verification

    Science.gov (United States)

    Argyropoulos, P.; Scott, K.; Shukla, A. K.; Jackson, C.

    A model equation is developed to predict the cell voltage versus current density response of a liquid feed direct methanol fuel cell (DMFC). The equation is based on a semi-empirical approach in which methanol oxidation and oxygen reduction kinetics are combined with effective mass transport coefficients for the fuel cell electrodes. The model equation is validated against experimental data for a small-scale fuel cell and is applicable over a wide range of methanol concentration and temperatures.

  7. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  8. Two-Dimensional Model Test Verification of the New Cubipod Armoured Western Breakwater for the Port of Hanstholm

    DEFF Research Database (Denmark)

    Eldrup, Mads Røge; Andersen, Thomas Lykke

    The present report presents results from a two-dimensional model test study carried out at Aalborg University in December 2017 with the proposed trunk section for the new cubipod armoured western breakwater in Port of Hanstholm as proposed by the contractor Aarsleff and their consultant Cowi....... The objectives of the model tests were to study the stability of the armour layer, toe erosion, overtopping and transmission. The scale used for the model tests was 1:44.6. Initially the model was created on a scale 1:47, but model was adapted to 1:44.6 due to a mismatch in density of rented cupipods. Unless...

  9. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey...

  10. Constraint specialisation in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2017-01-01

    We present a method for specialising the constraints in constrained Horn clauses with respect to a goal. We use abstract interpretation to compute a model of a query–answer transformed version of a given set of clauses and a goal. The constraints from the model are then used to compute...... underlying the clauses. Experimental results on verification problems show that this is an effective transformation, both in our own verification tools (based on a convex polyhedra analyser) and as a pre-processor to other Horn clause verification tools....

  11. Three-Dimensional Model Test Verification of the New Cubipod Armoured Western Breakwater for Port of Hanstholm

    DEFF Research Database (Denmark)

    Eldrup, Mads Røge; Andersen, Thomas Lykke

    The present report presents results from a three-dimensional model test study carried out at Aalborg University in January 2018 with the new western breakwater in Port of Hanstholm as proposed by the contractor Aarsleff and their consultant Cowi. The objectives of the model tests were to study th...

  12. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  13. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    based model checking style of verification. The next paper by D'Souza & Thiagarajan presents an automata-theoretic approach to analysing timing properties of systems. The last paper by Mohalik and Ramanujam presents the assumption.

  14. Design studies of hydraulic verification test section of windowless target of liquid lead-bismuth eutectic experimental platform

    International Nuclear Information System (INIS)

    Zhu Linglin; Bai Yunqing; Chen Zhao

    2010-01-01

    As one of the key components of the accelerator driven system (ADS), the formation and control of the free surface is one of the key issues in the design of the windowless target which needs the theoretical analysis and experimental verification. FDS team is launching the conceptual design of ADS reactor and is developing the integrative experimental platform of liquid lead-bismuth eutectic technology. The hydraulic verification facility for windowless spallation target is an important part of the platform. In this paper, the aim of the design and major parameters of the facility were described, and the major components and experimental schemes were presented. (authors)

  15. Modeling and simulation of the main metabolism in Escherichia coli and its several single-gene knockout mutants with experimental verification

    Directory of Open Access Journals (Sweden)

    McFadden Johnjoe

    2010-11-01

    Full Text Available Abstract Background It is quite important to simulate the metabolic changes of a cell in response to the change in culture environment and/or specific gene knockouts particularly for the purpose of application in industry. If this could be done, the cell design can be made without conducting exhaustive experiments, and one can screen out the promising candidates, proceeded by experimental verification of a select few of particular interest. Although several models have so far been proposed, most of them focus on the specific metabolic pathways. It is preferred to model the whole of the main metabolic pathways in Escherichia coli, allowing for the estimation of energy generation and cell synthesis, based on intracellular fluxes and that may be used to characterize phenotypic growth. Results In the present study, we considered the simulation of the main metabolic pathways such as glycolysis, TCA cycle, pentose phosphate (PP pathway, and the anapleorotic pathways using enzymatic reaction models of E. coli. Once intracellular fluxes were computed by this model, the specific ATP production rate, the specific CO2 production rate, and the specific NADPH production rate could be estimated. The specific ATP production rate thus computed was used for the estimation of the specific growth rate. The CO2 production rate could be used to estimate cell yield, and the specific NADPH production rate could be used to determine the flux of the oxidative PP pathway. The batch and continuous cultivations were simulated where the changing patterns of extracellular and intra-cellular metabolite concentrations were compared with experimental data. Moreover, the effects of the knockout of such pathways as Ppc, Pck and Pyk on the metabolism were simulated. It was shown to be difficult for the cell to grow in Ppc mutant due to low concentration of OAA, while Pck mutant does not necessarily show this phenomenon. The slower growth rate of the Ppc mutant was properly

  16. Verification of operation of the actuator control system using the integration the B&R Automation Studio software with a virtual model of the actuator system

    Science.gov (United States)

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.

  17. Software verification and testing

    Science.gov (United States)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  18. SU-F-T-287: A Preliminary Study On Patient Specific VMAT Verification Using a Phosphor-Screen Based Geometric QA System (Raven QA)

    International Nuclear Information System (INIS)

    Lee, M; Yi, B; Wong, J; Ding, K

    2016-01-01

    Purpose: The RavenQA system (LAP Laser, Germany) is a QA device with a phosphor screen detector for performing the QA tasks of TG-142. This study tested if it is feasible to use the system for the patient specific QA of the Volumetric Modulated Arc Therapy (VMAT). Methods: Water equivalent material (5cm) is attached to the front of the detector plate of the RavenQA for dosimetry purpose. Then the plate is attached to the gantry to synchronize the movement between the detector and the gantry. Since the detector moves together with gantry, The ’Reset gantry to 0’ function of the Eclipse planning system (Varian, CA) is used to simulate the measurement situation when calculating dose of the detector plate. The same gantry setup is used when delivering the treatment beam for feasibility test purposes. Cumulative dose is acquired for each arc. The optical scatter component of each captured image from the CCD camera is corrected by deconvolving the 2D spatial invariant optical scatter kernel (OSK). We assume that the OSK is a 2D isotropic point spread function with inverse-squared decrease as a function of radius from the center. Results: Three cases of VMAT plans including head & neck, whole pelvis and abdomen-pelvis are tested. Setup time for measurements was less than 5 minutes. Passing rates of absolute gamma were 99.3, 98.2, 95.9 respectively for 3%/3mm criteria and 96.2, 97.1, 86.4 for 2%/2mm criteria. The abdomen-pelvis field has long treatment fields, 37cm, which are longer than the detector plate (25cm). This plan showed relatively lower passing rate than other plans. Conclusion: An algorithm for IMRT/VMAT verification using the RavenQA has been developed and tested. The model of spatially invariant OSK works well for deconvolution purpose. It is proved that the RavenQA can be used for the patient specific verification of VMAT. This work is funded in part by a Maryland Industrial Partnership Program grant to University of Maryland and to JPLC who owns the

  19. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  20. Bedrock geology Forsmark. Modelling stage 2.3. Implications for and verification of the deterministic geological models based on complementary data

    Energy Technology Data Exchange (ETDEWEB)

    Stephens, Michael B. (Geological Survey of Sweden, Uppsala (Sweden)); Simeonov, Assen (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Isaksson, Hans (GeoVista AB, Luleaa (Sweden))

    2008-12-15

    The Swedish Nuclear Fuel and Waste Management Company is in the process of completing site descriptive modelling at two locations in Sweden, with the objective to site a deep geological repository for spent nuclear fuel. At Forsmark, the results of the stage 2.2 geological modelling formed the input for downstream users. Since complementary ground and borehole geological and geophysical data, acquired after model stage 2.2, were not planned to be included in the deterministic rock domain, fracture domain and deformation zone models supplied to the users, it was deemed necessary to evaluate the implications of these stage 2.3 data for the stage 2.2 deterministic geological models and, if possible, to make use of these data to verify the models. This report presents the results of the analysis of the complementary stage 2.3 geological and geophysical data. Model verification from borehole data has been implemented in the form of a prediction-outcome test. The stage 2.3 geological and geophysical data at Forsmark mostly provide information on the bedrock outside the target volume. Additional high-resolution ground magnetic data and the data from the boreholes KFM02B, KFM11A, KFM12A and HFM33 to HFM37 can be included in this category. Other data complement older information of identical character, both inside and outside this volume. These include the character and kinematics of deformation zones and fracture mineralogy. In general terms, it can be stated that all these new data either confirm the geological modelling work completed during stage 2.2 or are in good agreement with the data that were used in this work. In particular, although the new high-resolution ground magnetic data modify slightly the position and trace length of some stage 2.2 deformation zones at the ground surface, no new or modified deformation zones with a trace length longer than 3,000 m at the ground surface have emerged. It is also apparent that the revision of fracture orientation data