WorldWideScience

Sample records for modeling development verification

  1. Simscape Modeling Verification in the Simulink Development Environment

    Science.gov (United States)

    Volle, Christopher E. E.

    2014-01-01

    The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.

  2. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  3. Development and verification of printed circuit board toroidal transformer model

    DEFF Research Database (Denmark)

    Pejtersen, Jens; Mønster, Jakob Døllner; Knott, Arnold

    2013-01-01

    by comparing calculated parameters with 3D finite element simulations and experimental measurement results. The developed transformer model shows good agreement with the simulated and measured results. The model can be used to predict the parameters of printed circuit board toroidal transformer configurations......An analytical model of an air core printed circuit board embedded toroidal transformer configuration is presented. The transformer has been developed for galvanic isolation of very high frequency switch-mode dc-dc power converter applications. The theoretical model is developed and verified...

  4. Development and verification of an agent-based model of opinion leadership.

    Science.gov (United States)

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The

  5. Development and verification of a space-dependent dynamic model of a natural circulation steam generator

    International Nuclear Information System (INIS)

    Mewdell, C.G.; Harrison, W.C.; Hawley, E.H.

    1980-01-01

    This paper describes the development and verification of a Non-Linear Space-Dependent Dynamic Model of a Natural Circulation Steam Generator typical of boilers used in CANDU nuclear power stations. The model contains a detailed one-dimensional dynamic description of both the primary and secondary sides of an integral pre-heater natural circulation boiler. Two-phase flow effects on the primary side are included. The secondary side uses a drift-flux model in the boiling sections and a detailed non-equilibrium point model for the steam drum. The paper presents the essential features of the final model called BOILER-2, its solution scheme, the RD-12 loop and test boiler, the boiler steady-state and transient experiments, and the comparison of the model predictions with experimental results. (author)

  6. Developing Verification Systems for Building Information Models of Heritage Buildings with Heterogeneous Datasets

    Science.gov (United States)

    Chow, L.; Fai, S.

    2017-08-01

    The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  7. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  8. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  9. ParFlow.RT: Development and Verification of a New Reactive Transport Model

    Science.gov (United States)

    Beisman, J. J., III

    2015-12-01

    In natural subsurface systems, total elemental fluxes are often heavily influenced by areas of disproportionately high reaction rates. These pockets of high reaction rates tend to occur at interfaces, such as the hyporheic zone, where a hydrologic flowpath converges with either a chemically distinct hydrologic flowpath or a reactive substrate. Understanding the affects that these highly reactive zones have on the behavior of shallow subsurface systems is integral to the accurate quantification of nutrient fluxes and biogeochemical cycling. Numerical simulations of these systems may be able to offer some insight. To that end, we have developed a new reactive transport model, ParFlow.RT, by coupling the parallel flow and transport code ParFlow with the geochemical engines of both PFLOTRAN and CrunchFlow. The coupling was accomplished via the Alquimia biogeochemistry API, which provides a unified interface to several geochemical codes and allows a relatively simple implementation of advanced geochemical functionality in flow and transport codes. This model uses an operator-splitting approach, where the transport and reaction steps are solved separately. Here, we present the details of this new model, and the results of verification simulations and biogeochemical cycling simulations of the DOE's East River field site outside of Gothic, CO.

  10. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  11. Development of the VESUVIUS module. Molten jet breakup modeling and model verification

    Energy Technology Data Exchange (ETDEWEB)

    Vierow, K. [Nuclear Power Engineering Corp., Tokyo (Japan); Nagano, Katsuhiro; Araki, Kazuhiro

    1998-01-01

    With the in-vessel vapor explosion issue ({alpha}-mode failure) now considered to pose an acceptably small risk to the safety of a light water reactor, ex-vessel vapor explosions are being given considerable attention. Attempts are being made to analytically model breakup of continuous-phase jets, however uncertainty exists regarding the basic phenomena. In addition, the conditions upon reactor vessel failure, which determine the starting point of the ex-vessel vapor explosion process, are difficult to quantify. Herein, molten jet ejection from the reactor pressure vessel is characterized. Next, the expected mode of jet breakup is determined and the current state of analytical modeling is reviewed. A jet breakup model for ex-vessel scenarios, with the primary breakup mechanism being the Kelvin-Helmholtz instability, is described. The model has been incorporated into the VESUVIUS module and comparisons of VESUVIUS calculations against FARO L-06 experimental data show differences, particularly in the pressure curve and amount of jet breakup. The need for additional development to resolve these differences is discussed. (author)

  12. Development and Verification of CFD Models for Modeling Wind Conditions on Forested Wind Turbine Sites

    DEFF Research Database (Denmark)

    Andersen, Morten Q.; Mortensen, Kasper; Nielsen, Daniel E.

    2009-01-01

    This paper describes a proposed CFD model to simulate the wind conditions on a forested site. The model introduces porous subdomains representing the forests in the terrain. Obtained simulation values are compared to field measurements in- and outside a forest. Initial results are very promising...

  13. Temperature Modeling of Lost Creek Lake Using CE-QUAL-W2: A Report on the Development, Calibration, Verification, and Application of the Model

    Science.gov (United States)

    2017-05-01

    ER D C/ EL T R- 17 -6 Temperature Modeling of Applegate Lake Using CE-QUAL-W2 A Report on the Development, Calibration, Verification...library at http://acwc.sdp.sirsi.net/client/default. ERDC/EL TR-17-6 May 2017 Temperature Modeling of Applegate Lake Using CE-QUAL-W2 A Report on...model and the corresponding results from the study provided CENWP with more refined estimates of water temperatures so that more defendable water

  14. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    Science.gov (United States)

    Zavodsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use

  15. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H.

    2006-09-01

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard

  16. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  17. Simulating the Water Use of Thermoelectric Power Plants in the United States: Model Development and Verification

    Science.gov (United States)

    Betrie, G.; Yan, E.; Clark, C.

    2016-12-01

    Thermoelectric power plants use the highest amount of freshwater second to the agriculture sector. However, there is scarcity of information that characterizes the freshwater use of these plants in the United States. This could be attributed to the lack of model and data that are required to conduct analysis and gain insights. The competition for freshwater among sectors will increase in the future as the amount of freshwater gets limited due climate change and population growth. A model that makes use of less data is urgently needed to conduct analysis and identify adaptation strategies. The objectives of this study are to develop a model and simulate the water use of thermoelectric power plants in the United States. The developed model has heat-balance, climate, cooling system, and optimization modules. It computes the amount of heat rejected to the environment, estimates the quantity of heat exchanged through latent and sensible heat to the environment, and computes the amount of water required per unit generation of electricity. To verify the model, we simulated a total of 876 fossil-fired, nuclear and gas-turbine power plants with different cooling systems (CS) using 2010-2014 data obtained from Energy Information Administration. The CS includes once-through with cooling pond, once-through without cooling ponds, recirculating with induced draft and recirculating with induced draft natural draft. The results show that the model reproduced the observed water use per unit generation of electricity for the most of the power plants. It is also noticed that the model slightly overestimates the water use during the summer period when the input water temperatures are higher. We are investigating the possible reasons for the overestimation and address it in the future work. The model could be used individually or coupled to regional models to analyze various adaptation strategies and improve the water use efficiency of thermoelectric power plants.

  18. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  19. Simulink based behavioural modelling of a pulse oximeter for deployment in rapid development, prototyping and verification.

    Science.gov (United States)

    Shokouhian, M; Morling, R C S; Kale, I

    2012-01-01

    The pulse oximeter is a well-known device for measuring the level of oxygen in blood. Since their invention, pulse oximeters have been under constant development in both aspects of hardware and software; however there are still unsolved problems that limit their performance [6], [7]. Many fresh algorithms and new design techniques are being suggested every year by industry and academic researchers which claim that they can improve accuracy of measurements [8], [9]. With the lack of an accurate computer-based behavioural model for pulse oximeters, the only way for evaluation of these newly developed systems and algorithms is through hardware implementation which can be both expensive and time consuming. This paper presents an accurate Simulink based behavioural model for a pulse oximeter that can be used by industry and academia alike working in this area, as an exploration as well as productivity enhancement tool during their research and development process. The aim of this paper is to introduce a new computer-based behavioural model which provides a simulation environment from which new ideas can be rapidly evaluated long before the real implementation.

  20. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  1. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    International Nuclear Information System (INIS)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke

    2015-01-01

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  2. Contaminant transport in groundwater in the presence of colloids and bacteria: model development and verification.

    Science.gov (United States)

    Bekhit, Hesham M; El-Kordy, Mohamed A; Hassan, Ahmed E

    2009-09-01

    Colloids and bacteria (microorganisms) naturally exist in groundwater aquifers and can significantly impact contaminant migration rates. A conceptual model is first developed to account for the different physiochemical and biological processes, reaction kinetics, and different transport mechanisms of the combined system (contaminant-colloids-bacteria). All three constituents are assumed to be reactive with the reactions taking place between each constituent and the porous medium and also among the different constituents. A general linear kinetic reaction model is assumed for all reactive processes considered. The mathematical model is represented by fourteen coupled partial differential equations describing mass balance and reaction processes. Two of these equations describe colloid movement and reactions with the porous medium, four equations describe bacterial movement and reactions with colloids and the porous medium, and the remaining eight equations describe contaminant movement and its reactions with bacteria, colloids, and the porous medium. The mass balance equations are numerically solved for two-dimensional groundwater systems using a third-order, total variance-diminishing scheme (TVD) for the advection terms. Due to the complex coupling of the equations, they are solved iteratively each time step until a convergence criterion is met. The model is tested against experimental data and the results are favorable.

  3. Development and verification of fission gas release model for the design and analysis of future fuel

    International Nuclear Information System (INIS)

    Ku, Yang Hyun; Sohn, Dong Sung.

    1997-08-01

    A mechanistic model has been developed to predict the release behavior of fission gas during steady-state and transient conditions for both LWR UO 2 and MOX fuel. Under the assumption that UO 2 grain surface is composed of fourteen identical circular faces and grain edge bubble can be represented by a triangulated tube around the circumference of three circular grain faces, it introduces the concept of continuous formation of open grain edges tunnels that is proportional to grain edge swelling. In addition, it takes into account the interaction between the gas release from matrix to grain boundary and the reintroduction of gas atoms into the matrix by the irradiation-induced re-solution of grain face bubbles. It also treats analytically the behavior of intragranular, intergranular, and grain edge bubbles under the assumption that both intragranular and intergranular bubbles are uniform in both radius and number density. The effect of contact pressure between clad and pellet on the inter-granular bubble's storage capacity of fission gas has been considered. (author). 43 refs., 4 tabs., 35 figs

  4. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  5. Formal Development and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    standards [2] for railways to use formal (i.e. mathematical) logic and models for the unambiguous description of requirements and designs as well as for exhaustive verification as they give a higher assurance of safety compared to conventional methods. The use of domain-specific methods is another trend...... can be combined and used for an efficient development and verification of new fail-safe systems. The expected result is a methodology for using domain-specific, formal languages, techniques and tools for more efficient development and verification of robust software for railway control systems......This paper presents work package WP4.1 of the RobustRails research project. The work package aims at suggesting a methodology for efficient development and verification of safe and robust railway control systems. 1 Project background and state of the art Over the next 10 years all Danish railway...

  6. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  7. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  8. A semi-empirical model of the direct methanol fuel cell performance. Part I. Model development and verification

    Science.gov (United States)

    Argyropoulos, P.; Scott, K.; Shukla, A. K.; Jackson, C.

    A model equation is developed to predict the cell voltage versus current density response of a liquid feed direct methanol fuel cell (DMFC). The equation is based on a semi-empirical approach in which methanol oxidation and oxygen reduction kinetics are combined with effective mass transport coefficients for the fuel cell electrodes. The model equation is validated against experimental data for a small-scale fuel cell and is applicable over a wide range of methanol concentration and temperatures.

  9. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  10. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  11. Mesoscale model forecast verification during monsoon 2008

    Indian Academy of Sciences (India)

    There have been very few mesoscale modelling studies of the Indian monsoon, with focus on the verification and intercomparison of the operational real time forecasts. With the exception of Das et al (2008), most of the studies in the literature are either the case studies of tropical cyclones and thunderstorms or the sensitivity ...

  12. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  13. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  14. Authoring and verification of clinical guidelines: a model driven approach.

    Science.gov (United States)

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc

  15. MACCS2 development and verification efforts

    International Nuclear Information System (INIS)

    Young, M.; Chanin, D.

    1997-01-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the σ y and σ z plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses

  16. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  17. TU electric reactor model verification

    International Nuclear Information System (INIS)

    Willingham, C.E.; Killgore, M.R.

    1989-01-01

    Power reactor benchmark calculations using the code package CASMO-3/SIMULATE-3 have been performed for six cycles of Prairie Island Unit 1. The reload fuel designs for the selected cycles include gadolinia as a burnable absorber, natural uranium axial blankets, and increased water-to-fuel ratio. The calculated results for both low-power physics tests (boron end points, control rod worths, and isothermal temperature coefficients) and full-power operation (power distributions and boron letdown) are compared to measured plant data. These comparisons show that the TU Electric reactor physics models accurately predict important physics parameters for power reactors

  18. Mechanistic Physiologically Based Pharmacokinetic (PBPK) Model of the Heart Accounting for Inter-Individual Variability: Development and Performance Verification.

    Science.gov (United States)

    Tylutki, Zofia; Mendyk, Aleksander; Polak, Sebastian

    2018-04-01

    Modern model-based approaches to cardiac safety and efficacy assessment require accurate drug concentration-effect relationship establishment. Thus, knowledge of the active concentration of drugs in heart tissue is desirable along with inter-subject variability influence estimation. To that end, we developed a mechanistic physiologically based pharmacokinetic model of the heart. The models were described with literature-derived parameters and written in R, v.3.4.0. Five parameters were estimated. The model was fitted to amitriptyline and nortriptyline concentrations after an intravenous infusion of amitriptyline. The cardiac model consisted of 5 compartments representing the pericardial fluid, heart extracellular water, and epicardial intracellular, midmyocardial intracellular, and endocardial intracellular fluids. Drug cardiac metabolism, passive diffusion, active efflux, and uptake were included in the model as mechanisms involved in the drug disposition within the heart. The model accounted for inter-individual variability. The estimates of optimized parameters were within physiological ranges. The model performance was verified by simulating 5 clinical studies of amitriptyline intravenous infusion, and the simulated pharmacokinetic profiles agreed with clinical data. The results support the model feasibility. The proposed structure can be tested with the goal of improving the patient-specific model-based cardiac safety assessment and offers a framework for predicting cardiac concentrations of various xenobiotics. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  19. Verification of a model for predicting the effect of inconstant temperature on embryonic development of lake whitefish (Coregonus clupeaformis)

    Science.gov (United States)

    Berlin, William H.; Brooke, L.T.; Stone, Linda J.

    1977-01-01

    Eggs stripped from lake whitefish (Coregonus clupeaformis) spawning in Lake Michigan were incubated in the laboratory at temperatures similar to those on whitefish spawning grounds in Lake Michigan during December-April. Observed times from fertilization to attainment of each of 21 developmental stages were used to test a model that predicts the rate of development at daily fluctuating temperatures; the model relates rate of development for any given stage j, expressed as the reciprocal of time (Rj), to temperature (T). The generalized equation for a developmental stage is Rj = abTcT??.

  20. Algorithm development and verification of UASCM for multi-dimension and multi-group neutron kinetics model

    International Nuclear Information System (INIS)

    Si, S.

    2012-01-01

    The Universal Algorithm of Stiffness Confinement Method (UASCM) for neutron kinetics model of multi-dimensional and multi-group transport equations or diffusion equations has been developed. The numerical experiments based on transport theory code MGSNM and diffusion theory code MGNEM have demonstrated that the algorithm has sufficient accuracy and stability. (authors)

  1. Effective Development and Verification of Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This document presents a method for effective development of software for a product line of similar railway control systems. The software is constructed in three steps: first a specifications in a domain-specific language is created, then a formal behavioural controller model is automatically...... created from the specification, and finally the model is compiled into executable object code. Formal verification is performed automatically by tools at three levels: (1) the specification is checked to follow the rules of the domain, (2) the controller model is checked to ensure safety, and (3...

  2. Formal Modeling and Verification for MVB

    Directory of Open Access Journals (Sweden)

    Mo Xia

    2013-01-01

    Full Text Available Multifunction Vehicle Bus (MVB is a critical component in the Train Communication Network (TCN, which is widely used in most of the modern train techniques of the transportation system. How to ensure security of MVB has become an important issue. Traditional testing could not ensure the system correctness. The MVB system modeling and verification are concerned in this paper. Petri Net and model checking methods are used to verify the MVB system. A Hierarchy Colored Petri Net (HCPN approach is presented to model and simulate the Master Transfer protocol of MVB. Synchronous and asynchronous methods are proposed to describe the entities and communication environment. Automata model of the Master Transfer protocol is designed. Based on our model checking platform M3C, the Master Transfer protocol of the MVB is verified and some system logic critical errors are found. Experimental results show the efficiency of our methods.

  3. Development and Verification of a Mouse Model for Roux-en-Y Gastric Bypass Surgery with a Small Gastric Pouch

    Science.gov (United States)

    Hao, Zheng; Zhao, Zhiyun; Berthoud, Hans-Rudolf; Ye, Jianping

    2013-01-01

    Existing mouse models of Roux-en-Y gastric bypass (RYGB) surgery are not comparable to human RYGB in gastric pouch volume for a large or absent gastric volume. The aim of this study was to develop and characterize a mouse RYGB model that closely replicates gastric pouch size of human RYGB surgery of about 5% of total gastric volume. We established this model in diet-induced obese (DIO) mice of C57BL/6J. This surgery resulted in a sustained 30% weight loss, entirely accounted for by decreased fat mass but not lean mass, compared to sham-operated mice on the high fat diet. Compared to sham-operated mice, energy expenditure corrected for total body weight was significantly increased by about 25%, and substrate utilization was shifted toward higher carbohydrate utilization at 8 weeks after RYGB when body weight had stabilized at the lower level. The energy expenditure persisted and carbohydrate utilization was even more pronounced when the mice were fed chow diet. Although significantly increased during daytime, overall locomotor activity was not significantly different. In response to cold exposure, RYGB mice exhibited an improved capacity to maintain the body temperature. In insulin tolerance test, exogenous insulin-induced suppression of plasma glucose levels was significantly greater in RYGB mice at 4 weeks after surgery. Paradoxically, food intake measured at 5 weeks after surgery was significantly increased, possibly in compensation for increased fecal energy loss and energy expenditure. In conclusion, this new model is a viable alternative to existing murine RYGB models and the model matches human RYGB surgery in anatomy. This model will be useful for studying molecular mechanisms involved in the beneficial effects of RYGB on body weight and glucose homeostasis. PMID:23326365

  4. Development and verification of a mouse model for Roux-en-Y gastric bypass surgery with a small gastric pouch.

    Directory of Open Access Journals (Sweden)

    Zheng Hao

    Full Text Available Existing mouse models of Roux-en-Y gastric bypass (RYGB surgery are not comparable to human RYGB in gastric pouch volume for a large or absent gastric volume. The aim of this study was to develop and characterize a mouse RYGB model that closely replicates gastric pouch size of human RYGB surgery of about 5% of total gastric volume. We established this model in diet-induced obese (DIO mice of C57BL/6J. This surgery resulted in a sustained 30% weight loss, entirely accounted for by decreased fat mass but not lean mass, compared to sham-operated mice on the high fat diet. Compared to sham-operated mice, energy expenditure corrected for total body weight was significantly increased by about 25%, and substrate utilization was shifted toward higher carbohydrate utilization at 8 weeks after RYGB when body weight had stabilized at the lower level. The energy expenditure persisted and carbohydrate utilization was even more pronounced when the mice were fed chow diet. Although significantly increased during daytime, overall locomotor activity was not significantly different. In response to cold exposure, RYGB mice exhibited an improved capacity to maintain the body temperature. In insulin tolerance test, exogenous insulin-induced suppression of plasma glucose levels was significantly greater in RYGB mice at 4 weeks after surgery. Paradoxically, food intake measured at 5 weeks after surgery was significantly increased, possibly in compensation for increased fecal energy loss and energy expenditure. In conclusion, this new model is a viable alternative to existing murine RYGB models and the model matches human RYGB surgery in anatomy. This model will be useful for studying molecular mechanisms involved in the beneficial effects of RYGB on body weight and glucose homeostasis.

  5. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  6. Development, Verification and Use of Gust Modeling in the NASA Computational Fluid Dynamics Code FUN3D

    Science.gov (United States)

    Bartels, Robert E.

    2012-01-01

    This paper presents the implementation of gust modeling capability in the CFD code FUN3D. The gust capability is verified by computing the response of an airfoil to a sharp edged gust. This result is compared with the theoretical result. The present simulations will be compared with other CFD gust simulations. This paper also serves as a users manual for FUN3D gust analyses using a variety of gust profiles. Finally, the development of an Auto-Regressive Moving-Average (ARMA) reduced order gust model using a gust with a Gaussian profile in the FUN3D code is presented. ARMA simulated results of a sequence of one-minus-cosine gusts is shown to compare well with the same gust profile computed with FUN3D. Proper Orthogonal Decomposition (POD) is combined with the ARMA modeling technique to predict the time varying pressure coefficient increment distribution due to a novel gust profile. The aeroelastic response of a pitch/plunge airfoil to a gust environment is computed with a reduced order model, and compared with a direct simulation of the system in the FUN3D code. The two results are found to agree very well.

  7. Models and formal verification of multiprocessor system-on-chips

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2008-01-01

    , the configuration of the execution platform and the mapping of the application onto this platform. The computational model provides a basis for formal analysis of systems. The model is translated to timed automata and a tool for system verification and simulation has been developed using Uppaal as backend. We...... a discrete model of computation for such systems and characterize the size of the computation tree it suffices to consider when checking for schedulability. Analysis of multiprocessor system on chips is a major challenge due to the freedom of interrelated choices concerning the application level...

  8. Verification of the NWP models operated at ICM, Poland

    Science.gov (United States)

    Melonek, Malgorzata

    2010-05-01

    Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw (ICM) started its activity in the field of NWP in May 1997. Since this time the numerical weather forecasts covering Central Europe have been routinely published on our publicly available website. First NWP model used in ICM was hydrostatic Unified Model developed by the UK Meteorological Office. It was a mesoscale version with horizontal resolution of 17 km and 31 levels in vertical. At present two NWP non-hydrostatic models are running in quasi-operational regime. The main new UM model with 4 km horizontal resolution, 38 levels in vertical and forecats range of 48 hours is running four times a day. Second, the COAMPS model (Coupled Ocean/Atmosphere Mesoscale Prediction System) developed by the US Naval Research Laboratory, configured with the three nested grids (with coresponding resolutions of 39km, 13km and 4.3km, 30 vertical levels) are running twice a day (for 00 and 12 UTC). The second grid covers Central Europe and has forecast range of 84 hours. Results of the both NWP models, ie. COAMPS computed on 13km mesh resolution and UM, are verified against observations from the Polish synoptic stations. Verification uses surface observations and nearest grid point forcasts. Following meteorological elements are verified: air temperature at 2m, mean sea level pressure, wind speed and wind direction at 10 m and 12 hours accumulated precipitation. There are presented different statistical indices. For continous variables Mean Error(ME), Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) in 6 hours intervals are computed. In case of precipitation the contingency tables for different thresholds are computed and some of the verification scores such as FBI, ETS, POD, FAR are graphically presented. The verification sample covers nearly one year.

  9. Transforming PLC Programs into Formal Models for Verification Purposes

    CERN Document Server

    Darvas, D; Blanco, E

    2013-01-01

    Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.

  10. Design verification and cold-flow modeling test report

    Energy Technology Data Exchange (ETDEWEB)

    1993-07-01

    This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, and TRW proprietary information has been excluded.

  11. Effect of Terrestrial and Marine Organic Aerosol on Regional and Global Climate: Model Development, Application, and Verification with Satellite Data

    Energy Technology Data Exchange (ETDEWEB)

    Meskhidze, Nicholas; Zhang, Yang; Kamykowski, Daniel

    2012-03-28

    In this DOE project the improvements to parameterization of marine primary organic matter (POM) emissions, hygroscopic properties of marine POM, marine isoprene derived secondary organic aerosol (SOA) emissions, surfactant effects, new cloud droplet activation parameterization have been implemented into Community Atmosphere Model (CAM 5.0), with a seven mode aerosol module from the Pacific Northwest National Laboratory (PNNL)'s Modal Aerosol Model (MAM7). The effects of marine aerosols derived from sea spray and ocean emitted biogenic volatile organic compounds (BVOCs) on microphysical properties of clouds were explored by conducting 10 year CAM5.0-MAM7 model simulations at a grid resolution 1.9° by 2.5° with 30 vertical layers. Model-predicted relationship between ocean physical and biological systems and the abundance of CCN in remote marine atmosphere was compared to data from the A-Train satellites (MODIS, CALIPSO, AMSR-E). Model simulations show that on average, primary and secondary organic aerosol emissions from the ocean can yield up to 20% increase in Cloud Condensation Nuclei (CCN) at 0.2% Supersaturation, and up to 5% increases in droplet number concentration of global maritime shallow clouds. Marine organics were treated as internally or externally mixed with sea salt. Changes associated with cloud properties reduced (absolute value) the model-predicted short wave cloud forcing from -1.35 Wm-2 to -0.25 Wm-2. By using different emission scenarios, and droplet activation parameterizations, this study suggests that addition of marine primary aerosols and biologically generated reactive gases makes an important difference in radiative forcing assessments. All baseline and sensitivity simulations for 2001 and 2050 using global-through-urban WRF/Chem (GU-WRF) were completed. The main objective of these simulations was to evaluate the capability of GU-WRF for an accurate representation of the global atmosphere by exploring the most accurate

  12. Verification and transfer of thermal pollution model. Volume 5: Verification of 2-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorate (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  13. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  14. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...

  15. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train......In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...

  16. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Science.gov (United States)

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  17. MCNP5 development, verification, and performance

    International Nuclear Information System (INIS)

    Forrest B, Brown

    2003-01-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  18. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  19. Demonstrations and verification of debris bed models in the MEDICI reactor cavity code

    International Nuclear Information System (INIS)

    Trebilcock, W.R.; Bergeron, K.D.; Gorham-Bergeron, E.D.

    1984-01-01

    The MEDICI reactor cavity model is under development at Sandia National Laboratories to provide a simple but realistic treatment of ex-vessel severe accident phenomena. Several demonstration cases have been run and are discussed as illustrations of the model's capabilities. Verification of the model with experiments has supplied confidence in the model

  20. Model of supercapacitor with laboratory verification

    Directory of Open Access Journals (Sweden)

    Simić Mitar S.

    2014-01-01

    Full Text Available In this paper historical development and most important production technologies of supercapacitors are presented. Simple model of supecapacitor based on parameters obtained from manufacturer's datasheet is also realized. Purposed model is verified with charge, discharge and self-discharge tests with obtained results close to experimental.

  1. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  2. Verification of Fault Tree Models with RBDGG Methodology

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    Currently, fault tree analysis is widely used in the field of probabilistic safety assessment (PSA) of nuclear power plants (NPPs). To guarantee the correctness of fault tree models, which are usually manually constructed by analysts, a review by other analysts is widely used for verifying constructed fault tree models. Recently, an extension of the reliability block diagram was developed, which is named as RBDGG (reliability block diagram with general gates). The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system and, therefore, the modeling of a system for a system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar to that of the development of the RGGG (Reliability Graph with General Gates) methodology. The difference between the RBDGG methodology and RGGG methodology is that the RBDGG methodology focuses on the block failures while the RGGG methodology focuses on the connection line failures. But, it is also known that an RGGG model can be converted to an RBDGG model and vice versa. In this paper, a new method for the verification of the constructed fault tree models using the RBDGG methodology is proposed and demonstrated

  3. Evaluation factors for verification and validation of low-level waste disposal site models

    International Nuclear Information System (INIS)

    Moran, M.S.; Mezga, L.J.

    1982-01-01

    The purpose of this paper is to identify general evaluation factors to be used to verify and validate LLW disposal site performance models in order to assess their site-specific applicability and to determine their accuracy and sensitivity. It is intended that the information contained in this paper be employed by model users involved with LLW site performance model verification and validation. It should not be construed as providing protocols, but rather as providing a framework for the preparation of specific protocols or procedures. A brief description of each evaluation factor is provided. The factors have been categorized according to recommended use during either the model verification or the model validation process. The general responsibilities of the developer and user are provided. In many cases it is difficult to separate the responsibilities of the developer and user, but the user is ultimately accountable for both verification and validation processes. 4 refs

  4. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the Knowledge Workers Productivity Approach. It is hoped that this paper will help managers to implement different corresponding measures. A case study is presented where this model measure and validates at the ...

  5. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available This paper provides a methodology for Validation and Verification (V&V) of a Bayesian Network (BN) model for aircraft vulnerability against Infrared (IR) missile threats. The model considers that the aircraft vulnerability depends both on a missile...

  6. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  7. Model Based Verification of Cyber Range Event Environments

    Science.gov (United States)

    2015-12-10

    built a software development environment that uses to support the development and verification of on-board software for spacecraft. Tanizaki et al...Technologies, Piscataway, 2011. [16] Hiroaki Tanizaki , Toshiaki Aoki, and Takuya Katayama, "A Variability Management Method for Software

  8. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Panduro, T. E.; Thorsen, B. J.

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however...

  9. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however...

  10. Study on formal modeling and verification of safety computer platform

    Directory of Open Access Journals (Sweden)

    Xi Wang

    2016-05-01

    Full Text Available With the development of automatic control and communication technology, communication-based train control system is adopted by more and more urban mass transit system to automatically supervise the train speed to follow a desired trajectory. Taking reliability, availability, maintainability, and safety into consideration, 2 × 2-out-of-2 safety computer platform is usually utilized as the hardware platform of safety-critical subsystem in communication-based train control. To enhance the safety integrity level of safety computer platform, safety-related logics have to be verified before integrating them into practical systems. Therefore, a significant problem of developing safety computer platform is how to guarantee that system behaviors will satisfy the function requirements as well as responding to external events and processes within the limit of right time. Based on the qualitative and quantitative analysis of function and timing characteristics, this article introduces a formal modeling and verification approach for this real-time system. In the proposed method, timed automata network model for 2 × 2-out-of-2 safety computer platform is built, and system requirements are specified and formalized as computation tree logic properties which can be verified by UPPAAL model checker.

  11. VERIFICATION OF GEAR DYNAMIC MODEL IN DIFFERENT OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ

    2014-09-01

    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  12. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    Michael Horsfall

    ABSTRACT: Improving validation and verification (vv) has been one of the most important tasks of the century. However, we have few measures or management interventions to make such improvement possible, and it is difficult to identify patterns that should be followed by developers because systems and processes in an.

  13. BProVe: A formal verification framework for business process models

    DEFF Research Database (Denmark)

    Corradini, Flavio; Fornari, Fabrizio; Polini, Andrea

    2017-01-01

    Business Process Modelling has acquired increasing relevance in software development. Available notations, such as BPMN, permit to describe activities of complex organisations. On the one hand, this shortens the communication gap between domain experts and IT specialists. On the other hand......, this permits to clarify the characteristics of software systems introduced to provide automatic support for such activities. Nevertheless, the lack of formal semantics hinders the automatic verification of relevant properties. This paper presents a novel verification framework for BPMN 2.0, called BPro......Ve. It is based on an operational semantics, implemented using MAUDE, devised to make the verification general and effective. A complete tool chain, based on the Eclipse modelling environment, allows for rigorous modelling and analysis of Business Processes. The approach has been validated using more than one...

  14. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  15. Fossil Fuel Emission Verification Modeling at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Cameron-Smith, P; Kosovic, B; Guilderson, T; Monache, L D; Bergmann, D

    2009-08-06

    We have an established project at LLNL to develop the tools needed to constrain fossil fuel carbon dioxide emissions using measurements of the carbon-14 isotope in atmospheric samples. In Figure 1 we show the fossil fuel plumes from Los Angeles and San Francisco for two different weather patterns. Obviously, a measurement made at any given location is going to depend on the weather leading up to the measurement. Thus, in order to determine the GHG emissions from some region using in situ measurements of those GHGs, we use state-of-the-art global and regional atmospheric chemistry-transport codes to simulate the plumes: the LLNL-IMPACT model (Rotman et al., 2004) and the WRFCHEM community code (http://www.wrf-model.org/index.php). Both codes can use observed (aka assimilated) meteorology in order to recreate the actual transport that occurred. The measured concentration of each tracer at a particular spatio-temporal location is a linear combination of the plumes from each region at that location (for non-reactive species). The challenge is to calculate the emission strengths for each region that fit the observed concentrations. In general this is difficult because there are errors in the measurements and modeling of the plumes. We solve this inversion problem using the strategy illustrated in Figure 2. The Bayesian Inference step combines the a priori estimates of the emissions, and their uncertainty, for each region with the results of the observations, and their uncertainty, and an ensemble of model predicted plumes for each region, and their uncertainty. The result is the mathematical best estimate of the emissions and their errors. In the case of non-linearities, or if we are using a statistical sampling technique such as a Markov Chain Monte Carlo technique, then the process is iterated until it converges (ie reaches stationarity). For the Bayesian inference we can use both a direct inversion capability, which is fast but requires assumptions of linearity and

  16. Temperature Modeling of Lost Creek Lake Using CE-QUAL-W2:A Report on the Development, Calibration, Verification, and Application of the Model

    Science.gov (United States)

    2017-04-01

    Shortwave solar radiation was available, but ERDC chose to the let the model calculate it internally because this produced better results (SROC = OFF...point temperature, wind speed, wind direction, cloud cover, and short wave solar radiation (if available) 3.1 Model geometry 3.1.1 Bathymetry data...higher value accounts for the turbidity of the lake. The BETA parameter determines the fraction of incident solar radiation absorbed at the water

  17. Formal Requirements Modeling for Simulation-Based Verification

    OpenAIRE

    Otter, Martin; Thuy, Nguyen; Bouskela, Daniel; Buffoni, Lena; Elmqvist, Hilding; Fritzson, Peter; Garro, Alfredo; Jardin, Audrey; Olsson, Hans; Payelleville, Maxime; Schamai, Wladimir; Thomas, Eric; Tundis, Andrea

    2015-01-01

    This paper describes a proposal on how to model formal requirements in Modelica for simulation-based verification. The approach is implemented in the open source Modelica_Requirements library. It requires extensions to the Modelica language, that have been prototypically implemented in the Dymola and Open-Modelica software. The design of the library is based on the FOrmal Requirement Modeling Language (FORM-L) defined by EDF, and on industrial use cases from EDF and Dassault Aviation. It uses...

  18. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor...

  19. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  20. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  1. Verification of Orthogrid Finite Element Modeling Techniques

    Science.gov (United States)

    Steeve, B. E.

    1996-01-01

    The stress analysis of orthogrid structures, specifically with I-beam sections, is regularly performed using finite elements. Various modeling techniques are often used to simplify the modeling process but still adequately capture the actual hardware behavior. The accuracy of such 'Oshort cutso' is sometimes in question. This report compares three modeling techniques to actual test results from a loaded orthogrid panel. The finite element models include a beam, shell, and mixed beam and shell element model. Results show that the shell element model performs the best, but that the simpler beam and beam and shell element models provide reasonable to conservative results for a stress analysis. When deflection and stiffness is critical, it is important to capture the effect of the orthogrid nodes in the model.

  2. Verification of the karst flow model under laboratory controlled conditions

    Science.gov (United States)

    Gotovac, Hrvoje; Andric, Ivo; Malenica, Luka; Srzic, Veljko

    2016-04-01

    Karst aquifers are very important groundwater resources around the world as well as in coastal part of Croatia. They consist of extremely complex structure defining by slow and laminar porous medium and small fissures and usually fast turbulent conduits/karst channels. Except simple lumped hydrological models that ignore high karst heterogeneity, full hydraulic (distributive) models have been developed exclusively by conventional finite element and finite volume elements considering complete karst heterogeneity structure that improves our understanding of complex processes in karst. Groundwater flow modeling in complex karst aquifers are faced by many difficulties such as a lack of heterogeneity knowledge (especially conduits), resolution of different spatial/temporal scales, connectivity between matrix and conduits, setting of appropriate boundary conditions and many others. Particular problem of karst flow modeling is verification of distributive models under real aquifer conditions due to lack of above-mentioned information. Therefore, we will show here possibility to verify karst flow models under the laboratory controlled conditions. Special 3-D karst flow model (5.6*2.6*2 m) consists of concrete construction, rainfall platform, 74 piezometers, 2 reservoirs and other supply equipment. Model is filled by fine sand (3-D porous matrix) and drainage plastic pipes (1-D conduits). This model enables knowledge of full heterogeneity structure including position of different sand layers as well as conduits location and geometry. Moreover, we know geometry of conduits perforation that enable analysis of interaction between matrix and conduits. In addition, pressure and precipitation distribution and discharge flow rates from both phases can be measured very accurately. These possibilities are not present in real sites what this model makes much more useful for karst flow modeling. Many experiments were performed under different controlled conditions such as different

  3. Mesoscale model forecast verification during monsoon 2008

    Indian Academy of Sciences (India)

    The systematic error in the 850 hPa temperature indicates that largely the WRF model forecasts feature warm bias and the MM5 model forecasts feature cold bias. Features common to all the three models include warm bias over northwest India and cold bias over southeast peninsula. The 850 hPa specific humidity forecast ...

  4. SWAAM code development, verification and application to steam generator design

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes developed by Argonne National Laboratory to analyze the effects of sodium/water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and to predict the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The theoretical foundations and numerical treatments on which the codes are based are discussed, followed by a description of code capabilities and limitations, verification of the codes by comparison with experiment, and applications to steam generator and IHTS design. (author). 25 refs, 14 figs

  5. Efficient Development and Verification of Safe Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2013-01-01

    In this book, the authors present current research on the types, design and safety issues of railways. Topics discussed include the acoustic characteristics of noise in train stations; monitoring railway structure conditions and opportunities to use wireless sensor networks as tools to improve...... the monitoring process; hydraulic absorbers as dampers to dissipate the energy of oscillations in railway electric equipment; development of train fare calculation and adjustment systems using VDM++; efficient development and verification of safe railway control software; and evolution of the connectivity...

  6. Development and verification of Monte Carlo burnup calculation system

    International Nuclear Information System (INIS)

    Ando, Yoshihira; Yoshioka, Kenichi; Mitsuhashi, Ishi; Sakurada, Koichi; Sakurai, Shungo

    2003-01-01

    Monte Carlo burnup calculation code system has been developed to evaluate accurate various quantities required in the backend field. From the Actinide Research in a Nuclear Element (ARIANE) program, by using, the measured nuclide compositions of fuel rods in the fuel assemblies irradiated in the commercial Netherlands BWR, the analyses have been performed for the code system verification. The code system developed in this paper has been verified through analysis for MOX and UO2 fuel rods. This system enables to reduce large margin assumed in the present criticality analysis for LWR spent fuels. (J.P.N.)

  7. Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Bliguet, Marie Le; Kjær, Andreas

    2010-01-01

    This paper describes how relay interlocking systems as used by the Danish railways can be formally modelled and verified. Such systems are documented by circuit diagrams describing their static layout. It is explained how to derive a state transition system model for the dynamic behaviour of a re...

  8. Numerical modeling of nitrogen oxide emission and experimental verification

    Directory of Open Access Journals (Sweden)

    Szecowka Lech

    2003-12-01

    Full Text Available The results of nitrogen reduction in combustion process with application of primary method are presented in paper. The reduction of NOx emission, by the recirculation of combustion gasses, staging of fuel and of air was investigated, and than the reduction of NOx emission by simultaneous usage of the mentioned above primary method with pulsatory disturbances.The investigations contain numerical modeling of NOx reduction and experimental verification of obtained numerical calculation results.

  9. VERIFICATION AND VALIDATION OF THE SPARC MODEL

    Science.gov (United States)

    Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values--that is, the physical and chemical constants that govern reactivity. Although empirical structure-activity relationships that allow estimation of some ...

  10. User verification of the FRBR conceptual model

    OpenAIRE

    Pisanski, Jan; Žumer, Maja

    2015-01-01

    Purpose - The paper aims to build on of a previous study of mental model s of the bibliographic universe, which found that the Functional Requirements for Bibliographic Records (FRBR) conceptual model is intuitive. Design/ methodology/approach - A total 120 participants were presented with a list of bibliographic entities and six graphs each. They were asked to choose the graph they thought best represented the relationships between entities described. Findings - The graph bas ed on the FRBR ...

  11. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...... as well as slip. An Unscented Kalman Filter (UKF) based on the dynamic model is used for sensor fusion, feeding sensor measurements back to the robot controller in an intelligent manner. Through practical experiments with the robot, the UKF is demonstrated to improve the reliability of the sensor signals...

  12. A verification system survival probability assessment model test methods

    International Nuclear Information System (INIS)

    Jia Rui; Wu Qiang; Fu Jiwei; Cao Leituan; Zhang Junnan

    2014-01-01

    Subject to the limitations of funding and test conditions, the number of sub-samples of large complex system test less often. Under the single sample conditions, how to make an accurate evaluation of the performance, it is important for reinforcement of complex systems. It will be able to significantly improve the technical maturity of the assessment model, if that can experimental validation and evaluation model. In this paper, a verification system survival probability assessment model test method, the method by the test system sample test results, verify the correctness of the assessment model and a priori information. (authors)

  13. From Wireless Sensor Networks to Wireless Body Area Networks: Formal Modeling and Verification on Security Using PAT

    Directory of Open Access Journals (Sweden)

    Tieming Chen

    2016-01-01

    Full Text Available Model checking has successfully been applied on verification of security protocols, but the modeling process is always tedious and proficient knowledge of formal method is also needed although the final verification could be automatic depending on specific tools. At the same time, due to the appearance of novel kind of networks, such as wireless sensor networks (WSN and wireless body area networks (WBAN, formal modeling and verification for these domain-specific systems are quite challenging. In this paper, a specific and novel formal modeling and verification method is proposed and implemented using an expandable tool called PAT to do WSN-specific security verification. At first, an abstract modeling data structure for CSP#, which is built in PAT, is developed to support the node mobility related specification for modeling location-based node activity. Then, the traditional Dolev-Yao model is redefined to facilitate modeling of location-specific attack behaviors on security mechanism. A throughout formal verification application on a location-based security protocol in WSN is described in detail to show the usability and effectiveness of the proposed methodology. Furthermore, also a novel location-based authentication security protocol in WBAN can be successfully modeled and verified directly using our method, which is, to the best of our knowledge, the first effort on employing model checking for automatic analysis of authentication protocol for WBAN.

  14. NRPB models for calculating the transfer of radionuclides through the environment. Verification and validation

    International Nuclear Information System (INIS)

    Attwood, C.; Barraclough, I.; Brown, J.

    1998-06-01

    There is a wide range of models available at NRPB to predict the transfer of radionuclides through the environment. Such models form an essential part of assessments of the radiological impact of releases of radionuclides into the environment. These models cover: the atmosphere; the aquatic environment; the geosphere; the terrestrial environment including foodchains. It is important that the models used for radiological impact assessments are robust, reliable and suitable for the assessment being undertaken. During model development it is, therefore, important that the model is both verified and validated. Verification of a model involves ensuring that it has been implemented correctly, while validation consists of demonstrating that the model is an adequate representation of the real environment. The extent to which a model can be verified depends on its complexity and whether similar models exist. For relatively simple models verification is straightforward, but for more complex models verification has to form part of the development, coding and testing of the model within quality assurance procedures. Validation of models should ideally consist of comparisons between the results of the models and experimental or environmental measurement data that were not used to develop the model. This is more straightforward for some models than for others depending on the quantity and type of data available. Validation becomes increasingly difficult for models which are intended to predict environmental transfer at long times or at great distances. It is, therefore, necessary to adopt qualitative validation techniques to ensure that the model is an adequate representation of the real environment. This report summarises the models used at NRPB to predict the transfer of radionuclides through the environment as part of a radiological impact assessment. It outlines the work carried out to verify and validate the models. The majority of these models are not currently available

  15. Issues to be considered on obtaining plant models for formal verification purposes

    Science.gov (United States)

    Pacheco, R.; Gonzalez, L.; Intriago, M.; Machado, J.; Prisacaru, G.; Olaru, D.

    2016-08-01

    The development of dependable software for mechatronic systems can be a very complex and hard task. For facilitating the obtaining of dependable software for industrial controllers, some powerful software tools and analysis techniques can be used. Mainly, when using simulation and formal verification analysis techniques, it is necessary to develop plant models, in order to describe the plant behavior of those systems. However, developing a plant model implies that designer takes his (or her) decisions concerning granularity and level of abstraction of models; approach to consider for modeling (global or modular); and definition of strategies for simulation and formal verification tasks. This paper intends to highlight some aspects that can be considered for taking into account those decisions. For this purpose, it is presented a case study and there are illustrated and discussed very important aspects concerning above exposed issues.

  16. A program for verification of phylogenetic network models.

    Science.gov (United States)

    Gunawan, Andreas D M; Lu, Bingxin; Zhang, Louxin

    2016-09-01

    Genetic material is transferred in a non-reproductive manner across species more frequently than commonly thought, particularly in the bacteria kingdom. On one hand, extant genomes are thus more properly considered as a fusion product of both reproductive and non-reproductive genetic transfers. This has motivated researchers to adopt phylogenetic networks to study genome evolution. On the other hand, a gene's evolution is usually tree-like and has been studied for over half a century. Accordingly, the relationships between phylogenetic trees and networks are the basis for the reconstruction and verification of phylogenetic networks. One important problem in verifying a network model is determining whether or not certain existing phylogenetic trees are displayed in a phylogenetic network. This problem is formally called the tree containment problem. It is NP-complete even for binary phylogenetic networks. We design an exponential time but efficient method for determining whether or not a phylogenetic tree is displayed in an arbitrary phylogenetic network. It is developed on the basis of the so-called reticulation-visible property of phylogenetic networks. A C-program is available for download on http://www.math.nus.edu.sg/∼matzlx/tcp_package matzlx@nus.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2017-01-01

    In this article, we present a method and an associated toolchain for the formal verification of the new Danish railway interlocking systems that are compatible with the European Train Control System (ETCS) Level 2. We have made a generic and reconfigurable model of the system behaviour and generic...... safety properties. This model accommodates sequential release - a feature in the new Danish interlocking systems. To verify the safety of an interlocking system, first a domain-specific description of interlocking configuration data is constructed and validated. Then the generic model and safety...

  18. Modeling and Verification of the Bitcoin Protocol

    Directory of Open Access Journals (Sweden)

    Kaylash Chaudhary

    2015-11-01

    Full Text Available Bitcoin is a popular digital currency for online payments, realized as a decentralized peer-to-peer electronic cash system. Bitcoin keeps a ledger of all transactions; the majority of the participants decides on the correct ledger. Since there is no trusted third party to guard against double spending, and inspired by its popularity, we would like to investigate the correctness of the Bitcoin protocol. Double spending is an important threat to electronic payment systems. Double spending would happen if one user could force a majority to believe that a ledger without his previous payment is the correct one. We are interested in the probability of success of such a double spending attack, which is linked to the computational power of the attacker. This paper examines the Bitcoin protocol and provides its formalization as an UPPAAL model. The model will be used to show how double spending can be done if the parties in the Bitcoin protocol behave maliciously, and with what probability double spending occurs.

  19. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  20. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR - BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND... Sensor -Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...covers the development of a torque sensor for verification and validation (V&V) of spacecraft attitude control actuators. The developed sensor directly

  1. Land surface Verification Toolkit (LVT) - a generalized framework for land surface model evaluation

    Science.gov (United States)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J.; Harrison, K.; Liu, Y.; Shaw, M.

    2012-06-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  2. Development and Verification of Smoothed Particle Hydrodynamics Code for Analysis of Tsunami near NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Young Beom; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    It becomes more complicated when considering the shape and phase of the ground below the seawater. Therefore, some different attempts are required to precisely analyze the behavior of tsunami. This paper introduces an on-going activities on code development in SNU based on an unconventional mesh-free fluid analysis method called Smoothed Particle Hydrodynamics (SPH) and its verification work with some practice simulations. This paper summarizes the on-going development and verification activities on Lagrangian mesh-free SPH code in SNU. The newly developed code can cover equation of motions and heat conduction equation so far, and verification of each models is completed. In addition, parallel computation using GPU is now possible, and GUI is also prepared. If users change input geometry or input values, they can simulate for various conditions geometries. A SPH method has large advantages and potential in modeling of free surface, highly deformable geometry and multi-phase problems that traditional grid-based code has difficulties in analysis. Therefore, by incorporating more complex physical models such as turbulent flow, phase change, two-phase flow, and even solid mechanics, application of the current SPH code is expected to be much more extended including molten fuel behaviors in the sever accident.

  3. SoS contract verification using statistical model checking

    Directory of Open Access Journals (Sweden)

    Alessandro Mignogna

    2013-11-01

    Full Text Available Exhaustive formal verification for systems of systems (SoS is impractical and cannot be applied on a large scale. In this paper we propose to use statistical model checking for efficient verification of SoS. We address three relevant aspects for systems of systems: 1 the model of the SoS, which includes stochastic aspects; 2 the formalization of the SoS requirements in the form of contracts; 3 the tool-chain to support statistical model checking for SoS. We adapt the SMC technique for application to heterogeneous SoS. We extend the UPDM/SysML specification language to express the SoS requirements that the implemented strategies over the SoS must satisfy. The requirements are specified with a new contract language specifically designed for SoS, targeting a high-level English- pattern language, but relying on an accurate semantics given by the standard temporal logics. The contracts are verified against the UPDM/SysML specification using the Statistical Model Checker (SMC PLASMA combined with the simulation engine DESYRE, which integrates heterogeneous behavioral models through the functional mock-up interface (FMI standard. The tool-chain allows computing an estimation of the satisfiability of the contracts by the SoS. The results help the system architect to trade-off different solutions to guide the evolution of the SoS.

  4. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  5. Development and verification of a new wind speed forecasting system using an ensemble Kalman filter data assimilation technique in a fully coupled hydrologic and atmospheric model

    Science.gov (United States)

    Williams, John L.; Maxwell, Reed M.; Monache, Luca Delle

    2013-12-01

    Wind power is rapidly gaining prominence as a major source of renewable energy. Harnessing this promising energy source is challenging because of the chaotic nature of wind and its inherently intermittent nature. Accurate forecasting tools are critical to support the integration of wind energy into power grids and to maximize its impact on renewable energy portfolios. We have adapted the Data Assimilation Research Testbed (DART), a community software facility which includes the ensemble Kalman filter (EnKF) algorithm, to expand our capability to use observational data to improve forecasts produced with a fully coupled hydrologic and atmospheric modeling system, the ParFlow (PF) hydrologic model and the Weather Research and Forecasting (WRF) mesoscale atmospheric model, coupled via mass and energy fluxes across the land surface, and resulting in the PF.WRF model. Numerous studies have shown that soil moisture distribution and land surface vegetative processes profoundly influence atmospheric boundary layer development and weather processes on local and regional scales. We have used the PF.WRF model to explore the connections between the land surface and the atmosphere in terms of land surface energy flux partitioning and coupled variable fields including hydraulic conductivity, soil moisture, and wind speed and demonstrated that reductions in uncertainty in these coupled fields realized through assimilation of soil moisture observations propagate through the hydrologic and atmospheric system. The sensitivities found in this study will enable further studies to optimize observation strategies to maximize the utility of the PF.WRF-DART forecasting system.

  6. Development and verification of a time delivery model for prostate intensity modulated radiotherapy using a Siemens®Artiste™ 160 Multi-leaf Collimator Linac.

    Science.gov (United States)

    Fourie, Nicola; Ali, Omer A; Rae, William I D

    2017-03-01

    Time delivery models thus far proposed for prediction of radiotherapy delivery times are not applicable to all makes of Linac. Our purpose was to develop a time delivery model, which would also be applicable for a Siemens ® ARTISTE™ 160 Multi-leaf Collimator (MLC) linear accelerator (Linac) and validate the model using prostate Intensity Modulated Radiation Therapy (IMRT) treatment plans. To our knowledge, a time delivery model has not yet been proposed for a Siemens ® ARTISTE™ 160 MLC Linac. We used the principles of the time delivery model created for a Varian ® Linac and added the radio frequency (RF) wave component, and the MLC delay time to the MLC travel time component. Machine input parameters were confirmed using a WIN ® stopwatch. We tested our derived model by selecting ten random 15 MV prostate IMRT treatment plans from our clinic. The delivery time was measured three times, once per day on three different days. The calculated and measured times were compared by means of correlation. The time delivery ranged between 314 and 480 s. The largest percentage difference was 3.3% (16 s) and the smallest 0.2% (1 s); the mean percentage difference was 1.9%. MLC delay and MLC speed, representing segment delivery, had the greatest uncertainties. From the successfully verified time delivery model created, it is concluded that the inter-segmental component of the process is most time-consuming. In order to decrease delivery time it is proposed that the total segments of a treatment plan be decreased.

  7. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    Science.gov (United States)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  8. Verification and transfer of thermal pollution model. Volume 3: Verification of 3-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free-surface model also provides surface height variations with time.

  9. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    Science.gov (United States)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  10. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  11. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  12. Modeling and Verification of Insider Threats Using Logical Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2017-01-01

    and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider......In this paper, we combine formal modeling and analysis of infrastructures of organizations with sociological explanation to provide a framework for insider threat analysis. We use the higher order logic (HOL) proof assistant Isabelle/HOL to support this framework. In the formal model, we exhibit...... theory constructed in Isabelle that implements this process of social explanation. To validate that the social explanation is generally useful for the analysis of insider threats and to demonstrate our framework, we model and verify the insider threat patterns of entitled independent and Ambitious Leader...

  13. Development of evaluation and performance verification technology for radiotherapy radiation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.; Jang, S. Y.; Kim, B. H. and others

    2005-02-15

    No matter how much the importance is emphasized, the exact assessment of the absorbed doses administered to the patients to treat the various diseases such as lately soaring malignant tumors with the radiotherapy practices is the most important factor. In reality, several over-exposed patients from the radiotherapy practice become very serious social issues. Especially, the development of a technology to exactly assess the high doses and high energies (In general, dose administered to the patients with the radiotherapy practices are very huge doses, and they are about three times higher than the lethal doses) generated by the radiation generators and irradiation equipment is a competing issue to be promptly conducted. Over fifty medical centers in Korea operate the radiation generators and irradiation equipment for the radiotherapy practices. However, neither the legal and regulatory systems to implement a quality assurance program are sufficiently stipulated nor qualified personnel who could run a program to maintain the quality assurance and control of those generators and equipment for the radiotherapy practices in the medical facilities are sufficiently employed. To overcome the above deficiencies, a quality assurance program such as those developed in the technically advanced countries should be developed to exactly assess the doses administered to patients with the radiotherapy practices and develop the necessary procedures to maintain the continuing performance of the machine or equipment for the radiotherapy. The QA program and procedures should induce the fluent calibration of the machine or equipment with quality, and definitely establish the safety of patients in the radiotherapy practices. In this study, a methodology for the verification and evaluation of the radiotherapy doses is developed, and several accurate measurements, evaluations of the doses delivered to patients and verification of the performance of the therapy machine and equipment are

  14. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...... of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated...

  15. On the verification and validation of detonation models

    Science.gov (United States)

    Quirk, James

    2013-06-01

    This talk will consider the verification and validation of detonation models, such as Wescott-Stewart-Davis (Journal of Applied Physics. 2005), from the perspective of the American Institute of Aeronautics and Astronautics policy on numerical accuracy (AIAA J. Vol. 36, No. 1, 1998). A key aspect of the policy is that accepted documentation procedures must be used for journal articles with the aim of allowing the reported work to be reproduced by the interested reader. With the rise of electronic documents, since the policy was formulated, it is now possible to satisfy this mandate in its strictest sense: that is, it is now possible to run a comptuational verification study directly in a PDF, thereby allowing a technical author to report numerical subtleties that traditionally have been ignored. The motivation for this document-centric approach is discussed elsewhere (Quirk2003, Adaptive Mesh Refinement Theory and Practice, Springer), leaving the talk to concentrate on specific detonation examples that should be of broad interest to the shock-compression community.

  16. Design Development and Verification of a System Integrated Modular PWR

    International Nuclear Information System (INIS)

    Kim, S.-H.; Kim, K. K.; Chang, M. H.; Kang, C. S.; Park, G.-C.

    2002-01-01

    An advanced PWR with a rated thermal power of 330 MW has been developed at the Korea Atomic Energy Research Institute (KAERI) for a dual purpose: seawater desalination and electricity generation. The conceptual design of SMART ( System-Integrated Modular Advanced ReacTor) with a desalination system was already completed in March of 1999. The basic design for the integrated nuclear desalination system is currently underway and will be finished by March of 2002. The SMART co-generation plant with the MED seawater desalination process is designed to supply forty thousand (40,000) tons of fresh water per day and ninety (90) MW of electricity to an area with approximately a ten thousand (100,000) population or an industrialized complex. This paper describes advanced design features adopted in the SMART design and also introduces the design and engineering verification program. In the beginning stage of the SMART development, top-level requirements for safety and economics were imposed for the SMART design features. To meet the requirements, highly advanced design features enhancing the safety, reliability, performance, and operability are introduced in the SMART design. The SMART consists of proven KOFA (Korea Optimized Fuel Assembly), helical once-through steam generators, a self-controlled pressurizer, control element drive mechanisms, and main coolant pumps in a single pressure vessel. In order to enhance safety characteristics, innovative design features adopted in the SMART system are low core power density, large negative Moderator Temperature Coefficient (MTC), high natural circulation capability and integral arrangement to eliminate large break loss of coolant accident, etc. The progression of emergency situations into accidents is prevented with a number of advanced engineered safety features such as passive residual heat removal system, passive emergency core cooling system, safeguard vessel, and passive containment over-pressure protection. The preliminary

  17. Analog Video Authentication and Seal Verification Equipment Development

    Energy Technology Data Exchange (ETDEWEB)

    Gregory Lancaster

    2012-09-01

    Under contract to the US Department of Energy in support of arms control treaty verification activities, the Savannah River National Laboratory in conjunction with the Pacific Northwest National Laboratory, the Idaho National Laboratory and Milagro Consulting, LLC developed equipment for use within a chain of custody regime. This paper discussed two specific devices, the Authentication Through the Lens (ATL) analog video authentication system and a photographic multi-seal reader. Both of these devices have been demonstrated in a field trial, and the experience gained throughout will also be discussed. Typically, cryptographic methods are used to prove the authenticity of digital images and video used in arms control chain of custody applications. However, in some applications analog cameras are used. Since cryptographic authentication methods will not work on analog video streams, a simple method of authenticating analog video was developed and tested. A photographic multi-seal reader was developed to image different types of visual unique identifiers for use in chain of custody and authentication activities. This seal reader is unique in its ability to image various types of seals including the Cobra Seal, Reflective Particle Tags, and adhesive seals. Flicker comparison is used to compare before and after images collected with the seal reader in order to detect tampering and verify the integrity of the seal.

  18. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Science.gov (United States)

    Franz, K. J.; Hogue, T. S.

    2011-11-01

    The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP) systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE), and the Shuffle Complex Evolution Metropolis (SCEM). Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA) model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  19. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  20. Verification and Validation of a Three-Dimensional Orthotropic Plasticity Constitutive Model Using a Unidirectional Composite

    Directory of Open Access Journals (Sweden)

    Canio Hoffarth

    2017-03-01

    Full Text Available A three-dimensional constitutive model has been developed for modeling orthotropic composites subject to impact loads. It has three distinct components—a deformation model involving elastic and plastic deformations; a damage model; and a failure model. The model is driven by tabular data that is generated either using laboratory tests or via virtual testing. A unidirectional composite—T800/F3900, commonly used in the aerospace industry, is used in the verification and validation tests. While the failure model is under development, these tests indicate that the implementation of the deformation and damage models in a commercial finite element program, LS-DYNA, is efficient, robust and accurate.

  1. Pneumatic Adaptive Absorber: Mathematical Modelling with Experimental Verification

    Directory of Open Access Journals (Sweden)

    Grzegorz Mikułowski

    2016-01-01

    Full Text Available Many of mechanical energy absorbers utilized in engineering structures are hydraulic dampers, since they are simple and highly efficient and have favourable volume to load capacity ratio. However, there exist fields of applications where a threat of toxic contamination with the hydraulic fluid contents must be avoided, for example, food or pharmacy industries. A solution here can be a Pneumatic Adaptive Absorber (PAA, which is characterized by a high dissipation efficiency and an inactive medium. In order to properly analyse the characteristics of a PAA, an adequate mathematical model is required. This paper proposes a concept for mathematical modelling of a PAA with experimental verification. The PAA is considered as a piston-cylinder device with a controllable valve incorporated inside the piston. The objective of this paper is to describe a thermodynamic model of a double chamber cylinder with gas migration between the inner volumes of the device. The specific situation considered here is that the process cannot be defined as polytropic, characterized by constant in time thermodynamic coefficients. Instead, the coefficients of the proposed model are updated during the analysis. The results of the experimental research reveal that the proposed mathematical model is able to accurately reflect the physical behaviour of the fabricated demonstrator of the shock absorber.

  2. Development and preliminary verification of the 3D core neutronic code: COCO

    Energy Technology Data Exchange (ETDEWEB)

    Lu, H.; Mo, K.; Li, W.; Bai, N.; Li, J. [Reactor Design and Fuel Management Research Center, China Nuclear Power Technology Research Inst., 47F/A Jiangsu Bldg., Yitian Road, Futian District, Shenzhen (China)

    2012-07-01

    As the recent blooming economic growth and following environmental concerns (China)) is proactively pushing forward nuclear power development and encouraging the tapping of clean energy. Under this situation, CGNPC, as one of the largest energy enterprises in China, is planning to develop its own nuclear related technology in order to support more and more nuclear plants either under construction or being operation. This paper introduces the recent progress in software development for CGNPC. The focus is placed on the physical models and preliminary verification results during the recent development of the 3D Core Neutronic Code: COCO. In the COCO code, the non-linear Green's function method is employed to calculate the neutron flux. In order to use the discontinuity factor, the Neumann (second kind) boundary condition is utilized in the Green's function nodal method. Additionally, the COCO code also includes the necessary physical models, e.g. single-channel thermal-hydraulic module, burnup module, pin power reconstruction module and cross-section interpolation module. The preliminary verification result shows that the COCO code is sufficient for reactor core design and analysis for pressurized water reactor (PWR). (authors)

  3. Control and verification of industrial hybrid systems using models specified with the formalism $ chi $

    NARCIS (Netherlands)

    J.J.H. Fey

    1996-01-01

    textabstractControl and verification of hybrid systems is studied using two industrial examples. The hybrid models of a conveyor-belt and of a biochemical plant for the production of ethanol are specified in the formalism $chi .$ A verification of the closed-loop systems for those examples,

  4. Verification of temporal-causal network models by mathematical analysis

    Directory of Open Access Journals (Sweden)

    Jan Treur

    2016-04-01

    Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.

  5. Providing an empirical basis for optimizing the verification and testing phases of software development

    Science.gov (United States)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1992-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.

  6. Verification of the two-dimensional hydrodynamic model based on remote sensing

    Science.gov (United States)

    Sazonov, Alexey; Mikhailukova, Polina; Krylenko, Inna; Frolova, Natalya; Kireeva, Mariya

    2016-04-01

    Mathematical modeling methods are used more and more actively to evaluate possible damage, identify potential flood zone and the influence of individual factors affecting the river during the passage of the flood. Calculations were performed by means of domestic software complex «STREAM-2D» which is based on the numerical solution of two-dimensional St. Venant equations. One of the major challenges in mathematical modeling is the verification of the model. This is usually made using data on water levels from hydrological stations: the smaller the difference of the actual level and the simulated one, the better the quality of the model used. Data from hydrological stations are not always available, so alternative sources of verification, such as remote sensing, are increasingly used. The aim of this work is to develop a method of verification of hydrodynamic model based on a comparison of actual flood zone area, which in turn is determined on the basis of the automated satellite image interpretation methods for different imaging systems and flooded area obtained in the course of the model. The study areas are Lena River, The North Dvina River, Amur River near Blagoveshchensk. We used satellite images made by optical and radar sensors: SPOT-5/HRG, Resurs-F, Radarsat-2. Flooded area were calculated using unsupervised classification (ISODATA and K-mean) for optical images and segmentation for Radarsat-2. Knowing the flow rate and the water level at a given date for the upper and lower limits of the model, respectively, it is possible to calculate flooded area by means of program STREAM-2D and GIS technology. All the existing vector layers with the boundaries of flooding are included in a GIS project for flood area calculation. This study was supported by the Russian Science Foundation, project no. 14-17-00155.

  7. Verification of Model of Calculation of Intra-Chamber Parameters In Hybrid Solid-Propellant Rocket Engines

    Directory of Open Access Journals (Sweden)

    Zhukov Ilya S.

    2016-01-01

    Full Text Available On the basis of obtained analytical estimate of characteristics of hybrid solid-propellant rocket engine verification of earlier developed physical and mathematical model of processes in a hybrid solid-propellant rocket engine for quasi-steady-state flow regime was performed. Comparative analysis of calculated and analytical data indicated satisfactory comparability of simulation results.

  8. Experimental verification of mathematical model of the heat transfer in exhaust system

    Directory of Open Access Journals (Sweden)

    Petković Snežana

    2011-01-01

    Full Text Available A Catalyst convertor has maximal efficiency when it reaches working temperature. In a cold start phase efficiency of the catalyst is low and exhaust emissions have high level of air pollutants. The exhaust system optimization, in order to decrease time of achievement of the catalyst working temperature, caused reduction of the total vehicle emission. Implementation of mathematical models in development of exhaust systems decrease total costs and reduce time. Mathematical model has to be experimentally verified and calibrated, in order to be useful in the optimization process. Measurement installations have been developed and used for verification of the mathematical model of unsteady heat transfer in exhaust systems. Comparisons between experimental results and the mathematical model are presented in this paper. Based on obtained results, it can be concluded that there is a good agreement between the model and the experimental results.

  9. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  10. Integrating Requirements Engineering, Modeling, and Verification Technologies into Software and Systems Engineering

    National Research Council Canada - National Science Library

    Broy, Manfred; Leucker, Martin

    2007-01-01

    The objective of this project is the development of an integrated suite of technologies focusing on end-to-end software development supporting requirements analysis, design, implementation, and verification...

  11. Modelling and Formal Verification of Timing Aspects in Large PLC Programs

    CERN Document Server

    Fernandez Adiego, B; Blanco Vinuela, E; Tournier, J-C; Gonzalez Suarez, V M; Blech, J O

    2014-01-01

    One of the main obstacle that prevents model checking from being widely used in industrial control systems is the complexity of building formal models out of PLC programs, especially when timing aspects need to be integrated. This paper brings an answer to this obstacle by proposing a methodology to model and verify timing aspects of PLC programs. Two approaches are proposed to allow the users to balance the trade-off between the complexity of the model, i.e. its number of states, and the set of specifications possible to be verified. A tool supporting the methodology which allows to produce models for different model checkers directly from PLC programs has been developed. Verification of timing aspects for real-life PLC programs are presented in this paper using NuSMV.

  12. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    Science.gov (United States)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  13. Thermal Pollution Mathematical Model. Volume 4: Verification of Three-Dimensional Rigid-Lid Model at Lake Keowee. [envrionment impact of thermal discharges from power plants

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1980-01-01

    The rigid lid model was developed to predict three dimensional temperature and velocity distributions in lakes. This model was verified at various sites (Lake Belews, Biscayne Bay, etc.) and th verification at Lake Keowee was the last of these series of verification runs. The verification at Lake Keowee included the following: (1) selecting the domain of interest, grid systems, and comparing the preliminary results with archival data; (2) obtaining actual ground truth and infrared scanner data both for summer and winter; and (3) using the model to predict the measured data for the above periods and comparing the predicted results with the actual data. The model results compared well with measured data. Thus, the model can be used as an effective predictive tool for future sites.

  14. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    Science.gov (United States)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  15. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at{sub R}isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at{sub R}isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT LASER TOUCH AND TECHNOLOGIES, LLC LASER TOUCH MODEL LT-B512

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of Laser Touch model LT-B512 targeting device manufactured by Laser Touch and Technologies, LLC, for manual spray painting operations. The relative transfer efficiency (TE) improved an avera...

  17. Results of the Independent Verification and Validation Study for the D2-Puff Model

    National Research Council Canada - National Science Library

    Bowers, J

    1999-01-01

    .... The independent verification and validation (IV&V) study of D2-Puff Version 2.0.6 focused on these accreditation requirements and the implicit requirement that the model provide safe-sided hazard estimates...

  18. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  19. Program Verification with Monadic Second-Order Logic & Languages for Web Service Development

    DEFF Research Database (Denmark)

    Møller, Anders

    and verification techniques. This dissertation describes two projects, each exploring one particular instance of such languages: monadic second-order logic and its application to program verification, and programming languages for construction of interactive Web services. Both program verification and Web service...... applications, this implementation forms the basis of a verification technique for imperative programs that perform data-type operations using pointers. To achieve this, the basic logic is extended with layers of language abstractions. Also, a language for expressing data structures and operations along...... development are areas of programming language research that have received increased attention during the last years. We first show how the logic Weak monadic Second-order Logic on Strings and Trees can be implemented efficiently despite an intractable theoretical worst-case complexity. Among several other...

  20. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  1. D Modeling with Photogrammetry by Uavs and Model Quality Verification

    Science.gov (United States)

    Barrile, V.; Bilotta, G.; Nunnari, A.

    2017-11-01

    This paper deals with a test lead by Geomatics laboratory (DICEAM, Mediterranea University of Reggio Calabria), concerning the application of UAV photogrammetry for survey, monitoring and checking. The study case relies with the surroundings of the Department of Agriculture Sciences. In the last years, such area was interested by landslides and survey activities carried out to take the phenomenon under control. For this purpose, a set of digital images were acquired through a UAV equipped with a digital camera and GPS. Successively, the processing for the production of a 3D georeferenced model was performed by using the commercial software Agisoft PhotoScan. Similarly, the use of a terrestrial laser scanning technique allowed to product dense cloud and 3D models of the same area. To assess the accuracy of the UAV-derived 3D models, a comparison between image and range-based methods was performed.

  2. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification

    Science.gov (United States)

    Sager, Jennifer E.; Yu, Jingjing; Ragueneau-Majlessi, Isabelle

    2015-01-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms “PBPK” and “physiologically based pharmacokinetic model” to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines. PMID:26296709

  3. Developing a Verification and Training Phantom for Gynecological Brachytherapy System

    Directory of Open Access Journals (Sweden)

    Mahbobeh Nazarnejad

    2012-03-01

    Full Text Available Introduction Dosimetric accuracy is a major issue in the quality assurance (QA program for treatment planning systems (TPS. An important contribution to this process has been a proper dosimetry method to guarantee the accuracy of delivered dose to the tumor. In brachytherapy (BT of gynecological (Gyn cancer it is usual to insert a combination of tandem and ovoid applicators with a complicated geometry which makes their dosimetry verification difficult and important. Therefore, evaluation and verification of dose distribution is necessary for accurate dose delivery to the patients. Materials and Methods The solid phantom was made from Perspex slabs as a tool for intracavitary brachytherapy dosimetric QA. Film dosimetry (EDR2 was done for a combination of ovoid and tandem applicators introduced by Flexitron brachytherapy system. Treatment planning was also done with Flexiplan 3D-TPS to irradiate films sandwiched between phantom slabs. Isodose curves obtained from treatment planning system and the films were compared with each other in 2D and 3D manners. Results The brachytherapy solid phantom was constructed with slabs. It was possible to insert tandems and ovoids loaded with radioactive source of Ir-192 subsequently. Relative error was 3-8.6% and average relative error was 5.08% in comparison with the films and TPS isodose curves. Conclusion Our results showed that the difference between TPS and the measurements is well within the acceptable boundaries and below the action level according to AAPM TG.45. Our findings showed that this phantom after minor corrections can be used as a method of choice for inter-comparison analysis of TPS and to fill the existing gap for accurate QA program in intracavitary brachytherapy. The constructed phantom also showed that it can be a valuable tool for verification of accurate dose delivery to the patients as well as training for brachytherapy residents and physics students.

  4. Evaluation and verification of thermal stratification models for was

    African Journals Online (AJOL)

    USER

    form bacteria and verification. INTRODUCTION ... bottom. The thermal stratification can be stable-persisting for months or intermittent, appearing for a few hours in the day (Dor et al. 1993; Pedahzur et al., 1993; Torres et al.,. 1997). In waste stabilization ponds the .... water depth of 0.2 and a thick sludge deposits. The WSP ...

  5. Efficient speaker verification using Gaussian mixture model component clustering.

    Energy Technology Data Exchange (ETDEWEB)

    De Leon, Phillip L. (New Mexico State University, Las Cruces, NM); McClanahan, Richard D.

    2012-04-01

    In speaker verification (SV) systems that employ a support vector machine (SVM) classifier to make decisions on a supervector derived from Gaussian mixture model (GMM) component mean vectors, a significant portion of the computational load is involved in the calculation of the a posteriori probability of the feature vectors of the speaker under test with respect to the individual component densities of the universal background model (UBM). Further, the calculation of the sufficient statistics for the weight, mean, and covariance parameters derived from these same feature vectors also contribute a substantial amount of processing load to the SV system. In this paper, we propose a method that utilizes clusters of GMM-UBM mixture component densities in order to reduce the computational load required. In the adaptation step we score the feature vectors against the clusters and calculate the a posteriori probabilities and update the statistics exclusively for mixture components belonging to appropriate clusters. Each cluster is a grouping of multivariate normal distributions and is modeled by a single multivariate distribution. As such, the set of multivariate normal distributions representing the different clusters also form a GMM. This GMM is referred to as a hash GMM which can be considered to a lower resolution representation of the GMM-UBM. The mapping that associates the components of the hash GMM with components of the original GMM-UBM is referred to as a shortlist. This research investigates various methods of clustering the components of the GMM-UBM and forming hash GMMs. Of five different methods that are presented one method, Gaussian mixture reduction as proposed by Runnall's, easily outperformed the other methods. This method of Gaussian reduction iteratively reduces the size of a GMM by successively merging pairs of component densities. Pairs are selected for merger by using a Kullback-Leibler based metric. Using Runnal's method of reduction, we

  6. Code and Solution Verification of 3D Numerical Modeling of Flow in the Gust Erosion Chamber

    Science.gov (United States)

    Yuen, A.; Bombardelli, F. A.

    2014-12-01

    Erosion microcosms are devices commonly used to investigate the erosion and transport characteristics of sediments at the bed of rivers, lakes, or estuaries. In order to understand the results these devices provide, the bed shear stress and flow field need to be accurately described. In this research, the UMCES Gust Erosion Microcosm System (U-GEMS) is numerically modeled using Finite Volume Method. The primary aims are to simulate the bed shear stress distribution at the surface of the sediment core/bottom of the microcosm, and to validate the U-GEMS produces uniform bed shear stress at the bottom of the microcosm. The mathematical model equations are solved by on a Cartesian non-uniform grid. Multiple numerical runs were developed with different input conditions and configurations. Prior to developing the U-GEMS model, the General Moving Objects (GMO) model and different momentum algorithms in the code were verified. Code verification of these solvers was done via simulating the flow inside the top wall driven square cavity on different mesh sizes to obtain order of convergence. The GMO model was used to simulate the top wall in the top wall driven square cavity as well as the rotating disk in the U-GEMS. Components simulated with the GMO model were rigid bodies that could have any type of motion. In addition cross-verification was conducted as results were compared with numerical results by Ghia et al. (1982), and good agreement was found. Next, CFD results were validated by simulating the flow within the conventional microcosm system without suction and injection. Good agreement was found when the experimental results by Khalili et al. (2008) were compared. After the ability of the CFD solver was proved through the above code verification steps. The model was utilized to simulate the U-GEMS. The solution was verified via classic mesh convergence study on four consecutive mesh sizes, in addition to that Grid Convergence Index (GCI) was calculated and based on

  7. Verification of forward kinematics of the numerical and analytical model of Fanuc AM100iB robot

    Science.gov (United States)

    Cholewa, A.; Świder, J.; Zbilski, A.

    2016-08-01

    The article presents the verification of forward kinematics of Fanuc AM100iB robot. The developed kinematic model of the machine was verified using tests on an actual robot model. The tests consisted in positioning the robot operating in the mode of controlling the values of natural angles in selected points of its workspace and reading the indications of the coordinate values of the TCP point in the robot's global coordinate system on the operator panel. Validation of the model consisted of entering the same values of natural angles that were used for positioning the robot in its inputs and calculating the coordinate values of the TCP of the machine's CAE model, and then comparing the results obtained with the values read. These results are the introduction to the partial verification of the dynamic model of the analysed device.

  8. Experimental verification of the energetic model of the dry mechanical reclamation process

    Directory of Open Access Journals (Sweden)

    R. Dańko

    2008-04-01

    Full Text Available The experimental results of the dry mechanical reclamation process, which constituted the bases for the verification of the energetic model of this process, developed by the author on the grounds of the Rittinger’s deterministic hypothesis of the crushing process, are presented in the paper. Used foundry sands with bentonite, with water-glass from the floster technology and used sands with furan FL 105 resin were used in the reclamation tests. In the mechanical and mechanical-cryogenic reclamation a wide range of time variations and reclamation conditions influencing intensity of the reclamation process – covering all possible parameters used in industrial devices - were applied. The developed theoretical model constitutes a new tool allowing selecting optimal times for the reclamation treatment of the given spent foundry sand at the assumed process intensity realized in rotor reclaimers - with leaves or rods as grinding elements mounted horizontally on the rotor axis.

  9. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  10. Development and verification test of integral reactor major components

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. I.; Kim, Y. W.; Kim, J. H. and others

    1999-03-01

    The conceptual designs for SG, MCP, CEDM to be installed in the integral reactor SMART were developed. Three-dimensional CAD models for the major components were developed to visualize the design concepts. Once-through helical steam generator was conceptually designed for SMART. Canned motor pump was adopted in the conceptual design of MCP. Linear pulse motor type and ballscrew type CEDM, which have fine control capabilities were studied for adoption in SMART. In parallel with the structural design, the electro-magnetic design was performed for the sizing motors and electro-magnet. Prototypes for the CEDM and MCP sub-assemblies were developed and tested to verify the performance. The impeller design procedure and the computer program to analyze the dynamic characteristics of MCP rotor shaft were developed. The design concepts of SG, MCP, CEDM were also invetigated for the fabricability.

  11. Development and verification test of integral reactor major components

    International Nuclear Information System (INIS)

    Kim, J. I.; Kim, Y. W.; Kim, J. H. and others

    1999-03-01

    The conceptual designs for SG, MCP, CEDM to be installed in the integral reactor SMART were developed. Three-dimensional CAD models for the major components were developed to visualize the design concepts. Once-through helical steam generator was conceptually designed for SMART. Canned motor pump was adopted in the conceptual design of MCP. Linear pulse motor type and ballscrew type CEDM, which have fine control capabilities were studied for adoption in SMART. In parallel with the structural design, the electro-magnetic design was performed for the sizing motors and electro-magnet. Prototypes for the CEDM and MCP sub-assemblies were developed and tested to verify the performance. The impeller design procedure and the computer program to analyze the dynamic characteristics of MCP rotor shaft were developed. The design concepts of SG, MCP, CEDM were also invetigated for the fabricability

  12. IP cores design from specifications to production modeling, verification, optimization, and protection

    CERN Document Server

    Mohamed, Khaled Salah

    2016-01-01

    This book describes the life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection. Various trade-offs in the design process are discussed, including  those associated with many of the most common memory cores, controller IPs  and system-on-chip (SoC) buses. Readers will also benefit from the author’s practical coverage of new verification methodologies. such as bug localization, UVM, and scan-chain.  A SoC case study is presented to compare traditional verification with the new verification methodologies. ·         Discusses the entire life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection; ·         Introduce a deep introduction for Verilog for both implementation and verification point of view.  ·         Demonstrates how to use IP in applications such as memory controllers and SoC buses. ·         Describes a new ver...

  13. Formal Development and Verification of Railway Control Systems - In the context of ERTMS/ETCS Level 2

    DEFF Research Database (Denmark)

    Vu, Linh Hong

    This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration data...... making the method easy for the railway engineers to use. Furthermore, the method features a 4-step verification and validation approach that can be integrated naturally into different phases of the software development process. This 4-step approach identifies possible errors in generic applications...... or configuration data as early as possible in the software development cycle, and facilitates debugging/troubleshooting if errors are discovered. The proposed method has successfully been applied to case studies of the forthcoming Danish railway interlocking systems that are compatible with the European...

  14. Development and verification of a software tool for the acoustic location of partial discharge in a power transformer

    Directory of Open Access Journals (Sweden)

    Polužanski Vladimir

    2014-01-01

    Full Text Available This paper discusses the development and verification of software tool for determining the location of partial discharge in a power transformer with the acoustic method. Classification and systematization of physical principles and detection methods and tests of partial discharge in power transformers are shown at the beginning of this paper. The most important mathematical models, features, algorithms, and real problems that affect measurement accuracy are highlighted. This paper describes the development and implementation of a software tool for determining the location of partial discharge in a power transformer based on a no iterative mathematical algorithm. Verification and accuracy of measurement are proved both by computer simulation and experimental results available in the literature.

  15. Verification of atmospheric diffusion models using data of long term atmospheric diffusion experiments

    International Nuclear Information System (INIS)

    Tamura, Junji; Kido, Hiroko; Hato, Shinji; Homma, Toshimitsu

    2009-03-01

    Straight-line or segmented plume models as atmospheric diffusion models are commonly used in probabilistic accident consequence assessment (PCA) codes due to cost and time savings. The PCA code, OSCAAR developed by Japan Atomic Energy Research Institute (Present; Japan Atomic Energy Agency) uses the variable puff trajectory model to calculate atmospheric transport and dispersion of released radionuclides. In order to investigate uncertainties involved with the structure of the atmospheric dispersion/deposition model in OSCAAR, we have introduced the more sophisticated computer codes that included regional meteorological models RAMS and atmospheric transport model HYPACT, which were developed by Colorado State University, and comparative analyses between OSCAAR and RAMS/HYPACT have been performed. In this study, model verification of OSCAAR and RAMS/HYPACT was conducted using data of long term atmospheric diffusion experiments, which were carried out in Tokai-mura, Ibaraki-ken. The predictions by models and the results of the atmospheric diffusion experiments indicated relatively good agreements. And it was shown that model performance of OSCAAR was the same degree as it of RAMS/HYPACT. (author)

  16. Developing Reading and Listening Comprehension Tests Based on the Sentence Verification Technique (SVT).

    Science.gov (United States)

    Royer, James M.

    2001-01-01

    Describes a team-based approach for creating Sentence Verification Technique (SVT) tests, a development procedure that allows teachers and other school personnel to develop comprehension tests from curriculum materials in use in their schools. Finds that if tests are based on materials that are appropriate for the population to be tested, the…

  17. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View Texas A& M; DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-12-20

    This project was successfully executed to provide valuable adsorption data and improve a comprehensive model developed in previous work by the authors. Data obtained were used in an integrated computer program to predict the behavior of adsorption columns. The model is supported by experimental data and has been shown to predict capture of off gas similar to that evolving during the reprocessing of nuclear waste. The computer program structure contains (a) equilibrium models of off-gases with the adsorbate; (b) mass-transfer models to describe off-gas mass transfer to a particle, diffusion through the pores of the particle, and adsorption on the active sites of the particle; and (c) incorporation of these models into fixed bed adsorption modeling, which includes advection through the bed. These models are being connected with the MOOSE (Multiphysics Object-Oriented Simulation Environment) software developed at the Idaho National Laboratory through DGOSPREY (Discontinuous Galerkin Off-gas SeParation and REcoverY) computer codes developed in this project. Experiments for iodine and water adsorption have been conducted on reduced silver mordenite (Ag0Z) for single layered particles. Adsorption apparatuses have been constructed to execute these experiments over a useful range of conditions for temperatures ranging from ambient to 250°C and water dew points ranging from -69 to 19°C. Experimental results were analyzed to determine mass transfer and diffusion of these gases into the particles and to determine which models best describe the single and binary component mass transfer and diffusion processes. The experimental results were also used to demonstrate the capabilities of the comprehensive models developed to predict single-particle adsorption and transients of the adsorption-desorption processes in fixed beds. Models for adsorption and mass transfer have been developed to mathematically describe adsorption kinetics and transport via diffusion and advection

  18. Channel Verification Results for the SCME models in a Multi-Probe Based MIMO OTA Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; S. Ashta, Jagjit

    2013-01-01

    , where the focus is on comparing results from various proposed methods. Channel model verification is necessary to ensure that the target channel models are correctly implemented inside the test area. This paper shows that the all the key parameters of the SCME models, i.e., power delay profile, temporal...

  19. The development of advanced instrumentation and control technology -The development of verification and validation technology for instrumentation and control in NPPs-

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Ham, Chang Sik; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Lee, Jang Soo; Um, Heung Sub; Kim, Jang Yul; Ryoo, Chan Hoh; Joo, Jae Yoon; Song, Soon Ja [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    We collected and analyzed the domestic/international codes, standards and guidelines to develop high reliable software verification and validation methodology which is satisfied to our real situation. The three major parts of work are performed that is the construction of the frame for high reliable software development environment, establishment of high reliable software development methodology and study for the basic technology related to safety-critical software. These three parts are tightly coupled each other to achieve self-reliable software verification and validation technology for digital I and C in NPPs. The configuration of hardware and software are partly performed using requirements which is developed in first stage for the development of I and C test facility. In hardware part, expanded interface using VXI bus and driving software is completed. The main program for math, modelling and supervisor program for instructions are developed. 27 figs, 22 tabs, 69 refs. (Author).

  20. Development and verification of multicycle depletion perturbation theory

    International Nuclear Information System (INIS)

    White, J.R.; Burns, T.J.

    1980-01-01

    Recently, Williams has developed a coupled neutron/nuclide depletion perturbation theory (DPT) applicable to multidimensional and multigroup reactor analysis problems. This theoretical framework has been verified using the newly developed DEPTH module within the context of the VENTURE modular code system. The accuracy and usefulness of this alternate calculational method for burnup analyses has been demonstrated for a variety of final-time response functionals. However, these examples were restricted to single-cycle depletion analyses due to the theoretical assumption that the nuclide density field was continuous in time. Clearly, in multicycle problems, the nuclide concentrations must vary discontinuously with time to model refueling and shuffling operations or discrete control rod movements. Thus, the purpose of this work is to generalize the original DPT framework to include nuclide discontinuities and to verify that this generalization can be employed in realistic multicycle applications

  1. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  2. Development and Verification of the Taiwan Ocean Prediction System

    Science.gov (United States)

    Liau, J. M.; Lai, J. W.; Yang, Y.; Chen, S. H.

    2016-02-01

    Taiwan is an island state surrounded by the sea to which the economic activity and the ecological environment are closely related. The aim of the Taiwan ocean prediction system is to satisfy the assessment of the marine environment and the reduction of risks in the coastal zone of Taiwan. The high-resolution wave and ocean numerical models, the data assimilation and the observing database are established and operated by Taiwan Ocean Research Institute (TORI). In order to reduce computational time and increase the grid resolution, the one-way nested grids are adopted for Northwestern Pacific Ocean and the sea around Taiwan. The WAVEWACTCH-III and the nested SWAN wave models and the multi-scales Princeton Ocean Model (POM) are respectively used for the predictions of the wind wave and the three-dimensional ocean current. A joint effect composed of the tide and the ocean circulation is also taken into consideration for resolving the complex current field around Taiwan. The performance evaluations are well carried out to investigate the rationality of the numerical models by using the observed data, such as temperature and current velocity profile, wave height and wave period of data buoys and sea surface current of the TORI high frequency radar observing system. The well verified results of the numerical ocean model assimilated by the sea surface height are used to investigate characteristics of Kuroshio in the waters off eastern Taiwan, and several value-added modules, such as storm surge simulation and particle tracking, are also developed to apply to the maritime casualty search and rescue, the hazard mitigation and the disaster assistance.

  3. Status of development and verification of the CTFD code FLUBOX

    International Nuclear Information System (INIS)

    Graf, U.; Paradimitriou, P.

    2004-01-01

    The Computational Two-Fluid Dynamics (CTFD) code FLUBOX is developed at GRS for the multidimensional simulation of two-phase flows. FLUBOX will also be used as a multidimensional module for the German system code ATHLET. The Benchmark test cases of the European ASTAR project were used to verify the ability of the code FLUBOX to calculate typical two-phase flow phenomena and conditions: void and pressure wave propagation, phase transitions, countercurrent flows, sharp interface movements, compressible (vapour) and nearly incompressible (water) conditions, thermal and mechanical non-equilibrium, stiff source terms due to mass and heat transfer between the phases. Realistic simulations of two-phase require beside the pure conservation equations additional transport equations for the interfacial area, turbulent energy and dissipation. A transport equation for the interfacial area density covering the whole two-phase flow range is in development. First validation calculations are presented in the paper. Turbulent shear stress for two-phase flows will be modelled by the development of transport equations for the turbulent kinetic energy and the turbulent dissipation rate. The development of the transport equations is mainly based on first principles on bubbles or drops and is largely free from empiricism. (author)

  4. Development and verification of the FP behavior evaluation computer code: FPRETAIN

    International Nuclear Information System (INIS)

    Yanagisawa, Kazuaki

    1989-06-01

    This paper describes results of development and verification study by FP behavior evaluation computer code: FPRETAIN, which has been developed to understand the fuel behavior at extended burn-up stages. The birth, the release and the retain process of various kinds of fission products in the fuel pellet and fuel mechanical behavior were modelled in the code. In this work, open-to-public data obtained from several test reactors (Halden Boiling Water Reactor, Studsvik R2 Reactor and Riso DR-3 Reactor) and from the domestic commercial power reactor were utilized for verification. Concluding remarks obtained are: (1) Regarding fuel centerline temperatures as a function of rod linear power and burn-up, calculation coincided well with experiment up to the burn-ups range of 20 MWd/kgU. (2) At burn-ups within 20 MWd/kgU, increased plenum pressures by released Xe and Kr gases were well predicted. (3) On release rate of the stable gaseous FP to burn-up range up to 35 MWd/kgU, it was understood that the code was tended to make under-estimation, especially at increased burn-up stages. The code calculation against fuels from domestic reactors, however, resulted in good agreement with post irradiation data. (4) Within burn-up ranges up to 35 MWd/kgU, hoop strain calculated by the code coincided well with data obtained from experiment. (5) Regarding retained FP gas distribution in a fuel, the calculation agreed with experimental data qualitatively. (author)

  5. Sorption Modeling and Verification for Off-Gas Treatment

    International Nuclear Information System (INIS)

    Tavlarides, Lawrence L.; Lin, Ronghong; Nan, Yue; Yiacoumi, Sotira; Tsouris, Costas; Ladshaw, Austin; Sharma, Ketki; Gabitto, Jorge; DePaoli, David

    2015-01-01

    The project has made progress toward developing a comprehensive modeling capability for the capture of target species in off gas evolved during the reprocessing of nuclear fuel. The effort has integrated experimentation, model development, and computer code development for adsorption and absorption processes. For adsorption, a modeling library has been initiated to include (a) equilibrium models for uptake of off-gas components by adsorbents, (b) mass transfer models to describe mass transfer to a particle, diffusion through the pores of the particle and adsorption on the active sites of the particle, and (c) interconnection of these models to fixed bed adsorption modeling which includes advection through the bed. For single-component equilibria, a Generalized Statistical Thermodynamic Adsorption (GSTA) code was developed to represent experimental data from a broad range of isotherm types; this is equivalent to a Langmuir isotherm in the two-parameter case, and was demonstrated for Kr on INL-engineered sorbent HZ PAN, water sorption on molecular sieve A sorbent material (MS3A), and Kr and Xe capture on metal-organic framework (MOF) materials. The GSTA isotherm was extended to multicomponent systems through application of a modified spreading pressure surface activity model and generalized predictive adsorbed solution theory; the result is the capability to estimate multicomponent adsorption equilibria from single-component isotherms. This advance, which enhances the capability to simulate systems related to off-gas treatment, has been demonstrated for a range of real-gas systems in the literature and is ready for testing with data currently being collected for multicomponent systems of interest, including iodine and water on MS3A. A diffusion kinetic model for sorbent pellets involving pore and surface diffusion as well as external mass transfer has been established, and a methodology was developed for determining unknown diffusivity parameters from transient

  6. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence L. [Syracuse Univ., NY (United States); Lin, Ronghong [Syracuse Univ., NY (United States); Nan, Yue [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Ladshaw, Austin [Georgia Inst. of Technology, Atlanta, GA (United States); Sharma, Ketki [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View A & M Univ., Prairie View, TX (United States); DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-04-29

    The project has made progress toward developing a comprehensive modeling capability for the capture of target species in off gas evolved during the reprocessing of nuclear fuel. The effort has integrated experimentation, model development, and computer code development for adsorption and absorption processes. For adsorption, a modeling library has been initiated to include (a) equilibrium models for uptake of off-gas components by adsorbents, (b) mass transfer models to describe mass transfer to a particle, diffusion through the pores of the particle and adsorption on the active sites of the particle, and (c) interconnection of these models to fixed bed adsorption modeling which includes advection through the bed. For single-component equilibria, a Generalized Statistical Thermodynamic Adsorption (GSTA) code was developed to represent experimental data from a broad range of isotherm types; this is equivalent to a Langmuir isotherm in the two-parameter case, and was demonstrated for Kr on INL-engineered sorbent HZ PAN, water sorption on molecular sieve A sorbent material (MS3A), and Kr and Xe capture on metal-organic framework (MOF) materials. The GSTA isotherm was extended to multicomponent systems through application of a modified spreading pressure surface activity model and generalized predictive adsorbed solution theory; the result is the capability to estimate multicomponent adsorption equilibria from single-component isotherms. This advance, which enhances the capability to simulate systems related to off-gas treatment, has been demonstrated for a range of real-gas systems in the literature and is ready for testing with data currently being collected for multicomponent systems of interest, including iodine and water on MS3A. A diffusion kinetic model for sorbent pellets involving pore and surface diffusion as well as external mass transfer has been established, and a methodology was developed for determining unknown diffusivity parameters from transient

  7. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    Science.gov (United States)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    into the KMD-WRF runs, using the product generated by NOAA/NESDIS. Model verification capabilities are also being transitioned to KMD using NCAR's Model *Corresponding author address: Jonathan Case, ENSCO, Inc., 320 Sparkman Dr., Room 3008, Huntsville, AL, 35805. Email: Jonathan.Case-1@nasa.gov Evaluation Tools (MET; Brown et al. 2009) software in conjunction with a SPoRT-developed scripting package, in order to quantify and compare errors in simulated temperature, moisture and precipitation in the experimental WRF model simulations. This extended abstract and accompanying presentation summarizes the efforts and training done to date to support this unique regional modeling initiative at KMD. To honor the memory of Dr. Peter J. Lamb and his extensive efforts in bolstering weather and climate science and capacity-building in Africa, we offer this contribution to the special Peter J. Lamb symposium. The remainder of this extended abstract is organized as follows. The collaborating international organizations involved in the project are presented in Section 2. Background information on the unique land surface input datasets is presented in Section 3. The hands-on training sessions from March 2014 and June 2015 are described in Section 4. Sample experimental WRF output and verification from the June 2015 training are given in Section 5. A summary is given in Section 6, followed by Acknowledgements and References.

  8. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  9. Verification and improvement of a predictive model for radionuclide migration

    International Nuclear Information System (INIS)

    Miller, C.W.; Benson, L.V.; Carnahan, C.L.

    1982-01-01

    Prediction of the rates of migration of contaminant chemical species in groundwater flowing through toxic waste repositories is essential to the assessment of a repository's capability of meeting standards for release rates. A large number of chemical transport models, of varying degrees of complexity, have been devised for the purpose of providing this predictive capability. In general, the transport of dissolved chemical species through a water-saturated porous medium is influenced by convection, diffusion/dispersion, sorption, formation of complexes in the aqueous phase, and chemical precipitation. The reliability of predictions made with the models which omit certain of these processes is difficult to assess. A numerical model, CHEMTRN, has been developed to determine which chemical processes govern radionuclide migration. CHEMTRN builds on a model called MCCTM developed previously by Lichtner and Benson

  10. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    International Nuclear Information System (INIS)

    Mao, S P; Rottenberg, X; Rochus, V; Czarnecki, P; Helin, P; Severi, S; Tilmans, H A C; Nauwelaers, B

    2017-01-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp .). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells. (paper)

  11. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    Science.gov (United States)

    Mao, S. P.; Rottenberg, X.; Rochus, V.; Czarnecki, P.; Helin, P.; Severi, S.; Nauwelaers, B.; Tilmans, H. A. C.

    2017-03-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp.). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells.

  12. Development of an advanced real time simulation tool, ARTIST and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hee Cheol; Moon, S. K.; Yoon, B. J.; Sim, S. K.; Lee, W. J. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    1999-10-01

    A real time reactor system analysis code ARTIST, based on drift flux model has been developed to investigate the transient system behavior under low pressure, low flow and low power conditions with noncondensable gas present in the system. The governing equations of the ARTIST code consist of three mass continuity equations (steam, liquid and noncondensables), two energy equations (steam and mixture) and one mixture equation constituted with the drift flux model. The drift flux model of ARTIST has been validated against the THETIS experimental data by comparing the void distribution in the system. Especially, the calculated void fraction by Chexal-Lellouche void fraction correlation at low pressure and low flow, is better than the results of both the homogeneous model of TASS code and the two-fluid model of RELAP5/MOD3 code. When noncondensable gas exists, thermal-hydraulic state solution scheme and the calculation methods of the partial derivatives are developed. Numerical consistency and convergence was tested with the one volume problems and the manometric oscillation was assessed to examine the calculation methods of the partial derivatives. Calculated thermal-hydraulic state for each test shows the consistent and expected behaviour. In order to evaluate the ARTIST code capability in predicting the two phase thermal-hydraulic phenomena of the loss of RHR accident during midloop operation, BETHSY test 6.9d is simulated. From the results, it is judged that the reflux condensation model and the critical flow model for the noncondensable gas are necessary to correctly predict the thermal-hydraulic behaviour. Finally, the verification run was performed without the drift flux model and the noncondensable gas model for the postulated accidents of the real plants. The ARTIST code well reproduces the parametric trends which are calculated by TASS code. Therefore, the integrity of ARTIST code was verified. 35 refs., 70 figs., 3 tabs. (Author)

  13. Development of NSSS Control System Performance Verification Tool

    International Nuclear Information System (INIS)

    Sohn, Suk Whun; Song, Myung Jun

    2007-01-01

    Thanks to many control systems and control components, the nuclear power plant can be operated safely and efficiently under the transient condition as well as the steady state condition. If a fault or an error exists in control systems, the nuclear power plant should experience the unwanted and unexpected transient condition. Therefore, the performance of these control systems and control components should be completely verified through power ascension tests of startup period. However, there are many needs to replace control components or to modify control logic or to change its setpoint. It is important to verify the performance of changed control system without redoing power ascension tests in order to perform these changes. Up to now, a simulation method with computer codes which has been used for design of nuclear power plants was commonly used to verify its performance. But, if hardware characteristics of control system are changed or the software in control system has an unexpected fault or error, this simulation method is not effective to verify the performance of changed control system. Many tests related to V and V (Verification and Validation) are performed in the factory as well as in the plant to eliminate these errors which might be generated in hardware manufacturing or software coding. It reveals that these field tests and the simulation method are insufficient to guaranty the performance of changed control system. Two unexpected transients occurred in YGN 5 and 6 startup period are good examples to show this fact. One occurred at 50% reactor power and caused reactor trip. The other occurred during 70% loss of main feedwater pump test and caused the excess turbine runback

  14. Development Modules for Specification of Requirements for a System of Verification of Parallel Algorithms

    Directory of Open Access Journals (Sweden)

    Vasiliy Yu. Meltsov

    2012-05-01

    Full Text Available This paper presents the results of the development of one of the modules of the system verification of parallel algorithms that are used to verify the inference engine. This module is designed to build the specification requirements, the feasibility of which on the algorithm is necessary to prove (test.

  15. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    NARCIS (Netherlands)

    Joseph, S.; Herold, M.; Sunderlin, W.D.; Verchot, L.V.

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20

  16. Development and verification of a reciprocating test rig designed for investigation of piston ring tribology

    DEFF Research Database (Denmark)

    Pedersen, Michael Torben; Imran, Tajammal; Klit, Peder

    2009-01-01

    This paper describes the development and verification of a reciprocating test rig, which was designed to study the piston ring tribology. A crank mechanism is used to generate a reciprocating motion for a moving plate, which acts as the liner. A stationary block acting as the ring package is loaded......, which is suitable for the study of piston ring tribology....

  17. On the need for data for the verification of service life models for frost damage

    DEFF Research Database (Denmark)

    Geiker, Mette Rica; Engelund, Sven

    1999-01-01

    The purpose of this paper is to draw the attention to the need for the verification of service life models for frost attack on concrete and the collection of relevant data. To illustrate the type of data needed the paper presents models for internal freeze/thaw damage (internal cracking including...

  18. Verification of Context-Dependent Channel-Based Service Models

    NARCIS (Netherlands)

    N. Kokash (Natallia); , C. (born Köhler, , C.) Krause (Christian); E.P. de Vink (Erik Peter)

    2010-01-01

    htmlabstractThe paradigms of service-oriented computing and model-driven development are becoming of increasing importance in the eld of software engineering. According to these paradigms, new systems are composed with added value from existing stand-alone services to support business processes

  19. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  20. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  1. Carbon dioxide stripping in aquaculture -- part III: model verification

    Science.gov (United States)

    Colt, John; Watten, Barnaby; Pfeiffer, Tim

    2012-01-01

    Based on conventional mass transfer models developed for oxygen, the use of the non-linear ASCE method, 2-point method, and one parameter linear-regression method were evaluated for carbon dioxide stripping data. For values of KLaCO2 down at higher values of KLaCO2. How to correct KLaCO2 for gas phase enrichment remains to be determined. The one-parameter linear regression model was used to vary the C*CO2 over the test, but it did not result in a better fit to the experimental data when compared to the ASCE or fixed C*CO2 assumptions.

  2. Results of verification and investigation of wind velocity field forecast. Verification of wind velocity field forecast model

    International Nuclear Information System (INIS)

    Ogawa, Takeshi; Kayano, Mitsunaga; Kikuchi, Hideo; Abe, Takeo; Saga, Kyoji

    1995-01-01

    In Environmental Radioactivity Research Institute, the verification and investigation of the wind velocity field forecast model 'EXPRESS-1' have been carried out since 1991. In fiscal year 1994, as the general analysis, the validity of weather observation data, the local features of wind field, and the validity of the positions of monitoring stations were investigated. The EXPRESS which adopted 500 m mesh so far was improved to 250 m mesh, and the heightening of forecast accuracy was examined, and the comparison with another wind velocity field forecast model 'SPEEDI' was carried out. As the results, there are the places where the correlation with other points of measurement is high and low, and it was found that for the forecast of wind velocity field, by excluding the data of the points with low correlation or installing simplified observation stations to take their data in, the forecast accuracy is improved. The outline of the investigation, the general analysis of weather observation data and the improvements of wind velocity field forecast model and forecast accuracy are reported. (K.I.)

  3. Specification, Model Generation, and Verification of Distributed Applications

    OpenAIRE

    Madelaine, Eric

    2011-01-01

    Since 2001, in the Oasis team, I have developed research on the semantics of applications based on distributed objects, applying in the context of a real language, and applications of realistic size, my previous researches in the field of process algebras. The various aspects of this work naturally include behavioral semantics and the definition of procedures for model generation, taking into account the different concepts of distributed applications, but also upstream, static code analysis a...

  4. Development and verification test of integral reactor major components - Development of MCP impeller design, performance prediction code and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung Kyoon; Oh, Woo Hyoung; Song, Jae Wook [Korea Advanced Institute of Science and Technology, Taejon (Korea)

    1999-03-01

    The present study is aimed at developing a computational code for design and performance prediction of an axial-flow pump. The proposed performance prediction method is tested against a model axial-flow pump streamline curvature method. The preliminary design is made by using the ideal velocity triangles at inlet and exit and the three dimensional blade shape is calculated by employing the free vortex design method. Then the detailed blading design is carried out by using experimental database of double circular arc cambered hydrofoils. To computationally determine the design incidence, deviation, blade camber, solidity and stagger angle, a number of correlation equations are developed form the experimental database and a theorical formula for the lift coefficient is adopted. A total of 8 equations are solved iteratively using an under-relaxation factor. An experimental measurement is conducted under a non-cavitating condition to obtain the off-design performance curve and also a cavitation test is carried out by reducing the suction pressure. The experimental results are very satisfactorily compared with the predictions by the streamline curvature method. 28 refs., 26 figs., 11 tabs. (Author)

  5. Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling

    Science.gov (United States)

    Kraan, Aafke Christine

    2015-01-01

    Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β+ emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects. PMID:26217586

  6. Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling.

    Science.gov (United States)

    Kraan, Aafke Christine

    2015-01-01

    Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β (+) emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.

  7. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker Verification

    OpenAIRE

    Sarkar, A. K.; Tan, Zheng-Hua

    2016-01-01

    In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using the utterances of a particular pass-phrase. During training, pass-phrase specific target speaker models are derived from the particular PBM using the training data for the respective target model. ...

  8. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    Energy Technology Data Exchange (ETDEWEB)

    Amadio, G.; et al.

    2017-11-22

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physics models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.

  9. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    Science.gov (United States)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Behera, S. P.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Hariri, F.; Jun, S. Y.; Konstantinov, D.; Kumawat, H.; Ivantchenko, V.; Lima, G.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2017-10-01

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physics models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.

  10. The Parametric Model for PLC Reference Chanells and its Verification in Real PLC Environment

    Directory of Open Access Journals (Sweden)

    Rastislav Roka

    2008-01-01

    Full Text Available For the expansion of PLC systems, it is necesssary to have a detailed knowledge of the PLC transmission channel properties. This contribution shortly discusses characteristics of the PLC environment and a classification of PLC transmission channels. A main part is focused on the parametric model for PLC reference channels and its verification in the real PLC environment utilizing experimental measurements.

  11. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  12. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    . The Guided System Development framework contributes to more secure communication systems by aiding the development of such systems. The framework features a simple modelling language, step-wise refinement from models to implementation, interfaces to security verification tools, and code generation from...... the verified specification. The refinement process carries thus security properties from the model to the implementation. Our approach also supports verification of systems previously developed and deployed. Internally, the reasoning in our framework is based on the Beliefs and Knowledge tool, a verification...

  13. Verification of geological/engineering model in waterflood areas

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, B.; Szpakiewicz, M.; Honarpour, M.; Schatzinger, R.A.; Tillman, R.

    1988-12-01

    The construction of a detailed geological/engineering model is the basis for development of the methodology for characterizing reservoir heterogeneity. The NIPER geological/engineering model is the subject of this report. The area selected for geological and production performance studies is a four-section area within the Powder River Basin which includes the Tertiary Incentive Project (TIP) pilot. Log, well test, production, and core data were acquired for construction of the geological model of a barrier island reservoir. In this investigation, emphasis was on the synthesis and quantification of the abundant geological information acquired from the literature and field studies (subsurface and outcrop) by mapping the geological heterogeneities that influence fluid flow. The geological model was verified by comparing it with the exceptionally complete production data available for Bell Creek field. This integration of new and existing information from various geological, geophysical, and engineering disciplines has enabled better definition of the heterogeneities that influence production during different recovery operations. 16 refs., 26 figs., 6 tabs.

  14. Characterizing proton-activated materials to develop PET-mediated proton range verification markers

    Science.gov (United States)

    Cho, Jongmin; Ibbott, Geoffrey S.; Kerr, Matthew D.; Amos, Richard A.; Stingo, Francesco C.; Marom, Edith M.; Truong, Mylene T.; Palacio, Diana M.; Betancourt, Sonia L.; Erasmus, Jeremy J.; DeGroot, Patricia M.; Carter, Brett W.; Gladish, Gregory W.; Sabloff, Bradley S.; Benveniste, Marcelo F.; Godoy, Myrna C.; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R.

    2016-06-01

    Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials (18O, Cu, and 68Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm-3) and beef (~1.0 g cm-3) were embedded with Cu or 68Zn foils of several volumes (10-50 mm3). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1-5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20-40 min of scan time using various delay times (30-150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils’ PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers.

  15. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    Science.gov (United States)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  16. Finite element code FENIA verification and application for 3D modelling of thermal state of radioactive waste deep geological repository

    Science.gov (United States)

    Butov, R. A.; Drobyshevsky, N. I.; Moiseenko, E. V.; Tokarev, U. N.

    2017-11-01

    The verification of the FENIA finite element code on some problems and an example of its application are presented in the paper. The code is being developing for 3D modelling of thermal, mechanical and hydrodynamical (THM) problems related to the functioning of deep geological repositories. Verification of the code for two analytical problems has been performed. The first one is point heat source with exponential heat decrease, the second one - linear heat source with similar behavior. Analytical solutions have been obtained by the authors. The problems have been chosen because they reflect the processes influencing the thermal state of deep geological repository of radioactive waste. Verification was performed for several meshes with different resolution. Good convergence between analytical and numerical solutions was achieved. The application of the FENIA code is illustrated by 3D modelling of thermal state of a prototypic deep geological repository of radioactive waste. The repository is designed for disposal of radioactive waste in a rock at depth of several hundred meters with no intention of later retrieval. Vitrified radioactive waste is placed in the containers, which are placed in vertical boreholes. The residual decay heat of radioactive waste leads to containers, engineered safety barriers and host rock heating. Maximum temperatures and corresponding times of their establishment have been determined.

  17. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: BROME AGRI SALES, LTD., MAXIMIZER SEPARATOR, MODEL MAX 1016 - 03/01/WQPC-SWP

    Science.gov (United States)

    Verification testing of the Brome Agri Sales Ltd. Maximizer Separator, Model MAX 1016 (Maximizer) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The Maximizer is an inclined screen solids separator that can be used to s...

  19. Weather forecast in north-western Greece: RISKMED warnings and verification of MM5 model

    Directory of Open Access Journals (Sweden)

    A. Bartzokas

    2010-02-01

    Full Text Available The meteorological model MM5 is applied operationally for the area of north-western Greece for one-year period (1 June 2007–31 May 2008. The model output is used for daily weather forecasting over the area. An early warning system is developed, by dividing the study area in 16 sub-regions and defining specific thresholds for issuing alerts for adverse weather phenomena. The verification of the model is carried out by comparing the model results with observations from three automatic meteorological stations. For air temperature and wind speed, correlation coefficients and biases are calculated, revealing that there is a significant overestimation of the early morning air temperature. For precipitation amount, yes/no contingency tables are constructed for 4 specific thresholds and some categorical statistics are applied, showing that the prediction of precipitation in the area under study is generally satisfactory. Finally, the thunderstorm warnings issued by the system are verified against the observed lightning activity.

  20. VERIFICATION OF 3D BUILDING MODELS USING MUTUAL INFORMATION IN AIRBORNE OBLIQUE IMAGES

    Directory of Open Access Journals (Sweden)

    A. P. Nyaruhuma

    2012-07-01

    Full Text Available This paper describes a method for automatic verification of 3D building models using airborne oblique images. The problem being tackled is identifying buildings that are demolished or changed since the models were constructed or identifying wrong models using the images. The models verified are of CityGML LOD2 or higher since their edges are expected to coincide with actual building edges. The verification approach is based on information theory. Corresponding variables between building models and oblique images are used for deriving mutual information for individual edges, faces or whole buildings, and combined for all perspective images available for the building. The wireframe model edges are projected to images and verified using low level image features – the image pixel gradient directions. A building part is only checked against images in which it may be visible. The method has been tested with models constructed using laser points against Pictometry images that are available for most cities of Europe and may be publically viewed in the so called Birds Eye view of the Microsoft Bing Maps. Results are that nearly all buildings are correctly categorised as existing or demolished. Because we now concentrate only on roofs we also used the method to test and compare results from nadir images. This comparison made clear that especially height errors in models can be more reliably detected in oblique images because of the tilted view. Besides overall building verification, results per individual edges can be used for improving the 3D building models.

  1. Development of genomic tools for verification of hybrids and selfed ...

    African Journals Online (AJOL)

    ajl yemi

    2011-11-30

    Nov 30, 2011 ... Increased attention would go to the development of more efficient markers such as single nucleotide polymorphisms (SNPs) and other gene-based markers using .... [pH 8], 500 mM NaCl, 50mM EDTA, 1.0% SDS, 2% PVP, 1% β- mercaptoethanol, and 0.05 mg/ml Proteinase K. Fresh young leaves were ...

  2. EVA Development and Verification Testing at NASA's Neutral Buoyancy Laboratory

    Science.gov (United States)

    Jairala, Juniper C.; Durkin, Robert; Marak, Ralph J.; Sipila, Stepahnie A.; Ney, Zane A.; Parazynski, Scott E.; Thomason, Arthur H.

    2012-01-01

    As an early step in the preparation for future Extravehicular Activities (EVAs), astronauts perform neutral buoyancy testing to develop and verify EVA hardware and operations. Neutral buoyancy demonstrations at NASA Johnson Space Center's Sonny Carter Training Facility to date have primarily evaluated assembly and maintenance tasks associated with several elements of the International Space Station (ISS). With the retirement of the Shuttle, completion of ISS assembly, and introduction of commercial players for human transportation to space, evaluations at the Neutral Buoyancy Laboratory (NBL) will take on a new focus. Test objectives are selected for their criticality, lack of previous testing, or design changes that justify retesting. Assembly tasks investigated are performed using procedures developed by the flight hardware providers and the Mission Operations Directorate (MOD). Orbital Replacement Unit (ORU) maintenance tasks are performed using a more systematic set of procedures, EVA Concept of Operations for the International Space Station (JSC-33408), also developed by the MOD. This paper describes the requirements and process for performing a neutral buoyancy test, including typical hardware and support equipment requirements, personnel and administrative resource requirements, examples of ISS systems and operations that are evaluated, and typical operational objectives that are evaluated.

  3. Development of an object test to dental image verification

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, Paula S.S. [Instituto de Pesquisas Energeticas e Nucleares, Comissao Nacional de Energia Nuclear/SP, C.P. 11049-CEP, 05422-970-Sao Paulo (Brazil)], E-mail: psasaki@ipen.br; Potiens, Maria da Penha A. [Instituto de Pesquisas Energeticas e Nucleares, Comissao Nacional de Energia Nuclear/SP, C.P. 11049-CEP, 05422-970-Sao Paulo (Brazil)], E-mail: mppalbu@ipen.br

    2010-04-15

    An object test was developed to verify the dental images best quality in relation to the exposition time. It consists of high purity perforated aluminum plates, with different thickness and dimensions of 3x4 cm. The plates were covered by acrylic, making its total thickness of 3.0 mm. It was irradiated together with the film from 0.1 to 0.5 s. The irradiation of 0.2 s showed the best image, reducing the regular time exposition by 0.3 s.

  4. Modelling and Verification of Multiple UAV Mission Using SMV

    Directory of Open Access Journals (Sweden)

    Gopinadh Sirigineedi

    2010-03-01

    Full Text Available Model checking has been used to verify the correctness of digital circuits, security protocols, communication protocols, as they can be modelled by means of finite state transition model. However, modelling the behaviour of hybrid systems like UAVs in a Kripke model is challenging. This work is aimed at capturing the behaviour of an UAV performing cooperative search mission into a Kripke model, so as to verify it against the temporal properties expressed in Computational Tree Logic (CTL. SMV model checker is used for the purpose of model checking.

  5. Development of a consensus standard for verification and validation of nuclear system thermal-fluids software

    International Nuclear Information System (INIS)

    Harvego, Edwin A.; Schultz, Richard R.; Crane, Ryan L.

    2011-01-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V and V) of software used to calculate the thermal–hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V and V 30 Committee, under the jurisdiction of the V and V Standards Committee, to develop a consensus standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V and V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. Although software verification will be an important and necessary part of the standard, much of the initial effort of the committee will be focused on the validation of existing software and new models that could be used in the licensing process. In this framework, the Standard should conform to Nuclear Regulatory Commission (NRC) and other regulatory practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the Standard should be consistent with applicable sections of ASME NQA-1-2008 “Quality Assurance Requirements for Nuclear Facility Applications (QA)”. This paper describes the general

  6. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  7. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  8. Development and Performance Verification of High Resistance Semiconducting Glaze Insulators

    Science.gov (United States)

    Shinoda, Akihide; Chiyajo, Kiyonobu; Okada, Hideyuki; Suzuki, Yoshihiro; Ito, Susumu; Naito, Katsuhiko

    In case of transmission lines along the sea coast, audible noise due to partial discharges may occur from insulators when contaminated with sea salt and wetted. From the consideration to residents, insulator washing has been performed periodically, but this results in increase in the maintenance cost. As a countermeasure to reduce the audible noise, semiconducting glaze insulator (DC resistance approx. 20MΩ) has been developed and used. However, in case of a very special environment with direct spraying of sea water in seaside districts, there seems to be some risk of thermal runaway because voltage concentration on very small number of insulator units in a string is possible and it may rise the temperature of those units up to the critical point. In this paper, the thermal runaway mechanism is clarified from view point of input and output energy of the semiconducting glaze insulator under contaminated and wetted condition. The surface temperature starting thermal runaway is estimated from various experiments. As a result, the high resistance semiconducting glaze insulator, which has higher thermal runaway withstand capacity and acceptable agreeable audible noise characteristics is developed and subjected to the field evaluation.

  9. Spaceport Command and Control System Automated Verification Software Development

    Science.gov (United States)

    Backus, Michael W.

    2017-01-01

    For as long as we have walked the Earth, humans have always been explorers. We have visited our nearest celestial body and sent Voyager 1 beyond our solar system1 out into interstellar space. Now it is finally time for us to step beyond our home and onto another planet. The Spaceport Command and Control System (SCCS) is being developed along with the Space Launch System (SLS) to take us on a journey further than ever attempted. Within SCCS are separate subsystems and system level software, each of which have to be tested and verified. Testing is a long and tedious process, so automating it will be much more efficient and also helps to remove the possibility of human error from mission operations. I was part of a team of interns and full-time engineers who automated tests for the requirements on SCCS, and with that was able to help verify that the software systems are performing as expected.

  10. Bringing Automated Formal Verification to PLC Program Development

    CERN Document Server

    Fernández Adiego, Borja; Blanco Viñuela, Enrique

    Automation is the field of engineering that deals with the development of control systems for operating systems such as industrial processes, railways, machinery or aircraft without human intervention. In most of the cases, a failure in these control systems can cause a disaster in terms of economic losses, environmental damages or human losses. For that reason, providing safe, reliable and robust control systems is a first priority goal for control engineers. Ideally, control engineers should be able to guarantee that both software and hardware fulfill the design requirements. This is an enormous challenge in which industry and academia have been working and making progresses in the last decades. This thesis focuses on one particular type of control systems that operates industrial processes, the PLC (Programmable Logic Controller) - based control systems. Moreover it targets one of the main challenges for these systems, guaranteeing that PLC programs are compliant with their specifications. Traditionally ...

  11. Performance verification of Surface Mapping Instrument developed at CGM

    DEFF Research Database (Denmark)

    Bariani, Paolo

    of the instrument was the development of stitching software. Successful stitching of AFM scans is demonstrated in this report. Single data files in the millimetre range can be obtained, which are entirely based on AFM probing. High definition of nanostructures can therefore be combined with a measuring range......: • A synthetic image featuring a 2D pattern for testing the stitching code performance • An optical standard containing several line patterns, for testing the whole measuring system (AFM-CMM-software) Quantitative analysis was based on: • Dimensional parameters based on the cross correlation function; these were...... calculated over the synthetic image, and the result of several partitioning and stitching operations, respectively. • FFT and dimensional measurements performed on the lines patterns resulting from stitching of single AFM scans. Deviations below 1% were experienced when measuring dimensions of several...

  12. A comparative verification of high resolution precipitation forecasts using model output statistics

    Science.gov (United States)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  13. Game Theory Models for the Verification of the Collective Behaviour of Autonomous Cars

    OpenAIRE

    Varga, László Z.

    2017-01-01

    The collective of autonomous cars is expected to generate almost optimal traffic. In this position paper we discuss the multi-agent models and the verification results of the collective behaviour of autonomous cars. We argue that non-cooperative autonomous adaptation cannot guarantee optimal behaviour. The conjecture is that intention aware adaptation with a constraint on simultaneous decision making has the potential to avoid unwanted behaviour. The online routing game model is expected to b...

  14. Further Development of Verification Check-Cases for Six- Degree-of-Freedom Flight Vehicle Simulations

    Science.gov (United States)

    Jackson, E. Bruce; Madden, Michael M.; Shelton, Robert; Jackson, A. A.; Castro, Manuel P.; Noble, Deleena M.; Zimmerman, Curtis J.; Shidner, Jeremy D.; White, Joseph P.; Dutta, Doumyo; hide

    2015-01-01

    This follow-on paper describes the principal methods of implementing, and documents the results of exercising, a set of six-degree-of-freedom rigid-body equations of motion and planetary geodetic, gravitation and atmospheric models for simple vehicles in a variety of endo- and exo-atmospheric conditions with various NASA, and one popular open-source, engineering simulation tools. This effort is intended to provide an additional means of verification of flight simulations. The models used in this comparison, as well as the resulting time-history trajectory data, are available electronically for persons and organizations wishing to compare their flight simulation implementations of the same models.

  15. RAPID FREEFORM SHEET METAL FORMING: TECHNOLOGY DEVELOPMENT AND SYSTEM VERIFICATION

    Energy Technology Data Exchange (ETDEWEB)

    Kiridena, Vijitha [Ford Scientific Research Lab., Dearborn, MI (United States); Verma, Ravi [Boeing Research and Technology (BR& T), Seattle, WA (United States); Gutowski, Timothy [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Roth, John [Pennsylvania State Univ., University Park, PA (United States)

    2018-03-31

    The objective of this project is to develop a transformational RApid Freeform sheet metal Forming Technology (RAFFT) in an industrial environment, which has the potential to increase manufacturing energy efficiency up to ten times, at a fraction of the cost of conventional technologies. The RAFFT technology is a flexible and energy-efficient process that eliminates the need for having geometry-specific forming dies. The innovation lies in the idea of using the energy resource at the local deformation area which provides greater formability, process control, and process flexibility relative to traditional methods. Double-Sided Incremental Forming (DSIF), the core technology in RAFFT, is a new concept for sheet metal forming. A blank sheet is clamped around its periphery and gradually deformed into a complex 3D freeform part by two strategically aligned stylus-type tools that follow a pre-described toolpath. The two tools, one on each side of the blank, can form a part with sharp features for both concave and convex shapes. Since deformation happens locally, the forming force at any instant is significantly decreased when compared to traditional methods. The key advantages of DSIF are its high process flexibility, high energy-efficiency, low capital investment, and the elimination of the need for massive amounts of die casting and machining. Additionally, the enhanced formability and process flexibility of DSIF can open up design spaces and result in greater weight savings.

  16. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Tsao, Jeffrey Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trucano, Timothy G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kleban, Stephen D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naugle, Asmeret Bier [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Verzi, Stephen Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Curtis M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flanagan, Tatiana Paz [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gabert, Kasimir Georg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lave, Matthew Samuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chen, Wei [Northwestern Univ., Evanston, IL (United States); DeLaurentis, Daniel [Purdue Univ., West Lafayette, IN (United States); Hubler, Alfred [Univ. of Illinois, Urbana, IL (United States); Oberkampf, Bill [WLO Consulting, Austin, TX (United States)

    2016-08-01

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledge gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?

  17. FAST Mast Structural Response to Axial Loading: Modeling and Verification

    Science.gov (United States)

    Knight, Norman F., Jr.; Elliott, Kenny B.; Templeton, Justin D.; Song, Kyongchan; Rayburn, Jeffery T.

    2012-01-01

    The International Space Station s solar array wing mast shadowing problem is the focus of this paper. A building-block approach to modeling and analysis is pursued for the primary structural components of the solar array wing mast structure. Starting with an ANSYS (Registered Trademark) finite element model, a verified MSC.Nastran (Trademark) model is established for a single longeron. This finite element model translation requires the conversion of several modeling and analysis features for the two structural analysis tools to produce comparable results for the single-longeron configuration. The model is then reconciled using test data. The resulting MSC.Nastran (Trademark) model is then extended to a single-bay configuration and verified using single-bay test data. Conversion of the MSC. Nastran (Trademark) single-bay model to Abaqus (Trademark) is also performed to simulate the elastic-plastic longeron buckling response of the single bay prior to folding.

  18. Verification of aseismic design model by using experimental results

    International Nuclear Information System (INIS)

    Mizuno, N.; Sugiyama, N.; Suzuki, T.; Shibata, Y.; Miura, K.; Miyagawa, N.

    1985-01-01

    A lattice model is applied as an analysis model for an aseismic design of the Hamaoka nuclear reactor building. With object to verify an availability of this design model, two reinforced concrete blocks are constructed on the ground and the forced vibration tests are carried out. The test results are well followed by simulation analysis using the lattice model. Damping value of the ground obtained from the test is more conservative than the design value. (orig.)

  19. Desublimation process: verification and applications of a theoretical model

    International Nuclear Information System (INIS)

    Eby, R.S.

    1979-01-01

    A theoretical model simulating the simultaneous heat and mass transfer which takes place during the desublimation of a gas to a solid is presented. Desublimer column loading profiles to experimentally verify the model were obtained using a gamma scintillation technique. The data indicate that, if the physical parameters of the desublimed frost material are known, the model can accurately predict the desublimation phenomenon. The usefulness of the model in different engineering applications is also addressed

  20. Are Australian and New Zealand trauma service resources reflective of the Australasian Trauma Verification Model Resource Criteria?

    Science.gov (United States)

    Leonard, Elizabeth; Curtis, Kate

    2014-01-01

    The Australasian Trauma Verification Program was developed in 2000 to improve the quality of care provided at services in Australia and New Zealand. The programme outlines resources required for differing levels of trauma services. This study compares the human resources in Australia and New Zealand trauma services with those recommended by the Australasian College of Surgeons Trauma Verification Program. In September 2011, all trauma nurse coordinators in Australia and New Zealand were invited to participate in an electronic survey endorsed by the Australasian Trauma Society. This study expands on previous bi-national research and aimed to identify demographic and trauma service human resource levels. Fifty-three surveys (78%) were completed and all 27 Level 1 trauma centres represented. Of the Level 1 trauma centres, a trauma director and fellow were available at 16 (51.8%) and 14 (40.7%) centres, respectively. The majority (93%) had a full-time trauma coordinator although a trauma case manager was only available at 14 (48.1%) of Level 1 trauma centres. Despite the large amount of data collection and extraction required, trauma services had limited access to a data manager (50.9%) or clerical staff (36.9%). Human resources in Australian and NZ trauma services are not reflective of those recommended by the Australasian Trauma Verification Program. This impacts on the ability to coordinate trauma monitoring and performance improvement. Review of the Australasian Trauma Verification Model Resource Criteria is required. Injury surveillance in Australia and NZ is hampered by insufficient trauma registry resources. © 2014 Royal Australasian College of Surgeons.

  1. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  2. Application of a Monte Carlo linac model in routine verifications of dose calculations

    International Nuclear Information System (INIS)

    Linares Rosales, H. M.; Alfonso Laguardia, R.; Lara Mas, E.; Popescu, T.

    2015-01-01

    The analysis of some parameters of interest in Radiotherapy Medical Physics based on an experimentally validated Monte Carlo model of an Elekta Precise lineal accelerator, was performed for 6 and 15 Mv photon beams. The simulations were performed using the EGSnrc code. As reference for simulations, the optimal beam parameters values (energy and FWHM) previously obtained were used. Deposited dose calculations in water phantoms were done, on typical complex geometries commonly are used in acceptance and quality control tests, such as irregular and asymmetric fields. Parameters such as MLC scatter, maximum opening or closing position, and the separation between them were analyzed from calculations in water. Similarly simulations were performed on phantoms obtained from CT studies of real patients, making comparisons of the dose distribution calculated with EGSnrc and the dose distribution obtained from the computerized treatment planning systems (TPS) used in routine clinical plans. All the results showed a great agreement with measurements, finding all of them within tolerance limits. These results allowed the possibility of using the developed model as a robust verification tool for validating calculations in very complex situation, where the accuracy of the available TPS could be questionable. (Author)

  3. Rheological-dynamical continuum damage model for concrete under uniaxial compression and its experimental verification

    Directory of Open Access Journals (Sweden)

    Milašinović Dragan D.

    2015-01-01

    Full Text Available A new analytical model for the prediction of concrete response under uniaxial compression and its experimental verification is presented in this paper. The proposed approach, referred to as the rheological-dynamical continuum damage model, combines rheological-dynamical analogy and damage mechanics. Within the framework of this approach the key continuum parameters such as the creep coefficient, Poisson’s ratio and damage variable are functionally related. The critical values of the creep coefficient and damage variable under peak stress are used to describe the failure mode of the concrete cylinder. The ultimate strain is determined in the post-peak regime only, using the secant stress-strain relation from damage mechanics. The post-peak branch is used for the energy analysis. Experimental data for five concrete compositions were obtained during the examination presented herein. The principal difference between compressive failure and tensile fracture is that there is a residual stress in the specimens, which is a consequence of uniformly accelerated motion of load during the examination of compressive strength. The critical interpenetration displacements and crushing energy are obtained theoretically based on the concept of global failure analysis. [Projekat Ministarstva nauke Republike Srbije, br. ON 174027: Computational Mechanics in Structural Engineering i br. TR 36017: Utilization of by-products and recycled waste materials in concrete composites for sustainable construction development in Serbia: Investigation and environmental assessment of possible applications

  4. Analysis and verification of a prediction model of solar energetic proton events

    Science.gov (United States)

    Wang, J.; Zhong, Q.

    2017-12-01

    The solar energetic particle event can cause severe radiation damages near Earth. The alerts and summary products of the solar energetic proton events were provided by the Space Environment Prediction Center (SEPC) according to the flux of the greater than 10 MeV protons taken by GOES satellite in geosynchronous orbit. The start of a solar energetic proton event is defined as the time when the flux of the greater than 10 MeV protons equals or exceeds 10 proton flux units (pfu). In this study, a model was developed to predict the solar energetic proton events, provide the warning for the solar energetic proton events at least minutes in advance, based on both the soft X-ray flux and integral proton flux taken by GOES. The quality of the forecast model was measured against verifications of accuracy, reliability, discrimination capability, and forecast skills. The peak flux and rise time of the solar energetic proton events in the six channels, >1MeV, >5 MeV, >10 MeV, >30 MeV, >50 MeV, >100 MeV, were also simulated and analyzed.

  5. Verification of an interaction model of an ultrasonic oscillatory system with periodontal tissues

    Directory of Open Access Journals (Sweden)

    V. A. Karpuhin

    2014-01-01

    Full Text Available Verification of an interaction model of an ultrasonic oscillatory system with biological tissues which was developed in COMSOL Multiphysics was carried out. It was shown that calculation results in COMSOL Multiphysics obtained using the “Finer” grid (the ratio of the grid step to a minimum transversal section area of the model ≤ 0.3 mm-1 best of all qualitatively and quantitatively corresponded to practical results. The average relative error of the obtained results in comparison with the experimental ones did not exceed 4.0%. Influence of geometrical parameters (thickness of load on electrical admittance of the ultrasonic oscillatory system interacting with biological tissues was investigated. It was shown that increase in thickness of load within the range from 0 to 95 mm led to decrease in calculated values of natural resonance frequency of longitudinal fluctuations and electrical admittance from 26,58 to 26,35 kHz and from 0,86 to 0,44 mS.

  6. A verification strategy for web services composition using enhanced stacked automata model.

    Science.gov (United States)

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  7. Methods, Computational Platform, Verification, and Application of Earthquake-Soil-Structure-Interaction Modeling and Simulation

    Science.gov (United States)

    Tafazzoli, Nima

    Seismic response of soil-structure systems has attracted significant attention for a long time. This is quite understandable with the size and the complexity of soil-structure systems. The focus of three important aspects of ESSI modeling could be on consistent following of input seismic energy and a number of energy dissipation mechanisms within the system, numerical techniques used to simulate dynamics of ESSI, and influence of uncertainty of ESSI simulations. This dissertation is a contribution to development of one such tool called ESSI Simulator. The work is being done on extensive verified and validated suite for ESSI Simulator. Verification and validation are important for high fidelity numerical predictions of behavior of complex systems. This simulator uses finite element method as a numerical tool to obtain solutions for large class of engineering problems such as liquefaction, earthquake-soil-structure-interaction, site effect, piles, pile group, probabilistic plasticity, stochastic elastic-plastic FEM, and detailed large scale parallel models. Response of full three-dimensional soil-structure-interaction simulation of complex structures is evaluated under the 3D wave propagation. Domain-Reduction-Method is used for applying the forces as a two-step procedure for dynamic analysis with the goal of reducing the large size computational domain. The issue of damping of the waves at the boundary of the finite element models is studied using different damping patterns. This is used at the layer of elements outside of the Domain-Reduction-Method zone in order to absorb the residual waves coming out of the boundary layer due to structural excitation. Extensive parametric study is done on dynamic soil-structure-interaction of a complex system and results of different cases in terms of soil strength and foundation embedment are compared. High efficiency set of constitutive models in terms of computational time are developed and implemented in ESSI Simulator

  8. Evaluation and Verification of Thermal Stratification Models for ...

    African Journals Online (AJOL)

    Stratification is a usual phenomenon occurring in waste stabilization ponds which needs to be incorporated in prediction models. The occurrence of stratification in waste stabilization pond (WSP) alters the flow pattern of the pond. Hence, its study is important for complete and accurate modeling of the WSP. In this study, two ...

  9. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-05-12

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  10. Verification of Occupants’ Behaviour Models in Residential Buildings

    DEFF Research Database (Denmark)

    Andersen, Rune Korsholm

    During the last decade, studies about stochastic models of occupants’ behaviour in relation to control of the indoor environment have been published. Often the overall aim of these models is to enable more reliable predictions of building performance using building performance simulations (BPS...... to determine if the event takes place or not. Finally, the simulated window position is compared to the measured ones and the True Positive Rate and False Positive Rate along with other metrics can be calculated and compared. The method evaluates the models abilities to predict the position of the window...... based on the dataset. In each time step, the probabilities are subtracted from the observed transitions, to find the residuals. Finally, the residuals can be averaged, and compared. The validation by simulation relies on detailed Building Performance Simulations (BPS) using models under evaluation...

  11. Experimental Verification of the Transient Model in an Enrichment Circuit

    International Nuclear Information System (INIS)

    Fernandino, Maria; Brasnarof, Daniel; Delmastro, Dario

    2003-01-01

    In the present work an experimental closed loop representing a single stage of an uranium gaseous diffusion enrichment cascade is described, loop that is used to experimentally validate an analytical model that describes the dynamics inside such a loop.The conditions established inside the experimental loop after a few working hours were reproduced by the analytical model, leaving the slower thermal phenomena taking place for future studies.Two kinds of perturbations were experimentally introduced: a change in the range of operation of one of the compressors and the addition of mass into the loop.Numerical and experimental results are compared and presented in this work. The analytical model proposed was verified against these two changes, with very good agreement in the time response and measured values.This analytical model allows us to determine the characteristic time response of the system

  12. A verification procedure for MSC/NASTRAN Finite Element Models

    Science.gov (United States)

    Stockwell, Alan E.

    1995-01-01

    Finite Element Models (FEM's) are used in the design and analysis of aircraft to mathematically describe the airframe structure for such diverse tasks as flutter analysis and actively controlled landing gear design. FEM's are used to model the entire airplane as well as airframe components. The purpose of this document is to describe recommended methods for verifying the quality of the FEM's and to specify a step-by-step procedure for implementing the methods.

  13. Predictions and Verification of an Isotope Marine Boundary Layer Model

    Science.gov (United States)

    Feng, X.; Posmentier, E. S.; Sonder, L. J.; Fan, N.

    2017-12-01

    A one-dimensional (1D), steady state isotope marine boundary layer (IMBL) model is constructed. The model includes meteorologically important features absent in Craig and Gordon type models, namely height-dependent diffusion/mixing and convergence of subsiding external air. Kinetic isotopic fractionation results from this height-dependent diffusion which starts as pure molecular diffusion at the air-water interface and increases linearly with height due to turbulent mixing. The convergence permits dry, isotopically depleted air subsiding adjacent to the model column to mix into ambient air. In δD-δ18O space, the model results fill a quadrilateral, of which three sides represent 1) vapor in equilibrium with various sea surface temperatures (SSTs) (high d18O boundary of quadrilateral); 2) mixture of vapor in equilibrium with seawater and vapor in the subsiding air (lower boundary depleted in both D and 18O); and 3) vapor that has experienced the maximum possible kinetic fractionation (high δD upper boundary). The results can be plotted in d-excess vs. δ18O space, indicating that these processes all cause variations in d-excess of MBL vapor. In particular, due to relatively high d-excess in the descending air, mixing of this air into the MBL causes an increase in d-excess, even without kinetic isotope fractionation. The model is tested by comparison with seven datasets of marine vapor isotopic ratios, with excellent correspondence; >95% of observational data fall within the quadrilateral area predicted by the model. The distribution of observations also highlights the significant influence of vapor from the nearby converging descending air on isotopic variations in the MBL. At least three factors may explain the affect the isotopic composition of precipitation. The model can be applied to modern as well as paleo- climate conditions.

  14. Target Soil Impact Verification: Experimental Testing and Kayenta Constitutive Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Broome, Scott Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Flint, Gregory Mark [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Dewers, Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Newell, Pania [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This report details experimental testing and constitutive modeling of sandy soil deformation under quasi - static conditions. This is driven by the need to understand constitutive response of soil to target/component behavior upon impact . An experimental and constitutive modeling program was followed to determine elastic - plastic properties and a compressional failure envelope of dry soil . One hydrostatic, one unconfined compressive stress (UCS), nine axisymmetric compression (ACS) , and one uniaxial strain (US) test were conducted at room temperature . Elastic moduli, assuming isotropy, are determined from unload/reload loops and final unloading for all tests pre - failure and increase monotonically with mean stress. Very little modulus degradation was discernable from elastic results even when exposed to mean stresses above 200 MPa . The failure envelope and initial yield surface were determined from peak stresses and observed onset of plastic yielding from all test results. Soil elasto - plastic behavior is described using the Brannon et al. (2009) Kayenta constitutive model. As a validation exercise, the ACS - parameterized Kayenta model is used to predict response of the soil material under uniaxial strain loading. The resulting parameterized and validated Kayenta model is of high quality and suitable for modeling sandy soil deformation under a range of conditions, including that for impact prediction.

  15. SWAAM-code development and verification and application to steam generator designs

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes which were developed by Argonne National Laboratory to analyze the effects of sodium-water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The paper discusses the theoretical foundations and numerical treatments on which the codes are based, followed by a description of code capabilities and limitations, verification of the codes and applications to steam generator and IHTS designs. 25 refs., 14 figs

  16. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  17. Model of PE-CVD Apparatus: Verification and Simulations

    Directory of Open Access Journals (Sweden)

    J. Geiser

    2010-01-01

    Full Text Available In this paper, we present a simulation of chemical vapor deposition with metallic bipolar plates. In chemical vapor deposition, a delicate optimization between temperature, pressure and plasma power is important to obtain homogeneous deposition. The aim is to reduce the number of real-life experiments in a given CVD plasma reactor. Based on the large physical parameter space, there are a hugh number of possible experiments. A detailed study of the physical experiments in a CVD plasma reactor allows to reduce the problem to an approximate mathematical model, which is the underlying transport-reaction model. Significant regions of the CVD apparatus are approximated and physical parameters are transferred to the mathematical parameters. Such an approximation reduces the mathematical parameter space to a realistic number of numerical experiments. The numerical results are discussed with physical experiments to give a valid model for the assumed growth and we could reduce expensive physical experiments.

  18. Modelling, property verification and behavioural equivalence of lactose operon regulation.

    Science.gov (United States)

    Pinto, Marcelo Cezar; Foss, Luciana; Mombach, José Carlos Merino; Ribeiro, Leila

    2007-02-01

    Understanding biochemical pathways is one of the biggest challenges in the field of molecular biology nowadays. Computer science can contribute in this area by providing formalisms and tools to simulate and analyse pathways. One formalism that is suited for modelling concurrent systems is Milner's Calculus of Communicating Systems (CCS). This paper shows the viability of using CCS to model and reason about biochemical networks. As a case study, we describe the regulation of lactose operon. After describing this operon formally using CCS, we validate our model by automatically checking some known properties for lactose regulation. Moreover, since biological systems tend to be very complex, we propose to use multiple descriptions of the same system at different levels of abstraction. The compatibility of these multiple views can be assured via mathematical proofs of observational equivalence.

  19. Robust control design verification using the modular modeling system

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ben-Abdennour, A.; Lee, K.Y.

    1991-01-01

    The Modular Modeling System (B ampersand W MMS) is being used as a design tool to verify robust controller designs for improving power plant performance while also providing fault-accommodating capabilities. These controllers are designed based on optimal control theory and are thus model based controllers which are targeted for implementation in a computer based digital control environment. The MMS is being successfully used to verify that the controllers are tolerant of uncertainties between the plant model employed in the controller and the actual plant; i.e., that they are robust. The two areas in which the MMS is being used for this purpose is in the design of (1) a reactor power controller with improved reactor temperature response, and (2) the design of a multiple input multiple output (MIMO) robust fault-accommodating controller for a deaerator level and pressure control problem

  20. Structure-dynamic model verification calculation of PWR 5 tests

    International Nuclear Information System (INIS)

    Engel, R.

    1980-02-01

    Within reactor safety research project RS 16 B of the German Federal Ministry of Research and Technology (BMFT), blowdown experiments are conducted at Battelle Institut e.V. Frankfurt/Main using a model reactor pressure vessel with a height of 11,2 m and internals corresponding to those in a PWR. In the present report the dynamic loading on the pressure vessel internals (upper perforated plate and barrel suspension) during the DWR 5 experiment are calculated by means of a vertical and horizontal dynamic model using the CESHOCK code. The equations of motion are resolved by direct integration. (orig./RW) [de

  1. Developing a Model Component

    Science.gov (United States)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) was a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The initial purpose of the UCTS was to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The UCTS is designed with the capability of servicing future space vehicles; including all Space Station Requirements necessary for the MPLM Modules. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems during their development. As an intern at Kennedy Space Center (KSC), my assignment was to develop a model component for the UCTS. I was given a fluid component (dryer) to model in Simulink. I completed training for UNIX and Simulink. The dryer is a Catch All replaceable core type filter-dryer. The filter-dryer provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-dryer also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. The filter-dryer was modeled by determining affects it has on the pressure and velocity of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my filter-dryer model in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements. I participated in Simulation meetings and was involved in the subsystem design process and team collaborations. I gained valuable work experience and insight into a career path as an engineer.

  2. Modelling and Verification of Web Services Business Activity Protocol

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2011-01-01

    WS-Business Activity specification defines two coordination protocols in order to ensure a consistent agreement on the outcome of long-running distributed applications. We use the model checker Uppaal to analyse the Business Agreement with Coordination Completion protocol type. Our analyses show...

  3. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  4. Stirling cryocooler test results and design model verification

    International Nuclear Information System (INIS)

    Shimko, M.A.; Stacy, W.D.; McCormick, J.A.

    1990-01-01

    This paper reports on progress in developing a long-life Stirling cycle cryocooler for space borne applications. It presents the results from tests on a preliminary breadboard version of the cryocooler used to demonstrate the feasibility of the technology and to validate the regenerator design code used in its development. This machine achieved a cold-end temperature of 65 K while carrying a 1/2 Watt cooling load. The basic machine is a double-acting, flexure-bearing, split Stirling design with linear electromagnetic drives for the expander and compressors. Flat metal diaphragms replace pistons for both sweeping and sealing the machine working volumes. In addition, the double-acting expander couples to a laminar-channel counterflow recuperative heat exchanger for regeneration. A PC compatible design code was developed for this design approach that calculates regenerator loss including heat transfer irreversibilities, pressure drop, and axial conduction in the regenerator walls

  5. Verification of the Skorohod-Olevsky Viscous Sintering (SOVS) Model

    Energy Technology Data Exchange (ETDEWEB)

    Lester, Brian T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-16

    Sintering refers to a manufacturing process through which mechanically pressed bodies of ceramic (and sometimes metal) powders are heated to drive densification thereby removing the inherit porosity of green bodies. As the body densifies through the sintering process, the ensuing material flow leads to macroscopic deformations of the specimen and as such the final configuration differs form the initial. Therefore, as with any manufacturing step, there is substantial interest in understanding and being able to model the sintering process to predict deformation and residual stress. Efforts in this regard have been pursued for face seals, gear wheels, and consumer products like wash-basins. To understand the sintering process, a variety of modeling approaches have been pursued at different scales.

  6. Model Verification and Validation Using Graphical Information Systems Tools

    Science.gov (United States)

    2013-07-31

    coastal ocean sufficiently to have a complete picture of the flow. The analysis will thus consist of comparing these incomplete pictures of the current...50 cm. This would suggest that tidal flats would exist at synoptic scales but not daily because there are expanses of the lagoon that are < 50 cm...historical daily data from the correct time of year but not from the correct day. This indicates that the model flow is generally correct at synoptic

  7. Design and verification of the 'GURI 01' bundle model

    International Nuclear Information System (INIS)

    Benito, G.D.

    1990-01-01

    This work presents a general description of the 'GURI 01' bundle model, designed by INVAP S.E., under international radioactive material transportation regulations, as a B(U) type bundle for international transportation up to a maximum of 350000 Ci of Co60. Moreover, the methodologies used and the results obtained from the structural evaluation of the mechanic essay and from the evaluation of the thermal behaviour under normal or accident conditions are briefly discussed. (Author) [es

  8. A New Approach to Model Verification, Falsification and Selection

    Directory of Open Access Journals (Sweden)

    Andrew J. Buck

    2015-06-01

    Full Text Available This paper shows that a qualitative analysis, i.e., an assessment of the consistency of a hypothesized sign pattern for structural arrays with the sign pattern of the estimated reduced form, can always provide decisive insight into a model’s validity both in general and compared to other models. Qualitative analysis can show that it is impossible for some models to have generated the data used to estimate the reduced form, even though standard specification tests might show the model to be adequate. A partially specified structural hypothesis can be falsified by estimating as few as one reduced form equation. Zero restrictions in the structure can themselves be falsified. It is further shown how the information content of the hypothesized structural sign patterns can be measured using a commonly applied concept of statistical entropy. The lower the hypothesized structural sign pattern’s entropy, the more a priori information it proposes about the sign pattern of the estimated reduced form. As an hypothesized structural sign pattern has a lower entropy, it is more subject to type 1 error and less subject to type 2 error. Three cases illustrate the approach taken here.

  9. Verification of an effective dose equivalent model for neutrons

    International Nuclear Information System (INIS)

    Tanner, J.E.; Piper, R.K.; Leonowich, J.A.; Faust, L.G.

    1992-01-01

    Since the effective dose equivalent, based on the weighted sum of organ dose equivalents, is not a directly measurable quantity, it must be estimated with the assistance of computer modelling techniques and a knowledge of the incident radiation field. Although extreme accuracy is not necessary for radiation protection purposes, a few well chosen measurements are required to confirm the theoretical models. Neutron doses and dose equivalents were measured in a RANDO phantom at specific locations using thermoluminescence dosemeters, etched track dosemeters, and a 1.27 cm (1/2 in) tissue-equivalent proportional counter. The phantom was exposed to a bare and a D 2 O-moderated 252 Cf neutron source at the Pacific Northwest Laboratory's Low Scatter Facility. The Monte Carlo code MCNP with the MIRD-V mathematical phantom was used to model the human body and to calculate the organ doses and dose equivalents. The experimental methods are described and the results of the measurements are compared with the calculations. (author)

  10. Verification of an effective dose equivalent model for neutrons

    International Nuclear Information System (INIS)

    Tanner, J.E.; Piper, R.K.; Leonowich, J.A.; Faust, L.G.

    1991-10-01

    Since the effective dose equivalent, based on the weighted sum of organ dose equivalents, is not a directly measurable quantity, it must be estimated with the assistance of computer modeling techniques and a knowledge of the radiation field. Although extreme accuracy is not necessary for radiation protection purposes, a few well-chosen measurements are required to confirm the theoretical models. Neutron measurements were performed in a RANDO phantom using thermoluminescent dosemeters, track etch dosemeters, and a 1/2-in. (1.27-cm) tissue equivalent proportional counter in order to estimate neutron doses and dose equivalents within the phantom at specific locations. The phantom was exposed to bare and D 2 O-moderated 252 Cf neutrons at the Pacific Northwest Laboratory's Low Scatter Facility. The Monte Carlo code MCNP with the MIRD-V mathematical phantom was used to model the human body and calculate organ doses and dose equivalents. The experimental methods are described and the results of the measurements are compared to the calculations. 8 refs., 3 figs., 3 tabs

  11. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  12. Verification of Conjugate Heat Transfer Models in a Closed Volume with Radiative Heat Source

    Directory of Open Access Journals (Sweden)

    Maksimov Vyacheslav I.

    2016-01-01

    Full Text Available The results of verification of mathematical model of convective-conductive heat transfer in a closed volume with a thermally conductive enclosing structures are presented. Experiments were carried out to determine the temperature of floor premises in the working conditions of radiant heating systems. Comparison of mathematical modelling of temperature fields and experiments showed their good agreement. It is concluded that the mathematical model of conjugate heat transfers in the air cavity with a heat-conducting and heat-retaining walls correspond to the real process of formation of temperature fields in premises with gas infrared heaters system.

  13. TU Electric reactor physics model verification: Power reactor benchmark

    International Nuclear Information System (INIS)

    Willingham, C.E.; Killgore, M.R.

    1988-01-01

    Power reactor benchmark calculations using the advanced code package CASMO-3/SIMULATE-3 have been performed for six cycles of Prairie Island Unit 1. The reload fuel designs for the selected cycles included gadolinia as a burnable absorber, natural uranium axial blankets and increased water-to-fuel ratio. The calculated results for both startup reactor physics tests (boron endpoints, control rod worths, and isothermal temperature coefficients) and full power depletion results were compared to measured plant data. These comparisons show that the TU Electric reactor physics models accurately predict important measured parameters for power reactors

  14. Visual Sample Plan (VSP) Models and Code Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, Richard O.; Davidson, James R.; Wilson, John E.; Pulsipher, Brent A.

    2001-03-06

    VSP is an easy to use, visual and graphic software tool being developed to select the right number and location of environmental samples so that the results of statistical tests performed to provide input to environmental decisions have the required confidence and performance. It is a significant help for implementing the 6th and 7th steps of the Data Quality Objectives (DQO) planning process ("Specify Tolerable Limits on Decision Errors" and "Optimize the Design for Obtaining Data," respectively).

  15. Modelling of hydrodynamics and mecury transport in lake Velenje. Part 2, Modelling and model verification

    OpenAIRE

    Kotnik, Jože; Žagar, Dušan; Rajar, Rudi; Horvat, Milena

    2004-01-01

    PCFLOW3D - a three-dimensional mathematical model that was developed at the Chair of Fluid Mechanics of the Faculty of Civil and Geodetic Engineering, University of Ljubljana, was used for hydrodynamic and Hg transport simulations in Lake Velenje. The model is fully non-linear and computes three velocity components, water elevation and pressure. Transport-dispersion equations for salinity and heat (and/or any pollutant) are further used to compute the distributions of these par...

  16. Development and verification of SAC-CORE code for reactor core seismic analysis

    International Nuclear Information System (INIS)

    Koo, K. H.; Lee, J. H.; Yu, B.

    1998-01-01

    The purpose of this paper is to develop the SAC-CORE code for core seismic analysis of Liquid Metal Reactor. Using the SAC-CORE code, the core seismic analysis for KALIMER reactor core is carried out to show the seismic isolation performance. For the verification of SAC-CORE code, the seismic analysis in air for RAPSODIE core mock-up is performed and the results are compared with those of the experiments. In this benchmark, SAC-CORE code gives good results

  17. Autonomic networking-on-chip bio-inspired specification, development, and verification

    CERN Document Server

    Cong-Vinh, Phan

    2011-01-01

    Despite the growing mainstream importance and unique advantages of autonomic networking-on-chip (ANoC) technology, Autonomic Networking-On-Chip: Bio-Inspired Specification, Development, and Verification is among the first books to evaluate research results on formalizing this emerging NoC paradigm, which was inspired by the human nervous system. The FIRST Book to Assess Research Results, Opportunities, & Trends in ""BioChipNets"" The third book in the Embedded Multi-Core Systems series from CRC Press, this is an advanced technical guide and reference composed of contributions from prominent re

  18. Metal Fuel Development and Verification for Prototype Generation IV Sodium-Cooled Fast Reactor

    OpenAIRE

    Chan Bock Lee; Jin Sik Cheon; Sung Ho Kim; Jeong-Yong Park; Hyung-Kook Joo

    2016-01-01

    Metal fuel is being developed for the prototype generation-IV sodium-cooled fast reactor (PGSFR) to be built by 2028. U–Zr fuel is a driver for the initial core of the PGSFR, and U–transuranics (TRU)–Zr fuel will gradually replace U–Zr fuel through its qualification in the PGSFR. Based on the vast worldwide experiences of U–Zr fuel, work on U–Zr fuel is focused on fuel design, fabrication of fuel components, and fuel verification tests. U–TRU–Zr fuel uses TRU recovered through pyroelectrochem...

  19. Modeling and verification of hemispherical solar still using ANSYS CFD

    Energy Technology Data Exchange (ETDEWEB)

    Panchal, Hitesh N. [KSV University, Gujarat Power Engineering and Research Institute, Mehsana (India); Shah, P.K. [Silver Oak College of Engineering and Technology, Ahmedabad, Gujarat (India)

    2013-07-01

    In every efficient solar still design, water temperature, vapor temperature and distillate output, and difference between water temperature and inner glass cover temperatures are very important. Here, two dimensional three phase model of hemispherical solar still is made for evaporation as well as condensation process in ANSYS CFD. Simulation results like water temperature, vapor temperature, distillate output compared with actual experimental results of climate conditions of Mehsana (latitude of 23° 59’ and longitude of 72° 38) of hemispherical solar still. Water temperature and distillate output were good agreement with actual experimental results. Study shows that ANSYS-CFD is very powerful as well as efficient tool for design, comparison purpose of hemispherical solar still.

  20. Working Group 2: Future Directions for Safeguards and Verification, Technology, Research and Development

    International Nuclear Information System (INIS)

    Zykov, S.; Blair, D.

    2013-01-01

    For traditional safeguards it was recognized that the hardware presently available is, in general, addressing adequately fundamental IAEA needs, and that further developments should therefore focus mainly on improving efficiencies (i.e. increasing cost economies, reliability, maintainability and user-friendliness, keeping abreast of continual advancements in technologies and of the evolution of verification approaches). Specific technology areas that could benefit from further development include: -) Non-destructive measurement systems (NDA), in particular, gamma-spectroscopy and neutron counting techniques; -) Containment and surveillance tools, such as tamper indicating seals, video-surveillance, surface identification methods, etc.; -) Geophysical methods for design information verification (DIV) and safeguarding of geological repositories; and -) New tools and methods for real-time monitoring. Furthermore, the Working Group acknowledged that a 'building block' (or modular) approach should be adopted towards technology development, enabling equipment to be upgraded efficiently as technologies advance. Concerning non-traditional safeguards, in the area of satellite-based sensors, increased spatial resolution and broadened spectral range were identified as priorities. In the area of wide area surveillance, the development of LIDAR-like tools for atmospheric sensing was discussed from the perspective of both potential benefits and certain limitations. Recognizing the limitations imposed by the human brain in terms of information assessment and analysis, technologies are needed that will enable the more effective utilization of all information, regardless of its format and origin. The paper is followed by the slides of the presentation. (A.C.)

  1. Double piezoelectric energy harvesting cell: modeling and experimental verification

    Science.gov (United States)

    Wang, Xianfeng; Shi, Zhifei

    2017-06-01

    In this paper, a novel energy transducer named double piezoelectric energy harvesting cell (DPEHC) consisting of two flex-compressive piezoelectric energy harvesting cells (F-C PEHCs) is proposed. At the very beginning, two F-C PEHCs, a kind of cymbal type energy transducer, are assembled together sharing the same end just in order to be placed steady. However, throughout an open-circuit voltage test, additional energy harvesting performance of the DPEHC prototype appears. Taking the interaction between the two F-C PEHCs into account, a mechanical model for analyzing the DPEHC is established. The electric output of the DPEHC under harmonic excitation is obtained theoretically and verified experimentally, and good agreement is found. In addition, as an inverse problem, the method for identifying the key mechanical parameters of the DPEHC is recommended. Finally, the additional energy harvesting performance of the DPEHC is quantitatively discussed. Numerical results show that the additional energy harvesting performance of the DPEHC is correlated with the key mechanical parameters of the DPEHC. For the present DPEHC prototype, the energy harvesting addition is over 400% compared with two independent F-C PEHCs under the same load condition.

  2. State Token Petri Net modeling method for formal verification of computerized procedure including operator's interruptions of procedure execution flow

    International Nuclear Information System (INIS)

    Kim, Yun Goo; Seong, Poong Hyun

    2012-01-01

    The Computerized Procedure System (CPS) is one of the primary operating support systems in the digital Main Control Room. The CPS displays procedure on the computer screen in the form of a flow chart, and displays plant operating information along with procedure instructions. It also supports operator decision making by providing a system decision. A procedure flow should be correct and reliable, as an error would lead to operator misjudgement and inadequate control. In this paper we present a modeling for the CPS that enables formal verification based on Petri nets. The proposed State Token Petri Nets (STPN) also support modeling of a procedure flow that has various interruptions by the operator, according to the plant condition. STPN modeling is compared with Coloured Petri net when they are applied to Emergency Operating Computerized Procedure. A converting program for Computerized Procedure (CP) to STPN has been also developed. The formal verification and validation methods of CP with STPN increase the safety of a nuclear power plant and provide digital quality assurance means that are needed when the role and function of the CPS is increasing.

  3. One-Quarter-Car Active SuspensionModel Verification

    Directory of Open Access Journals (Sweden)

    Hyniova Katerina

    2017-01-01

    Full Text Available Suspension system influences both the comfort and safety of the passengers. In the paper, energy recuperation and management in automotive suspension systems with linear electric motors that are controlled by a designed H∞ controller to generate a variable mechanical force for a car damper is presented. Vehicle shock absorbers in which forces are generated in response to feedback signals by active elements obviously offer increased design flexibility compared to the conventional suspensions with passive elements (springs and dampers. The main advantage of the proposed solution that uses a linear AC motor is the possibility to generate desired forces acting between the unsprung (wheel and sprung (one-quarter of the car body mass masses of the car, providing good insulation of the car sprung mass from the road surface roughness and load disturbances. As shown in the paper, under certain circumstances linear motors as actuators enable to transform mechanical energy of the vertical car vibrations to electrical energy, accumulate it, and use it when needed. Energy flow control enables to reduce or even eliminate the demands on the external power source. In particular, the paper is focused on experiments with active shock absorber that has been taken on the designed test bed and the way we developed an appropriate input signal for the test bed that as real road disturbance acts upon the vibration absorber and the obtained results are evaluated at the end. Another important point the active suspension design should satisfy is energy supply control that is made via standard controller modification, and which allows changing amount of energy required by the system. Functionality of the designed controller modification was verified taking various experiments on the experiment stand as mentioned in the paper.

  4. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    Science.gov (United States)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  5. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  6. Pneumatic Muscles Actuated Lower-Limb Orthosis Model Verification with Actual Human Muscle Activation Patterns

    Directory of Open Access Journals (Sweden)

    Dzahir M.A.M

    2017-01-01

    Full Text Available A review study was conducted on existing lower-limb orthosis systems for rehabilitation which implemented pneumatic muscle type of actuators with the aim to clarify the current and on-going research in this field. The implementation of pneumatic artificial muscle will play an important role for the development of the advanced robotic system. In this research a derivation model for the antagonistic mono- and bi-articular muscles using pneumatic artificial muscles of a lower limb orthosis will be verified with actual human’s muscle activities models. A healthy and young male 29 years old subject with height 174cm and weight 68kg was used as a test subject. Two mono-articular muscles Vastus Medialis (VM and Vastus Lateralis (VL were selected to verify the mono-articular muscle models and muscle synergy between anterior muscles. Two biarticular muscles Rectus Femoris (RF and Bicep Femoris (BF were selected to verify the bi-articular muscle models and muscle co-contraction between anterior-posterior muscles. The test was carried out on a treadmill with a speed of 4.0 km/h, which approximately around 1.25 m/s for completing one cycle of walking motion. The data was collected for about one minute on a treadmill and 20 complete cycles of walking motion were successfully recorded. For the evaluations, the mathematical model obtained from the derivation and the actual human muscle activation patterns obtained using the surface electromyography (sEMG system were compared and analysed. The results shown that, high correlation values ranging from 0.83 up to 0.93 were obtained in between the derivation model and the actual human muscle’s model for both mono- and biarticular muscles. As a conclusion, based on the verification with the sEMG muscle activities data and its correlation values, the proposed derivation models of the antagonistic mono- and bi-articular muscles were suitable to simulate and controls the pneumatic muscles actuated lower limb

  7. Verification of gamma knife based fractionated radiosurgery with newly developed head-thorax phantom

    International Nuclear Information System (INIS)

    Bisht, Raj Kishor; Kale, Shashank Sharad; Natanasabapathi, Gopishankar; Singh, Manmohan Jit; Agarwal, Deepak; Garg, Ajay; Rath, Goura Kishore; Julka, Pramod Kumar; Kumar, Pratik; Thulkar, Sanjay; Sharma, Bhawani Shankar

    2016-01-01

    Objective: Purpose of the study is to verify the Gamma Knife Extend™ system (ES) based fractionated stereotactic radiosurgery with newly developed head-thorax phantom. Methods: Phantoms are extensively used to measure radiation dose and verify treatment plan in radiotherapy. A human upper body shaped phantom with thorax was designed to simulate fractionated stereotactic radiosurgery using Extend™ system of Gamma Knife. The central component of the phantom aids in performing radiological precision test, dosimetric evaluation and treatment verification. A hollow right circular cylindrical space of diameter 7.0 cm was created at the centre of this component to place various dosimetric devices using suitable adaptors. The phantom is made of poly methyl methacrylate (PMMA), a transparent thermoplastic material. Two sets of disk assemblies were designed to place dosimetric films in (1) horizontal (xy) and (2) vertical (xz) planes. Specific cylindrical adaptors were designed to place thimble ionization chamber inside phantom for point dose recording along xz axis. EBT3 Gafchromic films were used to analyze and map radiation field. The focal precision test was performed using 4 mm collimator shot in phantom to check radiological accuracy of treatment. The phantom head position within the Extend™ frame was estimated using encoded aperture measurement of repositioning check tool (RCT). For treatment verification, the phantom with inserts for film and ion chamber was scanned in reference treatment position using X-ray computed tomography (CT) machine and acquired stereotactic images were transferred into Leksell Gammaplan (LGP). A patient treatment plan with hypo-fractionated regimen was delivered and identical fractions were compared using EBT3 films and in-house MATLAB codes. Results: RCT measurement showed an overall positional accuracy of 0.265 mm (range 0.223 mm–0.343 mm). Gamma index analysis across fractions exhibited close agreement between LGP and film

  8. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2018-01-01

    is compared to conventional text-independent background model based TD-SV systems using either Gaussian mixture model (GMM)-universal background model (UBM) or Hidden Markov model (HMM)-UBM or i-vector paradigms. In addition, we consider two approaches to build PBMs: speaker-independent and speaker......In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using...... and the selected PBM is then used for the log likelihood ratio (LLR) calculation with respect to the claimant model. The proposed method incorporates the pass-phrase identification step in the LLR calculation, which is not considered in conventional standalone TD-SV systems. The performance of the proposed method...

  9. Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.

    Science.gov (United States)

    Li, Haoxiang; Hua, Gang

    2018-04-01

    Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.

  10. Extended verification of the model of dynamic near-surface layer of the atmosphere

    Science.gov (United States)

    Polnikov, V. G.

    2013-07-01

    This paper formulates the most general principles for verifying models of the dynamic near-water layer of the atmosphere (DNWLA) and performs an advanced verification of the model proposed by the author earlier [6]. Based on empirical wave spectra from the studies by Donelan [15], Elfouhaily [14], and Kudryavtsev [13] and well-known empirical laws describing the wave-age dependence of the friction coefficient, we adjusted the original version of the model. It was shown that the improvement of model reliability is most dependent on the adequacy of the parameterization of the tangential portion of the total momentum flux to the wavy surface. Then the new version of the model was verified on the basis of field data from two different groups of authors. It was found that the new version of the model is consistent with empirical data with an error not exceeding the measurement error of near-water layer parameters.

  11. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  12. Verification of a coupled atmosphere-ocean model using satellite observations over the Adriatic Sea

    Directory of Open Access Journals (Sweden)

    V. Djurdjevic

    2008-07-01

    Full Text Available Verification of the EBU-POM regional atmosphere-ocean coupled model (RAOCM was carried out using satellite observations of SST and surface winds over the Adriatic Sea. The atmospheric component has a horizontal resolution of 0.125 degree (approximately 10 km and 32 vertical levels, while the ocean component has a horizontal resolution of approximately 4 km with 21 sigma vertical levels.

    Verification of the forecasted SST was performed for 15 forecasts during 2006, each of them seven days long. These forecasts coincide with the operating cycle of the Adriatic Regional Model (AREG, which provided the initial fields and boundary conditions for the ocean component of EBU-POM. Two sources of data were used for the initial and boundary conditions of the atmosphere: primary data were obtained from the European Center for Medium-Range Weather Forecasting (ECMWF, while data from National Centers for Environmental Prediction (NCEP were used to test the sensitivity to boundary conditions.

    Forecast skill was expressed in terms of BIAS and root mean square error (RMSE. During most the of verification period, the model had a negative BIAS of approximately −0.3°, while RMSE varied between 1.1° and 1.2°. Interestingly, these errors did not increase over time, which means that the forecast skill did not decline during the integrations.

    The 10-m wind verification was conducted for one period of 17 days in February 2007, during a strong bora episode, for which satellite estimates of surface winds were available. During the same period, SST measurements were conducted twice a day, which enabled us to verify diurnal variations of SST simulated by the RAOCM model. Since ECMWF's deterministic forecasts do not cover such a long period, we decided to use the ECMWF analysis, i.e. we ran the model in hindcast mode. The winds simulated in this analysis were weaker than the satellite estimates, with a mean BIAS of −0.8 m/s.

  13. Verification of a coupled atmosphere-ocean model using satellite observations over the Adriatic Sea

    Directory of Open Access Journals (Sweden)

    V. Djurdjevic

    2008-07-01

    Full Text Available Verification of the EBU-POM regional atmosphere-ocean coupled model (RAOCM was carried out using satellite observations of SST and surface winds over the Adriatic Sea. The atmospheric component has a horizontal resolution of 0.125 degree (approximately 10 km and 32 vertical levels, while the ocean component has a horizontal resolution of approximately 4 km with 21 sigma vertical levels. Verification of the forecasted SST was performed for 15 forecasts during 2006, each of them seven days long. These forecasts coincide with the operating cycle of the Adriatic Regional Model (AREG, which provided the initial fields and boundary conditions for the ocean component of EBU-POM. Two sources of data were used for the initial and boundary conditions of the atmosphere: primary data were obtained from the European Center for Medium-Range Weather Forecasting (ECMWF, while data from National Centers for Environmental Prediction (NCEP were used to test the sensitivity to boundary conditions. Forecast skill was expressed in terms of BIAS and root mean square error (RMSE. During most the of verification period, the model had a negative BIAS of approximately −0.3°, while RMSE varied between 1.1° and 1.2°. Interestingly, these errors did not increase over time, which means that the forecast skill did not decline during the integrations. The 10-m wind verification was conducted for one period of 17 days in February 2007, during a strong bora episode, for which satellite estimates of surface winds were available. During the same period, SST measurements were conducted twice a day, which enabled us to verify diurnal variations of SST simulated by the RAOCM model. Since ECMWF's deterministic forecasts do not cover such a long period, we decided to use the ECMWF analysis, i.e. we ran the model in hindcast mode. The winds simulated in this analysis were weaker than the satellite estimates, with a mean BIAS of −0.8 m/s.

  14. Verification and validation plan for the SFR system analysis module

    Energy Technology Data Exchange (ETDEWEB)

    Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-12-18

    This report documents the Verification and Validation (V&V) Plan for software verification and validation of the SFR System Analysis Module (SAM), developed at Argonne National Laboratory for sodium fast reactor whole-plant transient analysis. SAM is developed under the DOE NEAMS program and is part of the Reactor Product Line toolkit. The SAM code, the phenomena and computational models of interest, the software quality assurance, and the verification and validation requirements and plans are discussed in this report.

  15. Development of an automated testing system for verification and validation of nuclear data

    International Nuclear Information System (INIS)

    Triplett, B. S.; Anghaie, S.; White, M. C.

    2008-01-01

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National Laboratory (LANL) in collaboration with the University of Florida is developing a methodology to automate the process of nuclear data verification and validation. The International Criticality Safety Benchmark Experiment Project (ICSBEP) provides a set of criticality problems that may be used to evaluate nuclear data. This process tests a number of data libraries using cases from the ICSBEP benchmark set to demonstrate how automation of these tasks may reduce errors and increase efficiency. The process is driven by an integrated set of Python scripts. Material and geometry data may be read from an existing code input file to generate a standardized template or the template may be generated directly by the user The user specifies the desired precision and other vital problem parameters. The Python scripts generate input decks for multiple transport codes from these templates, run and monitor individual jobs, and parse the relevant output. This output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. (authors)

  16. Linear models to perform treaty verification tasks for enhanced information security

    Science.gov (United States)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  17. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  18. Linear models to perform treaty verification tasks for enhanced information security

    International Nuclear Information System (INIS)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-01-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  19. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Energy Technology Data Exchange (ETDEWEB)

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  20. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Science.gov (United States)

    Chukbar, B. K.

    2015-12-01

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm-3 in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  1. Additional Model Datasets and Results to Accelerate the Verification and Validation of RELAP-7

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-11-01

    The RELAP-7 code verification and validation activities are ongoing under the code assessment plan proposed in the previous document (INL-EXT-16-40015). Among the list of V&V test problems in the ‘RELAP-7 code V&V RTM (Requirements Traceability Matrix)’, the RELAP-7 7-equation model has been tested with additional demonstration problems and the results of these tests are reported in this document. In this report, we describe the testing process, the test cases that were conducted, and the results of the evaluation.

  2. Experimental Validation and Model Verification for a Novel Geometry ICPC Solar Collector

    DEFF Research Database (Denmark)

    Perers, Bengt; Duff, William S.; Daosukho, Jirachote

    at the various specified incident angles provide model verification for the investigation into causes of ray attenuation and provide accounts for rays that escape. Two fourteen tube modules were tested on Sandia National Laboratory’s two-axis tracking (AZTRAK) platform. By adjusting the tracking of the platform...... at the corresponding specified incident angles are compared to the Sandia results. A 100 m2 336 Novel ICPC evacuated tube solar collector array has been in continuous operation at a demonstration project in Sacramento California since 1998. Data from the initial operation of the array are used to further validate...

  3. CANDU RU fuel manufacturing basic technology development and advanced fuel verification tests

    International Nuclear Information System (INIS)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D.

    1999-04-01

    A PHWR advanced fuel named the CANFLEX fuel has been developed through a KAERI/AECL joint Program. The KAERI made fuel bundle was tested at the KAERI Hot Test Loop for the performance verification of the bundle design. The major test activities were the fuel bundle cross-flow test, the endurance fretting/vibration test, the freon CHF test, and the fuel bundle heat-up test. KAERI also has developing a more advanced PHWR fuel, the CANFLEX-RU fuel, using recovered uranium to extend fuel burn-up in the CANDU reactors. For the purpose of proving safety of the RU handling techniques and appraising feasibility of the CANFLEX-RU fuel fabrication in near future, a physical, chemical and radiological characterization of the RU powder and pellets was performed. (author). 54 refs., 46 tabs., 62 figs

  4. CANDU RU fuel manufacturing basic technology development and advanced fuel verification tests

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D. [and others

    1999-04-01

    A PHWR advanced fuel named the CANFLEX fuel has been developed through a KAERI/AECL joint Program. The KAERI made fuel bundle was tested at the KAERI Hot Test Loop for the performance verification of the bundle design. The major test activities were the fuel bundle cross-flow test, the endurance fretting/vibration test, the freon CHF test, and the fuel bundle heat-up test. KAERI also has developing a more advanced PHWR fuel, the CANFLEX-RU fuel, using recovered uranium to extend fuel burn-up in the CANDU reactors. For the purpose of proving safety of the RU handling techniques and appraising feasibility of the CANFLEX-RU fuel fabrication in near future, a physical, chemical and radiological characterization of the RU powder and pellets was performed. (author). 54 refs., 46 tabs., 62 figs.

  5. Verification of Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Jacobsen, Lasse; Jacobsen, Morten; Møller, Mikael Harkjær

    2011-01-01

    of interesting theoretical properties distinguishing them from other time extensions of Petri nets. We shall give an overview of the recent theory developed in the verification of TAPN extended with features like read/transport arcs, timed inhibitor arcs and age invariants. We will examine in detail...... the boundaries of automatic verification and the connections between TAPN and the model of timed automata. Finally, we will mention the tool TAPAAL that supports modelling, simulation and verification of TAPN and discuss a small case study of alternating bit protocol....

  6. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey...

  7. A Translator Verification Technique for FPGA Software Development in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Yeob; Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2014-10-15

    Although the FPGAs give a high performance than PLC (Programmable Logic Controller), the platform change from PLC to FPGA impose all PLC software engineers give up their experience, knowledge and practices accumulated over decades, and start a new FPGA-based hardware development from scratch. We have researched to fine the solution to this problem reducing the risk and preserving the experience and knowledge. One solution is to use the FBDtoVerilog translator, which translates the FBD programs into behavior-preserving Verilog programs. In general, the PLCs are usually designed with an FBD, while the FPGAs are described with a HDL (Hardware Description Language) such as Verilog or VHDL. Once PLC designer designed the FBD programs, the FBDtoVerilog translates the FBD into Verilog, mechanically. The designers, therefore, need not consider the rest of FPGA development process (e.g., Synthesis and Place and Routing) and can preserve the accumulated experience and knowledge. Even if we assure that the translation from FBD to Verilog is correct, it must be verified rigorously and thoroughly since it is used in nuclear power plants, which is one of the most safety critical systems. While the designer develops the FPGA software with the FBD program translated by the translator, there are other translation tools such as synthesis tool and place and routing tool. This paper also focuses to verify them rigorously and thoroughly. There are several verification techniques for correctness of translator, but they are hard to apply because of the outrageous cost and performance time. Instead, this paper tries to use an indirect verification technique for demonstrating the correctness of translator using the co-simulation technique. We intend to prove only against specific inputs which are under development for a target I and C system, not against all possible input cases.

  8. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    Science.gov (United States)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture

  9. Modeling and Verification of Reconfigurable and Energy-Efficient Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Jiafeng Zhang

    2015-01-01

    Full Text Available This paper deals with the formal modeling and verification of reconfigurable and energy-efficient manufacturing systems (REMSs that are considered as reconfigurable discrete event control systems. A REMS not only allows global reconfigurations for switching the system from one configuration to another, but also allows local reconfigurations on components for saving energy when the system is in a particular configuration. In addition, the unreconfigured components of such a system should continue running during any reconfiguration. As a result, during a system reconfiguration, the system may have several possible paths and may fail to meet control requirements if concurrent reconfiguration events and normal events are not controlled. To guarantee the safety and correctness of such complex systems, formal verification is of great importance during a system design stage. This paper extends the formalism reconfigurable timed net condition/event systems (R-TNCESs in order to model all possible dynamic behavior in such systems. After that, the designed system based on extended R-TNCESs is verified with the help of a software tool SESA for functional, temporal, and energy-efficient properties. This paper is illustrated by an automatic assembly system.

  10. Managing the Verification Trajectory

    NARCIS (Netherlands)

    Ruys, T.C.; Brinksma, Hendrik

    In this paper we take a closer look at the automated analysis of designs, in particular of verification by model checking. Model checking tools are increasingly being used for the verification of real-life systems in an industrial context. In addition to ongoing research aimed at curbing the

  11. Development of a Compton camera for online ion beam range verification via prompt γ detection

    Energy Technology Data Exchange (ETDEWEB)

    Aldawood, S. [LMU Munich, Garching (Germany); King Saud University, Riyadh (Saudi Arabia); Liprandi, S.; Marinsek, T.; Bortfeldt, J.; Lang, C.; Lutter, R.; Dedes, G.; Parodi, K.; Thirolf, P.G. [LMU Munich, Garching (Germany); Maier, L.; Gernhaeuser, R. [TU Munich, Garching (Germany); Kolff, H. van der [LMU Munich, Garching (Germany); TU Delft (Netherlands); Castelhano, I. [LMU Munich, Garching (Germany); University of Lisbon, Lisbon (Portugal); Schaart, D.R. [TU Delft (Netherlands)

    2015-07-01

    Precise and preferably online ion beam range verification is a mandatory prerequisite to fully exploit the advantages of hadron therapy in cancer treatment. An imaging system is being developed in Garching aiming to detect promptγ rays induced by nuclear reactions between the ion beam and biological tissue. The Compton camera prototype consists of a stack of six customized double-sided Si-strip detectors (DSSSD, 50 x 50 mm{sup 2}, 0.5 mm thick, 128 strips/side) acting as scatterer, while the absorber is formed by a monolithic LaBr{sub 3}:Ce scintillator crystal (50 x 50 x 30 mm{sup 3}) read out by a position-sensitive multi-anode photomultiplier (Hamamatsu H9500). The on going characterization of the Compton camera properties and its individual components both offline in the laboratory as well as online using proton beam are presented.

  12. Model-based verification method for solving the parameter uncertainty in the train control system

    International Nuclear Information System (INIS)

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  13. Software verification and testing

    Science.gov (United States)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  14. From model conception to verification and validation, a global approach to multiphase Navier-Stoke models with an emphasis on volcanic explosive phenomenology

    Energy Technology Data Exchange (ETDEWEB)

    Dartevelle, Sebastian

    2007-10-01

    Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards. In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key

  15. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nielsen, Kim [Ramboll, Copenhagen (Denmark); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bunnik, Tim [MARIN (Netherlands); Touzon, Imanol [Tecnalia (Spain); Nam, Bo Woo [KRISO (Korea, Rep. of); Kim, Jeong Seok [KRISO (Korea, Rep. of); Janson, Carl Erik [Chalmers University (Sweden); Jakobsen, Ken-Robert [EDRMedeso (Norway); Crowley, Sarah [WavEC (Portugal); Vega, Luis [Hawaii Natural Energy Institute (United States); Rajagopalan, Krishnakimar [Hawaii Natural Energy Institute (United States); Mathai, Thomas [Glosten (United States); Greaves, Deborah [Plymouth University (United Kingdom); Ransley, Edward [Plymouth University (United Kingdom); Lamont-Kane, Paul [Queen' s University Belfast (United Kingdom); Sheng, Wanan [University College Cork (Ireland); Costello, Ronan [Wave Venture (United Kingdom); Kennedy, Ben [Wave Venture (United Kingdom); Thomas, Sarah [Floating Power Plant (Denmark); Heras, Pilar [Floating Power Plant (Denmark); Bingham, Harry [Technical University of Denmark (Denmark); Kurniawan, Adi [Aalborg University (Denmark); Kramer, Morten Mejlhede [Aalborg University (Denmark); Ogden, David [INNOSEA (France); Girardin, Samuel [INNOSEA (France); Babarit, Aurelien [EC Nantes (France); Wuillaume, Pierre-Yves [EC Nantes (France); Steinke, Dean [Dynamic Systems Analysis (Canada); Roy, Andre [Dynamic Systems Analysis (Canada); Beatty, Scott [Cascadia Coast Research (Canada); Schofield, Paul [ANSYS (United States); Kim, Kyong-Hwan [KRISO (Korea, Rep. of); Jansson, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden); BCAM (Spain); Hoffman, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden)

    2017-10-16

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30) [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.

  16. Raman laser spectrometer optical head: qualification model assembly and integration verification

    Science.gov (United States)

    Ramos, G.; Sanz-Palomino, M.; Moral, A. G.; Canora, C. P.; Belenguer, T.; Canchal, R.; Prieto, J. A. R.; Santiago, A.; Gordillo, C.; Escribano, D.; Lopez-Reyes, G.; Rull, F.

    2017-08-01

    Raman Laser Spectrometer (RLS) is the Pasteur Payload instrument of the ExoMars mission, within the ESA's Aurora Exploration Programme, that will perform for the first time in an out planetary mission Raman spectroscopy. RLS is composed by SPU (Spectrometer Unit), iOH (Internal Optical Head), and ICEU (Instrument Control and Excitation Unit). iOH focuses the excitation laser on the samples (excitation path), and collects the Raman emission from the sample (collection path, composed on collimation system and filtering system). Its original design presented a high laser trace reaching to the detector, and although a certain level of laser trace was required for calibration purposes, the high level degrades the Signal to Noise Ratio confounding some Raman peaks. So, after the bread board campaign, some light design modifications were implemented in order to fix the desired amount of laser trace, and after the fabrication and the commitment of the commercial elements, the assembly and integration verification process was carried out. A brief description of the iOH design update for the engineering and qualification model (iOH EQM) as well as the assembly process are briefly described in this papers. In addition, the integration verification and the first functional tests, carried out with the RLS calibration target (CT), results are reported on.

  17. Development of a test system for verification and validation of nuclear transport simulations

    Energy Technology Data Exchange (ETDEWEB)

    White, Morgan C [Los Alamos National Laboratory; Triplett, Brian S [GENERAL ELECTRIC; Anghaie, Samim [UNIV OF FL

    2008-01-01

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National laboratory in collaboration with the University of Florida has developed a methodology to automate the process of nuclear data verification and validation (V and V). This automated V and V process can efficiently test a number of data libraries using well defined benchmark experiments, such as those in the International Criticality Safety Benchmark Experiment Project (ICSBEP). The process is implemented through an integrated set of Pyton scripts. Material and geometry data are read from an existing medium or given directly by the user to generate a benchmark experiment template file. The user specifies the choice of benchmark templates, codes, and libraries to form a V and V project. The Python scripts generate input decks for multiple transport codes from the templates, run and monitor individual jobs, and parse the relevant output automatically. The output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. The resource savings by using this automated methodology could potentially be an enabling technology for more sophisticated data studies, such as nuclear data uncertainty quantification. Once deployed, this tool will allow the nuclear data community to more thoroughly test data libraries leading to higher fidelity data in the future.

  18. Development of a test suite for nuclear data verification and validation

    International Nuclear Information System (INIS)

    Triplett, Brian; Anghaie, Samim; White, Morgan

    2008-01-01

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National Laboratory in collaboration with the University of Florida has developed a methodology to automate the process of nuclear data verification and validation (V and V). This automated V and V process can efficiently test a number of data libraries using well defined benchmark experiments, such as those in the International Criticality Safety Benchmark Experiment Project (ICSBEP). The process is implemented through an integrated set of Python scripts. Material and geometry data are read from an existing medium or given directly by the user to generate a benchmark experiment template file. The user specifies the choice of benchmark templates, codes, and libraries to form a V and V project. The Python scripts generate input decks for multiple transport codes from the templates, run and monitor individual jobs, and parse the relevant output automatically. The output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. The resource savings by using this automated methodology could potentially be an enabling technology for more sophisticated data studies, such as nuclear data uncertainty quantification. Once deployed, this tool will allow the nuclear data community to more thoroughly test data libraries leading to higher fidelity data in the future. (authors)

  19. Verification of the Hydrodynamic and Sediment Transport Hybrid Modeling System for Cumberland Sound and Kings Bay Navigation Channel, Georgia

    Science.gov (United States)

    1989-07-01

    different loca- tions and should not be directly evaluated. These data are provided for 72 8’ 0�PHSICAL MODEL 0*0NUMERICAL MODEL E HIGH WATER LE 6...TII. JM~ls a. Station 843 MIODEL. VERIFICATION - 1985 GEOMSETRY tt m •u - FIELD 61*110K le 4. a." -- 0 . 0 p-iOi" Plate \\1. is. in. 14. 1. Is. U. 94...STOTION 3n 4. a. L 0 MODE VEIICTO 195GOMTYw"" l -. L I 31 -6. 10. In. 14. 18. 18. a. *. 14. MIEL TIM. HMES a. Station 396 MI’ODEL VERIFICATION - 1985

  20. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    Science.gov (United States)

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-01-01

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949

  1. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

    Directory of Open Access Journals (Sweden)

    Shrirang Ambaji KULKARNI

    2017-04-01

    Full Text Available Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique for routing in dynamic network. To analyze the correctness of Q-Routing algorithms mathematically, we provide a proof and also implement a SPIN based verification model. We also perform simulation based analysis of Q-Routing for given metrics.

  2. Experimental verification of dynamic radioecological models established after the Chernobyl reactor accident

    International Nuclear Information System (INIS)

    Voigt, G.; Mueller, H.; Proehl, G.; Stocke, H.; Paretzke, H.G.

    1991-01-01

    The experiments reported were carried out for a verification of existing, dynamic radioecological models, especially of the ECOSYS model. The database used for the verification covers the radioactivity concentrations of Cs-134, Cs-137, I-131 measured after the Chernobyl reactor accident in foodstuffs and environmental samples, the results of field experiments on radionuclide translocation after foliar uptake or absorption by the roots of edible plants. The measured data were compared with the model predictions for the radionuclides under review. The Cs-134 and Cs-137 translocation factors which describe the redistribution of these radionuclides in the plant after foliar uptake were experimentally determined by a single sprinkling with Chernobyl rainwater, and were measured to be the following as a function of sprinkling time: winter wheat, 0.002-0.13; spring wheat, 0.003-0.09; winter rye, 0.002-0.27; barley, 0.002-0.04; potatoes, 0.05-0.35; carrots, 0.02-0.07; bush beans, 0.04-0.3; cabbage, 0.1-0.5. The weathering half-life of the radionuclides in lettuce was determined to be ten days. Transfer factors determined for root absorption of Cs-137 were measured to be an average of 0.002 for grains, 0.002 for potatoes, 0.004 for white cabbage, 0.003 for bush beans and carrots, and 0.007 for lettuce. There was an agreement between the ECOSYS model predictions and the measured radioactivity concentrations of the corresponding radionuclides. (orig./HP) [de

  3. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  4. A systematic approach for model verification: application on seven published activated sludge models.

    Science.gov (United States)

    Hauduc, H; Rieger, L; Takács, I; Héduit, A; Vanrolleghem, P A; Gillot, S

    2010-01-01

    The quality of simulation results can be significantly affected by errors in the published model (typing, inconsistencies, gaps or conceptual errors) and/or in the underlying numerical model description. Seven of the most commonly used activated sludge models have been investigated to point out the typing errors, inconsistencies and gaps in the model publications: ASM1; ASM2d; ASM3; ASM3 + Bio-P; ASM2d + TUD; New General; UCTPHO+. A systematic approach to verify models by tracking typing errors and inconsistencies in model development and software implementation is proposed. Then, stoichiometry and kinetic rate expressions are checked for each model and the errors found are reported in detail. An attached spreadsheet (see http://www.iwaponline.com/wst/06104/0898.pdf) provides corrected matrices with the calculations of all stoichiometric coefficients for the discussed biokinetic models and gives an example of proper continuity checks.

  5. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  6. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  7. Operational Characteristics Identification and Simulation Model Verification for Incheon International Airport

    Science.gov (United States)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Zhu, Zhifan; Jung, Yoon C.; Jeong, Myeongsook; Kim, Hyounkyong; Oh, Eunmi; Hong, Sungkwon; Lee, Junwon

    2016-01-01

    integrated into NASA's Airspace Technology Demonstration-2 (ATD-2) project for technology demonstration of Integrated Arrival-Departure-Surface (IADS) operations at CLT. This study is a part of the international research collaboration between KAIA (Korea Agency for Infrastructure Technology Advancement), KARI (Korea Aerospace Research Institute) and NASA, which is being conducted to validate the effectiveness of SARDA concept as a controller decision support tool for departure and surface management of ICN. This paper presents the preliminary results of the collaboration effort. It includes investigation of the operational environment of ICN, data analysis for identification of the operational characteristics of the airport, construction and verification of airport simulation model using Surface Operations Simulator and Scheduler (SOSS), NASA's fast-time simulation tool.

  8. Community Radiative Transfer Model for Inter-Satellites Calibration and Verification

    Science.gov (United States)

    Liu, Q.; Nalli, N. R.; Ignatov, A.; Garrett, K.; Chen, Y.; Weng, F.; Boukabara, S. A.; van Delst, P. F.; Groff, D. N.; Collard, A.; Joseph, E.; Morris, V. R.; Minnett, P. J.

    2014-12-01

    Developed at the Joint Center for Satellite Data Assimilation, the Community Radiative Transfer Model (CRTM) [1], operationally supports satellite radiance assimilation for weather forecasting. The CRTM also supports JPSS/NPP and GOES-R missions [2] for instrument calibration, validation, monitoring long-term trending, and satellite retrieved products [3]. The CRTM is used daily at the NOAA NCEP to quantify the biases and standard deviations between radiance simulations and satellite radiance measurements in a time series and angular dependency. The purposes of monitoring the data assimilation system are to ensure the proper performance of the assimilation system and to diagnose problems with the system for future improvements. The CRTM is a very useful tool for cross-sensor verifications. Using the double difference method, it can remove the biases caused by slight differences in spectral response and geometric angles between measurements of the two instruments. The CRTM is particularly useful to reduce the difference between instruments for climate studies [4]. In this study, we will carry out the assessment of the Suomi National Polar-orbiting Partnership (SNPP) [5] Cross-track Infrared Sounder (CrIS) data [6], Advanced Technology Microwave Sounder (ATMS) data, and data for Visible Infrared Imaging Radiometer Suite (VIIRS) [7][8] thermal emissive bands. We use dedicated radiosondes and surface data acquired from NOAA Aerosols and Ocean Science Expeditions (AEROSE) [9]. The high quality radiosondes were launched when Suomi NPP flew over NOAA Ship Ronald H. Brown situated in the tropical Atlantic Ocean. The atmospheric data include profiles of temperature, water vapor, and ozone, as well as total aerosol optical depths. The surface data includes air temperature and humidity at 2 meters, skin temperature (Marine Atmospheric Emitted Radiance Interferometer, M-AERI [10]), surface temperature, and surface wind vector. [1] Liu, Q., and F. Weng, 2006: JAS [2] Liu, Q

  9. [Verification of the VEF photon beam model for dose calculations by the Voxel-Monte-Carlo-Algorithm].

    Science.gov (United States)

    Kriesen, Stephan; Fippel, Matthias

    2005-01-01

    The VEF linac head model (VEF, virtual energy fluence) was developed at the University of Tübingen to determine the primary fluence for calculations of dose distributions in patients by the Voxel-Monte-Carlo-Algorithm (XVMC). This analytical model can be fitted to any therapy accelerator head by measuring only a few basic dose data; therefore, time-consuming Monte-Carlo simulations of the linac head become unnecessary. The aim of the present study was the verification of the VEF model by means of water-phantom measurements, as well as the comparison of this system with a common analytical linac head model of a commercial planning system (TMS, formerly HELAX or MDS Nordion, respectively). The results show that both the VEF and the TMS models can very well simulate the primary fluence. However, the VEF model proved superior in the simulations of scattered radiation and in the calculations of strongly irregular MLC fields. Thus, an accurate and clinically practicable tool for the determination of the primary fluence for Monte-Carlo-Simulations with photons was established, especially for the use in IMRT planning.

  10. Verification of the VEF photon beam model for dose calculations by the voxel-Monte-Carlo-algorithm

    International Nuclear Information System (INIS)

    Kriesen, S.; Fippel, M.

    2005-01-01

    The VEF linac head model (VEF, virtual energy fluence) was developed at the University of Tuebingen to determine the primary fluence for calculations of dose distributions in patients by the Voxel-Monte-Carlo-Algorithm (XVMC). This analytical model can be fitted to any therapy accelerator head by measuring only a few basic dose data; therefore, time-consuming Monte-Carlo simulations of the linac head become unnecessary. The aim of the present study was the verification of the VEF model by means of water-phantom measurements, as well as the comparison of this system with a common analytical linac head model of a commercial planning system (TMS, formerly HELAX or MDS Nordion, respectively). The results show that both the VEF and the TMS models can very well simulate the primary fluence. However, the VEF model proved superior in the simulations of scattered radiation and in the calculations of strongly irregular MLC fields. Thus, an accurate and clinically practicable tool for the determination of the primary fluence for Monte-Carlo-Simulations with photons was established, especially for the use in IMRT planning. (orig.)

  11. Real-Time Kennedy Space Center and Cape Canaveral Air Force Station High-Resolution Model Implementation and Verification

    Science.gov (United States)

    Shafer, Jaclyn A.; Watson, Leela R.

    2015-01-01

    Customer: NASA's Launch Services Program (LSP), Ground Systems Development and Operations (GSDO), and Space Launch System (SLS) programs. NASA's LSP, GSDO, SLS and other programs at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) use the daily and weekly weather forecasts issued by the 45th Weather Squadron (45 WS) as decision tools for their day-to-day and launch operations on the Eastern Range (ER). For example, to determine if they need to limit activities such as vehicle transport to the launch pad, protect people, structures or exposed launch vehicles given a threat of severe weather, or reschedule other critical operations. The 45 WS uses numerical weather prediction models as a guide for these weather forecasts, particularly the Air Force Weather Agency (AFWA) 1.67 kilometer Weather Research and Forecasting (WRF) model. Considering the 45 WS forecasters' and Launch Weather Officers' (LWO) extensive use of the AFWA model, the 45 WS proposed a task at the September 2013 Applied Meteorology Unit (AMU) Tasking Meeting requesting the AMU verify this model. Due to the lack of archived model data available from AFWA, verification is not yet possible. Instead, the AMU proposed to implement and verify the performance of an ER version of the AMU high-resolution WRF Environmental Modeling System (EMS) model (Watson 2013) in real-time. The tasking group agreed to this proposal; therefore the AMU implemented the WRF-EMS model on the second of two NASA AMU modeling clusters. The model was set up with a triple-nested grid configuration over KSC/CCAFS based on previous AMU work (Watson 2013). The outer domain (D01) has 12-kilometer grid spacing, the middle domain (D02) has 4-kilometer grid spacing, and the inner domain (D03) has 1.33-kilometer grid spacing. The model runs a 12-hour forecast every hour, D01 and D02 domain outputs are available once an hour and D03 is every 15 minutes during the forecast period. The AMU assessed the WRF-EMS 1

  12. Design, implementation and verification of software code for radiation dose assessment based on simple generic environmental model

    International Nuclear Information System (INIS)

    I Putu Susila; Arif Yuniarto

    2017-01-01

    Radiation dose assessment to determine the potential of radiological impacts of various installations within nuclear facility complex is necessary to ensure environmental and public safety. A simple generic model-based method for calculating radiation doses caused by the release of radioactive substances into the environment has been published by the International Atomic Energy Agency (IAEA) as the Safety Report Series No. 19 (SRS-19). In order to assist the application of the assessment method and a basis for the development of more complex assessment methods, an open-source based software code has been designed and implemented. The software comes with maps and is very easy to be used because assessment scenarios can be done through diagrams. Software verification was performed by comparing its result to SRS-19 and CROM software calculation results. Dose estimated by SRS-19 are higher compared to the result of developed software. However, these are still acceptable since dose estimation in SRS-19 is based on conservative approach. On the other hand, compared to CROM software, the same results for three scenarios and a non-significant difference of 2.25 % in another scenario were obtained. These results indicate the correctness of our implementation and implies that the developed software is ready for use in real scenario. In the future, the addition of various features and development of new model need to be done to improve the capability of software that has been developed. (author)

  13. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    Science.gov (United States)

    Joseph, Shijo; Herold, Martin; Sunderlin, William D.; Verchot, Louis V.

    2013-09-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed.

  14. Metal Fuel Development and Verification for Prototype Generation IV Sodium-Cooled Fast Reactor

    Directory of Open Access Journals (Sweden)

    Chan Bock Lee

    2016-10-01

    Full Text Available Metal fuel is being developed for the prototype generation-IV sodium-cooled fast reactor (PGSFR to be built by 2028. U–Zr fuel is a driver for the initial core of the PGSFR, and U–transuranics (TRU–Zr fuel will gradually replace U–Zr fuel through its qualification in the PGSFR. Based on the vast worldwide experiences of U–Zr fuel, work on U–Zr fuel is focused on fuel design, fabrication of fuel components, and fuel verification tests. U–TRU–Zr fuel uses TRU recovered through pyroelectrochemical processing of spent PWR (pressurized water reactor fuels, which contains highly radioactive minor actinides and chemically active lanthanide or rare earth elements as carryover impurities. An advanced fuel slug casting system, which can prevent vaporization of volatile elements through a control of the atmospheric pressure of the casting chamber and also deal with chemically active lanthanide elements using protective coatings in the casting crucible, was developed. Fuel cladding of the ferritic–martensitic steel FC92, which has higher mechanical strength at a high temperature than conventional HT9 cladding, was developed and fabricated, and is being irradiated in the fast reactor.

  15. Metal fuel development and verification for prototype generation- IV Sodium- Cooled Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chan Bock; Cheon, Jin Sik; Kim, Sung Ho; Park, Jeong Yong; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Metal fuel is being developed for the prototype generation-IV sodium-cooled fast reactor (PGSFR) to be built by 2028. U-Zr fuel is a driver for the initial core of the PGSFR, and U -transuranics (TRU)-Zr fuel will gradually replace U-Zr fuel through its qualification in the PGSFR. Based on the vast worldwide experiences of U-Zr fuel, work on U-Zr fuel is focused on fuel design, fabrication of fuel components, and fuel verification tests. U-TRU-Zr fuel uses TRU recovered through pyroelectrochemical processing of spent PWR (pressurized water reactor) fuels, which contains highly radioactive minor actinides and chemically active lanthanide or rare earth elements as carryover impurities. An advanced fuel slug casting system, which can prevent vaporization of volatile elements through a control of the atmospheric pressure of the casting chamber and also deal with chemically active lanthanide elements using protective coatings in the casting crucible, was developed. Fuel cladding of the ferritic-martensitic steel FC92, which has higher mechanical strength at a high temperature than conventional HT9 cladding, was developed and fabricated, and is being irradiated in the fast reactor.

  16. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    International Nuclear Information System (INIS)

    Joseph, Shijo; Sunderlin, William D; Verchot, Louis V; Herold, Martin

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed. (letter)

  17. Development of optical ground verification method for μm to sub-mm reflectors

    Science.gov (United States)

    Stockman, Y.; Thizy, C.; Lemaire, P.; Georges, M.; Mazy, E.; Mazzoli, A.; Houbrechts, Y.; Rochus, P.; Roose, S.; Doyle, D.; Ulbrich, G.

    2017-11-01

    develop and realise suitable verification tools based on infrared interferometry and other optical techniques for testing large reflector structures, telescope configurations and their performances under simulated space conditions. Two methods and techniques are developed at CSL. The first one is an IR-phase shifting interferometer with high spatial resolution. This interferometer shall be used specifically for the verification of high precision IR, FIR and sub-mm reflector surfaces and telescopes under both ambient and thermal vacuum conditions. The second one presented hereafter is a holographic method for relative shape measurement. The holographic solution proposed makes use of a home built vacuum compatible holographic camera that allows displacement measurements from typically 20 nanometres to 25 microns in one shot. An iterative process allows the measurement of a total of up to several mm of deformation. Uniquely the system is designed to measure both specular and diffuse surfaces.

  18. Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.

    Science.gov (United States)

    Chen, Joseph C.; Chang, Ted C.

    2000-01-01

    Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)

  19. Modeling and verification of the diffraction-limited visible light telescope aboard the solar observing satellite HINODE

    Science.gov (United States)

    Katsukawa, Y.; Suematsu, Y.; Tsuneta, S.; Ichimoto, K.; Shimizu, T.

    2011-09-01

    HINODE, Japanese for "sunrise", is a spacecraft dedicated for observations of the Sun, and was launched in 2006 to study the Sun's magnetic fields and how their explosive energies propagate through the different atmospheric layers. The spacecraft carries the Solar Optical Telescope (SOT), which has a 50 cm diameter clear aperture and provides a continuous series of diffraction-limited visible light images from space. The telescope was developed through international collaboration between Japan and US. In order to achieve the diffraction-limited performance, thermal and structural modeling of the telescope was extensively used in its development phase to predict how the optical performance changes dependent on the thermal condition in orbit. Not only the modeling, we devoted many efforts to verify the optical performance in ground tests before the launch. The verification in the ground tests helped us to find many issues, such as temperature dependent focus shifts, which were not identified only through the thermal-structural modeling. Another critical issue was micro-vibrations induced by internal disturbances of mechanical gyroscopes and momentum wheels for attitude control of the spacecraft. Because the structural modeling was not accurate enough to predict how much the image quality was degraded by the micro-vibrations, we measured their transmission in a spacecraft-level test.

  20. Statistical Modeling, Simulation, and Experimental Verification of Wideband Indoor Mobile Radio Channels

    Directory of Open Access Journals (Sweden)

    Yuanyuan Ma

    2018-01-01

    Full Text Available This paper focuses on the modeling, simulation, and experimental verification of wideband single-input single-output (SISO mobile fading channels for indoor propagation environments. The indoor reference channel model is derived from a geometrical rectangle scattering model, which consists of an infinite number of scatterers. It is assumed that the scatterers are exponentially distributed over the two-dimensional (2D horizontal plane of a rectangular room. Analytical expressions are derived for the probability density function (PDF of the angle of arrival (AOA, the PDF of the propagation path length, the power delay profile (PDP, and the frequency correlation function (FCF. An efficient sum-of-cisoids (SOC channel simulator is derived from the nonrealizable reference model by employing the SOC principle. It is shown that the SOC channel simulator approximates closely the reference model with respect to the FCF. The SOC channel simulator enables the performance evaluation of wideband indoor wireless communication systems with reduced realization expenditure. Moreover, the rationality and usefulness of the derived indoor channel model is confirmed by various measurements at 2.4, 5, and 60 GHz.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE — BAYSAVER TECHNOLOGIES, INC. BAYSAVER SEPARATION SYSTEM, MODEL 10K

    Science.gov (United States)

    Verification testing of the BaySaver Separation System, Model 10K was conducted on a 10 acre drainage basin near downtown Griffin, Georgia. The system consists of two water tight pre-cast concrete manholes and a high-density polyethylene BaySaver Separator Unit. The BaySaver Mod...

  2. Implementation and verification of a coupled fire model as a thermal boundary condition within P3/THERMAL

    International Nuclear Information System (INIS)

    Hensinger, D.M.; Gritzo, L.A.; Koski, J.A.

    1996-01-01

    A user-defined boundary condition subroutine has been implemented within P3/THERMAL to represent the heat flux between a noncombusting object and an engulfing fire. The heat flux calculations includes a simple 2D fire model in which energy and radiative heat transport equations are solved to produce estimates of the heat fluxes at the fire-object interface. These estimates reflect radiative coupling between a cold object and the flow of hot combustion gases which has been observed in fire experiments. The model uses a database of experimental pool fire measurements for far field boundary conditions and volumetric heat release rates. Taking into account the coupling between a structure and the fire is an improvement over the σT 4 approximation frequently used as a boundary condition for engineered system response and is the preliminary step in the development of a fire model with a predictive capability. This paper describes the implementation of the fire model as a P3/THERMAL boundary condition and presents the results of a verification calculation carried out using the model

  3. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  4. Fiction and reality in the modelling world - Balance between simplicity and complexity, calibration and identifiability, verification and falsification

    DEFF Research Database (Denmark)

    Harremoës, P.; Madsen, H.

    1999-01-01

    Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable by calibr...... and to incorporate that in the design, operation and control of urban drainage structures. (C) 1999 IAWQ Published by Elsevier Science Ltd. All rights reserved....

  5. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  6. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  7. Use of an Existing Airborne Radon Data Base in the Verification of the NASA/AEAP Core Model

    Science.gov (United States)

    Kritz, Mark A.

    1998-01-01

    The primary objective of this project was to apply the tropospheric atmospheric radon (Rn222) measurements to the development and verification of the global 3-D atmospheric chemical transport model under development by NASA's Atmospheric Effects of Aviation Project (AEAP). The AEAP project had two principal components: (1) a modeling effort, whose goal was to create, test and apply an elaborate three-dimensional atmospheric chemical transport model (the NASA/AEAP Core model to an evaluation of the possible short and long-term effects of aircraft emissions on atmospheric chemistry and climate--and (2) a measurement effort, whose goal was to obtain a focused set of atmospheric measurements that would provide some of the observational data used in the modeling effort. My activity in this project was confined to the first of these components. Both atmospheric transport and atmospheric chemical reactions (as well the input and removal of chemical species) are accounted for in the NASA/AEAP Core model. Thus, for example, in assessing the effect of aircraft effluents on the chemistry of a given region of the upper troposphere, the model must keep track not only of the chemical reactions of the effluent species emitted by aircraft flying in this region, but also of the transport into the region of these (and other) species from other, remote sources--for example, via the vertical convection of boundary layer air to the upper troposphere. Radon, because of its known surface source and known radioactive half-life, and freedom from chemical production or loss, and from removal from the atmosphere by physical scavenging, is a recognized and valuable tool for testing the transport components of global transport and circulation models.

  8. The middle range verification of numerical model performance for heavy rainfall in North China

    Science.gov (United States)

    Zhang, Bo; Zhao, Bin; Niu, Ruoyun

    2017-04-01

    The heavy rainfall forecast in North China is the focus and difficulty in middle range numerical weather forecast. 70 typical heavy precipitation cases in North China in summer from 2010 to 2016 are selected, which are divided into vortex type, the west trough and shear line type according to the atmospheric circulation. Based on ECMWF model and the Chinese operational model T639, a spatial verification method MODE is used, the middle range precipitation forecast abilities for heavy rain in summer in North China are evaluated according to contrast the difference of centroidal distance, axis angel and aspect ratios. It is found that the ECMWF model and the T639 model all show weak predictive ability for the low-vortex-type heavy rainfall in Northern China from all the similarities. When the area of rainfall is larger, the precipitation patterns of the two models are mostly northeast-southwest. It is consistent with the actual situation. For a large area of precipitation area, both models predict the precipitation area aspect ratio is less than 1. It shows that precipitation drop area is long and narrow, and the forecast is also consistent with the actual situation. However, as far as T639 and ECMWF models are concerned, there are systematic deviations in the precipitation area, and the predicted precipitation area is located on the southwestern side of the field. For smaller/larger areas of precipitation, the predicted precipitation area is larger/smaller than the actual situation. In addition, a sensitive test for the regional heavy precipitation process in North China (such as Huanghuai and other regions) from July 18 to 20, 2016 is also done and the results show that each numerical model of the process prediction is not successful. Therefore, further research is needed on the future correction of systematic bias of numerical models of regional heavy precipitation in medium-term forecasters.

  9. Development of verification program for safety evaluation of KNGR on-site and off-site power system design

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kem Joong; Ryu, Eun Sook; Choi, Jang Hong; Lee, Byung Il; Han, Hyun Kyu; Oh, Seong Kyun; Kim, Han Kee; Park, Chul Woo; Kim, Min Jeong [Chungnam National Univ., Taejon (Korea, Republic of)

    2001-04-15

    In order to verify the adequacy of the design and analysis of the on-site and off-site power system, we developed the regulatory analysis program. We established the methodology for electric power system and constructed the algorithm of steady-state load flow analysis, fault analysis, transient stability analysis. The developed program to be an advantage of GUI and C++ programming technique. The design of input made easy to access the common use PSS/E format and that of output made users to work with Excel spreadsheet. The performance of program was verified to compare with PSS/E results. The case studies as follows. The verification of load flow analysis of KNGR on-site power system. The evaluation of load flow and transient stability analysis of off-site power system of KNGR. The verification of load flow and transient stability analysis. The frequency drop analysis of loss of generation.

  10. Modelling horizontal steam generator with ATHLET. Verification of different nodalization schemes and implementation of verified constitutive equations

    Energy Technology Data Exchange (ETDEWEB)

    Beliaev, J.; Trunov, N.; Tschekin, I. [OKB Gidropress (Russian Federation); Luther, W. [GRS Garching (Germany); Spolitak, S. [RNC-KI (Russian Federation)

    1995-12-31

    Currently the ATHLET code is widely applied for modelling of several Power Plants of WWER type with horizontal steam generators. A main drawback of all these applications is the insufficient verification of the models for the steam generator. This paper presents the nodalization schemes for the secondary side of the steam generator, the results of stationary calculations, and preliminary comparisons to experimental data. The consideration of circulation in the water inventory of the secondary side is proved to be necessary. (orig.). 3 refs.

  11. Thermal Pollution Mathematical Model. Volume 2; Verification of One-Dimensional Numerical Model at Lake Keowee

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1980-01-01

    A one dimensional model for studying the thermal dynamics of cooling lakes was developed and verified. The model is essentially a set of partial differential equations which are solved by finite difference methods. The model includes the effects of variation of area with depth, surface heating due to solar radiation absorbed at the upper layer, and internal heating due to the transmission of solar radiation to the sub-surface layers. The exchange of mechanical energy between the lake and the atmosphere is included through the coupling of thermal diffusivity and wind speed. The effects of discharge and intake by power plants are also included. The numerical model was calibrated by applying it to Cayuga Lake. The model was then verified through a long term simulation using Lake Keowee data base. The comparison between measured and predicted vertical temperature profiles for the nine years is good. The physical limnology of Lake Keowee is presented through a set of graphical representations of the measured data base.

  12. Verification test of the SURF and SURFplus models in xRage

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-18

    As a verification test of the SURF and SURFplus models in the xRage code we use a propagating underdriven detonation wave in 1-D. This is about the only test cases for which an accurate solution can be determined based on the theoretical structure of the solution. The solution consists of a steady ZND reaction zone profile joined with a scale invariant rarefaction or Taylor wave and followed by a constant state. The end of the reaction profile and the head of the rarefaction coincide with the sonic CJ state of the detonation wave. The constant state is required to match a rigid wall boundary condition. For a test case, we use PBX 9502 with the same EOS and burn rate as previously used to test the shock detector algorithm utilized by the SURF model. The detonation wave is propagated for 10 μs (slightly under 80mm). As expected, the pointwise errors are largest in the neighborhood of discontinuities; pressure discontinuity at the lead shock front and pressure derivative discontinuities at the head and tail of the rarefaction. As a quantitative measure of the overall accuracy, the L2 norm of the difference of the numerical pressure and the exact solution is used. Results are presented for simulations using both a uniform grid and an adaptive grid that refines the reaction zone.

  13. Development and verification of a modular program system for calculation of the long-time behavior of pressurized water reactors

    International Nuclear Information System (INIS)

    Woerner, A.

    1984-01-01

    For the fuel management of nuclear power plants the time dependent fuel composition and power distribution within the reactor core has to be determined with high accuracy. The calculation of this long-time behaviour of the pressurized water reactors, taking into account the thermohydraulic feedback on the above mentioned distributions, has been integrated in a modular program system. The essential part of this program system is a newly developped nodal diffusion theory program which allows to describe approximately the local variation of cross sections within a node. The above mentioned computational method, making use of latest cross section data, permits reactor cycle calculations with high accuracy. This has been demonstrated by the verification of two reactor cycles of the Kernkraftwerk Obrigheim. In addition to this physical verification a benchmark problem comparison has been accomplished to prove the efficiency of the modular program system. (orig./HP) [de

  14. Mathematical modeling of a fluidized bed rice husk gasifier: Part 3 -- Model verification

    Energy Technology Data Exchange (ETDEWEB)

    Mansaray, K.G.; Ghaly, A.E.; Al-Taweel, A.M.; Ugursal, V.I.; Hamdullahpur, F.

    2000-04-01

    The validity of the two-compartment model developed for fluidized bed gasification of biomass was tested using experimental data obtained from a dual-distributor-type fluidized bed gasifier. The fluidized bed was operated on rice husks at various bed heights (19.5, 25.5, and 31.5 cm), fluidization velocities (0.22, 0.28, and 0.33 m/s), and equivalence ratios (0.25, 0.30, and 0.35). The model gave reasonable predictions of the core, annulus, and exit temperatures as well as the mole fractions of the combustible gas components and product gas higher heating value, except for the overall carbon conversion, which was overestimated. This could be attributed to uncertainties in the sampling procedure.

  15. Mathematical modeling of a fluidized bed rice husk gasifier: Part 3 - Model verification

    Energy Technology Data Exchange (ETDEWEB)

    Mansaray, K.G.; Ghaly, A.E.; Al-Taweel, A.M.; Ugursal, V.I.; Hamdullahpur, F.

    2000-03-01

    The validity of the two-compartment model developed for fluidized bed gasification of biomass was tested using experimental data obtained from a dual-distributor-type fluidized bed gasifier. The fluidized bed was operated on rice husks at various bed heights (19.5, 25.5, and 31.5 cm), fluidization velocities (0.22, 0.28, and 0.33 m/s), and equivalence ratios (0.25, 0.30, and 0.35). The model gave reasonable predictions of the core, annulus, and exit temperatures as well as the mole fractions of the combustible gas components and product gas higher heating value, except for the overall carbon conversion, which was overestimated. This could be attributed to uncertainties in the sampling procedure. (Author)

  16. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  17. Development of prompt gamma measurement system for in vivo proton beam range verification

    International Nuclear Information System (INIS)

    Min, Chul Hee

    2011-02-01

    In radiation therapy, most research has focused on reducing unnecessary radiation dose to normal tissues and critical organs around the target tumor volume. Proton therapy is considered to be one of the most promising radiation therapy methods with its physical characteristics in the dose distribution, delivering most of the dose just before protons come to rest at the so-named Bragg peak; that is, proton therapy allows for a very high radiation dose to the tumor volume, effectively sparing adjacent critical organs. However, the uncertainty in the location of the Bragg peak, coming from not only the uncertainty in the beam delivery system and the treatment planning method but also anatomical changes and organ motions of a patient, could be a critical problem in proton therapy. In spite of the importance of the in vivo dose verification to prevent the misapplication of the Bragg peak and to guarantee both successful treatment and patient safety, there is no practical methodology to monitor the in vivo dose distribution, only a few attempts have been made so far. The present dissertation suggests the prompt gamma measurement method for monitoring of the in vivo proton dose distribution during treatment. As a key part of the process of establishing the utility of this method, the verification of the clear relationship between the prompt gamma distribution and the proton dose distribution was accomplished by means of Monte Carlo simulations and experimental measurements. First, the physical properties of prompt gammas were investigated on the basis of cross-section data and Monte Carlo simulations. Prompt gammas are generated mainly from proton-induced nuclear interactions, and then emitted isotropically in less than 10 -9 sec at energies up to 10 MeV. Simulation results for the prompt gamma yield of the major elements of a human body show that within the optimal energy range of 4-10 MeV the highest number of prompt gammas is generated from oxygen, whereas over the

  18. Developing Radioactive Carbon Isotope Tagging for Monitoring, Verification and Accounting in Geological Carbon Storage

    Science.gov (United States)

    Ji, Yinghuang

    In the wake of concerns about the long-term integrity and containment of sub-surface CO2 sequestration reservoirs, many efforts have been made to improve the monitoring, verification, and accounting methods for geo-sequestered CO2. This Ph.D. project has been part of a larger U.S. Department of Energy (DOE) sponsored research project to demonstrate the feasibility of a system designed to tag CO2 with radiocarbon at a concentration of one part per trillion, which is the ambient concentration of 14C in the modern atmosphere. Because carbon found at depth is naturally free of 14C, this tag would easily differentiate pre-existing carbon in the underground from anthropogenic, injected carbon and provide an excellent handle for monitoring its whereabouts in the subsurface. It also creates an excellent handle for adding up anthropogenic carbon inventories. Future inventories in effect count 14C atoms. Accordingly, we developed a 14C tagging system suitable for use at the part-per-trillion level. This tagging system uses small containers of tracer fluid of 14C enriched CO2. The content of these containers is transferred into a CO2 stream readied for underground injection in a controlled manner so as to tag it at the part-per-trillion level. These containers because of their shape are referred to in this document as tracer loops. The demonstration of the tracer injection involved three steps. First, a tracer loop filling station was designed and constructed featuring a novel membrane based gas exchanger, which degassed the fluid in the first step and then equilibrated the fluid with CO2 at fixed pressure and fixed temperature. It was demonstrated that this approach could achieve uniform solutions and prevent the formation of bubbles and degassing downstream. The difference between measured and expected results of the CO2 content in the tracer loop was below 1%. Second, a high-pressure flow loop was built for injecting, mixing, and sampling of the fast flowing stream of

  19. Introducing uncertainty of radar-rainfall estimates to the verification of mesoscale model precipitation forecasts

    Directory of Open Access Journals (Sweden)

    M. P. Mittermaier

    2008-05-01

    Full Text Available A simple measure of the uncertainty associated with using radar-derived rainfall estimates as "truth" has been introduced to the Numerical Weather Prediction (NWP verification process to assess the effect on forecast skill and errors. Deterministic precipitation forecasts from the mesoscale version of the UK Met Office Unified Model for a two-day high-impact event and for a month were verified at the daily and six-hourly time scale using a spatially-based intensity-scale method and various traditional skill scores such as the Equitable Threat Score (ETS and log-odds ratio. Radar-rainfall accumulations from the UK Nimrod radar-composite were used.

    The results show that the inclusion of uncertainty has some effect, shifting the forecast errors and skill. The study also allowed for the comparison of results from the intensity-scale method and traditional skill scores. It showed that the two methods complement each other, one detailing the scale and rainfall accumulation thresholds where the errors occur, the other showing how skillful the forecast is. It was also found that for the six-hourly forecasts the error distributions remain similar with forecast lead time but skill decreases. This highlights the difference between forecast error and forecast skill, and that they are not necessarily the same.

  20. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  1. Spot scanning proton therapy plan assessment: design and development of a dose verification application for use in routine clinical practice

    Science.gov (United States)

    Augustine, Kurt E.; Walsh, Timothy J.; Beltran, Chris J.; Stoker, Joshua B.; Mundy, Daniel W.; Parry, Mark D.; Bues, Martin; Fatyga, Mirek

    2016-04-01

    The use of radiation therapy for the treatment of cancer has been carried out clinically since the late 1800's. Early on however, it was discovered that a radiation dose sufficient to destroy cancer cells can also cause severe injury to surrounding healthy tissue. Radiation oncologists continually strive to find the perfect balance between a dose high enough to destroy the cancer and one that avoids damage to healthy organs. Spot scanning or "pencil beam" proton radiotherapy offers another option to improve on this. Unlike traditional photon therapy, proton beams stop in the target tissue, thus better sparing all organs beyond the targeted tumor. In addition, the beams are far narrower and thus can be more precisely "painted" onto the tumor, avoiding exposure to surrounding healthy tissue. To safely treat patients with proton beam radiotherapy, dose verification should be carried out for each plan prior to treatment. Proton dose verification systems are not currently commercially available so the Department of Radiation Oncology at the Mayo Clinic developed its own, called DOSeCHECK, which offers two distinct dose simulation methods: GPU-based Monte Carlo and CPU-based analytical. The three major components of the system include the web-based user interface, the Linux-based dose verification simulation engines, and the supporting services and components. The architecture integrates multiple applications, libraries, platforms, programming languages, and communication protocols and was successfully deployed in time for Mayo Clinic's first proton beam therapy patient. Having a simple, efficient application for dose verification greatly reduces staff workload and provides additional quality assurance, ultimately improving patient safety.

  2. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  3. Modeling and experimental verification of laser self-mixing interference phenomenon with the structure of two-external-cavity feedback

    Science.gov (United States)

    Chen, Peng; Liu, Yuwei; Gao, Bingkun; Jiang, Chunlei

    2018-03-01

    A semiconductor laser employed with two-external-cavity feedback structure for laser self-mixing interference (SMI) phenomenon is investigated and analyzed. The SMI model with two directions based on F-P cavity is deduced, and numerical simulation and experimental verification were conducted. Experimental results show that the SMI with the structure of two-external-cavity feedback under weak light feedback is similar to the sum of two SMIs.

  4. Modeling radon entry into houses with basements: Model description and verification

    International Nuclear Information System (INIS)

    Revzan, K.L.; Fisk, W.J.; Gadgil, A.J.

    1991-01-01

    We model radon entry into basements using a previously developed three-dimensional steady-state finite difference model that has been modified in the following ways: first, cylindrical coordinates are used to take advantage of the symmetry of the problem in the horizontal plant; second, the configuration of the basement has been made more realistic by incorporating the concrete footer; third, a quadratic relationship between the pressure and flow in the L-shaped gap between slab, footer, and wall has been employed; fourth, the natural convection of the soil gas which follows from the heating of the basement in winter has been taken into account. The temperature field in the soil is determined from the equation of energy conservation, using the basement, surface, and deep-soil temperatures as boundary conditions. The pressure field is determined from Darcy's law and the equation of mass conservation (continuity), assuming that there is no flow across any boundary except the soil surface (atmospheric pressure) and the opening in the basement shell (fixed pressure). After the pressure and temperatures field have been obtained the velocity field is found from Darcy's law. Finally, the radon concentration field is found from the equation of mass-transport. The convective radon entry rate through the opening or openings is then calculated. In this paper we describe the modified model, compare the predicted radon entry rates with and without the consideration of thermal convection, and compare the predicted rates with determined from data from 7 houses in the Spokane River valley of Washington and Idaho. Although the predicted rate is much lower than the mean of the rates determined from measurements, errors in the measurement of soil permeability and variations in the permeability of the area immediately under the basement slab, which has a significant influence on the pressure field, can account for the range of entry rates inferred from the data. 25 refs., 8 figs

  5. Development and verification of a high-performance CFRP structure for the space-borne lidar instrument ALADIN

    Science.gov (United States)

    Kaiser, Clemens; Widani, Christoph; Härtel, Klaus; Haberler, Peter; Lecrenier, Olivier; Buvat, Daniel; Labruyere, Gilles

    2017-11-01

    The paper gives an overview of the development of a high-performance space structure achieving an optimum combination of mass, stiffness and stability to cope with the very stringent performance requirements of ALADIN instrument, the space-borne LIDAR built for ADM-AEOLUS ESA's Earth Explorer Mission. Kayser-Threde has been contracted in 2003 by EADS Astrium Toulouse, ALADIN instrument Prime Contractor, for the Phase C/D of ALADIN Structure PFM. The contract with ASF comprised the detailed design, development, verification, PMP qualification, and MAIV program including ALADIN Structure qualification using a complete and fully representative STM.

  6. Interpreter composition issues in the formal verification of a processor-memory module

    Science.gov (United States)

    Fura, David A.; Cohen, Gerald C.

    1994-01-01

    This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.

  7. A dynamic human water and electrolyte balance model for verification and optimization of life support systems in space flight applications

    Science.gov (United States)

    Hager, P.; Czupalla, M.; Walter, U.

    2010-11-01

    In this paper we report on the development of a dynamic MATLAB SIMULINK® model for the water and electrolyte balance inside the human body. This model is part of an environmentally sensitive dynamic human model for the optimization and verification of environmental control and life support systems (ECLSS) in space flight applications. An ECLSS provides all vital supplies for supporting human life on board a spacecraft. As human space flight today focuses on medium- to long-term missions, the strategy in ECLSS is shifting to closed loop systems. For these systems the dynamic stability and function over long duration are essential. However, the only evaluation and rating methods for ECLSS up to now are either expensive trial and error breadboarding strategies or static and semi-dynamic simulations. In order to overcome this mismatch the Exploration Group at Technische Universität München (TUM) is developing a dynamic environmental simulation, the "Virtual Habitat" (V-HAB). The central element of this simulation is the dynamic and environmentally sensitive human model. The water subsystem simulation of the human model discussed in this paper is of vital importance for the efficiency of possible ECLSS optimizations, as an over- or under-scaled water subsystem would have an adverse effect on the overall mass budget. On the other hand water has a pivotal role in the human organism. Water accounts for about 60% of the total body mass and is educt and product of numerous metabolic reactions. It is a transport medium for solutes and, due to its high evaporation enthalpy, provides the most potent medium for heat load dissipation. In a system engineering approach the human water balance was worked out by simulating the human body's subsystems and their interactions. The body fluids were assumed to reside in three compartments: blood plasma, interstitial fluid and intracellular fluid. In addition, the active and passive transport of water and solutes between those

  8. Model and code development

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    Progress in model and code development for reactor physics calculations is summarized. The codes included CINDER-10, PHROG, RAFFLE GAPP, DCFMR, RELAP/4, PARET, and KENO. Kinetics models for the PBF were developed

  9. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    developed, all models constructed in each phase are verifiable. This requires that the modelling notations are formally defined and related in order to have tool support developed for the integration of sophisticated checkers, generators and transformations. This paper summarises our research on the method...... of Refinement of Component and Object Systems (rCOS) and illustrates it with experiences from the work on the Common Component Modelling Example (CoCoME). This gives evidence that the formal techniques developed in rCOS can be integrated into a model-driven development process and shows where it may...

  10. Optimal Calculation of Residuals for ARMAX Models with Applications to Model Verification

    DEFF Research Database (Denmark)

    Knudsen, Torben

    1997-01-01

    Residual tests for sufficient model orders are based on the assumption that prediction errors are white when the model is correct. If an ARMAX system has zeros in the MA part which are close to the unit circle, then the standard predictor can have large transients. Even when the correct model...

  11. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  12. A verification of the bakery protocol combining algebraic and model-oriented techniques

    NARCIS (Netherlands)

    C. Brovedani; A.S. Klusener (Steven)

    1996-01-01

    textabstractIn this paper we give a specification of the so called Bakery protocol in an extension of the process algebra ACP with abstract datatypes. We prove that this protocol is equal to a Queue, modulo branching bisimulation equivalence. The verification is as follows. First we give a linear

  13. Verification of COMDES-II Systems Using UPPAAL with Model Transformation

    DEFF Research Database (Denmark)

    Xu, Ke; Pettersson, Paul; Sierszecki, Krzysztof

    2008-01-01

    in a timed multitasking environment, modal continuous operation combining reactive control behavior with continuous data processing, etc., by following the principle of separation-of-concerns. In the paper we present a transformational approach to the formal verification of both timing and reactive behaviors...

  14. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  15. Bringing Automated Model Checking to PLC Program Development - A CERN Case Study

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. Model checking appears to be an appropriate approach for this purpose. However, this technique is not widely used in industry yet, due to some obstacles. The main obstacles encountered when trying to apply formal verification techniques at industrial installations are the difficulty of creating models out of PLC programs and defining formally the specification requirements. In addition, models produced out of real-life programs have a huge state space, thus preventing the verification due to performance issues. Our work at CERN (European Organization for Nuclear Research) focuses on developing efficient automatic verification methods for industrial critical installations based on PLC (Programmable Logic Controller) control systems. In this paper, we present a tool generating automatically formal models out of PLC code. The tool implements a general methodology which can support several input languages, ...

  16. Substantiation and verification of the heat exchange crisis model in a rod bundles by means of the KORSAR thermohydraulic code

    International Nuclear Information System (INIS)

    Bobkov, V.P.; Vinogradov, V.N.; Efanov, A.D.; Sergeev, V.V.; Smogalev, I.P.

    2003-01-01

    The results of verifying the model for calculating the heat exchange crisis in the uniformly heated rod bundles, realized in the calculation code of the improved evaluation KORSAR, are presented. The model for calculating the critical heat fluxes in this code is based on the tabular method. The experimental data bank of the Branch base center of the thermophysical data GNTs RF - FEhI for the rod bundles, structurally similar to the WWER fuel assemblies, was used by the verification within the wide range of parameters: pressure from 0.11 up to 20 MPa and mass velocity from 5- up to 5000 kg/(m 2 s) [ru

  17. Gas Chromatographic Verification of a Mathematical Model: Product Distribution Following Methanolysis Reactions.

    Science.gov (United States)

    Lam, R. B.; And Others

    1983-01-01

    Investigated application of binomial statistics to equilibrium distribution of ester systems by employing gas chromatography to verify the mathematical model used. Discusses model development and experimental techniques, indicating the model enables a straightforward extension to symmetrical polyfunctional esters and presents a mathematical basis…

  18. Software verification, model validation, and hydrogeologic modelling aspects in nuclear waste disposal system simulations. A paradigm shift

    International Nuclear Information System (INIS)

    Sheng, G.M.

    1994-01-01

    This work reviewed the current concept of nuclear waste disposal in stable, terrestrial geologic media with a system of natural and man-made multi-barriers. Various aspects of this concept and supporting research were examined with the emphasis on the Canadian Nuclear Fuel Waste Management Program. Several of the crucial issues and challenges facing the current concept were discussed. These include: The difficulties inherent in a concept that centres around lithologic studies; the unsatisfactory state of software quality assurance in the present computer simulation programs; and the lack of a standardized, comprehensive, and systematic procedure to carry out a rigorous process of model validation and assessment of simulation studies. An outline of such an approach was presented and some of the principles, tools and techniques for software verification were introduced and described. A case study involving an evaluation of the Canadian performance assessment computer program is presented. A new paradigm to nuclear waste disposal was advocated to address the challenges facing the existing concept. The RRC (Regional Recharge Concept) was introduced and its many advantages were described and shown through a modelling exercise. (orig./HP)

  19. 78 FR 58492 - Generator Verification Reliability Standards

    Science.gov (United States)

    2013-09-24

    ... model verifications needed to support reliability and enhance the coordination of generator protection... equipment needed to support Bulk-Power System reliability and enhance coordination of important protection... Generating Unit or Plant Capabilities, Voltage Regulating Controls, and Protection) Develop coordination and...

  20. Verification of extended model of goal directed behavior applied on aggression

    Directory of Open Access Journals (Sweden)

    Katarína Vasková

    2016-01-01

    behavioral desire. Also important impact of this factor on prevolitional stages of aggressive behavior was identified. Next important predictor of behavioral desire was anticipation of positive emotions, but not negative emotions. These results correspond with theory of self-regulation where behavior that is focused on goal attainment is accompanied with positive emotions (see for example Cacioppo, Gardner & Berntson, 1999, Carver, 2004. Results confirmed not only sufficient model fit, but also explained 53% of variance of behavioral desire, 68% of intention and 37% of behavior. Some limitations should be mentioned - especially unequal gender representation in the second sample. Some results could be affected by lower sample size. For the future we recommend use also other types of aggressive behavior in verification EMGB and also to apply more complex incorporation of inhibition to the model. At last, character of this study is co-relational, therefore further researches should manipulate with key variables in experimental way to appraise main characteristics of stated theoretical background.

  1. 3D MODELING WITH PHOTOGRAMMETRY BY UAVS AND MODEL QUALITY VERIFICATION

    Directory of Open Access Journals (Sweden)

    V. Barrile

    2017-11-01

    Full Text Available This paper deals with a test lead by Geomatics laboratory (DICEAM, Mediterranea University of Reggio Calabria, concerning the application of UAV photogrammetry for survey, monitoring and checking. The study case relies with the surroundings of the Department of Agriculture Sciences. In the last years, such area was interested by landslides and survey activities carried out to take the phenomenon under control. For this purpose, a set of digital images were acquired through a UAV equipped with a digital camera and GPS. Successively, the processing for the production of a 3D georeferenced model was performed by using the commercial software Agisoft PhotoScan. Similarly, the use of a terrestrial laser scanning technique allowed to product dense cloud and 3D models of the same area. To assess the accuracy of the UAV-derived 3D models, a comparison between image and range-based methods was performed.

  2. Translating activity diagram from duration calculus for modeling of real-time systems and its formal verification using UPPAAL and DiVinE

    International Nuclear Information System (INIS)

    Rahim, M.A.B.U.; Arif, F.

    2016-01-01

    The RTS (Real-Time Systems) are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language) for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus) implementation based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost. (author)

  3. Verification and validation of models: far-field modelling of radionuclide migration

    International Nuclear Information System (INIS)

    Porter, J.D.; Herbert, A.W.; Clarke, D.S.; Roe, P.; Vassilic Melling, D.; Einfeldt, B.; Mackay, R.; Glendinning, R.

    1992-01-01

    The aim of this project was to improve the capability, efficiency and realism of the NAMMU and NAPSAC codes, which simulate groundwater flow and solute transport. Using NAMMU, various solution methods for non linear problems were investigated. The Broyden method gave a useful reduction in computing time and appeared robust. The relative saving obtained with this method increased with the problem size. This was also the case when parameter stepping was used. The existing empirical sorption models in NAMMU were generalized and a ternary heterogeneous ion exchange model was added. These modifications were tested and gave excellent results. The desirability of coupling NAMMU to an existing geochemical speciation code was assessed

  4. A room acoustical computer model for industrial environments - the model and its verification

    DEFF Research Database (Denmark)

    Christensen, Claus Lynge; Foged, Hans Torben

    1998-01-01

    sources. Combining these three source types it is possible to model huge machinery in an easy and visually clear way. Traditionally room acoustic simulations have been aimed at auditorium acoustics. The aim of the simulations has been to model the room acoustic measuring setup consisting...... of an omnidirectional sound source and a microphone. This allows the comparison of simulated results with the ones measured in real rooms. However when simulating the acoustic environment in industrial rooms, the sound sources are often far from being point like, as they can be distributed over a large space...

  5. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    Directory of Open Access Journals (Sweden)

    M. A. Lukin

    2014-01-01

    Full Text Available The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maximum size of verifiable programs. Considered method supports verification of the parallel system for hierarchical automata that interact with each other through messages and shared variables. The feature of automaton model is that each state machine is considered as a new data type and can have an arbitrary bounded number of instances. Each state machine in the system can run a different state machine in a new thread or have nested state machine. This method was implemented in the developed Stater tool. Stater shows correct operation for all test cases.

  6. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Samuel, D [Universite catholique de Louvain, Louvain-la-neuve, BW (Belgium); Testa, M; Park, Y [Massachusetts General Hospital, Boston, MA (United States); Schneider, R; Moteabbed, M [General Hospital, Boston, MA (United States); Janssens, G; Prieels, D [Ion Beam Applications, Louvain-la-neuve, Brabant Wallon (Belgium); Orban de Xivry, J [Universite catholique de Louvain, Louvain-la-neuve, BW (Belgium); Lu, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Bentefour, E

    2014-06-01

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separable into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.

  7. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    International Nuclear Information System (INIS)

    Samuel, D; Testa, M; Park, Y; Schneider, R; Moteabbed, M; Janssens, G; Prieels, D; Orban de Xivry, J; Lu, H; Bentefour, E

    2014-01-01

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separable into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient

  8. Exploring Middle School Students' Representational Competence in Science: Development and Verification of a Framework for Learning with Visual Representations

    Science.gov (United States)

    Tippett, Christine Diane

    Scientific knowledge is constructed and communicated through a range of forms in addition to verbal language. Maps, graphs, charts, diagrams, formulae, models, and drawings are just some of the ways in which science concepts can be represented. Representational competence---an aspect of visual literacy that focuses on the ability to interpret, transform, and produce visual representations---is a key component of science literacy and an essential part of science reading and writing. To date, however, most research has examined learning from representations rather than learning with representations. This dissertation consisted of three distinct projects that were related by a common focus on learning from visual representations as an important aspect of scientific literacy. The first project was the development of an exploratory framework that is proposed for use in investigations of students constructing and interpreting multimedia texts. The exploratory framework, which integrates cognition, metacognition, semiotics, and systemic functional linguistics, could eventually result in a model that might be used to guide classroom practice, leading to improved visual literacy, better comprehension of science concepts, and enhanced science literacy because it emphasizes distinct aspects of learning with representations that can be addressed though explicit instruction. The second project was a metasynthesis of the research that was previously conducted as part of the Explicit Literacy Instruction Embedded in Middle School Science project (Pacific CRYSTAL, http://www.educ.uvic.ca/pacificcrystal). Five overarching themes emerged from this case-to-case synthesis: the engaging and effective nature of multimedia genres, opportunities for differentiated instruction using multimodal strategies, opportunities for assessment, an emphasis on visual representations, and the robustness of some multimodal literacy strategies across content areas. The third project was a mixed

  9. Modeling of containment response for Krsko NPP Full Scope Simulator verification

    International Nuclear Information System (INIS)

    Kljenak, I.; Skerlavaj, A.

    2000-01-01

    Containment responses during the first 10000 s of Anticipated Transient Without Scram and Small Break Loss-of-Coolant Accident scenarios in the Krsko two-loop Westinghouse pressurized water reactor nuclear power plant were simulated with the CONTAIN computer code. Sources of coolant were obtained from simulations with the RELAP5 code. The simulations were carried out so that the results could be used for the verification of the Krsko Full Scope Simulator. (author)

  10. Storage and growth of denitrifiers in aerobic granules: part II. model calibration and verification.

    Science.gov (United States)

    Ni, Bing-Jie; Yu, Han-Qing; Xie, Wen-Ming

    2008-02-01

    A mathematical model to describe the simultaneous storage and growth activities of denitrifiers in aerobic granules under anoxic conditions has been developed in an accompanying article. The sensitivity of the nitrate uptake rate (NUR) toward the stoichiometric and kinetic coefficients is analyzed in this article. The model parameter values are estimated by minimizing the sum of squares of the deviations between the measured and model-predicted values. The model is successfully calibrated and a set of stoichiometric and kinetic parameters for the anoxic storage and growth of the denitrifiers are obtained. Thereafter, the model established is verified with three set of experimental data. The comparison between the model established with the ASM1 model and ASM3 shows that the present model is appropriate to simulate and predict the performance of a granule-based denitrification system. (c) 2007 Wiley Periodicals, Inc.

  11. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    Science.gov (United States)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  12. Verification of the model of predisposition in triathlon – structural model of confirmative factor analysis

    Directory of Open Access Journals (Sweden)

    Lenka Kovářová

    2012-09-01

    Full Text Available BACKGROUND: The triathlon is a combination of three different types of sport – swimming, cycling, and running. Each of these requires different top level predispositions and complex approach to talent selection is a rather difficult process. Attempts to identify assumptions in the triathlon have so far been specific and focused only on some groups of predispositions (physiology, motor tests, and psychology. The latest studies missed the structural approach and were based on determinants of sport performance, theory of sports training and expert assessment. OBJECTIVE: The aim of our study was to verify the model of predisposition in the short triathlon for talent assessment of young male athletes age 17–20 years. METHODS: The research sample consisted of 55 top level triathletes – men, who were included in the Government supported sports talent programme in the Czech Republic at the age of 17–20 years. We used a confirmative factor analysis (FA and Path diagram to verify the model, which allow us to explain mutual relationships among observed variables. For statistical data processing we used a structure equating modeling (SEM by software Lisrel L88. RESULTS: The study confirms best structural model for talent selection in triathlon at the age of 17–20 years old men, which composed seventeen indicators (tests and explained 91% of all cross-correlations (Goodness of Fit Index /GFI/ 0.91, Root Mean Square Residual /RMSR/ 0.13. Tests for predispositions in triathlons were grouped into five items, three motor predispositions (swimming, cycling and running skills, aerobic and psychological predispositions. Aerobic predispositions showed the highest importance to the assumptions to the general factor (1.00; 0. Running predispositions were measured as a very significant factor (–0.85; 0.28 which confirms importance of this critical stage of the race. Lower factor weight showed clusters of swimming (–0.61; 0.63 and cycling (0.53; 0

  13. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  14. Postures and Motions Library Development for Verification of Ground Crew Human Systems Integration Requirements

    Science.gov (United States)

    Jackson, Mariea Dunn; Dischinger, Charles; Stambolian, Damon; Henderson, Gena

    2012-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a Primitive motion capture library. The Library will be used by the human factors engineering in the future to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the Primitive models are being developed for the library the project has selected several current human factors issues to be addressed for the SLS and Orion launch systems. This paper explains how the Motion Capture of unique ground systems activities are being used to verify the human factors analysis requirements for ground system used to process the STS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  15. Postures and Motions Library Development for Verification of Ground Crew Human Factors Requirements

    Science.gov (United States)

    Stambolian, Damon; Henderson, Gena; Jackson, Mariea Dunn; Dischinger, Charles

    2013-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a primitive motion capture library. The library will be used by human factors engineering analysts to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the primitive models are being developed for the library, the project has selected several current human factors issues to be addressed for the Space Launch System (SLS) and Orion launch systems. This paper explains how the motion capture of unique ground systems activities is being used to verify the human factors engineering requirements for ground systems used to process the SLS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  16. Verification and Validation of Numerical Models for Air/Water Flow on Coastal and Navigation Fluid-Structure Interaction Applications

    Science.gov (United States)

    Kees, C. E.; Farthing, M.; Dimakopoulos, A.; DeLataillade, T.

    2015-12-01

    Performance analysis and optimization of coastal and navigation structures is becoming feasible due to recent improvements in numerical methods for multiphase flows and the steady increase in capacity and availability of high performance computing resources. Now that the concept of fully three-dimensional air/water flow modelling for real world engineering analysis is achieving acceptance by the wider engineering community, it is critical to expand careful comparative studies on verification,validation, benchmarking, and uncertainty quantification for the variety of competing numerical methods that are continuing to evolve. Furthermore, uncertainty still remains about the relevance of secondary processes such as surface tension, air compressibility, air entrainment, and solid phase (structure) modelling so that questions about continuum mechanical theory and mathematical analysis of multiphase flow are still required. Two of the most popular and practical numerical approaches for large-scale engineering analysis are the Volume-Of-Fluid (VOF) and Level Set (LS) approaches. In this work we will present a publically available verification and validation test set for air-water-structure interaction problems as well as computational and physical model results including a hybrid VOF-LS method, traditional VOF methods, and Smoothed Particle Hydrodynamics (SPH) results. The test set repository and test problem formats will also be presented in order to facilitate future comparative studies and reproduction of scientific results.

  17. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  18. Documentation, User Support, and Verification of Wind Turbine and Plant Models

    Energy Technology Data Exchange (ETDEWEB)

    Robert Zavadil; Vadim Zheglov; Yuriy Kazachkov; Bo Gong; Juan Sanchez; Jun Li

    2012-09-18

    As part of the Utility Wind Energy Integration Group (UWIG) and EnerNex's Wind Turbine Modeling Project, EnerNex has received ARRA (federal stimulus) funding through the Department of Energy (DOE) to further the progress of wind turbine and wind plant models. Despite the large existing and planned wind generation deployment, industry-standard models for wind generation have not been formally adopted. Models commonly provided for interconnection studies are not adequate for use in general transmission planning studies, where public, non-proprietary, documented and validated models are needed. NERC MOD (North American Electric Reliability Corporation) reliability standards require that power flow and dynamics models be provided, in accordance with regional requirements and procedures. The goal of this project is to accelerate the appropriate use of generic wind turbine models for transmission network analysis by: (1) Defining proposed enhancements to the generic wind turbine model structures that would allow representation of more advanced; (2) Comparative testing of the generic models against more detailed (and sometimes proprietary) versions developed by turbine vendors; (3) Developing recommended parameters for the generic models to best mimic the performance of specific commercial wind turbines; (4) Documenting results of the comparative simulations in an application guide for users; (5) Conducting technology transfer activities in regional workshops for dissemination of knowledge and information gained, and to engage electric power and wind industry personnel in the project while underway; (6) Designing of a "living" homepage to establish an online resource for transmission planners.

  19. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    Directory of Open Access Journals (Sweden)

    Tuo Ming Fu

    2016-01-01

    Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion

  20. Tracer experiment data sets for the verification of local and meso-scale atmospheric dispersion models including topographic effects

    International Nuclear Information System (INIS)

    Sartori, E.; Schuler, W.

    1992-01-01

    Software and data for nuclear energy applications are acquired, tested and distributed by several information centres; in particular, relevant computer codes are distributed internationally by the OECD/NEA Data Bank (France) and by ESTSC and EPIC/RSIC (United States). This activity is coordinated among the centres and is extended outside the OECD area through an arrangement with the IAEA. This article proposes more specifically a scheme for acquiring, storing and distributing atmospheric tracer experiment data (ATE) required for verification of atmospheric dispersion models especially the most advanced ones including topographic effects and specific to the local and meso-scale. These well documented data sets will form a valuable complement to the set of atmospheric dispersion computer codes distributed internationally. Modellers will be able to gain confidence in the predictive power of their models or to verify their modelling skills. (au)

  1. Verification of WIMS-ANL to be used as supporting code for WIMS-CANDU development

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Dai Hai; Kim, Won Young; Park, Joo Hwan

    2007-08-15

    The lattice code WIMS-ANL has been tested in order to assess it for the qualification to be used as a supporting code to aide the WIMS-CANDU development. A series of calculations have been performed to determine lattice physics parameters such as multiplication factors, isotopic number densities and coolant void reactivity. The WIMS-ANL results are compared with the predictions of WIMS-AECL/D4/D5 and PPV (POWDERPUFS-V), and the comparisons indicate that WIMS-ANL can be used not only as a supporting code to aide the WIMS-CANDU development, but also as a starting source for the study of developing detailed model that could delineate the realistic situations as it might occur during LOCA such as the asymmetric flux distribution across lattice cell.

  2. Modeling Tourism Sustainable Development

    Science.gov (United States)

    Shcherbina, O. A.; Shembeleva, E. A.

    The basic approaches to decision making and modeling tourism sustainable development are reviewed. Dynamics of a sustainable development is considered in the Forrester's system dynamics. Multidimensionality of tourism sustainable development and multicriteria issues of sustainable development are analyzed. Decision Support Systems (DSS) and Spatial Decision Support Systems (SDSS) as an effective technique in examining and visualizing impacts of policies, sustainable tourism development strategies within an integrated and dynamic framework are discussed. Main modules that may be utilized for integrated modeling sustainable tourism development are proposed.

  3. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  4. Verification of a TRACE EPRTM model on the basis of a scaling calculation of an SBLOCA ROSA test

    International Nuclear Information System (INIS)

    Freixa, J.; Manera, A.

    2011-01-01

    Research highlights: → Verification of a TRACE input deck for the EPR TM generation III PWR. → Scaling simulation of an SBLOCA experiment of the integral test facility ROSA/LSTF. → The EPR TM model was compared with the TRACE results of the ROSA/LSTF model. - Abstract: In cooperation with the Finnish Radiation and Nuclear Safety Authority (STUK), a project has been launched at the Paul Scherrer Institute (PSI) aimed at performing safety evaluations of the Olkiluoto-3 nuclear power plant (NPP), the first EPR TM , a generation III pressurizer water reactor (PWR); with particular emphasis on small-and large-break loss-of-coolant-accidents (SB/LB-LOCAs) and main steam-line breaks. As a first step of this work, the best estimate system code TRACE has been used to develop a model of Olkiluoto-3. In order to test the nodalization, a scaling calculation from the rig of safety assessment (ROSA) test facility has been performed. The ROSA large scale test facility (LSTF) was built to simulate Westinghouse design pressurized water reactors (PWR) with a four-loop configuration. Even though there are differences between the EPR TM and the Westinghouse designs, the number of similarities is large enough to carry out scaling calculations on SBLOCA and LOCA cases from the ROSA facility; as a matter of fact, the main differences are located in the secondary side. Test 6-1 of the ROSA 1 programme, an SBLOCA with the break situated in the upper head of the reactor pressure vessel (RPV), was of special interest since a very good agreement with the experiment was obtained with a TRACE input deck. In order to perform such scaling calculation, the set-points of the secondary relief and safety valves in the EPR TM nodalization had to be changed to those used in the ROSA facility, the break size and the core power had to be scaled by a factor of 60 (according to the core power and core volume) and the pumps coast down had to be adapted to the ones of the test. The calculation showed

  5. Model-Based Interpretation and Experimental Verification of ECT Signals of Steam Generator Tubes

    International Nuclear Information System (INIS)

    Song, Sung Jin; Kim, Young Hwan; Kim, Eui Lae; Yim, Chang Jae; Lee, Jin Ho

    2004-01-01

    Model-based inversion tools for eddy current signals have been developed by combining neural networks and finite element modeling, for quantitative flaw characterization in steam generator tubes. In the present work, interpretation of experimental eddy current signals was carried out in order to validate the developed inversion tools. A database was constructed using the synthetic flaw signals generated by the finite element model. The hybrid neural networks composed of a PNN classifier and BPNN size estimators were trained using the synthetic signals. Experimental eddy current signals were obtained from axisymmetric artificial flaws. Interpretation of flaw signals was conducted by feeding the experimental signals into the neural networks. The interpretation was excellent, which shows that the developed inversion tools would be applicable to the Interpretation of real eddy current signals

  6. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Today formal verification is finding increasing acceptance in some areas, especially model abstraction and functional verification. Other major chal- lenges, like timing verification, remain before this technology can be posed as a complete alternative to simulation. This special issue is devoted to presenting some of the ...

  7. Adaptive multi-rate interface: development and experimental verification for real-time hybrid simulation

    DEFF Research Database (Denmark)

    Maghareh, Amin; Waldbjørn, Jacob Paamand; Dyke, Shirley J.

    2016-01-01

    frequencies and/or introduce delays that can degrade its stability and performance. In this study, the Adaptive Multi-rate Interface rate-transitioning and compensation technique is developed to enable the use of more complex numerical models. Such a multi-rate RTHS is strictly executed at real-time, although...... frequency between the numerical and physical substructures and for input signals with high-frequency content. Further, it does not induce signal chattering at the coupling frequency. The effectiveness of AMRI is also verified experimentally....

  8. Developing mathematical modelling competence

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Jensen, Tomas Højgaard

    2003-01-01

    In this paper we introduce the concept of mathematical modelling competence, by which we mean being able to carry through a whole mathematical modelling process in a certain context. Analysing the structure of this process, six sub-competences are identified. Mathematical modelling competence...... cannot be reduced to these six sub-competences, but they are necessary elements in the development of mathematical modelling competence. Experience from the development of a modelling course is used to illustrate how the different nature of the sub-competences can be used as a tool for finding...... the balance between different kinds of activities in a particular educational setting. Obstacles of social, cognitive and affective nature for the students' development of mathematical modelling competence are reported and discussed in relation to the sub-competences....

  9. Quality assurance of sterilized products: verification of a model relating frequency of contaminated items and increasing radiation dose

    International Nuclear Information System (INIS)

    Khan, A.A.; Tallentire, A.; Dwyer, J.

    1977-01-01

    Values of the γ-radiation resistance parameters (k and n of the 'multi-hit' expression) for Bacillus pumilus E601 spores and Serratia marcescens cells have been determined and the constancy of values for a given test condition demonstrated. These organisms, differing by a factor of about 50 in k value, have been included in a laboratory test system for use in verification of a model describing the dependence of the proportion of contaminated items in a population of items on radiation dose. The proportions of contaminated units of the test system at various γ-radiation doses have been measured for different initial numbers and types of organisms present in units either singly or together. Using the model, the probabilities of contaminated units for corresponding sets of conditions have been evaluated together with associated variances. Measured proportions and predicted probabilities agree well, showing that the model holds in a laboratory contrived situation. (author)

  10. Experimental verification of mathematical model of the heat transfer in exhaust system

    OpenAIRE

    Petković Snežana; Pešić Radivoje; Lukić Jovanka

    2011-01-01

    A Catalyst convertor has maximal efficiency when it reaches working temperature. In a cold start phase efficiency of the catalyst is low and exhaust emissions have high level of air pollutants. The exhaust system optimization, in order to decrease time of achievement of the catalyst working temperature, caused reduction of the total vehicle emission. Implementation of mathematical models in development of exhaust systems decrease total costs and reduce time. Mathematical model has to be...

  11. Verification of Nine-phase PMSM Model in d-q Coordinates with Mutual Couplings

    OpenAIRE

    Kozovský, Matúš; Blaha, Petr; Václavek, Pavel

    2016-01-01

    Electric motors with more than three phases have many advantages comparing to ordinary three-phase motor. For this reason it is natural to pay attention to hem and to work on advanced control methods. Control algorithms development requires to operate with the model of motor. This paper presents the modeling concept of the nine-phase permanent magnet synchronous motor (PMSM) in three times three-phase arrangement fed by nine-phase voltage source inverter (VSI). Magnetic interaction between...

  12. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  13. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    OpenAIRE

    M. A. Lukin

    2014-01-01

    The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maxi...

  14. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  15. General-Purpose Heat Source development: Safety Verification Test Program. Bullet/fragment test series

    Energy Technology Data Exchange (ETDEWEB)

    George, T.G.; Tate, R.E.; Axler, K.M.

    1985-05-01

    The radioisotope thermoelectric generator (RTG) that will provide power for space missions contains 18 General-Purpose Heat Source (GPHS) modules. Each module contains four /sup 238/PuO/sub 2/-fueled clads and generates 250 W/sub (t)/. Because a launch-pad or post-launch explosion is always possible, we need to determine the ability of GPHS fueled clads within a module to survive fragment impact. The bullet/fragment test series, part of the Safety Verification Test Plan, was designed to provide information on clad response to impact by a compact, high-energy, aluminum-alloy fragment and to establish a threshold value of fragment energy required to breach the iridium cladding. Test results show that a velocity of 555 m/s (1820 ft/s) with an 18-g bullet is at or near the threshold value of fragment velocity that will cause a clad breach. Results also show that an exothermic Ir/Al reaction occurs if aluminum and hot iridium are in contact, a contact that is possible and most damaging to the clad within a narrow velocity range. The observed reactions between the iridium and the aluminum were studied in the laboratory and are reported in the Appendix.

  16. Developing reliable safeguards seals for application verification and removal by State operators

    Energy Technology Data Exchange (ETDEWEB)

    Finch, Robert J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smartt, Heidi A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Haddal, Risa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Once a geological repository has begun operations, the encapsulation and disposal of spent fuel will be performed as a continuous, industrial-scale series of processes, during which time safeguards seals will be applied to transportation casks before shipment from an encapsulation plant, and then verified and removed following receipt at the repository. These operations will occur approximately daily during several decades of Sweden's repository operation; however, requiring safeguards inspectors to perform the application, verification, and removal of every seal would be an onerous burden on International Atomic Energy Agency's (IAEA's) resources. Current IAEA practice includes allowing operators to either apply seals or remove them, but not both, so the daily task of either applying or verifying and removing would still require continuous presence of IAEA inspectors at one site at least. Of special importance is the inability to re-verify cask or canisters from which seals have been removed and the canisters emplaced underground. Successfully designing seals that can be applied, verified and removed by an operator with IAEA approval could impact more than repository shipments, but other applications as well, potentially reducing inspector burdens for a wide range of such duties.

  17. Explosion overpressure test series: General-Purpose Heat Source development: Safety Verification Test program

    International Nuclear Information System (INIS)

    Cull, T.A.; George, T.G.; Pavone, D.

    1986-09-01

    The General-Purpose Heat Source (GPHS) is a modular, radioisotope heat source that will be used in radioisotope thermoelectric generators (RTGs) to supply electric power for space missions. The first two uses will be the NASA Galileo and the ESA Ulysses missions. The RTG for these missions will contain 18 GPHS modules, each of which contains four 238 PuO 2 -fueled clads and generates 250 W/sub (t)/. A series of Safety Verification Tests (SVTs) was conducted to assess the ability of the GPHS modules to contain the plutonia in accident environments. Because a launch pad or postlaunch explosion of the Space Transportation System vehicle (space shuttle) is a conceivable accident, the SVT plan included a series of tests that simulated the overpressure exposure the RTG and GPHS modules could experience in such an event. Results of these tests, in which we used depleted UO 2 as a fuel simulant, suggest that exposure to overpressures as high as 15.2 MPa (2200 psi), without subsequent impact, does not result in a release of fuel

  18. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop.

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs

  19. Development of an expert system for success path generation and operator's action guides in NPP: Verification and validation of COSMOS

    International Nuclear Information System (INIS)

    Yang, Jun Un; Jung, Kwang Sup; Park, Chang Gyu

    1992-08-01

    For the support of emergency operation, an expert system named COSMOS (COmputerized Success-path MOnitoring System) is being developed at Korea Atomic Energy Research Institute (KAERI). COSMOS identifies the critical safety function's (CSF'S) status, and suggests the overall response strategy with a set of success paths which restore the challenged CSF's. The status of CSF is identified by the rule-based reasoning. The overall response strategy is inferred according to the identified CSF's status. The success paths are generated by the given structure descriptions of systems and the general generation algorithm. For efficient man-machine interface, a colar graphic display is utilized. COSMOS is being built on a workstation. The major tasks to build an expert system such as COSMOS are the construction of knowledge base and inference engine. In COSMOS, the knowledges are derived from the Emergency Operating Procedures (EOPs), and the forward chaining is adopted as the inference strategy. While the knowledge base and inference engine are the most common and essential elements of an expert system, they are not the only ones. The evaluation of expert systems can not only lessen the risk of using faulty software, but also enhance the acceptability of the expert systems by both users and regulators. The evaluation of expert systems consists of the system verification, validation and user acceptance testing. Among them, in this report, we have focused our attention to verification and validation (V≅V) of expert systems. We have accessed the general V≅V procedures and tried to develop the specific V≅V procedure for COSMOS. (Author)

  20. Development and verification of unstructured adaptive mesh technique with edge compatibility

    International Nuclear Information System (INIS)

    Ito, Kei; Ohshima, Hiroyuki; Kunugi, Tomoaki

    2010-01-01

    In the design study of the large-sized sodium-cooled fast reactor (JSFR), one key issue is suppression of gas entrainment (GE) phenomena at a gas-liquid interface. Therefore, the authors have been developed a high-precision CFD algorithm to evaluate the GE phenomena accurately. The CFD algorithm has been developed on unstructured meshes to establish an accurate modeling of JSFR system. For two-phase interfacial flow simulations, a high-precision volume-of-fluid algorithm is employed. It was confirmed that the developed CFD algorithm could reproduce the GE phenomena in a simple GE experiment. Recently, the authors have been developed an important technique for the simulation of the GE phenomena in JSFR. That is an unstructured adaptive mesh technique which can apply fine cells dynamically to the region where the GE occurs in JSFR. In this paper, as a part of the development, a two-dimensional unstructured adaptive mesh technique is discussed. In the two-dimensional adaptive mesh technique, each cell is refined isotropically to reduce distortions of the mesh. In addition, connection cells are formed to eliminate the edge incompatibility between refined and non-refined cells. The two-dimensional unstructured adaptive mesh technique is verified by solving well-known lid-driven cavity flow problem. As a result, the two-dimensional unstructured adaptive mesh technique succeeds in providing a high-precision solution, even though poor-quality distorted initial mesh is employed. In addition, the simulation error on the two-dimensional unstructured adaptive mesh is much less than the error on the structured mesh with a larger number of cells. (author)

  1. MARATHON Verification (MARV)

    Science.gov (United States)

    2017-08-01

    the verification and "lessons learned " from the semantic and technical issues we discovered as we implemented the approach. 15. SUBJECT TERMS...any programming language at use at CAA for modeling or other data analysis applications, to include R, Python , Scheme, Common Lisp, Julia, Mathematica

  2. Modelling of near-field radionuclide transport phenomena in a KBS-3V type of repository for nuclear waste with Goldsim Code - and verification against previous methods

    International Nuclear Information System (INIS)

    Pulkkanen, V.-M.; Nordman, H.

    2010-03-01

    Traditional radionuclide transport models overestimate significantly some phenomena, or completely ignore them. This motivates the development of new more precise models. As a result, this work is a description of commissioning of a new KBS-3V near-field radionuclide transport model, which has been done with a commercial software called GoldSim. According to earlier models, GoldSim model uses rz coordinates, but the solubilities of radionuclides have been treated more precisely. To begin with, the physical phenomena concerning near-field transport have been introduced according to GoldSim way of thinking. Also, the computational methods of GoldSim have been introduced and compared to methods used earlier. The actual verification of GoldSim model has been carried out by comparing the GoldSim results from simple cases to the corresponding results obtained with REPCOM, a software developed by VTT and used in several safety assessments. The results agree well. Finally, a few complicated cases were studied. In these cases, the REPCOM's limitations in handling of some phenomena become evident. The differences in the results are caused especially by the extension of the solubility limit to the whole computational domain, and the element-wise treatment of the solubilities which was used instead of nuclide-wise treatment. This work has been carried out as a special assignment to the former laboratory of Advanced Energy Systems in Helsinki University of Technology. The work was done at VTT. (orig.)

  3. Modelling of near-field radionuclide transport phenomena in a KBS-3V type of repository for nuclear waste with Goldsim Code - and verification against previous methods

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkanen, V.-M.; Nordman, H. (VTT Technical Research Centre, Espoo (Finland))

    2010-03-15

    Traditional radionuclide transport models overestimate significantly some phenomena, or completely ignore them. This motivates the development of new more precise models. As a result, this work is a description of commissioning of a new KBS-3V near-field radionuclide transport model, which has been done with a commercial software called GoldSim. According to earlier models, GoldSim model uses rz coordinates, but the solubilities of radionuclides have been treated more precisely. To begin with, the physical phenomena concerning near-field transport have been introduced according to GoldSim way of thinking. Also, the computational methods of GoldSim have been introduced and compared to methods used earlier. The actual verification of GoldSim model has been carried out by comparing the GoldSim results from simple cases to the corresponding results obtained with REPCOM, a software developed by VTT and used in several safety assessments. The results agree well. Finally, a few complicated cases were studied. In these cases, the REPCOM's limitations in handling of some phenomena become evident. The differences in the results are caused especially by the extension of the solubility limit to the whole computational domain, and the element-wise treatment of the solubilities which was used instead of nuclide-wise treatment. This work has been carried out as a special assignment to the former laboratory of Advanced Energy Systems in Helsinki University of Technology. The work was done at VTT. (orig.)

  4. Developpement of a GoldSim Biosphere Model, Evaluation, and Its Verification

    International Nuclear Information System (INIS)

    Lee, Youn Myoung; Hwang, Yong Soo

    2009-12-01

    For the purpose of evaluating dose rate to individual due to long-term release of nuclides from the repository for an HLW or a pyroprocessing repository, a biosphere assessment model and the implemented program based on BIOMASS methodology have been developed by utilizing GoldSim, a general model developing tool. To show its practicability and usability as well as to see the sensitivity of parametric and scenario variations to the annual exposure, some probabilistic calculations are made and investigated. For the cases when changing the exposure groups and associated GBIs as well as varying selected input values, all of which seem important for the biosphere evaluation, dose rate per nuclide release rate is probabilistically calculated and analyzed. A series of comparison studies with JAEA, Japan have been also carried out to verify the model

  5. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  6. Open verification methodology cookbook

    CERN Document Server

    Glasser, Mark

    2009-01-01

    Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic

  7. Development, Verification and Validation of Parallel, Scalable Volume of Fluid CFD Program for Propulsion Applications

    Science.gov (United States)

    West, Jeff; Yang, H. Q.

    2014-01-01

    There are many instances involving liquid/gas interfaces and their dynamics in the design of liquid engine powered rockets such as the Space Launch System (SLS). Some examples of these applications are: Propellant tank draining and slosh, subcritical condition injector analysis for gas generators, preburners and thrust chambers, water deluge mitigation for launch induced environments and even solid rocket motor liquid slag dynamics. Commercially available CFD programs simulating gas/liquid interfaces using the Volume of Fluid approach are currently limited in their parallel scalability. In 2010 for instance, an internal NASA/MSFC review of three commercial tools revealed that parallel scalability was seriously compromised at 8 cpus and no additional speedup was possible after 32 cpus. Other non-interface CFD applications at the time were demonstrating useful parallel scalability up to 4,096 processors or more. Based on this review, NASA/MSFC initiated an effort to implement a Volume of Fluid implementation within the unstructured mesh, pressure-based algorithm CFD program, Loci-STREAM. After verification was achieved by comparing results to the commercial CFD program CFD-Ace+, and validation by direct comparison with data, Loci-STREAM-VoF is now the production CFD tool for propellant slosh force and slosh damping rate simulations at NASA/MSFC. On these applications, good parallel scalability has been demonstrated for problems sizes of tens of millions of cells and thousands of cpu cores. Ongoing efforts are focused on the application of Loci-STREAM-VoF to predict the transient flow patterns of water on the SLS Mobile Launch Platform in order to support the phasing of water for launch environment mitigation so that vehicle determinantal effects are not realized.

  8. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  9. Helicopter stability and control modeling improvements and verification on two helicopters

    Science.gov (United States)

    Schrage, D. P.; Peters, D. A.; Prasad, J. V. R.; Stumpf, W. F.; He, Chengjian

    1988-01-01

    A linearized model of helicopter flight dynamics is developed which includes the flapping, lead-lag, and dynamic inflow degrees of freedom (DOF). The model is a combination of analytical terms and numerically determined stability derivatives, and is used to investigate the importance of the rotor DOF to stability and control modeling. The results show that the rotor DOF can have a significant impact on some of the natural modes in a linear model. The flap and dynamic inflow DOF show the greatest influence. Flapping exhibits strong coupling to the body, dynamic inflow, and to lead-lag to a lesser extent. Dynamic inflow tends to damp the high-frequency flapping modes, and reduces the damping on coupled body-flap motion. Dynamic inflow also couples to the flapping motion to produce complex roots. With body-flap and lag regressing modes as exceptions, the results show essentially similar behavior for most modes of articulated and hingeless rotor helicopters.

  10. Development of decommissioning management system. 9. Remodeling to PC system and system verification by evaluation of real work

    International Nuclear Information System (INIS)

    Kondo, Hitoshi; Fukuda, Seiji; Okubo, Toshiyuki

    2004-03-01

    When the plan of decommissioning such as nuclear fuel cycle facilities and small-scale research reactors is examined, it is necessary to select the technology and the process of the work procedure, and to optimize the index (such as the radiation dose, the cost, amount of the waste, the number of workers, and the term of works, etc.) concerning dismantling the facility. In our waste management section, Development of the decommissioning management system, which is called 'DECMAN', for the support of making the decommissioning plan is advanced. DECMAN automatically calculates the index by using the facility data and dismantling method. This paper describes the remodeling of program to the personal computer and the system verification by evaluation of real work (Dismantling of the liquor dissolver in the old JOYO Waste Treatment Facility (the old JWTF), the glove boxes in Deuterium Critical Assembly (DCA), and the incinerator in Waste Dismantling Facility (WDF)). The outline of remodeling and verification is as follows. (1) Additional function: 1) Equipment arrangement mapping, 2) Evaluation of the radiation dose by using the air dose rate, 3) I/O of data that uses EXCEL (software). (2) Comparison of work amount between calculation value and results value: The calculation value is 222.67man·hour against the result value 249.40 man·hour in the old JWTF evaluation. (3) Forecast of accompanying work is predictable to multiply a certain coefficient by the calculation value. (4) A new idea that expected the amount of the work was constructed by using the calculation value of DECMAN. (author)

  11. Development of the SEAtrace{trademark} barrier verification and validation technology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, S.D.; Lowry, W.; Walsh, R.; Rao, D.V. [Science and Engineering Associates, Santa Fe, NM (United States); Williams, C. [Sandia National Labs., Albuquerque, NM (United States). Underground Storage Technology Dept.

    1998-08-01

    In-situ barrier emplacement techniques and materials for the containment of high-risk contaminants in soils are currently being developed by the Department of Energy (DOE). Because of their relatively high cost, the barriers are intended to be used in cases where the risk is too great to remove the contaminants, the contaminants are too difficult to remove with current technologies, or the potential movement of the contaminants to the water table is so high that immediate action needs to be taken to reduce health risks. Assessing the integrity of the barrier once it is emplaced, and during its anticipated life, is a very difficult but necessary requirement. Science and Engineering Associates, Inc., (SEA) and Sandia National Laboratories (SNL) have developed a quantitative subsurface barrier assessment system using gaseous tracers in support of the Subsurface Contaminants Focus Area barrier technology program. Called SEAtrace{trademark}, this system integrates an autonomous, multi-point soil vapor sampling and analysis system with a global optimization modeling methodology to locate and size barrier breaches in real time. The methodology for the global optimization code was completed and a prototype code written using simplifying assumptions. Preliminary modeling work to validate the code assumptions were performed using the T2VOC numerical code. A multi-point field sampling system was built to take soil gas samples and analyze for tracer gas concentration. The tracer concentration histories were used in the global optimization code to locate and size barrier breaches. SEAtrace{trademark} was consistently able to detect and locate leaks, even under very adverse conditions. The system was able to locate the leak to within 0.75 m of the actual value, and was able to determine the size of the leak to within 0.15 m.

  12. Development and preliminary verification of 2-D transport module of radiation shielding code ARES

    International Nuclear Information System (INIS)

    Zhang Penghe; Chen Yixue; Zhang Bin; Zang Qiyong; Yuan Longjun; Chen Mengteng

    2013-01-01

    The 2-D transport module of radiation shielding code ARES is two-dimensional neutron and radiation shielding code. The theory model was based on the first-order steady state neutron transport equation, adopting the discrete ordinates method to disperse direction variables. Then a set of differential equations can be obtained and solved with the source iteration method. The 2-D transport module of ARES was capable of calculating k eff and fixed source problem with isotropic or anisotropic scattering in x-y geometry. The theoretical model was briefly introduced and series of benchmark problems were verified in this paper. Compared with the results given by the benchmark, the maximum relative deviation of k eff is 0.09% and the average relative deviation of flux density is about 0.60% in the BWR cells benchmark problem. As for the fixed source problem with isotropic and anisotropic scattering, the results of the 2-D transport module of ARES conform with DORT very well. These numerical results of benchmark problems preliminarily demonstrate that the development process of the 2-D transport module of ARES is right and it is able to provide high precision result. (authors)

  13. The Development and Verification of a Novel ECMS of Hybrid Electric Bus

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2014-01-01

    Full Text Available This paper presents the system modeling, control strategy design, and hardware-in-the-loop test for a series-parallel hybrid electric bus. First, the powertrain mathematical models and the system architecture were proposed. Then an adaptive ECMS is developed for the real-time control of a hybrid electric bus, which is investigated and verified in a hardware-in-the-loop simulation system. The ECMS through driving cycle recognition results in updating the equivalent charge and discharge coefficients and extracting optimized rules for real-time control. This method not only solves the problems of mode transition frequently and improves the fuel economy, but also simplifies the complexity of control strategy design and provides new design ideas for the energy management strategy and gear-shifting rules designed. Finally, the simulation results show that the proposed real-time A-ECMS can coordinate the overall hybrid electric powertrain to optimize fuel economy and sustain the battery SOC level.

  14. Modeling and experimental verification of thermally induced residual stress in RF-MEMS

    International Nuclear Information System (INIS)

    Somà, Aurelio; Saleem, Muhammad Mubasher

    2015-01-01

    Electrostatically actuated radio frequency microelectromechanical systems (RF-MEMS) generally consist of microcantilevers and clamped–clamped microbeams. The presence of residual stress in these microstructures affects the static and dynamic behavior of the device. In this study, nonlinear finite element method (FEM) modeling and the experimental validation of residual stress induced in the clamped–clamped microbeams and the symmetric toggle RF-MEMS switch (STS) is presented. The formation of residual stress due to plastic deformation during the thermal loading-unloading cycle in the plasma etching step of the microfabrication process is explained and modeled using the Bauschinger effect. The difference between the designed and the measured natural frequency and pull-in voltage values for the clamped–clamped microbeams is explained by the presence of the nonhomogenous tensile residual stress. For the STS switch specimens, three-dimensional (3D) FEM models are developed and the initial deflection at zero bias voltage, observed during the optical profile measurements, is explained by the residual stress developed during the plasma etching step. The simulated residual stress due to the plastic deformation is included in the STS models to obtain the switch pull-in voltage. At the end of the simulation process, a good correspondence is obtained between the FEM model results and the experimental measurements for both the clamped–clamped microbeams and the STS switch specimens. (paper)

  15. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    International Nuclear Information System (INIS)

    Lee, Se Ho; Lee, Seung Wook; Han, Su Chul; Park, Seung Woo

    2016-01-01

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study

  16. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Se Ho; Lee, Seung Wook [Pusan National University, Busan (Korea, Republic of); Han, Su Chul; Park, Seung Woo [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-05-15

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

  17. Modelling Template for the Development of the Process Flowsheet

    DEFF Research Database (Denmark)

    Fedorova, Marina; Gani, Rafiqul

    2015-01-01

    provides building blocks for the templates (generic models previously developed); 3) computer aided methods and tools, that include procedures to perform model translation, model analysis, model verification/validation, model solution and model documentation. In this work, the integrated use of all three......Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates......, where the experimental effort could be focused.In this contribution a general modelling framework for systematic model building through modelling templates, which supports the reuse of existing models via its tools integration and model import and export capabilities, is presented. Modelling templates...

  18. Verification of product quality from process control

    International Nuclear Information System (INIS)

    Drobot, A.; Bunnell, L.R.; Freeborn, W.P.; Macedo, P.B.; Mellinger, G.B.; Pegg, I.L.; Piepel, G.F.; Reimus, M.A.H.; Routt, K.R.; Saad, E.

    1989-01-01

    Process models were developed to characterize the waste vitrification at West Valley, in terms of process operating constraints and glass compositions achievable. The need for verification of compliance with the proposed Waste Acceptance Preliminary Specification criteria led to development of product models, the most critical one being a glass durability model. Both process and product models were used in developing a target composition for the waste glass. This target composition designed to ensure that glasses made to this target will be of acceptable durability after all process variations have been accounted for. 4 refs., 11 figs., 5 tabs

  19. Verification of the New FAST v8 Capabilities for the Modeling of Fixed-Bottom Offshore Wind Turbines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Barahona, B.; Jonkman, J.; Damiani, R.; Robertson, A.; Hayman, G.

    2014-12-01

    Coupled dynamic analysis has an important role in the design of offshore wind turbines because the systems are subject to complex operating conditions from the combined action of waves and wind. The aero-hydro-servo-elastic tool FAST v8 is framed in a novel modularization scheme that facilitates such analysis. Here, we present the verification of new capabilities of FAST v8 to model fixed-bottom offshore wind turbines. We analyze a series of load cases with both wind and wave loads and compare the results against those from the previous international code comparison projects-the International Energy Agency (IEA) Wind Task 23 Subtask 2 Offshore Code Comparison Collaboration (OC3) and the IEA Wind Task 30 OC3 Continued (OC4) projects. The verification is performed using the NREL 5-MW reference turbine supported by monopile, tripod, and jacket substructures. The substructure structural-dynamics models are built within the new SubDyn module of FAST v8, which uses a linear finite-element beam model with Craig-Bampton dynamic system reduction. This allows the modal properties of the substructure to be synthesized and coupled to hydrodynamic loads and tower dynamics. The hydrodynamic loads are calculated using a new strip theory approach for multimember substructures in the updated HydroDyn module of FAST v8. These modules are linked to the rest of FAST through the new coupling scheme involving mapping between module-independent spatial discretizations and a numerically rigorous implicit solver. The results show that the new structural dynamics, hydrodynamics, and coupled solutions compare well to the results from the previous code comparison projects.

  20. Development of dose delivery verification by PET imaging of photonuclear reactions following high energy photon therapy

    Science.gov (United States)

    Janek, S.; Svensson, R.; Jonsson, C.; Brahme, A.

    2006-11-01

    A method for dose delivery monitoring after high energy photon therapy has been investigated based on positron emission tomography (PET). The technique is based on the activation of body tissues by high energy bremsstrahlung beams, preferably with energies well above 20 MeV, resulting primarily in 11C and 15O but also 13N, all positron-emitting radionuclides produced by photoneutron reactions in the nuclei of 12C, 16O and 14N. A PMMA phantom and animal tissue, a frozen hind leg of a pig, were irradiated to 10 Gy and the induced positron activity distributions were measured off-line in a PET camera a couple of minutes after irradiation. The accelerator used was a Racetrack Microtron at the Karolinska University Hospital using 50 MV scanned photon beams. From photonuclear cross-section data integrated over the 50 MV photon fluence spectrum the predicted PET signal was calculated and compared with experimental measurements. Since measured PET images change with time post irradiation, as a result of the different decay times of the radionuclides, the signals from activated 12C, 16O and 14N within the irradiated volume could be separated from each other. Most information is obtained from the carbon and oxygen radionuclides which are the most abundant elements in soft tissue. The predicted and measured overall positron activities are almost equal (-3%) while the predicted activity originating from nitrogen is overestimated by almost a factor of two, possibly due to experimental noise. Based on the results obtained in this first feasibility study the great value of a combined radiotherapy-PET-CT unit is indicated in order to fully exploit the high activity signal from oxygen immediately after treatment and to avoid patient repositioning. With an RT-PET-CT unit a high signal could be collected even at a dose level of 2 Gy and the acquisition time for the PET could be reduced considerably. Real patient dose delivery verification by means of PET imaging seems to be

  1. Construction and verification of a model of passenger response to STOL aircraft characteristics

    Science.gov (United States)

    Jacobson, I. D.

    1976-01-01

    A technique for evaluating passenger acceptance of a transportation system's environment has been developed. This includes a model of passenger reaction to the vehicle, as well as the relative satisfaction compared to other system attributes. The technique is applied to two commercial airline operations - a U.S. commuter, and the Canadian Airtransit STOL system. It is demonstrated that system convenience and aircraft interior seating can play a large role in satisfying the passenger.

  2. Fiscal 1997 report of the verification research on geothermal prospecting technology. Theme 5-2. Development of a reservoir change prospecting method (reservoir change prediction technique (modeling support technique)); 1997 nendo chinetsu tansa gijutsu nado kensho chosa. 5-2. Choryuso hendo tansaho kaihatsu (choryuso hendo yosoku gijutsu (modeling shien gijutsu)) hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    To evaluate geothermal reservoirs in the initial stage of development, to keep stable output in service operation, and to develop a technology effective for extraction from peripheral reservoirs, study was made on a reservoir variation prediction technique, in particular, a modeling support technique. This paper describes the result in fiscal 1997. Underground temperature estimation technique using homogenization temperatures of fluid inclusions among core fault system measurement systems was applied to Wasabizawa field. The effect of stretching is important to estimate reservoir temperatures, and use of a minimum homogenization temperature of fluid inclusions in quartz was suitable. Even in the case of no quartz in hydrothermal veins, measured data of quartz (secondary fluid inclusion) in parent rocks adjacent to hydrothermal veins well agreed with measured temperature data. The developmental possibility of a new modeling support technique was confirmed enough through collection of documents and information. Based on the result, measurement equipment suitable for R and D was selected, and a measurement system was established through preliminary experiments. 39 refs., 35 figs., 6 tabs.

  3. Verification of the IVA4 film boiling model with the data base of Liu and Theofanous

    Energy Technology Data Exchange (ETDEWEB)

    Kolev, N.I. [Siemens AG Unternehmensbereich KWU, Erlangen (Germany)

    1998-01-01

    Part 1 of this work presents a closed analytical solution for mixed-convection film boiling on vertical walls. Heat transfer coefficients predicted by the proposed model and experimental data obtained at the Royal Institute of Technology in Sweden by Okkonen et al are compared. All data predicted are inside the {+-}10% error band, with mean averaged error being below 4% using the slightly modified analytical solution. The solution obtained is recommended for practical applications. The method presented here is used in Part 2 as a guideline for developing model for film boiling on spheres. The new semi-empirical film boiling model for spheres used in IVA4 computer code is compared with the experimental data base obtained by Liu and Theofanous. The data are predicted within {+-}30% error band. (author)

  4. Experimental verification of a radiofrequency power model for Wi-Fi technology.

    Science.gov (United States)

    Fang, Minyu; Malone, David

    2010-04-01

    When assessing the power emitted from a Wi-Fi network, it has been observed that these networks operate at a relatively low duty cycle. In this paper, we extend a recently introduced model of emitted power in Wi-Fi networks to cover conditions where devices do not always have packets to transmit. We present experimental results to validate the original model and its extension by developing approximate, but practical, testbed measurement techniques. The accuracy of the models is confirmed, with small relative errors: less than 5-10%. Moreover, we confirm that the greatest power is emitted when the network is saturated with traffic. Using this, we give a simple technique to quickly estimate power output based on traffic levels and give examples showing how this might be used in practice to predict current or future power output from a Wi-Fi network.

  5. Formal verification of dynamic hybrid systems: a NuSMV-based model checking approach

    Directory of Open Access Journals (Sweden)

    Xu Zhi

    2018-01-01

    Full Text Available Software security is an important and challenging research topic in developing dynamic hybrid embedded software systems. Ensuring the correct behavior of these systems is particularly difficult due to the interactions between the continuous subsystem and the discrete subsystem. Currently available security analysis methods for system risks have been limited, as they rely on manual inspections of the individual subsystems under simplifying assumptions. To improve this situation, a new approach is proposed that is based on the symbolic model checking tool NuSMV. A dual PID system is used as an example system, for which the logical part and the computational part of the system are modeled in a unified manner. Constraints are constructed on the controlled object, and a counter-example path is ultimately generated, indicating that the hybrid system can be analyzed by the model checking tool.

  6. A Modified Approach to Modeling of Diffusive Transformation Kinetics from Nonisothermal Data and Experimental Verification

    Science.gov (United States)

    Chen, Xiangjun; Xiao, Namin; Cai, Minghui; Li, Dianzhong; Li, Guangyao; Sun, Guangyong; Rolfe, Bernard F.

    2016-09-01

    An inverse model is proposed to construct the mathematical relationship between continuous cooling transformation (CCT) kinetics with constant rates and the isothermal one. The kinetic parameters in JMAK equations of isothermal kinetics can be deduced from the experimental CCT kinetics. Furthermore, a generalized model with a new additive rule is developed for predicting the kinetics of nucleation and growth during diffusional phase transformation with arbitrary cooling paths based only on CCT curve. A generalized contribution coefficient is introduced into the new additivity rule to describe the influences of current temperature and cooling rate on the incubation time of nuclei. Finally, then the reliability of the proposed model is validated using dilatometry experiments of a microalloy steel with fully bainitic microstructure based on various cooling routes.

  7. Simulating and Predicting Cereal Crop Yields in Ethiopia: Model Calibration and Verification

    Science.gov (United States)

    Yang, M.; Wang, G.; Ahmed, K. F.; Eggen, M.; Adugna, B.; Anagnostou, E. N.

    2017-12-01

    Agriculture in developing countries are extremely vulnerable to climate variability and changes. In East Africa, most people live in the rural areas with outdated agriculture techniques and infrastructure. Smallholder agriculture continues to play a key role in this area, and the rate of irrigation is among the lowest of the world. As a result, seasonal and inter-annual weather patterns play an important role in the spatiotemporal variability of crop yields. This study investigates how various climate variables (e.g., temperature, precipitation, sunshine) and agricultural practice (e.g., fertilization, irrigation, planting date) influence cereal crop yields using a process-based model (DSSAT) and statistical analysis, and focuses on the Blue Nile Basin of Ethiopia. The DSSAT model is driven with meteorological forcing from the ECMWF's latest reanalysis product that cover the past 35 years; the statistical model will be developed by linking the same meteorological reanalysis data with harvest data at the woreda level from the Ethiopian national dataset. Results from this study will set the stage for the development of a seasonal prediction system for weather and crop yields in Ethiopia, which will serve multiple sectors in coping with the agricultural impact of climate variability.

  8. A fission gas release model for MOX fuel and its verification

    International Nuclear Information System (INIS)

    Koo, Y.H.; Sohn, D.S.; Strijov, P.

    2000-01-01

    A fission gas release model for MOX fuel has been developed based on a model for UO 2 fuel. Using the concept of equivalent cell, the model considers the uneven distribution of Pu within the fuel matrix and a number of Pu-rich particles that could lead to a non-uniform fission rate and fission gas distribution across the fuel pellet. The model has been incorporated into a code, COSMOS, and some parametric studies were made to analyze the effect of the size and Pu content of Pu-rich agglomerates. The model was then applied to the experimental data obtained from the FIGARO program, which consisted of the base irradiation of MOX fuels in the BEZNAU-1 PWR and the subsequent irradiation of four refabricated fuel segments in the Halden reactor. The calculated gas releases show good agreement with the measured ones. In addition, the present analysis indicates that the microstructure of the MOX fuel used in the FIGARO program is such that it has produced little difference in terms of gas release compared with UO 2 fuel. (author)

  9. New droplet model developments

    International Nuclear Information System (INIS)

    Dorso, C.O.; Myers, W.D.; Swiatecki, W.J.; Moeller, P.; Treiner, J.; Weiss, M.S.

    1985-09-01

    A brief summary is given of three recent contributions to the development of the Droplet Model. The first concerns the electric dipole moment induced in octupole deformed nuclei by the Coulomb redistribution. The second concerns a study of squeezing in nuclei and the third is a study of the improved predictive power of the model when an empirical ''exponential'' term is included. 25 refs., 3 figs

  10. Plan for a laser weapon verification research program

    Energy Technology Data Exchange (ETDEWEB)

    Karr, T.J.

    1990-03-01

    Currently there is great interest in the question of how, or even whether, a treaty limiting the development and deployment of laser weapons could be verified. The concept of cooperative laser weapon verification is that each party would place monitoring stations near the other party's declared or suspect laser weapon facilities. The monitoring stations would measure the primary laser observables'' such as power or energy, either directly or by collecting laser radiation scattered from the air or the target, and would verify that the laser is operated within treaty limits. This concept is modeled along the lines of the seismic network recently activated in the USSR as a joint project of the United States Geologic Survey and the Soviet Academy of Sciences. The seismic data, gathered cooperatively, can be used by each party as it wishes, including to support verification of future nuclear test ban treaties. For laser weapon verification the monitoring stations are envisioned as ground-based, and would verify treaty limitations on ground-based laser anti-satellite (ASAT) weapons and on the ground-based development of other laser weapons. They would also contribute to verification of limitations on air-, sea- and space-based laser weapons, and the technology developed for cooperative verification could also be used in national technical means of verification. 2 figs., 4 tabs.

  11. Development of procedures for calculating stiffness and damping properties of elastomers in engineering applications. Part 1: Verification of basic methods

    Science.gov (United States)

    Chiang, T.; Tessarzik, J. M.; Badgley, R. H.

    1972-01-01

    The primary aim of this investigation was verification of basic methods which are to be used in cataloging elastomer dynamic properties (stiffness and damping) in terms of viscoelastic model constants. These constants may then be used to predict dynamic properties for general elastomer shapes and operating conditions, thereby permitting optimum application of elastomers as energy absorption and/or energy storage devices in the control of vibrations in a broad variety of applications. The efforts reported involved: (1) literature search; (2) the design, fabrication and use of a test rig for obtaining elastomer dynamic test data over a wide range of frequencies, amplitudes, and preloads; and (3) the reduction of the test data, by means of a selected three-element elastomer model and specialized curve fitting techniques, to material properties. Material constants thus obtained have been used to calculate stiffness and damping for comparison with measured test data. These comparisons are excellent for a number of test conditions and only fair to poor for others. The results confirm the validity of the basic approach of the overall program and the mechanics of the cataloging procedure, and at the same time suggest areas in which refinements should be made.

  12. Probabilistic Model Development

    Science.gov (United States)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  13. Field verification of linear and nonlinear hybrid wave models for offshore tower response prediction

    Energy Technology Data Exchange (ETDEWEB)

    Couch, A.T. [Hudson Engineering, Houston, TX (United States). Offshore Structural Div.; Conte, J.P. [Rice Univ., Houston, TX (United States). Dept. of Civil Engineering

    1996-12-31

    Accuracy of the prediction of the dynamic response of deepwater fixed offshore platforms to irregular sea waves depends very much on the theory used to determine water kinematics from the mudline to the free surface. A common industry practice consists of using linear wave theory, which assumes infinitesimal wave steepness, in conjunction with empirical wave stretching techniques to provide a more realistic representation of near surface kinematics. The current velocity field is then added to the wave-induced fluid velocity field and the wave-and-current forces acting on the structure are computed via Morrison`s equation. The first objective of this study is to compare the predicted responses of Cognac, a deepwater fixed platform, obtained from various empirically stretched linear wave models with the response of Cognac predicted based on the Hybrid Wave Model. The latter is a recently developed higher-order, and therefore more accurate, wave model which satisfies, up to the second-order in wave steepness, the local mass conservation and the free surface boundary conditions up to the free surface. The second objective of this study consists of comparing the various analytical response predictions with the measured response of the Cognac platform. Availability of a set of oceanographic and structural vibration data for Cognac provides a unique opportunity to evaluate the prediction ability of traditional analytical models used in designing such structures. The results of this study indicate that (1) the use of the Hybrid Wave Model provides a predicted platform response which is in closer agreement with the measured response than the predictions based on the various stretched linear wave models; and (2) the Wheeler stretching technique produces platform response results which are more accurate than those obtained by using the other stretching schemes considered here.

  14. Migration of 90Sr, 137Cs and Pu in soils. Verification of a computer model on the behaviour of these radiocontaminants in soils of Western Europe

    International Nuclear Information System (INIS)

    Frissel, M.J.; Poelstra, P.; Klugt, N. van der.

    1980-01-01

    The main emphasis in 1979 was on the 239 240 Pu model for simulating translocations in soil. The verification was hampered because data for 239 Pu were available from only two locations. A comparison between the observed and predicted Pu distribution however indicated the possibility of using the available simulation approach for 239 240 Pu. (Auth.)

  15. Bedrock geology Forsmark. Modelling stage 2.3. Implications for and verification of the deterministic geological models based on complementary data

    Energy Technology Data Exchange (ETDEWEB)

    Stephens, Michael B. (Geological Survey of Sweden, Uppsala (Sweden)); Simeonov, Assen (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Isaksson, Hans (GeoVista AB, Luleaa (Sweden))

    2008-12-15

    The Swedish Nuclear Fuel and Waste Management Company is in the process of completing site descriptive modelling at two locations in Sweden, with the objective to site a deep geological repository for spent nuclear fuel. At Forsmark, the results of the stage 2.2 geological modelling formed the input for downstream users. Since complementary ground and borehole geological and geophysical data, acquired after model stage 2.2, were not planned to be included in the deterministic rock domain, fracture domain and deformation zone models supplied to the users, it was deemed necessary to evaluate the implications of these stage 2.3 data for the stage 2.2 deterministic geological models and, if possible, to make use of these data to verify the models. This report presents the results of the analysis of the complementary stage 2.3 geological and geophysical data. Model verification from borehole data has been implemented in the form of a prediction-outcome test. The stage 2.3 geological and geophysical data at Forsmark mostly provide information on the bedrock outside the target volume. Additional high-resolution ground magnetic data and the data from the boreholes KFM02B, KFM11A, KFM12A and HFM33 to HFM37 can be included in this category. Other data complement older information of identical character, both inside and outside this volume. These include the character and kinematics of deformation zones and fracture mineralogy. In general terms, it can be stated that all these new data either confirm the geological modelling work completed during stage 2.2 or are in good agreement with the data that were used in this work. In particular, although the new high-resolution ground magnetic data modify slightly the position and trace length of some stage 2.2 deformation zones at the ground surface, no new or modified deformation zones with a trace length longer than 3,000 m at the ground surface have emerged. It is also apparent that the revision of fracture orientation data

  16. Bedrock geology Forsmark. Modelling stage 2.3. Implications for and verification of the deterministic geological models based on complementary data

    International Nuclear Information System (INIS)

    Stephens, Michael B.; Simeonov, Assen; Isaksson, Hans

    2008-12-01

    The Swedish Nuclear Fuel and Waste Management Company is in the process of completing site descriptive modelling at two locations in Sweden, with the objective to site a deep geological repository for spent nuclear fuel. At Forsmark, the results of the stage 2.2 geological modelling formed the input for downstream users. Since complementary ground and borehole geological and geophysical data, acquired after model stage 2.2, were not planned to be included in the deterministic rock domain, fracture domain and deformation zone models supplied to the users, it was deemed necessary to evaluate the implications of these stage 2.3 data for the stage 2.2 deterministic geological models and, if possible, to make use of these data to verify the models. This report presents the results of the analysis of the complementary stage 2.3 geological and geophysical data. Model verification from borehole data has been implemented in the form of a prediction-outcome test. The stage 2.3 geological and geophysical data at Forsmark mostly provide information on the bedrock outside the target volume. Additional high-resolution ground magnetic data and the data from the boreholes KFM02B, KFM11A, KFM12A and HFM33 to HFM37 can be included in this category. Other data complement older information of identical character, both inside and outside this volume. These include the character and kinematics of deformation zones and fracture mineralogy. In general terms, it can be stated that all these new data either confirm the geological modelling work completed during stage 2.2 or are in good agreement with the data that were used in this work. In particular, although the new high-resolution ground magnetic data modify slightly the position and trace length of some stage 2.2 deformation zones at the ground surface, no new or modified deformation zones with a trace length longer than 3,000 m at the ground surface have emerged. It is also apparent that the revision of fracture orientation data

  17. The verification methodologies for a software modeling of Engineered Safety Features- Component Control System (ESF-CCS)

    International Nuclear Information System (INIS)

    Lee, Young-Jun; Cheon, Se-Woo; Cha, Kyung-Ho; Park, Gee-Yong; Kwon, Kee-Choon

    2007-01-01

    The safety of a software is not guaranteed through a simple testing of the software. The testing reviews only the static functions of a software. The behavior, dynamic state of a software is not reviewed by a software testing. The Ariane5 rocket accident and the failure of the Virtual Case File Project are determined by a software fault. Although this software was tested thoroughly, the potential errors existed internally. There are a lot of methods to solve these problems. One of the methods is a formal methodology. It describes the software requirements as a formal specification during a software life cycle and verifies a specified design. This paper suggests the methods which verify the design to be described as a formal specification. We adapt these methods to the software of a ESF-CCS (Engineered Safety Features-Component Control System) and use the SCADE (Safety Critical Application Development Environment) tool for adopting the suggested verification methods

  18. RSMASS system model development

    International Nuclear Information System (INIS)

    Marshall, A.C.; Gallup, D.R.

    1998-01-01

    RSMASS system mass models have been used for more than a decade to make rapid estimates of space reactor power system masses. This paper reviews the evolution of the RSMASS models and summarizes present capabilities. RSMASS has evolved from a simple model used to make rough estimates of space reactor and shield masses to a versatile space reactor power system model. RSMASS uses unique reactor and shield models that permit rapid mass optimization calculations for a variety of space reactor power and propulsion systems. The RSMASS-D upgrade of the original model includes algorithms for the balance of the power system, a number of reactor and shield modeling improvements, and an automatic mass optimization scheme. The RSMASS-D suite of codes cover a very broad range of reactor and power conversion system options as well as propulsion and bimodal reactor systems. Reactor choices include in-core and ex-core thermionic reactors, liquid metal cooled reactors, particle bed reactors, and prismatic configuration reactors. Power conversion options include thermoelectric, thermionic, Stirling, Brayton, and Rankine approaches. Program output includes all major component masses and dimensions, efficiencies, and a description of the design parameters for a mass optimized system. In the past, RSMASS has been used as an aid to identify and select promising concepts for space power applications. The RSMASS modeling approach has been demonstrated to be a valuable tool for guiding optimization of the power system design; consequently, the model is useful during system design and development as well as during the selection process. An improved in-core thermionic reactor system model RSMASS-T is now under development. The current development of the RSMASS-T code represents the next evolutionary stage of the RSMASS models. RSMASS-T includes many modeling improvements and is planned to be more user-friendly. RSMASS-T will be released as a fully documented, certified code at the end of

  19. Flow aerodynamics modeling of an MHD swirl combustor - calculations and experimental verification

    International Nuclear Information System (INIS)

    Gupta, A.K.; Beer, J.M.; Louis, J.F.; Busnaina, A.A.; Lilley, D.G.

    1981-01-01

    This paper describes a computer code for calculating the flow dynamics of constant density flow in the second stage trumpet shaped nozzle section of a two stage MHD swirl combustor for application to a disk generator. The primitive pressure-velocity variable, finite difference computer code has been developed to allow the computation of inert nonreacting turbulent swirling flows in an axisymmetric MHD model swirl combustor. The method and program involve a staggered grid system for axial and radial velocities, and a line relaxation technique for efficient solution of the equations. Tue produces as output the flow field map of the non-dimensional stream function, axial and swirl velocity. 19 refs

  20. A Runtime Verification System for Developing, Analyzing and Controlling Complex Safety-Critical Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A comprehensive commercial-grade system for the development of safe parallel and serial programs is developed. The system has the ability to perform efficient...

  1. Numerical climate modeling and verification of selected areas for heat waves of Pakistan using ensemble prediction system

    International Nuclear Information System (INIS)

    Amna, S; Samreen, N; Khalid, B; Shamim, A

    2013-01-01

    Depending upon the topography, there is an extreme variation in the temperature of Pakistan. Heat waves are the Weather-related events, having significant impact on the humans, including all socioeconomic activities and health issues as well which changes according to the climatic conditions of the area. The forecasting climate is of prime importance for being aware of future climatic changes, in order to mitigate them. The study used the Ensemble Prediction System (EPS) for the purpose of modeling seasonal weather hind-cast of three selected areas i.e., Islamabad, Jhelum and Muzaffarabad. This research was purposely carried out in order to suggest the most suitable climate model for Pakistan. Real time and simulated data of five General Circulation Models i.e., ECMWF, ERA-40, MPI, Meteo France and UKMO for selected areas was acquired from Pakistan Meteorological Department. Data incorporated constituted the statistical temperature records of 32 years for the months of June, July and August. This study was based on EPS to calculate probabilistic forecasts produced by single ensembles. Verification was done out to assess the quality of the forecast t by using standard probabilistic measures of Brier Score, Brier Skill Score, Cross Validation and Relative Operating Characteristic curve. The results showed ECMWF the most suitable model for Islamabad and Jhelum; and Meteo France for Muzaffarabad. Other models have significant results by omitting particular initial conditions.

  2. Theoretical model and experimental verification on the PID tracking method using liquid crystal optical phased array

    Science.gov (United States)

    Wang, Xiangru; Xu, Jianhua; Huang, Ziqiang; Wu, Liang; Zhang, Tianyi; Wu, Shuanghong; Qiu, Qi

    2017-02-01

    Liquid crystal optical phased array (LC-OPA) has been considered with great potential on the non-mechanical laser deflector because it is fabricated using photolithographic patterning technology which has been well advanced by the electronics and display industry. As a vital application of LC-OPA, free space laser communication has demonstrated its merits on communication bandwidth. Before data communication, ATP (acquisition, tracking and pointing) process costs relatively long time to result in a bottle-neck of free space laser communication. Meanwhile, dynamic real time accurate tracking is sensitive to keep a stable communication link. The electro-optic medium liquid crystal with low driving voltage can be used as the laser beam deflector. This paper presents a fast-track method using liquid crystal optical phased array as the beam deflector, CCD as a beacon light detector. PID (Proportion Integration Differentiation) loop algorithm is introduced as the controlling algorithm to generate the corresponding steering angle. To achieve the goal of fast and accurate tracking, theoretical analysis and experimental verification are demonstrated that PID closed-loop system can suppress the attitude random vibration. Meanwhile, theoretical analysis shows that tracking accuracy can be less than 6.5μrad, with a relative agreement with experimental results which is obtained after 10 adjustments that the tracking accuracy is less than12.6μrad.

  3. SKA aperture array verification system: electromagnetic modeling and beam pattern measurements using a micro UAV

    Science.gov (United States)

    de Lera Acedo, E.; Bolli, P.; Paonessa, F.; Virone, G.; Colin-Beltran, E.; Razavi-Ghods, N.; Aicardi, I.; Lingua, A.; Maschio, P.; Monari, J.; Naldi, G.; Piras, M.; Pupillo, G.

    2017-12-01

    In this paper we present the electromagnetic modeling and beam pattern measurements of a 16-elements ultra wideband sparse random test array for the low frequency instrument of the Square Kilometer Array telescope. We discuss the importance of a small array test platform for the development of technologies and techniques towards the final telescope, highlighting the most relevant aspects of its design. We also describe the electromagnetic simulations and modeling work as well as the embedded-element and array pattern measurements using an Unmanned Aerial Vehicle system. The latter are helpful both for the validation of the models and the design as well as for the future instrumental calibration of the telescope thanks to the stable, accurate and strong radio frequency signal transmitted by the UAV. At this stage of the design, these measurements have shown a general agreement between experimental results and numerical data and have revealed the localized effect of un-calibrated cable lengths in the inner side-lobes of the array pattern.

  4. SKA aperture array verification system: electromagnetic modeling and beam pattern measurements using a micro UAV

    Science.gov (United States)

    de Lera Acedo, E.; Bolli, P.; Paonessa, F.; Virone, G.; Colin-Beltran, E.; Razavi-Ghods, N.; Aicardi, I.; Lingua, A.; Maschio, P.; Monari, J.; Naldi, G.; Piras, M.; Pupillo, G.

    2018-03-01

    In this paper we present the electromagnetic modeling and beam pattern measurements of a 16-elements ultra wideband sparse random test array for the low frequency instrument of the Square Kilometer Array telescope. We discuss the importance of a small array test platform for the development of technologies and techniques towards the final telescope, highlighting the most relevant aspects of its design. We also describe the electromagnetic simulations and modeling work as well as the embedded-element and array pattern measurements using an Unmanned Aerial Vehicle system. The latter are helpful both for the validation of the models and the design as well as for the future instrumental calibration of the telescope thanks to the stable, accurate and strong radio frequency signal transmitted by the UAV. At this stage of the design, these measurements have shown a general agreement between experimental results and numerical data and have revealed the localized effect of un-calibrated cable lengths in the inner side-lobes of the array pattern.

  5. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  6. Evaluation and Verification of Decadal Predictions using the MiKlip Central Evaluation System - a Case Study using the MiKlip Prototype Model Data

    Science.gov (United States)

    Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich

    2016-04-01

    MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.

  7. Investigation, development and verification of printed circuit board embedded air-core solenoid transformers

    DEFF Research Database (Denmark)

    Mønster, Jakob Døllner; Madsen, Mickey Pierre; Pedersen, Jeppe Arnsdorf

    2015-01-01

    A new printed circuit board embedded air-core transformer/coupled inductor is proposed and presented. The transformer is intended for use in power converter applications operating at very high frequency between 30 MHz to 300 MHz. The transformer is based on two or more solenoid structures...... in different configurations. The different configurations are compared for usefulness as a transformer solution, and an analytical model of the inductive parameters for the most suitable configuration is derived for design purpose. The analytical model is verified by comparing calculations and measurements...... of prototypes. The analytical model shows good agreement with the measured results. The model can predict the inductive parameters of the transformer with a deviation range of approximately 3 % to 22 %. Lastly a prototype is used in a VHF converter to achieve a rise of 2.2% points in efficiency....

  8. HUMTRN: documentation and verification for an ICRP-based age- and sex-specific human simulation model for radionuclide dose assessment

    International Nuclear Information System (INIS)

    Gallegos, A.F.; Wenzel, W.J.

    1984-06-01

    The dynamic human simulation model HUMTRN is designed specifically as a major module of BIOTRAN to integrate climatic, hydrologic, atmospheric, food crop, and herbivore simulation with human dietary and physiological characteristics, and metabolism and radionuclides to predict radiation doses to selected organs of both sexes in different age groups. The model is based on age- and weight-specific equations developed for predicting human radionuclide transport from metabolic and physical characteristics. These characteristics are modeled from studies documented by the International Commission on Radiological Protection (ICRP 23). HUMTRN allows cumulative doses from uranium or plutonium radionuclides to be predicted by modeling age-specific anatomical, physiological, and metabolic properties of individuals between 1 and 70 years of age and can track radiation exposure and radionuclide metabolism for any age group for specified daily or yearly time periods. The simulated daily dose integration of eight or more simultaneous air, water, and food intakes gives a new, comprehensive, dynamic picture of radionuclide intake, uptake, and hazard analysis for complex scenarios. A detailed example using site-specific data based on the Pantex studies is included for verification. 14 references, 24 figures, 10 tables

  9. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  10. Development of a 2D array silicon detector magic plate for the dosimetric verification of IMRT treatment delivery

    International Nuclear Information System (INIS)

    Wong, J.H.D.; Fuduli, Iolanda; Lerch, M.L.F.; Petasecca, Marco; Metcalfe, P.E.

    2011-01-01

    Full text: We have developed an IMRT and VMAT dosimetry system for pre-treatment and during treatment verification. The 'Magic Plate' (MP) diode array was designed and prototyped by the CMRP and ICCe. It is a 2D diode array that can be used for in phantom dose measurement and can also function as a transmission detector for in vivo measurements during patient treatment. The prototype MP comprises of II x 11 silicon diodes mounted on a 0.6 mm Kapton substrate. Detectors are spaced 1 cm apart with sensitive volumes of 0.5 x 0.5 x 0.05 mm'. Phantom measurements were performed using the MP located at isocentre in the cavity of an l'mRT phantom. For fluence measurements in transmission mode the MP was mounted on the linac accessory slot. The detector was characterized and a nine field head and neck IMRT test plan was delivered. Measurements were compared with EBT2 films and Pinnacle predicted dose distributions. The 3%/3 mm gamma criteria was used for comparison. Average pass rates for the MP versus Pinnacle and MP versus EBT2 were 82.9 and 88.1 % in the phantom. Gamma analysis of MP versus EBT2 when used as transmission detectors gave a pass rate of 94.8%. The prototype MP design shows promise as IMRT dosimetric system. Future work involves refining the acquisition system, further detailed characterization, Monte Carlo simulation of the detector and its application to VMAT.

  11. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    International Nuclear Information System (INIS)

    Zhao, J; Hu, W; Xing, Y; Wu, X; Li, Y

    2016-01-01

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, position and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.

  12. Development and Verification of the Soil-Pile Interaction Extension for SubDyn

    Energy Technology Data Exchange (ETDEWEB)

    Damiani, Rick R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-01-02

    SubDyn is the substructure structural-dynamics module for the aero-hydro-servo-elastic tool FAST v8. SubDyn uses a finite-element model (FEM) to simulate complex multimember lattice structures connected to conventional turbines and towers, and it can make use of the Craig-Bampton model reduction. Here we describe the newly added capability to handle soil-pile stiffness and compare results for monopile and jacket-based offshore wind turbines as obtained with FAST v8, SACS, and EDP (the latter two are modeling software packages commonly used in the offshore oil and gas industry). The level of agreement in terms of modal properties and loads for the entire offshore wind turbine components is excellent, thus allowing SubDyn and FAST v8 to accurately simulate offshore wind turbines on fixed-bottom structures and accounting for the effect of soil dynamics, thus reducing risk to the project.

  13. Analytical model for performance verification of liquid poison injection system of a nuclear reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kansal, Anuj Kumar, E-mail: anujkumarkansal@gmail.com; Maheshwari, Naresh Kumar, E-mail: nmahesh@barc.gov.in; Vijayan, Pallippattu Krishnan, E-mail: vijayanp@barc.gov.in

    2014-08-15

    Highlights: • One-dimensional modelling of shut down system-2. • Semi-empirical correlation poison jet progression. • Validation of code. - Abstract: Shut down system-2 (SDS-2) in advanced vertical pressure tube type reactor, provides rapid reactor shutdown by high pressure injection of a neutron absorbing liquid called poison, into the moderator in the calandria. Poison inside the calandria is distributed by poison jets issued from holes provided in the injection tubes. Effectiveness of the system depends on the rate and spread of the poison in the moderator. In this study, a transient one-dimensional (1D) hydraulic code, COPJET is developed, to predict the performance of system by predicting progression of poison jet with time. Validation of the COPJET is done with the data available in literature. Thereafter, it is applied for advanced vertical pressure type reactor.

  14. Offline Signature Verification Using the Discrete Radon Transform and a Hidden Markov Model

    Directory of Open Access Journals (Sweden)

    J. Coetzer

    2004-04-01

    Full Text Available We developed a system that automatically authenticates offline handwritten signatures using the discrete Radon transform (DRT and a hidden Markov model (HMM. Given the robustness of our algorithm and the fact that only global features are considered, satisfactory results are obtained. Using a database of 924 signatures from 22 writers, our system achieves an equal error rate (EER of 18% when only high-quality forgeries (skilled forgeries are considered and an EER of 4.5% in the case of only casual forgeries. These signatures were originally captured offline. Using another database of 4800 signatures from 51 writers, our system achieves an EER of 12.2% when only skilled forgeries are considered. These signatures were originally captured online and then digitally converted into static signature images. These results compare well with the results of other algorithms that consider only global features.

  15. Model for Sulfate Diffusion Depth in Concrete under Complex Aggressive Environments and Its Experimental Verification

    Directory of Open Access Journals (Sweden)

    Yingwu Zhou

    2015-01-01

    Full Text Available Sulfate attack is one of the most important factors that lead to the performance deterioration of concrete materials. The progress of the sulfate diffusion depth in concrete is an important index that quantitatively characterizes the rate of concrete damage, cracking, and spalling due to sulfate attacks. The progress of the diffusion depth of concrete to sulfate attack is systematically investigated in this paper by both theoretical and experimental study. A newly time-varying model of the diffusion depth is developed, which has comprehensively considered a mass of parameter of complex environments for the first time. On this basis, a method is further proposed for effectively predicting the residual life of in-service concrete structures subject to sulfate attack. Integrating the data from the self-designed high-temperature dry-wet accelerated corrosion test and a large amount of experimental data reported in the existing literatures, the effectiveness and accuracy of the time-varying model of the diffusion depth by sulfates are finally verified.

  16. Verification of high voltage rf capacitive sheath models with particle-in-cell simulations

    Science.gov (United States)

    Wang, Ying; Lieberman, Michael; Verboncoeur, John

    2009-10-01

    Collisionless and collisional high voltage rf capacitive sheath models were developed in the late 1980's [1]. Given the external parameters of a single-frequency capacitively coupled discharge, plasma parameters including sheath width, electron and ion temperature, plasma density, power, and ion bombarding energy can be estimated. One-dimensional electrostatic PIC codes XPDP1 [2] and OOPD1 [3] are used to investigate plasma behaviors within rf sheaths and bulk plasma. Electron-neutral collisions only are considered for collisionless sheaths, while ion-neutral collisions are taken into account for collisional sheaths. The collisionless sheath model is verified very well by PIC simulations for the rf current-driven and voltage-driven cases. Results will be reported for collisional sheaths also. [1] M. A. Lieberman, IEEE Trans. Plasma Sci. 16 (1988) 638; 17 (1989) 338 [2] J. P. Verboncoeur, M. V. Alves, V. Vahedi, and C. K. Birdsall, J. Comp. Phys. 104 (1993) 321 [3] J. P. Verboncoeur, A. B. Langdon and N. T. Gladd, Comp. Phys. Comm. 87 (1995) 199

  17. Verification and Validation of Encapsulation Flow Models in GOMA, Version 1.1; TOPICAL

    International Nuclear Information System (INIS)

    MONDY, LISA ANN; RAO, REKHA R.; SCHUNK, P. RANDALL; SACKINGER, PHILIP A.; ADOLF, DOUGLAS B.

    2001-01-01

    Encapsulation is a common process used in manufacturing most non-nuclear components including: firing sets, neutron generators, trajectory sensing signal generators (TSSGs), arming, fusing and firing devices (AF and Fs), radars, programmers, connectors, and batteries. Encapsulation is used to contain high voltage, to mitigate stress and vibration and to protect against moisture. The purpose of the ASCI Encapsulation project is to develop a simulation capability that will allow us to aid in the encapsulation design process, especially for neutron generators. The introduction of an encapsulant poses many problems because of the need to balance ease of processing and properties necessary to achieve the design benefits such as tailored encapsulant properties, optimized cure schedule and reduced failure rates. Encapsulants can fail through fracture or delamination as a result of cure shrinkage, thermally induced residual stresses, voids or incomplete component embedding and particle gradients. Manufacturing design requirements include (1) maintaining uniform composition of particles in order to maintain the desired thermal coefficient of expansion (CTE) and density, (2) mitigating void formation during mold fill, (3) mitigating cure and thermally induced stresses during cure and cool down, and (4) eliminating delamination and fracture due to cure shrinkage/thermal strains. The first two require modeling of the fluid phase, and it is proposed to use the finite element code GOMA to accomplish this. The latter two require modeling of the solid state; however, ideally the effects of particle distribution would be included in the calculations, and thus initial conditions would be set from GOMA predictions. These models, once they are verified and validated, will be transitioned into the SIERRA framework and the ARIA code. This will facilitate exchange of data with the solid mechanics calculations in SIERRA/ADAGIO

  18. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  19. Integrated Ray Tracing Model for End-to-end Performance Verification of Amon-Ra Instrument

    Science.gov (United States)

    Lee, Jae-Min; Park, Won Hyun; Ham, Sun-Jeong, Yi, Hyun-Su; Yoon, Jee Yeon; Kim, Sug-Whan; Choi, Ki-Hyuk; Kim, Zeen Chul; Lockwood, Mike

    2007-03-01

    The international EARTHSHINE mission is to measure 1% anomaly of the Earth global albedo and total solar irradiance using Amon-Ra instrument around Lagrange point 1. We developed a new ray tracing based integrated end-to-end simulation tool that overcomes the shortcomings of the existing end-to-end performance simulation techniques. We then studied the in-orbit radiometric performance of the breadboard Amon-Ra visible channel optical system. The TSI variation and the Earth albedo anomaly, reported elsewhere, were used as the key input variables in the simulation. The output flux at the instrument focal plane confirms that the integrated ray tracing based end-to-end science simulation delivers the correct level of incident power to the Amon-Ra instrument well within the required measurement error budget of better than ±0.28%. Using the global angular distribution model (ADM), the incident flux is then used to estimate the Earth global albedo and the TSI variation, confirming the validity of the primary science cases at the L1 halo orbit. These results imply that the integrated end-to-end ray tracing technique, reported here, can serve as an effective and powerful building block of the on-line science analysis tool in support of the international EARTHSHINE mission currently being developed.

  20. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-21

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  1. Development, implementation, and verification of multicycle depletion perturbation theory for reactor burnup analysis

    Energy Technology Data Exchange (ETDEWEB)

    White, J.R.

    1980-08-01

    A generalized depletion perturbation formulation based on the quasi-static method for solving realistic multicycle reactor depletion problems is developed and implemented within the VENTURE/BURNER modular code system. The present development extends the original formulation derived by M.L. Williams to include nuclide discontinuities such as fuel shuffling and discharge. This theory is first described in detail with particular emphasis given to the similarity of the forward and adjoint quasi-static burnup equations. The specific algorithm and computational methods utilized to solve the adjoint problem within the newly developed DEPTH (Depletion Perturbation Theory) module are then briefly discussed. Finally, the main features and computational accuracy of this new method are illustrated through its application to several representative reactor depletion problems.

  2. Achievement report for fiscal 1998 on the development of superconductor power application technology. 2. Research and development of superconducting wire and superconductive power generator, research of total system, research and development of refrigeration system, and verification test; 1998 nendo chodendo denryoku oyo gijutsu kaihatsu seika hokokusho. 2. Chodendo senzai no kenkyu kaihatsu, chodendo hatsudenki no kenkyu kaihatsu, total sytsem no kenkyu, reito system no kenkyu kaihatsu, jissho shiken

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The slow excitation response type power generator is studied when the rotor and stator of a 70,000kW-class model are combinedly subjected to an on-site verification test, when a good result is obtained. The rotor is disassembled for inspection, and its members are found to be sound without any problem in terms of mechanical strength. The quick excitation response type is studied when a 70,000kW model is experimentally built and subjected to an on-site verification test after a rotation and excitation test in the factory, when the pilot machine concept design is reviewed. In the study of a total system, efforts continue for the review of the model machine test method, improvement on generator design and analytical methods, development of operating methods, and the effect of its introduction into the power system. Since a He-refrigerated system is requested to exhibit high reliability for application to power equipment and to be capable of continuous long-period operation, a system having constituents with their reliability enhanced and an appropriate redundant system is developed, and a verification study is under way which will continue for more than 10,000 hours. Described also is an oil-free low-temperature turbo refrigerator. The latest quick excitation response type rotor is also tested for verification. (NEDO)

  3. Preliminary Study for Development of Welds Integrity Verification Equipment for the Small Bore Piping

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Geun Suk; Lee, Jong Eun; Ryu, Jung Hoon; Cho, Kyoung Youn; Sohn, Myoung Sung [KEA, Seoul (Korea, Republic of); Lee, Sanghoon [Korea Institute of Materials Science, Changwon (Korea, Republic of); Sung, Gi Ho [SUNG IL(SIM)Co., Busan (Korea, Republic of); Cho, Hong Seok [KEPCO KPS, Naju (Korea, Republic of)

    2016-10-15

    It has been reported leakage accident of small-bore piping in Korea. Leakage accident of small-bore pipes are those that will increase due to the aging of the nuclear power plant. And if leakage of the pipe is repaired by using the clamping device when it occur accident, it is economically benefits. The clamping device is a fastening device used to hold or secure objects tightly together to prevent movement or separation through the application of inward pressure. However, when the accident occurs, it can't immediately respond because maintenance and repairing technology are not institutionalized in KEPIC. Thus it appears an economic loss. The technology for corresponding thereto is necessary for the safety of the operation of nuclear power plants. The purpose of this research is to develop an online repairing technology of socket welded pipe and vibration monitoring system of small-bore pipe in the nuclear power plant. Specifically, detailed studies are as follows : • Development of weld overlay method of safety class socket welded connections • Development of Mechanical Clamping Devices for Safety Class 2, 3 small-bore pipe. The purpose of this study is to develop an online repairing technology of socket welded pipe and vibration monitoring system of small-bore pipe, resulting in degraded plant systems. And it is necessary to institutionalize the technology. The fatigue crack testing of socket welded overlay will be performed and fatigue life evaluation method will be developed in second year. Also prototype fabrication of mechanical clamping device will be completed. Base on final goal, the intent is to propose practical evaluation tools, design and fabrication methods for socket welded connection integrity. And result of this study is to development of KEPIC code case approved technology for on-line repairing system of socket welded connection and fabrication of mechanical clamping device.

  4. Development and verification of ground-based tele-robotics operations concept for Dextre

    Science.gov (United States)

    Aziz, Sarmad

    2013-05-01

    The Special Purpose Dextreous Manipulator (Dextre) is the latest addition to the on-orbit segment of the Mobile Servicing System (MSS); Canada's contribution to the International Space Station (ISS). Launched in March 2008, the advanced two-armed robot is designed to perform various ISS maintenance tasks on robotically compatible elements and on-orbit replaceable units using a wide variety of tools and interfaces. The addition of Dextre has increased the capabilities of the MSS, and has introduced significant complexity to ISS robotics operations. While the initial operations concept for Dextre was based on human-in-the-loop control by the on-orbit astronauts, the complexities of robotic maintenance and the associated costs of training and maintaining the operator skills required for Dextre operations demanded a reexamination of the old concepts. A new approach to ISS robotic maintenance was developed in order to utilize the capabilities of Dextre safely and efficiently, while at the same time reducing the costs of on-orbit operations. This paper will describe the development, validation, and on-orbit demonstration of the operations concept for ground-based tele-robotics control of Dextre. It will describe the evolution of the new concepts from the experience gained from the development and implementation of the ground control capability for the Space Station Remote Manipulator System; Canadarm 2. It will discuss the various technical challenges faced during the development effort, such as requirements for high positioning accuracy, force/moment sensing and accommodation, failure tolerance, complex tool operations, and the novel operational tools and techniques developed to overcome them. The paper will also describe the work performed to validate the new concepts on orbit and will discuss the results and lessons learned from the on-orbit checkout and commissioning of Dextre using the newly developed tele-robotics techniques and capabilities.

  5. Development and verification of a high performance multi-group SP3 transport capability in the ARTEMIS core simulator

    International Nuclear Information System (INIS)

    Van Geemert, Rene

    2008-01-01

    For satisfaction of future global customer needs, dedicated efforts are being coordinated internationally and pursued continuously at AREVA NP. The currently ongoing CONVERGENCE project is committed to the development of the ARCADIA R next generation core simulation software package. ARCADIA R will be put to global use by all AREVA NP business regions, for the entire spectrum of core design processes, licensing computations and safety studies. As part of the currently ongoing trend towards more sophisticated neutronics methodologies, an SP 3 nodal transport concept has been developed for ARTEMIS which is the steady-state and transient core simulation part of ARCADIA R . For enabling a high computational performance, the SP N calculations are accelerated by applying multi-level coarse mesh re-balancing. In the current implementation, SP 3 is about 1.4 times as expensive computationally as SP 1 (diffusion). The developed SP 3 solution concept is foreseen as the future computational workhorse for many-group 3D pin-by-pin full core computations by ARCADIA R . With the entire numerical workload being highly parallelizable through domain decomposition techniques, associated CPU-time requirements that adhere to the efficiency needs in the nuclear industry can be expected to become feasible in the near future. The accuracy enhancement obtainable by using SP 3 instead of SP 1 has been verified by a detailed comparison of ARTEMIS 16-group pin-by-pin SP N results with KAERI's DeCart reference results for the 2D pin-by-pin Purdue UO 2 /MOX benchmark. This article presents the accuracy enhancement verification and quantifies the achieved ARTEMIS-SP 3 computational performance for a number of 2D and 3D multi-group and multi-box (up to pin-by-pin) core computations. (authors)

  6. Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans

    Science.gov (United States)

    2017-07-13

    retopologized geometry (green) was aligned with the models segmented from CT data (peach). From each Mimics file, landmarks were exported to align scale and...created using a series of density selection tools as well as manual selection tools. The masks often have high resolution and artifact interference that...systems,6 as shown in Fig. 4. Fig. 4 To ensure proper fit and placement of the final geometry, the retopologized geometry (green) was aligned with

  7. Kinematic Modelling and Simulation of a 2-R Robot Using SolidWorks and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Mahmoud Gouasmi

    2012-12-01

    Full Text Available The simulation of robot systems is becoming very popular, especially with the lowering of the cost of computers, and it can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. The trajectory planning of redundant manipulators is a very active area since many tasks require special characteristics to be satisfied. The importance of redundant manipulators has increased over the last two decades because of the possibility of avoiding singularities as well as obstacles within the course of motion. The angle that the last link of a 2 DOF manipulator makes with the x-axis is required in order to find the solution for the inverse kinematics problem. This angle could be optimized with respect to a given specified key factor (time, velocity, torques while the end-effector performs a chosen trajectory (i.e., avoiding an obstacle in the task space. Modeling and simulation of robots could be achieved using either of the following models: the geometrical model (positions, postures, the kinematic model and the dynamic model. To do so, the modelization of a 2-R robot type is implemented. Our main tasks are comparing two robot postures with the same trajectory (path and for the same length of time, and establishing a computing code to obtain the kinematic and dynamic parameters. SolidWorks and MATLAB/Simulink softwares are used to check the theory and the robot motion simulation. This could be easily generalized to a 3-R robot and possibly therefore to any serial robot (Scara, Puma, etc.. The verification of the obtained results by both softwares allows us to qualitatively evaluate and underline the validityof the chosen model and obtain the right conclusions. The results of the simulations are discussed and an agreement between the two softwares is certainly obtained.

  8. Design, Development and Delivery of Active Learning Tools in Software Verification & Validation Education

    Science.gov (United States)

    Acharya, Sushil; Manohar, Priyadarshan Anant; Wu, Peter; Maxim, Bruce; Hansen, Mary

    2018-01-01

    Active learning tools are critical in imparting real world experiences to the students within a classroom environment. This is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains with little to no training. However, there is a well-recognized need for the…

  9. Preliminary verification results of the DWD limited area model LME and evaluation of its storm forecasting skill over the area of Cyprus

    Directory of Open Access Journals (Sweden)

    A. Orphanou

    2006-01-01

    Full Text Available A preliminary verification and evaluation is made of the forecast fields of the non-hydrostatic limited area model LME of the German Weather Service (DWD, for a recent three month period. For this purpose, observations from two synoptic stations in Cyprus are utilized. In addition, days with depressions over the area were selected in order to evaluate the model's forecast skill in storm forecasting.

  10. Safety verification of radiation shielding and heat transfer for a model for dry

    International Nuclear Information System (INIS)

    Yu, Haiyan; Tang, Xiaobin; Wang, Peng; Chen, Feida; Chai, Hao; Chen, Da

    2015-01-01

    Highlights: • New type of dry spent fuel storage was designed. • MC method and FEM were used to verify the reliability of new storage. • Radiation shield and heat transfer both meet IAEA standards: 2 mSv/h, 0.1 mSv/h and 190 °C, 85 °C. • Provided possibilities for future implementation of this type of dry storage. - Abstract: The goal of this research is to develop a type of dry spent fuel storage called CHN-24 container, which could contain an equivalent load of 45 GWD/MTU of spent fuel after 10 years cooling. Basically, radiation shielding performance and safe removal of decay heat, which play important roles in the safety performance, were checked and validated using the Monte Carlo method and finite element analysis to establish the radiation dose rate calculation model and three-dimensional heat transfer model for the CHN-24 container. The dose rates at the surface of the container and at a distance of 1 m from the surface were 0.42 mSv/h and 0.06 mSv/h, respectively. These conform to the International Atomic Energy Agency (IAEA) radioactive material transportation safety standards 2 mSv/h and 0.1 mSv/h. The results shows that the CHN-24 container maintains its structural and material integrity under the condition of normal thermal steady-state heat transfer as well as in case of extreme fire as evinced by transient-state analysis. The temperature inside and on the surface of the container were 150.91 °C and 80 °C under normal storage conditions, which indicated that the design also conform to IAEA heat transfer safety standards of 190 °C and 85 °C

  11. Development and verification of remote research environment based on 'Fusion research grid'

    International Nuclear Information System (INIS)

    Iba, Katsuyuki; Ozeki, Takahisa; Totsuka, Toshiyuki; Suzuki, Yoshio; Oshima, Takayuki; Sakata, Shinya; Sato, Minoru; Suzuki, Mitsuhiro; Hamamatsu, Kiyotaka; Kiyono, Kimihiro

    2008-01-01

    'Fusion research grid' is a concept that unites scientists and let them collaborate effectively against their difference in time zone and location in a nuclear fusion research. Fundamental technologies of 'Fusion research grid' have been developed at JAEA in the VizGrid project under the e-Japan project at the Ministry of Education, Culture, Sports, Science and Technology (MEXT). We are conscious of needs to create new systems that assist researchers with their research activities because remote collaborations have been increasing in international projects. Therefore we have developed prototype remote research environments for experiments, diagnostics, analyses and communications based on 'Fusion research grid'. All users can access these environments from anywhere because 'Fusion research grid' does not require a closed network like Super SINET to maintain security. The prototype systems were verified in experiments at JT-60U and their availability was confirmed

  12. Relative Navigation Light Detection and Ranging (LIDAR) Sensor Development Test Objective (DTO) Performance Verification

    Science.gov (United States)

    Dennehy, Cornelius J.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) received a request from the NASA Associate Administrator (AA) for Human Exploration and Operations Mission Directorate (HEOMD), to quantitatively evaluate the individual performance of three light detection and ranging (LIDAR) rendezvous sensors flown as orbiter's development test objective on Space Transportation System (STS)-127, STS-133, STS-134, and STS-135. This document contains the outcome of the NESC assessment.

  13. Development and verification of a taxonomy of assessment metrics for surgical technical skills.

    Science.gov (United States)

    Schmitz, Connie C; DaRosa, Debra; Sullivan, Maura E; Meyerson, Shari; Yoshida, Ken; Korndorffer, James R

    2014-01-01

    To create and empirically verify a taxonomy of metrics for assessing surgical technical skills, and to determine which types of metrics, skills, settings, learners, models, and instruments were most commonly reported in the technical skills assessment literature. In 2011-2012, the authors used a rational analysis of existing and emerging metrics to create the taxonomy, and used PubMed to conduct a systematic literature review (2001-2011) to test the taxonomy's comprehensiveness and verifiability. Using 202 articles identified from the review, the authors classified metrics according to the taxonomy and coded data concerning their context and use. Frequencies (counts, percentages) were calculated for all variables. The taxonomy contained 12 objective and 4 subjective categories. Of 567 metrics identified in the literature, 520 (92%) were classified using the new taxonomy. Process metrics outnumbered outcome metrics by 8:1. The most frequent metrics were "time," "manual techniques" (objective and subjective), "errors," and "procedural steps." Only one new metric, "learning curve," emerged. Assessments of basic motor skills and skills germane to laparoscopic surgery dominated the literature. Novices, beginners, and intermediate learners were the most frequent subjects, and box trainers and virtual reality simulators were the most frequent models used for assessing performance. Metrics convey what is valued in human performance. This taxonomy provides a common nomenclature. It may help educators and researchers in procedurally oriented disciplines to use metrics more precisely and consistently. Future assessments should focus more on bedside tasks and open surgical procedures and should include more outcome metrics.

  14. Presentation and verification of a simple mathematical model foridentification of the areas behind noise barrierwith the highest performance

    Directory of Open Access Journals (Sweden)

    M. Monazzam

    2009-07-01

    Full Text Available Background and aims   Traffic noise barriers are the most important measure to control the environmental noise pollution. Diffraction from top edge of noise barriers is the most important path of indirect sound wave moves towards receiver.Therefore, most studies are focused on  improvement of this kind.   Methods   T-shape profile barriers are one of the most successful barrier among many different profiles. In this investigation the theory of destructive effect of diffracted waves from real edge of barrier and the wave diffracted from image of the barrier with phase difference of radians is used. Firstly a simple mathematical representation of the zones behind rigid and absorbent T- shape barriers with the highest insertion loss using the destructive effect of indirect path via barrier  image is introduced and then two different profile reflective and absorption barrier is used for  verification of the introduced model   Results   The results are then compared with the results of a verified two dimensional boundary element method at 1/3 octave band frequencies and in a wide field behind those barriers. Avery good agreement between the results has been achieved. In this method effective height is used for any different profile barriers.   Conclusion   The introduced model is very simple, flexible and fast and could be used for choosing the best location of profile rigid and absorptive barriers to achieve the highest  performance.  

  15. Kinematic Modeling and Simulation of a 2-R Robot by Using Solid Works and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Fernini Brahim

    2012-05-01

    Full Text Available Simulation of robot systems which is getting very popular, especially with the lowering cost of computers, can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. Object staging modelisation using robots holds, wether for the object or the robot, the following models: The geometric one, the kinematics one and the dynamic one. To do so, the modelisation of a 2-R robot type is being implemented. Comparing between two robot postures with the same trajectory (path and for the same length of time and establishing a computing code to obtain the kinematic and dynamic parameters are the main tasks. SolidWorks and Matlab/Simulink softwares are used to check the theory and the robot motion simulation. The verification of the obtained results by both softwares allows us to, qualitatively evaluate ,underline the rightness of the chosen model and to get the right conclusions. The results of simulations were discussed. An agreement between the two softwares is certainly Obtained.

  16. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  17. Development and verification of a coupled code system RETRAN-MASTER-TORC

    International Nuclear Information System (INIS)

    Cho, J.Y.; Song, J.S.; Joo, H.G.; Zee, S.Q.

    2004-01-01

    Recently, coupled thermal-hydraulics (T-H) and three-dimensional kinetics codes have been widely used for the best-estimate simulations such as the main steam line break (MSLB) and locked rotor problems. This work is to develop and verify one of such codes by coupling the system T-H code RETRAN, the 3-D kinetics code MASTER and sub-channel analysis code TORC. The MASTER code has already been applied to such simulations after coupling with the MARS or RETRAN-3D multi-dimensional system T-H codes. The MASTER code contains a sub-channel analysis code COBRA-III C/P, and the coupled systems MARSMASTER-COBRA and RETRAN-MASTER-COBRA had been already developed and verified. With these previous studies, a new coupled system of RETRAN-MASTER-TORC is to be developed and verified for the standard best-estimate simulation code package in Korea. The TORC code has already been applied to the thermal hydraulics design of the several ABB/CE type plants and Korean Standard Nuclear Power Plants (KSNP). This justifies the choice of TORC rather than COBRA. Because the coupling between RETRAN and MASTER codes are already established and verified, this work is simplified to couple the TORC sub-channel T-H code with the MASTER neutronics code. The TORC code is a standalone code that solves the T-H equations for a given core problem from reading the input file and finally printing the converged solutions. However, in the coupled system, because TORC receives the pin power distributions from the neutronics code MASTER and transfers the T-H results to MASTER iteratively, TORC needs to be controlled by the MASTER code and does not need to solve the given problem completely at each iteration step. By this reason, the coupling of the TORC code with the MASTER code requires several modifications in the I/O treatment, flow iteration and calculation logics. The next section of this paper describes the modifications in the TORC code. The TORC control logic of the MASTER code is then followed. The

  18. The development of verification and validation technology for instrumentation and control in NPPs - A study on the software development methodology of a highly reliable software

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Yong Rae; Cha, Sung Deok; Lee, Woo Jin; Chae, Hong Seok; Yoon, Kwang Sik; Jeong, Ki Suk [Korea Advanced Institute of Science and Technology,= Taejon (Korea, Republic of)

    1996-07-01

    Nuclear industries have tried to use the digital I and C technology in developing advanced nuclear power plants. However, because the industries did= not establish the highly reliable software development methodologies and standards applied to developing the highly reliable and safe software for digital I and C systems, they were confronted with the difficulties to avoid software common mode failures. To mitigate the difficulties, the highly reliable software development environments and methodologies and validation and verification techniques should be the cornerstone of all digital implementation in nuclear power plants. The objectives of this project is to establish the highly reliable software development methodology to support developing digital instrumentation and control systems in nuclear power plants. In this project, we have investigated the business-oriented and the real-time software development methods and techniques for ensuring safety and reliability of the software. Also we have studied standards related to licensing the software for digital I and C systems. 50 refs., 51 figs. (author)

  19. Development of novel EMAT-ECT multi-sensor and verification of its feasibility

    International Nuclear Information System (INIS)

    Suzuki, Kenichiro; Uchimoto, Tetsuya; Takagi, Toshiyuki; Sato, Takeshi; Guy, Philippe; Casse, Amelie

    2006-01-01

    In this study, we propose a novel EMAT-ECT multi sensor aiming at advanced structural health monitoring. For the purpose, proto-type EMAT-ECT multi-sensor was developed and their functions both as ECT and EMAT prove were evaluated. Experimental results of pulse ECT using the EMAT-ECT multi-sensor showed that the proposed sensor has a capability of detection and sizing of flaws. Experimental results of EMAT evaluation using the EMAT-ECT multi-sensor showed that ultrasonic wave was transmitted by EMAT-ECT multi sensor and flaw echo was observed. These results imply that EMAT-ECT multi sensor is available for pulse ECT and EMAT. (author)

  20. Development and Performance Verification of Fiber Optic Temperature Sensors in High Temperature Engine Environments

    Science.gov (United States)

    Adamovsky, Grigory; Mackey, Jeffrey R.; Kren, Lawrence A.; Floyd, Bertram M.; Elam, Kristie A.; Martinez, Martel

    2014-01-01

    A High Temperature Fiber Optic Sensor (HTFOS) has been developed at NASA Glenn Research Center for aircraft engine applications. After fabrication and preliminary in-house performance evaluation, the HTFOS was tested in an engine environment at NASA Armstrong Flight Research Center. The engine tests enabled the performance of the HTFOS in real engine environments to be evaluated along with the ability of the sensor to respond to changes in the engine's operating condition. Data were collected prior, during, and after each test in order to observe the change in temperature from ambient to each of the various test point levels. An adequate amount of data was collected and analyzed to satisfy the research team that HTFOS operates properly while the engine was running. Temperature measurements made by HTFOS while the engine was running agreed with those anticipated.

  1. IDEF method for designing seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Jin Ping; Zhang Huimin; Zheng Jiangling; Sun Peng

    2004-01-01

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  2. Validation and verification of agent models for trust: Independent compared to relative trust

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van

    2011-01-01

    In this paper, the results of a validation experiment for two existing computational trust models describing human trust are reported. One model uses experiences of performance in order to estimate the trust in different trustees. The second model in addition carries the notion of relative trust.

  3. Towards the Availability of the Distributed Cluster Rendering System: Automatic Modeling and Verification

    DEFF Research Database (Denmark)

    Wang, Kemin; Jiang, Zhengtao; Wang, Yongbin

    2012-01-01

    In this study, we proposed a Continuous Time Markov Chain Model towards the availability of n-node clusters of Distributed Rendering System. It's an infinite one, we formalized it, based on the model, we implemented a software, which can automatically model with PRISM language. With the tool, whe...

  4. Development and verification of child observation sheet for 5-year-old children.

    Science.gov (United States)

    Fujimoto, Keiko; Nagai, Toshisaburo; Okazaki, Shin; Kawajiri, Mie; Tomiwa, Kiyotaka

    2014-02-01

    The aim of the study was to develop a newly devised child observation sheet (COS-5) as a scoring sheet, based on the Childhood Autism Rating Scale (CARS), for use in the developmental evaluation of 5-year-old children, especially focusing on children with autistic features, and to verify its validity. Seventy-six children were studied. The children were recruited among participants of the Japan Children's Cohort Study, a research program implemented by the Research Institute of Science and Technology for Society (RISTEX) from 2004 to 2009. The developmental evaluation procedure was performed by doctors, clinical psychologists, and public health nurses. The COS-5 was also partly based on the Kyoto Scale of Psychological Development 2001 (Kyoto Scale 2001). Further, the Developmental Disorders Screening Questionnaire for 5-Years-Olds, PDD-Autism Society Japan Rating Scale (PARS), doctor interview questions and neurological examination for 5-year-old children, and the Draw-a-Man Test (DAM) were used as evaluation scales. Eighteen (25.4%) children were rated as Suspected, including Suspected PDD, Suspected ADHD and Suspected MR. The COS-5 was suggested to be valid with favorable reliability (α=0.89) and correlation with other evaluation scales. The COS-5 may be useful, with the following advantages: it can be performed within a shorter time frame; it facilitates the maintenance of observation quality; it facilitates sharing information with other professions; and it is reliable to identify the autistic features of 5-year-old children. In order to verify its wider applications including the screening of infants (18months to 3years old) by adjusting the items of younger age, additional study is needed. Copyright © 2013 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  5. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  6. Experimental Verification of Same Simple Equilibrium Models of Masonry Shear Walls

    Science.gov (United States)

    Radosław, Jasiński

    2017-10-01

    This paper contains theoretical fundamentals of strut and tie models, used in unreinforced horizontal shear walls. Depending on support conditions and wall loading, we can distinguish models with discrete bars when point load is applied to the wall (type I model) or with continuous bars (type II model) when load is uniformly distributed at the wall boundary. The main part of this paper compares calculated results with the own tests on horizontal shear walls made of solid brick, silicate elements and autoclaved aerated concrete. The tests were performed in Poland. The model required some modifications due to specific load and static diagram.

  7. Development and Verification of Novel Porous Titanium Metaphyseal Cones for Revision Total Knee Arthroplasty.

    Science.gov (United States)

    Faizan, Ahmad; Bhowmik-Stoker, Manoshi; Alipit, Vincent; Kirk, Amanda E; Krebs, Viktor E; Harwin, Steven F; Meneghini, R Michael

    2017-06-01

    Porous metaphyseal cones are widely used in revision knee arthroplasty. A new system of porous titanium metaphyseal cones has been designed based on the femoral and tibial morphology derived from a computed tomography-based anatomical database. The purpose of this study is to evaluate the initial mechanical stability of the new porous titanium revision cone system by measuring the micromotion under physiologic loading compared with a widely-used existing porous tantalum metaphyseal cone system. The new cones were designed to precisely fit the femoral and tibial anatomy, and 3D printing technology was used to manufacture these porous titanium cones. The stability of the new titanium cones and the widely-used tantalum cones were compared under physiologic loading conditions in bench top test model. The stability of the new titanium cones was either equivalent or better than the tantalum cones. The new titanium femoral cone construct had significantly less micromotion compared with the traditional femoral cone construct in 5 of the 12 directions measured (P titanium metaphyseal tibial cones demonstrated less micromotion in medial varus/valgus (P = .004) and posterior compressive micromotion (P = .002) compared with the traditional porous tantalum system. The findings of this biomechanical study demonstrate satisfactory mechanical stability of an anatomical-based porous titanium metaphyseal cone system for femoral and tibial bone loss as measured by micromotion under physiologic loading. The new cone design, in combination with instrumentation that facilitates surgical efficiency, is encouraging. Long-term clinical follow-up is warranted. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Automated single-voxel proton MRS: technical development and multisite verification.

    Science.gov (United States)

    Webb, P G; Sailasuta, N; Kohler, S J; Raidy, T; Moats, R A; Hurd, R E

    1994-04-01

    To improve clinical utility, an integrated method has been developed to automatically acquire and process single-voxel in vivo proton spectra on a 1.5 T clinical scanner. This method includes automated adjustment of linear shims using a very rapid modified simplex method, automated water suppression, and applies a water referencing scheme to correct for phase and residual eddy current effects. No operator intervention is required for the acquisition and processing of these pure-absorption spectra. This method was tested in a preliminary multisite trial to determine intersite and intrasite variability of metabolite ratio measurements. In a sample of over 100 examinations, the standard deviation of the ratios NAA:Cr, Cho:Cr, and ml:Cr were found to be under 15% when using this method, a substantially narrower range than has been found in studies relying on manual adjustment of the instrument and/or manual processing. This result indicates that automated setting of acquisition and processing parameters is of critical importance in the clinical application of in vivo spectroscopy.

  9. Phase two of Site 300's ecological risk assessment: Model verification and risk management

    International Nuclear Information System (INIS)

    Carlson, T.M.; Gregory, S.D.

    1995-01-01

    The authors completed the baseline ecological risk assessment (ERA) for Lawrence Livermore National Laboratory's Site 300 in 1993. Using data collection and modeling techniques adapted from the human health risk assessment (HRA), they evaluated the potential hazard of contaminants in environmental media to ecological receptors. They identified potential hazards to (1) aquatic invertebrates from heavy metal contaminants in surface water, (2) burrowing vertebrates from contaminants volatilizing from subsurface soil into burrow air, and (3) grazing deer and burrowing vertebrates from cadmium contamination in surface soil. They recently began collecting data to refine the estimates of potential hazard to these ecological receptors. Bioassay results form the surface water failed to verify a hazard to aquatic invertebrates. Soil vapor surveys of subsurface burrows did verify the presence of high concentrations of volatile organic compounds (VOCs). However, they have not yet verified a true impact on the burrowing populations. The authors also completed an extensive surface soil sampling program, which identified local hot spots of cadmium contamination. In addition, they have been collecting data on the land use patterns of the deer population. Their data indicate that deer do not typically use those areas with cadmium surface soil contamination. Information from this phase of the ERA, along with the results of the HRA, will direct the selection of remedial alternatives for the site. For the ecological receptors, remedial alternatives include developing a risk management program which includes ensuring that (1) sensitive burrowing species (such as rare or endangered species) do not use areas of surface or subsurface contamination, and (2) deer populations do not use areas of surface soil contamination

  10. Initial Verification and Validation Assessment for VERA

    Energy Technology Data Exchange (ETDEWEB)

    Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States); Athe, Paridhi [North Carolina State Univ., Raleigh, NC (United States); Jones, Christopher [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hetzler, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sieger, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-04-01

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. This approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.

  11. Development of a three-dimensionally movable phantom system for dosimetric verifications

    International Nuclear Information System (INIS)

    Nakayama, Hiroshi; Mizowaki, Takashi; Narita, Yuichiro; Kawada, Noriyuki; Takahashi, Kunio; Mihara, Kazumasa; Hiraoka, Masahiro

    2008-01-01

    The authors developed a three-dimensionally movable phantom system (3D movable phantom system) which can reproduce three-dimensional movements to experimentally verify the impact of radiotherapy treatment-related movements on dose distribution. The phantom system consists of three integrated components: a three-dimensional driving mechanism (3D driving mechanism), computer control system, and phantoms for film dosimetry. The 3D driving mechanism is a quintessential part of this system. It is composed of three linear-motion tables (single-axis robots) which are joined orthogonally to each other. This mechanism has a motion range of 100 mm, with a maximum velocity of 200 mm/s in each dimension, and 3D motion ability of arbitrary patterns. These attributes are sufficient to reproduce almost all organ movements. The positional accuracy of this 3D movable phantom system in a state of geostationary is less than 0.1 mm. The maximum error in terms of the absolute position on movement was 0.56 mm. The positional reappearance error on movement was up to 0.23 mm. The observed fluctuation of time was 0.012 s in the cycle of 4.5 s of oscillation. These results suggested that the 3D movable phantom system exhibited a sufficient level of accuracy in terms of geometry and timing to reproduce interfractional organ movement or setup errors in order to assess the influence of these errors on high-precision radiotherapy such as stereotactic irradiation and intensity-modulated radiotherapy. In addition, the authors 3D movable phantom system will also be useful in evaluating the adequacy and efficacy of new treatment techniques such as gating or tracking radiotherapy

  12. AXAF-I Low Intensity-Low Temperature (LILT) Testing of the Development Verification Test (DVT) Solar Panel

    Science.gov (United States)

    Alexander, Doug; Edge, Ted; Willowby, Doug

    1998-01-01

    The planned orbit of the AXAF-I spacecraft will subject the spacecraft to both short, less than 30 minutes for solar and less than 2 hours for lunar, and long earth eclipses and lunar eclipses with combined conjunctive duration of up to 3 to 4 hours. Lack of proper Electrical Power System (EPS) conditioning prior to eclipse may cause loss of mission. To avoid this problem, for short eclipses, it is necessary to off-point the solar array prior to or at the beginning of the eclipse to reduce the battery state of charge (SOC). This yields less overcharge during the high charge currents at sun entry. For long lunar eclipses, solar array pointing and load scheduling must be tailored for the profile of the eclipse. The battery SOC, loads, and solar array current-voltage (I-V) must be known or predictable to maintain the bus voltage within acceptable range. To address engineering concerns about the electrical performance of the AXAF-I solar array under Low Intensity and Low Temperature (LILT) conditions, Marshall Space Flight Center (MSFC) engineers undertook special testing of the AXAF-I Development Verification Test (DVT) solar panel in September-November 1997. In the test the DVT test panel was installed in a thermal vacuum chamber with a large view window with a mechanical "flapper door". The DVT test panel was "flash" tested with a Large Area Pulse Solar Simulator (LAPSS) at various fractional sun intensities and panel (solar cell) temperatures. The testing was unique with regards to the large size of the test article and type of testing performed. The test setup, results, and lessons learned from the testing will be presented.

  13. Assessment of surface solar irradiance derived from real-time modelling techniques and verification with ground-based measurements

    Science.gov (United States)

    Kosmopoulos, Panagiotis G.; Kazadzis, Stelios; Taylor, Michael; Raptis, Panagiotis I.; Keramitsoglou, Iphigenia; Kiranoudis, Chris; Bais, Alkiviadis F.

    2018-02-01

    This study focuses on the assessment of surface solar radiation (SSR) based on operational neural network (NN) and multi-regression function (MRF) modelling techniques that produce instantaneous (in less than 1 min) outputs. Using real-time cloud and aerosol optical properties inputs from the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation (MSG) satellite and the Copernicus Atmosphere Monitoring Service (CAMS), respectively, these models are capable of calculating SSR in high resolution (1 nm, 0.05°, 15 min) that can be used for spectrally integrated irradiance maps, databases and various applications related to energy exploitation. The real-time models are validated against ground-based measurements of the Baseline Surface Radiation Network (BSRN) in a temporal range varying from 15 min to monthly means, while a sensitivity analysis of the cloud and aerosol effects on SSR is performed to ensure reliability under different sky and climatological conditions. The simulated outputs, compared to their common training dataset created by the radiative transfer model (RTM) libRadtran, showed median error values in the range -15 to 15 % for the NN that produces spectral irradiances (NNS), 5-6 % underestimation for the integrated NN and close to zero errors for the MRF technique. The verification against BSRN revealed that the real-time calculation uncertainty ranges from -100 to 40 and -20 to 20 W m-2, for the 15 min and monthly mean global horizontal irradiance (GHI) averages, respectively, while the accuracy of the input parameters, in terms of aerosol and cloud optical thickness (AOD and COT), and their impact on GHI, was of the order of 10 % as compared to the ground-based measurements. The proposed system aims to be utilized through studies and real-time applications which are related to solar energy production planning and use.

  14. Process verification of a hydrological model using a temporal parameter sensitivity analysis

    OpenAIRE

    M. Pfannerstill; B. Guse; D. Reusser; N. Fohrer

    2015-01-01

    To ensure reliable results of hydrological models, it is essential that the models reproduce the hydrological process dynamics adequately. Information about simulated process dynamics is provided by looking at the temporal sensitivities of the corresponding model parameters. For this, the temporal dynamics of parameter sensitivity are analysed to identify the simulated hydrological processes. Based on these analyses it can be verified if the simulated hydrological processes ...

  15. Experimental verification of computational model for wind turbine blade geometry design

    Directory of Open Access Journals (Sweden)

    Štorch Vít

    2015-01-01

    Full Text Available A 3D potential flow solver with unsteady force free wake model intended for optimization of blade shape for wind power generation is applied on a test case scenario formed by a wind turbine with vertical axis of rotation. The calculation is sensitive to correct modelling of wake and its interaction with blades. The validity of the flow solver is verified by comparing experimentally obtained performance data of model rotor with numerical results.

  16. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  17. Induction Heating of Carbon-Fiber Composites: Experimental Verification of Models

    National Research Council Canada - National Science Library

    Fink, Bruce

    2000-01-01

    .... The validity of the global thermal generation model is established through an experimental test matrix in which various specimen configurations are evaluated and compared with theoretical predictions...

  18. Use of the Long Duration Exposure Facility's thermal measurement system for the verification of thermal models

    Science.gov (United States)

    Berrios, William M.

    1992-01-01

    The Long Duration Exposure Facility (LDEF) postflight thermal model predicted temperatures were matched to flight temperature data recorded by the Thermal Measurement System (THERM), LDEF experiment P0003. Flight temperatures, recorded at intervals of approximately 112 minutes for the first 390 days of LDEF's 2105 day mission were compared with predictions using the thermal mathematical model (TMM). This model was unverified prior to flight. The postflight analysis has reduced the thermal model uncertainty at the temperature sensor locations from +/- 40 F to +/- 18 F. The improved temperature predictions will be used by the LDEF's principal investigators to calculate improved flight temperatures experienced by 57 experiments located on 86 trays of the facility.

  19. Synthesis of reference compounds related to Chemical Weapons Convention for verification and drug development purposes – a Brazilian endeavour

    Science.gov (United States)

    Cavalcante, S. F. A.; de Paula, R. L.; Kitagawa, D. A. S.; Barcellos, M. C.; Simas, A. B. C.; Granjeiro, J. M.

    2018-03-01

    This paper deals with challenges that Brazilian Army Organic Synthesis Laboratory has been going through to access reference compounds related to the Chemical Weapons Convention in order to support verification analysis and for research of novel antidotes. Some synthetic procedures to produce the chemicals, as well as Quality Assurance issues and a brief introduction of international agreements banning chemical weapons are also presented.

  20. Model-based virtual VSB mask writer verification for efficient mask error checking and optimization prior to MDP

    Science.gov (United States)

    Pack, Robert C.; Standiford, Keith; Lukanc, Todd; Ning, Guo Xiang; Verma, Piyush; Batarseh, Fadi; Chua, Gek Soon; Fujimura, Akira; Pang, Linyong

    2014-10-01

    A methodology is described wherein a calibrated model-based `Virtual' Variable Shaped Beam (VSB) mask writer process simulator is used to accurately verify complex Optical Proximity Correction (OPC) and Inverse Lithography Technology (ILT) mask designs prior to Mask Data Preparation (MDP) and mask fabrication. This type of verification addresses physical effects which occur in mask writing that may impact lithographic printing fidelity and variability. The work described here is motivated by requirements for extreme accuracy and control of variations for today's most demanding IC products. These extreme demands necessitate careful and detailed analysis of all potential sources of uncompensated error or variation and extreme control of these at each stage of the integrated OPC/ MDP/ Mask/ silicon lithography flow. The important potential sources of variation we focus on here originate on the basis of VSB mask writer physics and other errors inherent in the mask writing process. The deposited electron beam dose distribution may be examined in a manner similar to optical lithography aerial image analysis and image edge log-slope analysis. This approach enables one to catch, grade, and mitigate problems early and thus reduce the likelihood for costly long-loop iterations between OPC, MDP, and wafer fabrication flows. It moreover describes how to detect regions of a layout or mask where hotspots may occur or where the robustness to intrinsic variations may be improved by modification to the OPC, choice of mask technology, or by judicious design of VSB shots and dose assignment.