WorldWideScience

Sample records for modeling development verification

  1. VARTM Model Development and Verification

    Science.gov (United States)

    Cano, Roberto J. (Technical Monitor); Dowling, Norman E.

    2004-01-01

    In this investigation, a comprehensive Vacuum Assisted Resin Transfer Molding (VARTM) process simulation model was developed and verified. The model incorporates resin flow through the preform, compaction and relaxation of the preform, and viscosity and cure kinetics of the resin. The computer model can be used to analyze the resin flow details, track the thickness change of the preform, predict the total infiltration time and final fiber volume fraction of the parts, and determine whether the resin could completely infiltrate and uniformly wet out the preform.

  2. Simscape Modeling Verification in the Simulink Development Environment

    Science.gov (United States)

    Volle, Christopher E. E.

    2014-01-01

    The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.

  3. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  4. Development and verification of printed circuit board toroidal transformer model

    DEFF Research Database (Denmark)

    Pejtersen, Jens; Mønster, Jakob Døllner; Knott, Arnold

    2013-01-01

    by comparing calculated parameters with 3D finite element simulations and experimental measurement results. The developed transformer model shows good agreement with the simulated and measured results. The model can be used to predict the parameters of printed circuit board toroidal transformer configurations......An analytical model of an air core printed circuit board embedded toroidal transformer configuration is presented. The transformer has been developed for galvanic isolation of very high frequency switch-mode dc-dc power converter applications. The theoretical model is developed and verified...

  5. Vacuum assisted resin transfer molding (VARTM): Model development and verification

    Science.gov (United States)

    Song, Xiaolan

    2003-06-01

    In this investigation, a comprehensive Vacuum Assisted Resin Transfer Molding (VARTM) process simulation model was developed and verified. The model incorporates resin flow through the preform, compaction and relaxation of the preform, and viscosity and cure kinetics of the resin. The computer model can be used to analyze the resin flow details, track the thickness change of the preform, predict the total infiltration time and final fiber volume fraction of the parts, and determine whether the resin could completely infiltrate and uniformly wet out the preform. Flow of resin through the preform is modeled as flow through porous media. Darcy's law combined with the continuity equation for an incompressible Newtonian fluid forms the basis of the flow model. During the infiltration process, it is well accepted that the total pressure is shared by the resin pressure and the pressure supported by the fiber network. With the progression of the resin, the net pressure applied to the preform decreases as a result of increasing local resin pressure. This leads to the springback of the preform, and is called the springback mechanism. On the other side, the lubrication effect of the resin causes the rearrangement of the fiber network and an increase in the preform compaction. This is called the wetting compaction mechanism. The thickness change of the preform is determined by the relative magnitude of the springback and wetting deformation mechanisms. In the compaction model, the transverse equilibrium equation is used to calculate the net compaction pressure applied to the preform, and the compaction test results are fitted to give the compressive constitutive law of the preform. The Finite Element/Control Volume (FE/CV) method is adopted to find the flow front location and the fluid pressure. The code features the ability of simultaneous integration of 1-D, 2-D and 3-D element types in a single simulation, and thus enables efficient modeling of the flow in complex mold

  6. Early Development of UVM based Verification Environment of Image Signal Processing Designs using TLM Reference Model of RTL

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2014-01-01

    Full Text Available With semiconductor industry trend of “smaller the better”, from an idea to a final product, more innovation on product portfolio and yet remaining competitive and profitable are few criteria which are culminating into pressure and need for more and more innovation for CAD flow, process management and project execution cycle. Project schedules are very tight and to achieve first silicon success is key for projects. This necessitates quicker verification with better coverage matrix. Quicker Verification requires early development of the verification environment with wider test vectors without waiting for RTL to be available. In this paper, we are presenting a novel approach of early development of reusable multi-language verification flow, by addressing four major activities of verification – 1. Early creation of Executable Specification 2. Early creation of Verification Environment 3. Early development of test vectors and 4. Better and increased Re-use of blocks Although this paper focuses on early development of UVM based Verification Environment of Image Signal Processing designs using TLM Reference Model of RTL, same concept can be extended for non-image signal processing designs.

  7. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  8. SYSTEM-COGNITIVE MODEL OF FORECASTING THE DEVELOPMENT OF DIVERSIFIED AGRO-INDUSTRIAL CORPORATIONS. PART II. SYNTHESIS AND MODEL VERIFICATION

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-11-01

    Full Text Available In this article, in accordance with the methodology of the Automated system-cognitive analysis (ASCanalysis, we examine the implementation of the 3rd ASC-analysis: synthesis and verification of forecasting models of development of diversified agro-industrial corporations. In this step, we have synthesis and verification of 3 statistical and 7 system-cognitive models: ABS – matrix of the absolute frequencies, PRC1 and PRC2 – matrix of the conditional and unconditional distributions, INF1 and INF2 private criterion: the amount of knowledge based on A. Kharkevich, INF3 – private criterion: the Chi-square test: difference between the actual and the theoretically expected absolute frequencies INF4 and INF5 – private criterion: ROI - Return On Investment, INF6 and INF7 – private criterion: the difference between conditional and unconditional probability (coefficient of relationship. The reliability of the created models was assessed in accordance with the proposed metric is similar to the known F-test, but does not involve the performance of normal distribution, linearity of the object modeling, the independence and additivity acting factors. The accuracy of the obtained models was high enough to resolve the subsequent problems of identification, forecasting and decision making, as well as studies of the modeled object by studying its model, scheduled for consideration in future articles

  9. ParFlow.RT: Development and Verification of a New Reactive Transport Model

    Science.gov (United States)

    Beisman, J. J., III

    2015-12-01

    In natural subsurface systems, total elemental fluxes are often heavily influenced by areas of disproportionately high reaction rates. These pockets of high reaction rates tend to occur at interfaces, such as the hyporheic zone, where a hydrologic flowpath converges with either a chemically distinct hydrologic flowpath or a reactive substrate. Understanding the affects that these highly reactive zones have on the behavior of shallow subsurface systems is integral to the accurate quantification of nutrient fluxes and biogeochemical cycling. Numerical simulations of these systems may be able to offer some insight. To that end, we have developed a new reactive transport model, ParFlow.RT, by coupling the parallel flow and transport code ParFlow with the geochemical engines of both PFLOTRAN and CrunchFlow. The coupling was accomplished via the Alquimia biogeochemistry API, which provides a unified interface to several geochemical codes and allows a relatively simple implementation of advanced geochemical functionality in flow and transport codes. This model uses an operator-splitting approach, where the transport and reaction steps are solved separately. Here, we present the details of this new model, and the results of verification simulations and biogeochemical cycling simulations of the DOE's East River field site outside of Gothic, CO.

  10. A "Kane's Dynamics" Model for the Active Rack Isolation System Part Two: Nonlinear Model Development, Verification, and Simplification

    Science.gov (United States)

    Beech, G. S.; Hampton, R. D.; Rupert, J. K.

    2004-01-01

    Many microgravity space-science experiments require vibratory acceleration levels that are unachievable without active isolation. The Boeing Corporation's active rack isolation system (ARIS) employs a novel combination of magnetic actuation and mechanical linkages to address these isolation requirements on the International Space Station. Effective model-based vibration isolation requires: (1) An isolation device, (2) an adequate dynamic; i.e., mathematical, model of that isolator, and (3) a suitable, corresponding controller. This Technical Memorandum documents the validation of that high-fidelity dynamic model of ARIS. The verification of this dynamics model was achieved by utilizing two commercial off-the-shelf (COTS) software tools: Deneb's ENVISION(registered trademark), and Online Dynamics Autolev(trademark). ENVISION is a robotics software package developed for the automotive industry that employs three-dimensional computer-aided design models to facilitate both forward and inverse kinematics analyses. Autolev is a DOS-based interpreter designed, in general, to solve vector-based mathematical problems and specifically to solve dynamics problems using Kane's method. The simplification of this model was achieved using the small-angle theorem for the joint angle of the ARIS actuators. This simplification has a profound effect on the overall complexity of the closed-form solution while yielding a closed-form solution easily employed using COTS control hardware.

  11. A mathematical model of the nickel converter: Part I. Model development and verification

    Science.gov (United States)

    Kyllo, A. K.; Richards, G. G.

    1991-04-01

    A mathematical model of the nickel converter has been developed. The primary assumption of the model is that the three phases in the converter are in thermal and chemical equilibrium. All matte, slag, and gas in the converter is brought to equilibrium at the end of each of a series of short time steps throughout an entire charge. An empirical model of both the matte and slag is used to characterize the activity coefficients in each phase. Two nickel sulfide species were used to allow for the modeling of sulfur-deficient mattes. A heat balance is carried out over each time step, considering the major heat flows in the converter. The model was validated by a detailed comparison with measured data from six industrial charges. The overall predicted mass balance was shown to be close to that seen in actual practice, and the heat balance gave a good fit of converter temperature up to the last two or three blows of a charge. At this point, reactions in the converter begin to deviate strongly from “equilibrium,” probably due to the converter reactions coming under liquid-phase mass-transfer control. While the equilibrium assumption does work, it is not strictly valid, and the majority of the charge is probably under gas-phase mass-transfer control.

  12. Development of the VESUVIUS module. Molten jet breakup modeling and model verification

    Energy Technology Data Exchange (ETDEWEB)

    Vierow, K. [Nuclear Power Engineering Corp., Tokyo (Japan); Nagano, Katsuhiro; Araki, Kazuhiro

    1998-01-01

    With the in-vessel vapor explosion issue ({alpha}-mode failure) now considered to pose an acceptably small risk to the safety of a light water reactor, ex-vessel vapor explosions are being given considerable attention. Attempts are being made to analytically model breakup of continuous-phase jets, however uncertainty exists regarding the basic phenomena. In addition, the conditions upon reactor vessel failure, which determine the starting point of the ex-vessel vapor explosion process, are difficult to quantify. Herein, molten jet ejection from the reactor pressure vessel is characterized. Next, the expected mode of jet breakup is determined and the current state of analytical modeling is reviewed. A jet breakup model for ex-vessel scenarios, with the primary breakup mechanism being the Kelvin-Helmholtz instability, is described. The model has been incorporated into the VESUVIUS module and comparisons of VESUVIUS calculations against FARO L-06 experimental data show differences, particularly in the pressure curve and amount of jet breakup. The need for additional development to resolve these differences is discussed. (author)

  13. Development and verification of a screening model for surface spreading of petroleum

    Science.gov (United States)

    Hussein, Maged; Jin, Minghui; Weaver, James W.

    2002-08-01

    Overflows and leakage from aboveground storage tanks and pipelines carrying crude oil and petroleum products occur frequently. The spilled hydrocarbons pose environmental threats by contaminating the surrounding soil and the underlying ground water. Predicting the fate and transport of these chemicals is required for environmental risk assessment and for remedial measure design. The present paper discusses the formulation and application of the Oil Surface Flow Screening Model (OILSFSM) for predicting the surface flow of oil by taking into account infiltration and evaporation. Surface flow is simulated using a semi-analytical model based on the lubrication theory approximation of viscous flow. Infiltration is simulated using a version of the Green and Ampt infiltration model, which is modified to account for oil properties. Evaporation of volatile compounds is simulated using a compositional model that accounts for the changes in the fraction of each compound in the spilled oil. The coupling between surface flow, infiltration and evaporation is achieved by incorporating the infiltration and evaporation fluxes into the global continuity equation of the spilled oil. The model was verified against numerical models for infiltration and analytical models for surface flow. The verification study demonstrates the applicability of the model.

  14. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    Science.gov (United States)

    Zavodsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use

  15. Vacuum-assisted resin transfer molding (VARTM) model development, verification, and process analysis

    Science.gov (United States)

    Sayre, Jay Randall

    2000-12-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process

  16. Temperature Modeling of Lost Creek Lake Using CE-QUAL-W2: A Report on the Development, Calibration, Verification, and Application of the Model

    Science.gov (United States)

    2017-05-01

    ER D C/ EL T R- 17 -6 Temperature Modeling of Applegate Lake Using CE-QUAL-W2 A Report on the Development, Calibration, Verification...and Application of the Model En vi ro nm en ta l L ab or at or y Tammy L. Threadgill, Daniel F. Turner, Laurie A. Nicholas, Barry W. Bunch...Dorothy H. Tillman, and David L. Smith May 2017 Approved for public release; distribution is unlimited. The U.S. Army Engineer Research and

  17. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H

    2006-09-15

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard.

  18. USER CONTEXT MODELS : A FRAMEWORK TO EASE SOFTWARE FORMAL VERIFICATIONS

    OpenAIRE

    2010-01-01

    This article is accepted to appear in ICEIS 2010 proceedings; International audience; Several works emphasize the difficulties of software verification applied to embedded systems. In past years, formal verification techniques and tools were widely developed and used by the research community. However, the use of formal verification at industrial scale remains difficult, expensive and requires lot of time. This is due to the size and the complexity of manipulated models, but also, to the impo...

  19. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Ed; Mader, Angelika

    2004-01-01

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the verificatio

  20. Security Policy Development: Towards a Life-Cycle and Logic-Based Verification Model

    Directory of Open Access Journals (Sweden)

    Luay A. Wahsheh

    2008-01-01

    Full Text Available Although security plays a major role in the design of software systems, security requirements and policies are usually added to an already existing system, not created in conjunction with the product. As a result, there are often numerous problems with the overall design. In this paper, we discuss the relationship between software engineering, security engineering, and policy engineering and present a security policy life-cycle; an engineering methodology to policy development in high assurance computer systems. The model provides system security managers with a procedural engineering process to develop security policies. We also present an executable Prolog-based model as a formal specification and knowledge representation method using a theorem prover to verify system correctness with respect to security policies in their life-cycle stages.

  1. Simulink based behavioural modelling of a pulse oximeter for deployment in rapid development, prototyping and verification.

    Science.gov (United States)

    Shokouhian, M; Morling, R C S; Kale, I

    2012-01-01

    The pulse oximeter is a well-known device for measuring the level of oxygen in blood. Since their invention, pulse oximeters have been under constant development in both aspects of hardware and software; however there are still unsolved problems that limit their performance [6], [7]. Many fresh algorithms and new design techniques are being suggested every year by industry and academic researchers which claim that they can improve accuracy of measurements [8], [9]. With the lack of an accurate computer-based behavioural model for pulse oximeters, the only way for evaluation of these newly developed systems and algorithms is through hardware implementation which can be both expensive and time consuming. This paper presents an accurate Simulink based behavioural model for a pulse oximeter that can be used by industry and academia alike working in this area, as an exploration as well as productivity enhancement tool during their research and development process. The aim of this paper is to introduce a new computer-based behavioural model which provides a simulation environment from which new ideas can be rapidly evaluated long before the real implementation.

  2. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  3. Development and verification of the modified dynamic two-fluid model GOPS

    Science.gov (United States)

    Song, Chengyi; Li, Yuxing; Meng, Lan; Wang, Haiyan

    2013-07-01

    In the oil and gas industry, many versions of software have been developed to calculate the flow parameters of multiphase flow. However, the existing software is not perfect. To improve the accuracy, a new version of software GOPS has been developed by Daqing Oilfield Construction Design and Research Institute, and China University of Petroleum. GOPS modifies the general extended two-fluid model, and considers the gas bubble phase in liquid and liquid droplet phase in gas. There are four continuity equations, two momentum equations, one mixture energy-conservation equation and one pressure-conservation equation in the controlling equations of GOPS. These controlling equations are combined with flow pattern transition model and closure relationships for every flow pattern. By this way, GOPS can simulate the dynamic variation of multiphase flow. To verify GOPS, relevant experiment has been made in Surface Engineering Pilot Test Center, CNPC. The experimental pressure gradients are compared with the results from GOPS, and the accuracy of GOPS is high.

  4. Development and verification of fuel burn-up calculation model in a reduced reactor geometry

    Energy Technology Data Exchange (ETDEWEB)

    Sembiring, Tagor Malem [Center for Reactor Technology and Nuclear Safety (PTKRN), National Nuclear Energy Agency (BATAN), Kawasan PUSPIPTEK Gd. No. 80, Serpong, Tangerang 15310 (Indonesia)], E-mail: tagorms@batan.go.id; Liem, Peng Hong [Research Laboratory for Nuclear Reactor (RLNR), Tokyo Institute of Technology (Tokyo Tech), O-okayama, Meguro-ku, Tokyo 152-8550 (Japan)

    2008-02-15

    A fuel burn-up model in a reduced reactor geometry (2-D) is successfully developed and implemented in the Batan in-core fuel management code, Batan-FUEL. Considering the bank mode operation of the control rods, several interpolation functions are investigated which best approximate the 3-D fuel assembly radial power distributions across the core as function of insertion depth of the control rods. Concerning the applicability of the interpolation functions, it can be concluded that the optimal coefficients of the interpolation functions are not very sensitive to the core configuration and core or fuel composition in RSG GAS (MPR-30) reactor. Consequently, once the optimal interpolation function and its coefficients are derived then they can be used for 2-D routine operational in-core fuel management without repeating the expensive 3-D neutron diffusion calculations. At the selected fuel elements (at H-9 and G-6 core grid positions), the discrepancy of the FECFs (fuel element channel power peaking factors) between the 2-D and 3-D models are within the range of 3.637 x 10{sup -4}, 3.241 x 10{sup -4} and 7.556 x 10{sup -4} for the oxide, silicide cores with 250 g {sup 235}U/FE and the silicide core with 300 g {sup 235}U/FE, respectively.

  5. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  6. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on t

  7. METHANOGENESIS AND SULFATE REDUCTION IN CHEMOSTATS: II. MODEL DEVELOPMENT AND VERIFICATION

    Science.gov (United States)

    A comprehensive dynamic model is presented that simulates methanogenesis and sulfate reduction in a continuously stirred tank reactor (CSTR). This model incorporates the complex chemistry of anaerobic systems. A salient feature of the model is its ability to predict the effluent ...

  8. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  9. Development and experimental verification of a genome-scale metabolic model for Corynebacterium glutamicum

    Directory of Open Access Journals (Sweden)

    Hirasawa Takashi

    2009-08-01

    Full Text Available Abstract Background In silico genome-scale metabolic models enable the analysis of the characteristics of metabolic systems of organisms. In this study, we reconstructed a genome-scale metabolic model of Corynebacterium glutamicum on the basis of genome sequence annotation and physiological data. The metabolic characteristics were analyzed using flux balance analysis (FBA, and the results of FBA were validated using data from culture experiments performed at different oxygen uptake rates. Results The reconstructed genome-scale metabolic model of C. glutamicum contains 502 reactions and 423 metabolites. We collected the reactions and biomass components from the database and literatures, and made the model available for the flux balance analysis by filling gaps in the reaction networks and removing inadequate loop reactions. Using the framework of FBA and our genome-scale metabolic model, we first simulated the changes in the metabolic flux profiles that occur on changing the oxygen uptake rate. The predicted production yields of carbon dioxide and organic acids agreed well with the experimental data. The metabolic profiles of amino acid production phases were also investigated. A comprehensive gene deletion study was performed in which the effects of gene deletions on metabolic fluxes were simulated; this helped in the identification of several genes whose deletion resulted in an improvement in organic acid production. Conclusion The genome-scale metabolic model provides useful information for the evaluation of the metabolic capabilities and prediction of the metabolic characteristics of C. glutamicum. This can form a basis for the in silico design of C. glutamicum metabolic networks for improved bioproduction of desirable metabolites.

  10. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  11. Vacuum-Assisted Resin Transfer Molding (VARTM) Model Development, Verification, and Process Analysis

    OpenAIRE

    Sayre, Jay Randall

    2000-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this techni...

  12. Development and experimental verification of a model for an air jet penetrated by plumes

    Directory of Open Access Journals (Sweden)

    Xin Wang

    2015-03-01

    Full Text Available This article presents the fluid mechanics of a ventilation system formed by a momentum source and buoyancy sources. We investigate the interaction between plumes and a non-isothermal air jet for separate sources of buoyancy produced by the plume and the momentum of the air jet. The mathematical model represents the situation in which a plume rises from two heat sources causing buoyancy. The model is used to discuss the interactions involved. The effects of parameters such as the power of the source and the air-flow volume used in the mathematical-physical model are also discussed. An expression is deduced for the trajectory of the non-isothermal air jet penetrated by plumes. Experiments were also carried out to illustrate the effect on the flow of the air jet and to validate the theoretical work. The results show that the buoyancy source’s efforts to baffle the descent of the cold air have even been effective in reversing the direction of the trajectory. However, increasing the distance between the plumes can reduce the effect of the plumes on the jet curve. And it is apparent that when the velocity of the air supply increases, the interference caused by the plumes can be reduced.

  13. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  14. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  15. Calibration and verification of environmental models

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Weinberg, N.; Hiser, H.

    1976-01-01

    The problems of calibration and verification of mesoscale models used for investigating power plant discharges are considered. The value of remote sensors for data acquisition is discussed as well as an investigation of Biscayne Bay in southern Florida.

  16. Verification strategies for fluid-based plasma simulation models

    Science.gov (United States)

    Mahadevan, Shankar

    2012-10-01

    Verification is an essential aspect of computational code development for models based on partial differential equations. However, verification of plasma models is often conducted internally by authors of these programs and not openly discussed. Several professional research bodies including the IEEE, AIAA, ASME and others have formulated standards for verification and validation (V&V) of computational software. This work focuses on verification, defined succinctly as determining whether the mathematical model is solved correctly. As plasma fluid models share several aspects with the Navier-Stokes equations used in Computational Fluid Dynamics (CFD), the CFD verification process is used as a guide. Steps in the verification process: consistency checks, examination of iterative, spatial and temporal convergence, and comparison with exact solutions, are described with examples from plasma modeling. The Method of Manufactured Solutions (MMS), which has been used to verify complex systems of PDEs in solid and fluid mechanics, is introduced. An example of the application of MMS to a self-consistent plasma fluid model using the local mean energy approximation is presented. The strengths and weaknesses of the techniques presented in this work are discussed.

  17. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  18. SU-E-T-254: Development of a HDR-BT QA Tool for Verification of Source Position with Oncentra Applicator Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kumazaki, Y; Miyaura, K; Hirai, R; Miyazawa, K; Makino, S; Tamaki, T; Shikama, N; Kato, S [Saitama Medical University International Medical Center, Hidaka, Saitama (Japan)

    2015-06-15

    Purpose: To develop a High Dose Rate Brachytherapy (HDR-BT) quality assurance (QA) tool for verification of source position with Oncentra applicator modeling, and to report the results of radiation source positions with this tool. Methods: We developed a HDR-BT QA phantom and automated analysis software for verification of source position with Oncentra applicator modeling for the Fletcher applicator used in the MicroSelectron HDR system. This tool is intended for end-to-end tests that mimic the clinical 3D image-guided brachytherapy (3D-IGBT) workflow. The phantom is a 30x30x3 cm cuboid phantom with radiopaque markers, which are inserted into the phantom to evaluate applicator tips and reference source positions; positions are laterally shifted 10 mm from the applicator axis. The markers are lead-based and scatter radiation to expose the films. Gafchromic RTQA2 films are placed on the applicators. The phantom includes spaces to embed the applicators. The source position is determined as the distance between the exposed source position and center position of two pairs of the first radiopaque markers. We generated a 3D-IGBT plan with applicator modeling. The first source position was 6 mm from the applicator tips, and the second source position was 10 mm from the first source position. Results: All source positions were consistent with the exposed positions within 1 mm for all Fletcher applicators using in-house software. Moreover, the distance between source positions was in good agreement with the reference distance. Applicator offset, determined as the distance from the applicator tips at the first source position in the treatment planning system, was accurate. Conclusion: Source position accuracy of applicator modeling used in 3D-IGBT was acceptable. This phantom and software will be useful as a HDR-BT QA tool for verification of source position with Oncentra applicator modeling.

  19. ENROLMENT MODEL STABILITY IN STATIC SIGNATURE VERIFICATION

    NARCIS (Netherlands)

    Allgrove, C.; Fairhurst, M.C.

    2004-01-01

    The stability of enrolment models used in a static verification system is assessed, in order to provide an enhanced chracterisation of signatures through the validation of the enrolment process. A number of static features are used to illustrate the effect of the variation in enrolment model size on

  20. Development of a Model for the Simulation of ROPS Tests on Agricultural Tractors Cabin: Numerical Models and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Sergio Baragetti,

    2015-09-01

    Full Text Available It is here proposed a methodology for simulation of ROPS tests (ROPS = Roll Over Protective Structure of agricultural tractor cabins. The work is based on the resolution of this problem through the use of the finite element method. In order to limit the number of nodes of the model and thus to speed up the resolution,a twodimensional finite elements model has been chosen. The method presented here solves with relative ease, even very complex structures. There are also simplest methods in literature where specially made software is based on the finite element method for simulating approval tests on ROPS structures. In this case,codes developed just for this purposeare available, and therefore very simple to use and characterized by a high speed of preparation of the model following the definition of a small number of parameters. On the other side these are codes designed for structures having a specific geometric shape and in which the user is not free to set all the parameters existing in commercial software for the structural calculation, and are not very suitable in case of complex or not conventional structures. The methodology proposed by the authors instead, although not automated, allows simulating any type of structure in acceptable times. The results were validated by full scale experimental tests. Through the interpretation of the results it is possible to identify which areais the most critical for the structure and evaluate any change, something which is not easy to do through expensive tests.

  1. Probabilistic Model for Dynamic Signature Verification System

    Directory of Open Access Journals (Sweden)

    Chai Tong Yuen

    2011-11-01

    Full Text Available This study has proposed the algorithm for signature verification system using dynamic parameters of the signature: pen pressure, velocity and position. The system is proposed to read, analyze and verify the signatures from the SUSig online database. Firstly, the testing and reference samples will have to be normalized, re-sampled and smoothed through pre-processing stage. In verification stage, the difference between reference and testing signatures will be calculated based on the proposed thresholded standard deviation method. A probabilistic acceptance model has been designed to enhance the performance of the verification system. The proposed algorithm has reported False Rejection Rate (FRR of 14.8% and False Acceptance Rate (FAR of 2.64%. Meanwhile, the classification rate of the system is around 97%.

  2. The Construction of Verification Models for Embedded Systems

    NARCIS (Netherlands)

    Mader, Angelika H.; Wupper, H.; Boon, Mieke

    2007-01-01

    The usefulness of verification hinges on the quality of the verification model. Verification is useful if it increases our confidence that an artefact bahaves as expected. As modelling inherently contains non-formal elements, the qualityof models cannot be captured by purely formal means. Still, we

  3. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  4. MACCS2 development and verification efforts

    Energy Technology Data Exchange (ETDEWEB)

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  5. SU-E-T-265: Development of Dose-To-Water Conversion Models for Pre-Treatment Verification with the New AS1200 Imager

    Energy Technology Data Exchange (ETDEWEB)

    Miri, N [University of Newcastle, Newcastle, NSW (Australia); Baltes, C; Keller, P [Varian Medical Systems Imaging, Baden-Dättwil (Switzerland); Greer, P [Newcastle Mater Hospital, Newcastle, NSW (Australia)

    2015-06-15

    Purpose: To develop and evaluate models for dose verification of flattened (FF) and flattening filter free (FFF) beams for the new Varian aS1200 backscatter-shielded electronic portal imaging device (EPID). Methods: The model converts EPID images to incident energy fluence using deconvolution of EPID scatter kernels and fluence to dose in water using convolution with dose-to-water kernels. Model parameters were optimized using non-transmission EPID images of varying jaw defined field sizes for energies of 6 and 10 MV FF and FFF beams. Energy fluence was obtained from the Acuros planning system and reference dose profiles and output factors were measured at depths of 5, 10, 15 and 20 cm in a water phantom. Images for 34 IMRT fields acquired at 6 and 10 MV FF energy were converted to dose at 10 cm depth in water and compared to treatment planning system dose plane calculations using gamma criteria. Results: Gamma evaluations for the IMRT fields had mean (1 standard deviation) pass rates of 99.4% (0.8%) and mean gamma scores of 0.32 (0.06) with 2%, 2 mm criteria and 10% of maximum dose threshold. Conclusion: The developed model has been shown to be highly accurate for pre-treatment verification with the new aS1200 imager which does not display support-arm backscatter artefact and has improved dosimetric properties. Further investigation of FFF modes is in progress. The model is currently being evaluated at sites for potential clinical release.

  6. Development and verification of a model for estimating the screening utility in the detection of PCBs in transformer oil.

    Science.gov (United States)

    Terakado, Shingo; Glass, Thomas R; Sasaki, Kazuhiro; Ohmura, Naoya

    2014-01-01

    A simple new model for estimating the screening performance (false positive and false negative rates) of a given test for a specific sample population is presented. The model is shown to give good results on a test population, and is used to estimate the performance on a sampled population. Using the model developed in conjunction with regulatory requirements and the relative costs of the confirmatory and screening tests allows evaluation of the screening test's utility in terms of cost savings. Testers can use the methods developed to estimate the utility of a screening program using available screening tests with their own sample populations.

  7. Verification of pneumatic railway brake models

    Science.gov (United States)

    Piechowiak, Tadeusz

    2010-03-01

    The article presents a survey of diverse methods for validation of pneumatic train brake modelling. Various experimental measurements of railway pneumatic brakes were made chiefly on a test stand at Poznań University of Technology; other test stands and some results have been taken from the literature. The measurements, some of them unconventional, were performed on separate pneumatic elements, brake devices, the brake pipe and fragments thereof. Mechanical devices were also included. The experimental measurement results were used for the verification of numerical models and for the determination of parameters. The latter was partially performed using an optimisation method.

  8. Formal Modeling and Verification for MVB

    Directory of Open Access Journals (Sweden)

    Mo Xia

    2013-01-01

    Full Text Available Multifunction Vehicle Bus (MVB is a critical component in the Train Communication Network (TCN, which is widely used in most of the modern train techniques of the transportation system. How to ensure security of MVB has become an important issue. Traditional testing could not ensure the system correctness. The MVB system modeling and verification are concerned in this paper. Petri Net and model checking methods are used to verify the MVB system. A Hierarchy Colored Petri Net (HCPN approach is presented to model and simulate the Master Transfer protocol of MVB. Synchronous and asynchronous methods are proposed to describe the entities and communication environment. Automata model of the Master Transfer protocol is designed. Based on our model checking platform M3C, the Master Transfer protocol of the MVB is verified and some system logic critical errors are found. Experimental results show the efficiency of our methods.

  9. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    and implementation. Our approach has been developed in the context of the Coloured Petri Nets (CPNs) modelling language. We illustrate our approach by presenting a descriptive specification model of the Websocket protocol which is currently under development by the Internet Engineering Task Force (IETF), and we show......Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...

  10. Approaches to verification of two-dimensional water quality models

    Energy Technology Data Exchange (ETDEWEB)

    Butkus, S.R. (Tennessee Valley Authority, Chattanooga, TN (USA). Water Quality Dept.)

    1990-11-01

    The verification of a water quality model is the one procedure most needed by decision making evaluating a model predictions, but is often not adequate or done at all. The results of a properly conducted verification provide the decision makers with an estimate of the uncertainty associated with model predictions. Several statistical tests are available for quantifying of the performance of a model. Six methods of verification were evaluated using an application of the BETTER two-dimensional water quality model for Chickamauga reservoir. Model predictions for ten state variables were compared to observed conditions from 1989. Spatial distributions of the verification measures showed the model predictions were generally adequate, except at a few specific locations in the reservoir. The most useful statistics were the mean standard error of the residuals. Quantifiable measures of model performance should be calculated during calibration and verification of future applications of the BETTER model. 25 refs., 5 figs., 7 tabs.

  11. Development and Kinematic Verification of a Finite Element Model for the Lumbar Spine: Application to Disc Degeneration

    Directory of Open Access Journals (Sweden)

    Elena Ibarz

    2013-01-01

    Full Text Available The knowledge of the lumbar spine biomechanics is essential for clinical applications. Due to the difficulties to experiment on living people and the irregular results published, simulation based on finite elements (FE has been developed, making it possible to adequately reproduce the biomechanics of the lumbar spine. A 3D FE model of the complete lumbar spine (vertebrae, discs, and ligaments has been developed. To verify the model, radiological images (X-rays were taken over a group of 25 healthy, male individuals with average age of 27.4 and average weight of 78.6 kg with the corresponding informed consent. A maximum angle of 34.40° is achieved in flexion and of 35.58° in extension with a flexion-extension angle of 69.98°. The radiological measurements were 33.94 ± 4.91°, 38.73 ± 4.29°, and 72.67°, respectively. In lateral bending, the maximum angles were 19.33° and 23.40 ± 2.39, respectively. In rotation a maximum angle of 9.96° was obtained. The model incorporates a precise geometrical characterization of several elements (vertebrae, discs, and ligaments, respecting anatomical features and being capable of reproducing a wide range of physiological movements. Application to disc degeneration (L5-S1 allows predicting the affection in the mobility of the different lumbar segments, by means of parametric studies for different ranges of degeneration.

  12. Development, Verification and Use of Gust Modeling in the NASA Computational Fluid Dynamics Code FUN3D

    Science.gov (United States)

    Bartels, Robert E.

    2012-01-01

    This paper presents the implementation of gust modeling capability in the CFD code FUN3D. The gust capability is verified by computing the response of an airfoil to a sharp edged gust. This result is compared with the theoretical result. The present simulations will be compared with other CFD gust simulations. This paper also serves as a users manual for FUN3D gust analyses using a variety of gust profiles. Finally, the development of an Auto-Regressive Moving-Average (ARMA) reduced order gust model using a gust with a Gaussian profile in the FUN3D code is presented. ARMA simulated results of a sequence of one-minus-cosine gusts is shown to compare well with the same gust profile computed with FUN3D. Proper Orthogonal Decomposition (POD) is combined with the ARMA modeling technique to predict the time varying pressure coefficient increment distribution due to a novel gust profile. The aeroelastic response of a pitch/plunge airfoil to a gust environment is computed with a reduced order model, and compared with a direct simulation of the system in the FUN3D code. The two results are found to agree very well.

  13. Development of a Wearable Instrumented Vest for Posture Monitoring and System Usability Verification Based on the Technology Acceptance Model.

    Science.gov (United States)

    Lin, Wen-Yen; Chou, Wen-Cheng; Tsai, Tsai-Hsuan; Lin, Chung-Chih; Lee, Ming-Yih

    2016-12-17

    Body posture and activity are important indices for assessing health and quality of life, especially for elderly people. Therefore, an easily wearable device or instrumented garment would be valuable for monitoring elderly people's postures and activities to facilitate healthy aging. In particular, such devices should be accepted by elderly people so that they are willing to wear it all the time. This paper presents the design and development of a novel, textile-based, intelligent wearable vest for real-time posture monitoring and emergency warnings. The vest provides a highly portable and low-cost solution that can be used both indoors and outdoors in order to provide long-term care at home, including health promotion, healthy aging assessments, and health abnormality alerts. The usability of the system was verified using a technology acceptance model-based study of 50 elderly people. The results indicated that although elderly people are anxious about some newly developed wearable technologies, they look forward to wearing this instrumented posture-monitoring vest in the future.

  14. Modeling and Verification of Distributed Generation and Voltage Regulation Equipment for Unbalanced Distribution Power Systems; Annual Subcontract Report, June 2007

    Energy Technology Data Exchange (ETDEWEB)

    Davis, M. W.; Broadwater, R.; Hambrick, J.

    2007-07-01

    This report summarizes the development of models for distributed generation and distribution circuit voltage regulation equipment for unbalanced power systems and their verification through actual field measurements.

  15. [Development and verification of a 3-dimensional finite element model of the human neck based on CT images].

    Science.gov (United States)

    Lu, Chang; Han, Ke; Li, Jing; Wang, Bing; Lu, Guo-hua

    2008-05-01

    To establish a 3-dimensional finite element model. The coordinate data of the vertebras were obtained from the CT scan images of Chinese 50th percentile healthy male adult volunteers' cervical spine, converted into point cloud data, and stored as ASCII file using Mimics software. CATIA software was used to preprocess and Geomagic software was used to establish the geometry model of the C0 approximately C7 cervical spine. The geometry model was meshed by Hypermesh software. Mapped mesh method was used to mesh cortical bone, trabecular bone, intervertebral disk, ligaments, etc. Some material parameters were defined from other available material parameters using proportion and function scale method. The model had 22 512 solid elements and 14 180 shell/membrane elements. The model was validated by the cervical spine drop test. The model has good biofidelity and can be used to study the dynamic response and injury mechanism of the cervical spine in the car accidents.

  16. Design verification and cold-flow modeling test report

    Energy Technology Data Exchange (ETDEWEB)

    1993-07-01

    This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, and TRW proprietary information has been excluded.

  17. Transforming PLC Programs into Formal Models for Verification Purposes

    CERN Document Server

    Darvas, D; Blanco, E

    2013-01-01

    Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.

  18. MCCI 过程模型开发及验证%Development and Verification of MCCI Process Model

    Institute of Scientific and Technical Information of China (English)

    魏巍; 齐克林; 万舒; 陈艳芳; 郭富德

    2014-01-01

    A mechanistic model of the molten core-concrete interaction (MCCI) process was described ,and it was used to calculate and analyze the MCCI process of Daya Bay Nuclear Power Plant when the station blackout (SBO) accident ,or large loss of coolant accident (LLOCA) with the failure of safety injection was happened .The calculation results of this procedure were compared with the large-scale analysis program MELCOR to verify the reasonableness and correctness of the model .T he results indicate that the model presented in this paper can simulate the MCCI process correctly and reasonably under the given severe accidents ,and the calculation speed is fast .It can meet the appli-cation requirements of simulators .%概述了严重事故下堆芯熔融物与混凝土相互作用(MCCI)过程的机理性模型,并给出了大亚湾核电厂全厂断电及大破口叠加安注失效等典型初因事故导致的严重事故下的MCCI过程的计算分析结果,并与相同事故序列下的 M ELCOR计算结果进行对比。计算结果表明,所给出的严重事故下的MCCI过程模型正确合理,计算速度快,能满足在模拟机上应用的要求。

  19. Effect of Terrestrial and Marine Organic Aerosol on Regional and Global Climate: Model Development, Application, and Verification with Satellite Data

    Energy Technology Data Exchange (ETDEWEB)

    Meskhidze, Nicholas; Zhang, Yang; Kamykowski, Daniel

    2012-03-28

    In this DOE project the improvements to parameterization of marine primary organic matter (POM) emissions, hygroscopic properties of marine POM, marine isoprene derived secondary organic aerosol (SOA) emissions, surfactant effects, new cloud droplet activation parameterization have been implemented into Community Atmosphere Model (CAM 5.0), with a seven mode aerosol module from the Pacific Northwest National Laboratory (PNNL)'s Modal Aerosol Model (MAM7). The effects of marine aerosols derived from sea spray and ocean emitted biogenic volatile organic compounds (BVOCs) on microphysical properties of clouds were explored by conducting 10 year CAM5.0-MAM7 model simulations at a grid resolution 1.9° by 2.5° with 30 vertical layers. Model-predicted relationship between ocean physical and biological systems and the abundance of CCN in remote marine atmosphere was compared to data from the A-Train satellites (MODIS, CALIPSO, AMSR-E). Model simulations show that on average, primary and secondary organic aerosol emissions from the ocean can yield up to 20% increase in Cloud Condensation Nuclei (CCN) at 0.2% Supersaturation, and up to 5% increases in droplet number concentration of global maritime shallow clouds. Marine organics were treated as internally or externally mixed with sea salt. Changes associated with cloud properties reduced (absolute value) the model-predicted short wave cloud forcing from -1.35 Wm-2 to -0.25 Wm-2. By using different emission scenarios, and droplet activation parameterizations, this study suggests that addition of marine primary aerosols and biologically generated reactive gases makes an important difference in radiative forcing assessments. All baseline and sensitivity simulations for 2001 and 2050 using global-through-urban WRF/Chem (GU-WRF) were completed. The main objective of these simulations was to evaluate the capability of GU-WRF for an accurate representation of the global atmosphere by exploring the most accurate

  20. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  1. Verification and transfer of thermal pollution model. Volume 5: Verification of 2-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorate (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  2. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...

  3. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...

  4. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...

  5. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...

  6. Development and verification of simplified prediction models for enhanced oil recovery applications. CO/sub 2/ (miscible flood) predictive model. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Paul, G.W.

    1984-10-01

    A screening model for CO/sub 2/ miscible flooding has been developed consisting of a reservoir model for oil rate and recovery and an economic model. The reservoir model includes the effects of viscous fingering, reservoir heterogeneity, gravity segregation and areal sweep. The economic model includes methods to calculate various profitability indices, the windfall profits tax, and provides for CO/sub 2/ recycle. The model is applicable to secondary or tertiary floods, and to solvent slug or WAG processes. The model does not require detailed oil-CO/sub 2/ PVT data for execution, and is limited to five-spot patterns. A pattern schedule may be specified to allow economic calculations for an entire project to be made. Models of similar architecture have been developed for steam drive, in-situ combustion, surfactant-polymer flooding, polymer flooding and waterflooding. 36 references, 41 figures, 4 tables.

  7. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  8. Developement of radioisotope tracer technique; development of verification method for hydraulic model using radioisotope tracer techniques in the municipal wastewater treatment plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, C. W.; Kim, S. H.; Kim, J. W.; Yun, J. S.; Wo, S. B. [Pusan National University, Pusan (Korea)

    2001-04-01

    This study focuses on the development of the computational fluid dynamics that can be used in secondary clarifier in wastewater treatment plants. This model could describe the internal flow characteristics and predicted similar results as the isotopic tracer experiment. Therefore, it was demonstrated that the isotopic tracer method was a powerful tool as a hydrodynamic model to understand the internal hydraulics. Generally the secondary clarifier can be improved by special design, changing coagulation characteristics by addition of coagulation chemicals and well management by experienced operator. Because of expensive coagulation chemicals and limited availability of experienced operator, the improvement of the design is feasible way to upgrade the secondary clarifier. Though it is very complex and difficult to model the fluid dynamics, CFD model can describe correctly density flow, short circuiting, turbulent dispersion and settling characteristics. There are few trust worthy methods for verifying the hydrodynamic model. Also, it is very difficult to prove the flow by experiment in secondary sedimentation tank because of the disturbing the flow by the experimental equipment. However, the isotope tracer experiment is known as a useful tool for the study of the hydraulic characteristics and floc movement in the sedimentation tank because the isotope tracer does not disturb the internal flow and provide the data quickly through the on-line system. Therefore, the computed fluid dynamic model was developed to make the isotope tracer experiment available as a model verifying method. Predicted results in model simulation were made the same pattern as the experiment on-line data with the time. These results were compared each other. Also, the model explained the detail flow pattern of the area without the monitoring in the sedimentation tank and visualized the internal flow and concentration distribution with time using the graphic software. Because of the complicated

  9. Developement of radioisotope tracer technique; development of verification method for hydraulic model using radioisotope tracer techniques in the municipal wastewater treatment plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, C. W.; Kim, S. H.; Kim, J. W.; Yun, J. S.; Wo, S. B. [Pusan National University, Pusan (Korea)

    2001-04-01

    This study focuses on the development of the computational fluid dynamics that can be used in secondary clarifier in wastewater treatment plants. This model could describe the internal flow characteristics and predicted similar results as the isotopic tracer experiment. Therefore, it was demonstrated that the isotopic tracer method was a powerful tool as a hydrodynamic model to understand the internal hydraulics. Generally the secondary clarifier can be improved by special design, changing coagulation characteristics by addition of coagulation chemicals and well management by experienced operator. Because of expensive coagulation chemicals and limited availability of experienced operator, the improvement of the design is feasible way to upgrade the secondary clarifier. Though it is very complex and difficult to model the fluid dynamics, CFD model can describe correctly density flow, short circuiting, turbulent dispersion and settling characteristics. There are few trust worthy methods for verifying the hydrodynamic model. Also, it is very difficult to prove the flow by experiment in secondary sedimentation tank because of the disturbing the flow by the experimental equipment. However, the isotope tracer experiment is known as a useful tool for the study of the hydraulic characteristics and floc movement in the sedimentation tank because the isotope tracer does not disturb the internal flow and provide the data quickly through the on-line system. Therefore, the computed fluid dynamic model was developed to make the isotope tracer experiment available as a model verifying method. Predicted results in model simulation were made the same pattern as the experiment on-line data with the time. These results were compared each other. Also, the model explained the detail flow pattern of the area without the monitoring in the sedimentation tank and visualized the internal flow and concentration distribution with time using the graphic software. Because of the complicated

  10. A verification and validation process for model-driven engineering

    Science.gov (United States)

    Delmas, R.; Pires, A. F.; Polacsek, T.

    2013-12-01

    Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

  11. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  12. Verification of tropical cyclone using the KIAPS Integration Model (KIM)

    Science.gov (United States)

    Lim, S.; Seol, K. H.

    2015-12-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) is a government funded non-profit research and development institute located in Seoul, South Korea. KIAPS is developing the Global Model, a backbone for the next-generation operational global numerical weather prediction (NWP) system with three-phase plans; Establishment and R&D Planning (2011-2013), Test Model Development (2014-2016), and Operational Model Development (2017-2019). As a second-phase, we have beta version of KIAPS Integration Model (KIM) that can produce reasonable global forecasting. Using the KIM model, we are evaluating the tropical cyclone forecast in the global model. To objectively provide a best estimate of the storm's central position, we use the Geophysical Fluid Dynamics Laboratory (GFDL) vortex tracker, widely used in tracker algorithms. It gives the track and intensity of the storm throughout the duration of the forecast based on its algorithm. As a verification tool, we use the Model Evaluation Tool - Tropical Cyclone (MET-TC), which produces statistical evaluation. We expect these results give the statue of ability for the tropical cyclone forecast with KIM model.

  13. Development of Palmprint Verification System Using Biometrics

    Institute of Scientific and Technical Information of China (English)

    G. Shobha; M. Krishna; S.C. Sharma

    2006-01-01

    Palmprint verification system using Biometrics is one of the emerging technologies, which recognizes a person based on the principle lines, wrinkles and ridges on the surface of the palm. These line structures are stable and remain unchanged throughout the life of an individual. More importantly, no two palmprints from different individuals are the same, and normally people do not feel uneasy to have their palmprint images taken for testing. Therefore palmprint recognition offers a promising future for medium-security access control systems. In this paper, a new approach for personal authentication using hand images is discussed. Gray-Scale palm images are captured using a digital camera at a resolution of 640′480. Each of these gray-scale images is aligned and then used to extract palmprint and hand geometry features. These features are then used for authenticating users. The image acquisition setup used here is inherently simple and it does not employ any special illumination nor does it use any pegs that might cause any inconvenience to users. Experimental results show that the designed system achieves an acceptable level of performance.

  14. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Young; Kim, Eung Soo [Seoul National University, Seoul (Korea, Republic of)

    2014-10-15

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification.

  15. Verification of Embedded Memory Systems using Efficient Memory Modeling

    CERN Document Server

    Ganai, Malay K; Ashar, Pranav

    2011-01-01

    We describe verification techniques for embedded memory systems using efficient memory modeling (EMM), without explicitly modeling each memory bit. We extend our previously proposed approach of EMM in Bounded Model Checking (BMC) for a single read/write port single memory system, to more commonly occurring systems with multiple memories, having multiple read and write ports. More importantly, we augment such EMM to providing correctness proofs, in addition to finding real bugs as before. The novelties of our verification approach are in a) combining EMM with proof-based abstraction that preserves the correctness of a property up to a certain analysis depth of SAT-based BMC, and b) modeling arbitrary initial memory state precisely and thereby, providing inductive proofs using SAT-based BMC for embedded memory systems. Similar to the previous approach, we construct a verification model by eliminating memory arrays, but retaining the memory interface signals with their control logic and adding constraints on tho...

  16. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...

  17. Verification of a Probabilistic Model for A Distribution System with Integration of Dispersed Generation

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte;

    2008-01-01

    In order to assess the present and predict the future distribution system performance using a probabilistic model, verification of the model is crucial. This paper illustrates the error caused by using traditional Monte Carlo (MC) based probabilistic load flow (PLF) when involving tap...... obtained from the developed probabilistic model....

  18. LithoScope: Simulation Based Mask Layout Verification with Physical Resist Model

    Science.gov (United States)

    Qian, Qi-De

    2002-12-01

    Simulation based mask layout verification and optimization is a cost effective way to ensure high mask performance in wafer lithography. Because mask layout verification serves as a gateway to the expensive manufacturing process, the model used for verification must have superior accuracy than models used upstream. In this paper, we demonstrate, for the first time, a software system for mask layout verification and optical proximity correction that employs a physical resist development model. The new system, LithoScope, predicts wafer patterning by solving optical and resist processing equations on a scale that is until recently considered unpractical. Leveraging the predictive capability of the physical model, LithoScope can perform mask layout verification and optical proximity correction under a wide range of processing conditions and for any reticle enhancement technology without the need for multiple model development. We show the ability for physical resist model to change iso-focal bias by optimizing resist parameters, which is critical for matching the experimental process window. We present line width variation statistics and chip level process window predictions using a practical cell layout. We show that LithoScope model can accurately describe the resist-intensive poly gate layer patterning. This system can be used to pre-screen mask data problems before manufacturing to reduce the overall cost of the mask and the product.

  19. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  20. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  1. Correction, improvement and model verification of CARE 3, version 3

    Science.gov (United States)

    Rose, D. M.; Manke, J. W.; Altschul, R. E.; Nelson, D. L.

    1987-01-01

    An independent verification of the CARE 3 mathematical model and computer code was conducted and reported in NASA Contractor Report 166096, Review and Verification of CARE 3 Mathematical Model and Code: Interim Report. The study uncovered some implementation errors that were corrected and are reported in this document. The corrected CARE 3 program is called version 4. Thus the document, correction. improvement, and model verification of CARE 3, version 3 was written in April 1984. It is being published now as it has been determined to contain a more accurate representation of CARE 3 than the preceding document of April 1983. This edition supercedes NASA-CR-166122 entitled, 'Correction and Improvement of CARE 3,' version 3, April 1983.

  2. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...

  3. Demonstration of Design Verification Model of Rubidium Frequency Standard

    CERN Document Server

    Ghosal, Bikash; Nandanwar, Satish; Banik, Alak; Dasgupta, K S; Saxena, G M

    2011-01-01

    In this paper we report the development of the design verification model (DVM) of Rb atomic frequency standard. The Rb atomic frequency standard or clock has two distinct parts. One is the Physics Package where the hyperfine transitions produce the clock signal in the integrated filter cell configuration and the other is the electronic circuits which generate the resonant microwave hyperfine frequency, phase modulator and phase sensitive detector. In this paper the details of the Rb Physics package and the electronic circuits are given. The effect of putting the photo detector inside the microwave cavity is studied and reported with its effect on the resonance signal profile. The Rb clock frequency stability measurements have also been discussed.

  4. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    Science.gov (United States)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  5. Sensor Fusion and Model Verification for a Mobile Robot

    OpenAIRE

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck; Bendtsen, Jan Dimon; Izadi-Zamanabadi, Roozbeh

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms as well as slip. An Unscented Kalman Filter (UKF) based on the dynamic model is used for sensor fusion, feeding sensor measurements back to the robot controller in an intelligent manner. Through practi...

  6. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...

  7. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Panduro, T. E.; Thorsen, B. J.

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...

  8. High temperature furnace modeling and performance verifications

    Science.gov (United States)

    Smith, James E., Jr.

    1992-01-01

    Analytical, numerical, and experimental studies were performed on two classes of high temperature materials processing sources for their potential use as directional solidification furnaces. The research concentrated on a commercially available high temperature furnace using a zirconia ceramic tube as the heating element and an Arc Furnace based on a tube welder. The first objective was to assemble the zirconia furnace and construct parts needed to successfully perform experiments. The 2nd objective was to evaluate the zirconia furnace performance as a directional solidification furnace element. The 3rd objective was to establish a data base on materials used in the furnace construction, with particular emphasis on emissivities, transmissivities, and absorptivities as functions of wavelength and temperature. A 1-D and 2-D spectral radiation heat transfer model was developed for comparison with standard modeling techniques, and were used to predict wall and crucible temperatures. The 4th objective addressed the development of a SINDA model for the Arc Furnace and was used to design sample holders and to estimate cooling media temperatures for the steady state operation of the furnace. And, the 5th objective addressed the initial performance evaluation of the Arc Furnace and associated equipment for directional solidification. Results of these objectives are presented.

  9. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  10. VERIFICATION OF GEAR DYNAMIC MODEL IN DIFFERENT OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ

    2014-09-01

    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  11. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  12. Verification of road databases using multiple road models

    Science.gov (United States)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  13. Temperature Modeling of Lost Creek Lake Using CE-QUAL-W2:A Report on the Development, Calibration, Verification, and Application of the Model

    Science.gov (United States)

    2017-04-01

    Shortwave solar radiation was available, but ERDC chose to the let the model calculate it internally because this produced better results (SROC = OFF...Although ERDC was provided with hourly meteorological data, W2 was still allowed to interpolate the input data to correspond to the model time-step by...the dam, ERDC has found in previous studies that the model performs better when the W2 is allowed to calculate SRO (short wave solar radiation

  14. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2010-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor ...

  15. Formal Specifications and Verification of a Secure Communication Protocol Model

    Institute of Scientific and Technical Information of China (English)

    夏阳; 陆余良; 蒋凡

    2003-01-01

    This paper presents a secure communication protocol model-EABM, by which network security communication can be realized easily and efficiently. First, the paper gives a thorough analysis of the protocol system, systematic construction and state transition of EABM. Then , it describes the channels and the process of state transition of EABM in terms of ESTELLE. At last, it offers a verification of the accuracy of the EABM model.

  16. Efficient Development and Verification of Safe Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2013-01-01

    In this book, the authors present current research on the types, design and safety issues of railways. Topics discussed include the acoustic characteristics of noise in train stations; monitoring railway structure conditions and opportunities to use wireless sensor networks as tools to improve...... the monitoring process; hydraulic absorbers as dampers to dissipate the energy of oscillations in railway electric equipment; development of train fare calculation and adjustment systems using VDM++; efficient development and verification of safe railway control software; and evolution of the connectivity...... of the Portuguese broad gauge railway network (1948-2012)....

  17. Efficient Development and Verification of Safe Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2013-01-01

    the monitoring process; hydraulic absorbers as dampers to dissipate the energy of oscillations in railway electric equipment; development of train fare calculation and adjustment systems using VDM++; efficient development and verification of safe railway control software; and evolution of the connectivity......In this book, the authors present current research on the types, design and safety issues of railways. Topics discussed include the acoustic characteristics of noise in train stations; monitoring railway structure conditions and opportunities to use wireless sensor networks as tools to improve...

  18. Gaia challenging performances verification: combination of spacecraft models and test results

    Science.gov (United States)

    Ecale, Eric; Faye, Frédéric; Chassat, François

    2016-08-01

    To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.

  19. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  20. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  1. Verification of the karst flow model under laboratory controlled conditions

    Science.gov (United States)

    Gotovac, Hrvoje; Andric, Ivo; Malenica, Luka; Srzic, Veljko

    2016-04-01

    Karst aquifers are very important groundwater resources around the world as well as in coastal part of Croatia. They consist of extremely complex structure defining by slow and laminar porous medium and small fissures and usually fast turbulent conduits/karst channels. Except simple lumped hydrological models that ignore high karst heterogeneity, full hydraulic (distributive) models have been developed exclusively by conventional finite element and finite volume elements considering complete karst heterogeneity structure that improves our understanding of complex processes in karst. Groundwater flow modeling in complex karst aquifers are faced by many difficulties such as a lack of heterogeneity knowledge (especially conduits), resolution of different spatial/temporal scales, connectivity between matrix and conduits, setting of appropriate boundary conditions and many others. Particular problem of karst flow modeling is verification of distributive models under real aquifer conditions due to lack of above-mentioned information. Therefore, we will show here possibility to verify karst flow models under the laboratory controlled conditions. Special 3-D karst flow model (5.6*2.6*2 m) consists of concrete construction, rainfall platform, 74 piezometers, 2 reservoirs and other supply equipment. Model is filled by fine sand (3-D porous matrix) and drainage plastic pipes (1-D conduits). This model enables knowledge of full heterogeneity structure including position of different sand layers as well as conduits location and geometry. Moreover, we know geometry of conduits perforation that enable analysis of interaction between matrix and conduits. In addition, pressure and precipitation distribution and discharge flow rates from both phases can be measured very accurately. These possibilities are not present in real sites what this model makes much more useful for karst flow modeling. Many experiments were performed under different controlled conditions such as different

  2. Assessment of Galileo modal test results for mathematical model verification

    Science.gov (United States)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  3. Verification of A Numerical Harbour Wave Model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A numerical model for wave propagation in a harbour is verified by use of physical models. The extended time-dependent mild slope equation is employed as the governing equation, and the model is solved by use of ADI method containing the relaxation factor. Firstly, the reflection coefficient of waves in front of rubble-mound breakwaters under oblique incident waves is determined through physical model tests, and it is regarded as the basis for simulating partial reflection boundaries of the numerical model. Then model tests on refraction, diffraction and reflection of waves in a harbour are performed to measure wave height distribution. Comparative results between physical and numerical model tests show that the present numerical model can satisfactorily simulate the propagation of regular and irregular waves in a harbour with complex topography and boundary conditions.

  4. Spatial Error Metrics for Oceanographic Model Verification

    Science.gov (United States)

    2012-02-01

    quantitatively and qualitatively for this oceano - graphic data and successfully separates the model error into displacement and intensity components. This... oceano - graphic models as well, though one would likely need to make special modifications to handle the often-used nonuniform spacing between depth layers

  5. Issues to be considered on obtaining plant models for formal verification purposes

    Science.gov (United States)

    Pacheco, R.; Gonzalez, L.; Intriago, M.; Machado, J.; Prisacaru, G.; Olaru, D.

    2016-08-01

    The development of dependable software for mechatronic systems can be a very complex and hard task. For facilitating the obtaining of dependable software for industrial controllers, some powerful software tools and analysis techniques can be used. Mainly, when using simulation and formal verification analysis techniques, it is necessary to develop plant models, in order to describe the plant behavior of those systems. However, developing a plant model implies that designer takes his (or her) decisions concerning granularity and level of abstraction of models; approach to consider for modeling (global or modular); and definition of strategies for simulation and formal verification tasks. This paper intends to highlight some aspects that can be considered for taking into account those decisions. For this purpose, it is presented a case study and there are illustrated and discussed very important aspects concerning above exposed issues.

  6. Certification and verification for Northrup Model NSC-01-0732 Fresnel lens concentrating solar collector

    Energy Technology Data Exchange (ETDEWEB)

    1979-03-01

    The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.

  7. Is my model good enough? Best practices for verification and validation of musculoskeletal models and simulations of movement.

    Science.gov (United States)

    Hicks, Jennifer L; Uchida, Thomas K; Seth, Ajay; Rajagopal, Apoorva; Delp, Scott L

    2015-02-01

    Computational modeling and simulation of neuromusculoskeletal (NMS) systems enables researchers and clinicians to study the complex dynamics underlying human and animal movement. NMS models use equations derived from physical laws and biology to help solve challenging real-world problems, from designing prosthetics that maximize running speed to developing exoskeletal devices that enable walking after a stroke. NMS modeling and simulation has proliferated in the biomechanics research community over the past 25 years, but the lack of verification and validation standards remains a major barrier to wider adoption and impact. The goal of this paper is to establish practical guidelines for verification and validation of NMS models and simulations that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies. In particular, we review a general process for verification and validation applied to NMS models and simulations, including careful formulation of a research question and methods, traditional verification and validation steps, and documentation and sharing of results for use and testing by other researchers. Modeling the NMS system and simulating its motion involves methods to represent neural control, musculoskeletal geometry, muscle-tendon dynamics, contact forces, and multibody dynamics. For each of these components, we review modeling choices and software verification guidelines; discuss variability, errors, uncertainty, and sensitivity relationships; and provide recommendations for verification and validation by comparing experimental data and testing robustness. We present a series of case studies to illustrate key principles. In closing, we discuss challenges the community must overcome to ensure that modeling and simulation are successfully used to solve the broad spectrum of problems that limit human mobility.

  8. Superelement Verification in Complex Structural Models

    Directory of Open Access Journals (Sweden)

    B. Dupont

    2008-01-01

    Full Text Available The objective of this article is to propose decision indicators to guide the analyst in the optimal definition of an ensemble of superelements in a complex structural assembly. These indicators are constructed based on comparisons between the unreduced physical model and the approximate solution provided by a nominally reduced superelement model. First, the low contribution substructure slave modes are filtered. Then, the minimum dynamical residual expansion is used to localize the superelements which are the most responsible for the response prediction errors. Moreover, it is shown that static residual vectors, which are a natural result of these calculations, can be included to represent the contribution of important truncated slave modes and consequently correct the deficient superelements. The proposed methodology is illustrated on a subassembly of an aeroengine model.

  9. Development and Verification of Smoothed Particle Hydrodynamics Code for Analysis of Tsunami near NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Young Beom; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    It becomes more complicated when considering the shape and phase of the ground below the seawater. Therefore, some different attempts are required to precisely analyze the behavior of tsunami. This paper introduces an on-going activities on code development in SNU based on an unconventional mesh-free fluid analysis method called Smoothed Particle Hydrodynamics (SPH) and its verification work with some practice simulations. This paper summarizes the on-going development and verification activities on Lagrangian mesh-free SPH code in SNU. The newly developed code can cover equation of motions and heat conduction equation so far, and verification of each models is completed. In addition, parallel computation using GPU is now possible, and GUI is also prepared. If users change input geometry or input values, they can simulate for various conditions geometries. A SPH method has large advantages and potential in modeling of free surface, highly deformable geometry and multi-phase problems that traditional grid-based code has difficulties in analysis. Therefore, by incorporating more complex physical models such as turbulent flow, phase change, two-phase flow, and even solid mechanics, application of the current SPH code is expected to be much more extended including molten fuel behaviors in the sever accident.

  10. Verification of the Chesapeake Bay Model.

    Science.gov (United States)

    1981-12-01

    line of the five cups was about 0.045 ft above the bottom of the meter frame; 30 STEPPING MOTOR 200 STEPS REVOLUTION TRANSLATOR SRPPOT.E SELECTOR DIST...about 0.1 ft in the model, represented a horizontal width of about 100 ft in the prototype. The height of the meter cups , about 0.04 ft, represented...the entire bay. Although station-to-station wind magnitude comparisons cannot be made due to variations in anemometer height and exposure, wind-field

  11. Modeling and Verification of the Bitcoin Protocol

    Directory of Open Access Journals (Sweden)

    Kaylash Chaudhary

    2015-11-01

    Full Text Available Bitcoin is a popular digital currency for online payments, realized as a decentralized peer-to-peer electronic cash system. Bitcoin keeps a ledger of all transactions; the majority of the participants decides on the correct ledger. Since there is no trusted third party to guard against double spending, and inspired by its popularity, we would like to investigate the correctness of the Bitcoin protocol. Double spending is an important threat to electronic payment systems. Double spending would happen if one user could force a majority to believe that a ledger without his previous payment is the correct one. We are interested in the probability of success of such a double spending attack, which is linked to the computational power of the attacker. This paper examines the Bitcoin protocol and provides its formalization as an UPPAAL model. The model will be used to show how double spending can be done if the parties in the Bitcoin protocol behave maliciously, and with what probability double spending occurs.

  12. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  13. Land surface Verification Toolkit (LVT – a generalized framework for land surface model evaluation

    Directory of Open Access Journals (Sweden)

    S. V. Kumar

    2012-02-01

    Full Text Available Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS, it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  14. Land surface Verification Toolkit (LVT – a generalized framework for land surface model evaluation

    Directory of Open Access Journals (Sweden)

    S. V. Kumar

    2012-06-01

    Full Text Available Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS, it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  15. Performance verification tests of JT-60SA CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Obana, Tetsuhiro, E-mail: obana.tetsuhiro@LHD.nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Murakami, Haruyuki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan)

    2015-11-15

    Highlights: • The performance of the JT-60SA CS model coil was verified. • The CS model coil comprised a quad-pancake wound with a Nb{sub 3}Sn CIC conductor. • The CS model coil met the design requirements. - Abstract: As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb{sub 3}Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of −0.62% for the Nb{sub 3}Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  16. Formal Development and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    , are avoided. Central parts of these new systems consist of safety-critical software the functional correctness of which is one of the key requisites for a reliable operation of the traffics and in particular for the safety of passengers. Until now the development of railway control software has typically been......This paper presents work package WP4.1 of the RobustRails research project. The work package aims at suggesting a methodology for efficient development and verification of safe and robust railway control systems. 1 Project background and state of the art Over the next 10 years all Danish railway...... signalling systems are going to be completely replaced with modern, computer based railway control systems based on the European standard ERTMS/ETCS [3, 4] by the Danish Signaling Programme [1]. The purpose of these systems is to control the railway traffic such that unsafe situations, like train collisions...

  17. Weather model verification using Sodankylä mast measurements

    Directory of Open Access Journals (Sweden)

    M. Kangas

    2015-12-01

    Full Text Available Sodankylä, in the heart of Arctic Research Centre of the Finnish Meteorological Institute (FMI ARC in northern Finland, is an ideal site for atmospheric and environmental research in the boreal and sub-arctic zone. With temperatures ranging from −50 to +30 °C, it provides a challenging testing ground for numerical weather forecasting (NWP models as well as weather forecasting in general. An extensive set of measurements has been carried out in Sodankylä for more than 100 years. In 2000, a 48 m high micrometeorological mast was erected in the area. In this article, the use of Sodankylä mast measurements in NWP model verification is described. Started in 2000 with NWP model HIRLAM and Sodankylä measurements, the verification system has now been expanded to include comparisons between 12 NWP models and seven measurement masts. A case study, comparing forecasted and observed radiation fluxes, is also presented. It was found that three different radiation schemes, applicable in NWP model HARMONIE-AROME, produced during cloudy days somewhat different downwelling long-wave radiation fluxes, which however did not change the overall cold bias of the predicted screen-level temperature.

  18. SoS contract verification using statistical model checking

    Directory of Open Access Journals (Sweden)

    Alessandro Mignogna

    2013-11-01

    Full Text Available Exhaustive formal verification for systems of systems (SoS is impractical and cannot be applied on a large scale. In this paper we propose to use statistical model checking for efficient verification of SoS. We address three relevant aspects for systems of systems: 1 the model of the SoS, which includes stochastic aspects; 2 the formalization of the SoS requirements in the form of contracts; 3 the tool-chain to support statistical model checking for SoS. We adapt the SMC technique for application to heterogeneous SoS. We extend the UPDM/SysML specification language to express the SoS requirements that the implemented strategies over the SoS must satisfy. The requirements are specified with a new contract language specifically designed for SoS, targeting a high-level English- pattern language, but relying on an accurate semantics given by the standard temporal logics. The contracts are verified against the UPDM/SysML specification using the Statistical Model Checker (SMC PLASMA combined with the simulation engine DESYRE, which integrates heterogeneous behavioral models through the functional mock-up interface (FMI standard. The tool-chain allows computing an estimation of the satisfiability of the contracts by the SoS. The results help the system architect to trade-off different solutions to guide the evolution of the SoS.

  19. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates...... sequential release – a feature in the new Danish interlocking systems. The generic model and safety properties can be instantiated with interlocking configuration data, resulting in a concrete model in the form of a Kripke structure, and in high-level safety properties expressed as state invariants. Using...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  20. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates...... sequential release - a feature in the new Danish interlocking systems. The generic model and safety properties can be instantiated with interlocking configuration data, resulting in a concrete model in the form of a Kripke structure, and in high-level safety properties expressed as state invariants. Using...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  1. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    Science.gov (United States)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  2. In-Space Engine (ISE-100) Development - Design Verification Test

    Science.gov (United States)

    Trinh, Huu P.; Popp, Chris; Bullard, Brad

    2017-01-01

    In the past decade, NASA has formulated science mission concepts with an anticipation of landing spacecraft on the lunar surface, meteoroids, and other planets. Advancing thruster technology for spacecraft propulsion systems has been considered for maximizing science payload. Starting in 2010, development of In-Space Engine (designated as ISE-100) has been carried out. ISE-100 thruster is designed based on heritage Missile Defense Agency (MDA) technology aimed for a lightweight and efficient system in terms volume and packaging. It runs with a hypergolic bi-propellant system: MON-25 (nitrogen tetroxide, N2O4, with 25% of nitric oxide, NO) and MMH (monomethylhydrazine, CH6N2) for NASA spacecraft applications. The utilization of this propellant system will provide a propulsion system capable of operating at wide range of temperatures, from 50 C (122 F) down to -30 C (-22 F) to drastically reduce heater power. The thruster is designed to deliver 100 lb(sub f) of thrust with the capability of a pulse mode operation for a wide range of mission duty cycles (MDCs). Two thrusters were fabricated. As part of the engine development, this test campaign is dedicated for the design verification of the thruster. This presentation will report the efforts of the design verification hot-fire test program of the ISE-100 thruster in collaboration between NASA Marshall Space Flight Center (MSFC) and Aerojet Rocketdyne (AR) test teams. The hot-fire tests were conducted at Advance Mobile Propulsion Test (AMPT) facility in Durango, Colorado, from May 13 to June 10, 2016. This presentation will also provide a summary of key points from the test results.

  3. Development and verification of a time delivery model for prostate intensity modulated radiotherapy using a Siemens(®) Artiste™ 160 Multi-leaf Collimator Linac.

    Science.gov (United States)

    Fourie, Nicola; Ali, Omer A; Rae, William I D

    2017-03-01

    Time delivery models thus far proposed for prediction of radiotherapy delivery times are not applicable to all makes of Linac. Our purpose was to develop a time delivery model, which would also be applicable for a Siemens(®) ARTISTE™ 160 Multi-leaf Collimator (MLC) linear accelerator (Linac) and validate the model using prostate Intensity Modulated Radiation Therapy (IMRT) treatment plans. To our knowledge, a time delivery model has not yet been proposed for a Siemens(®) ARTISTE™ 160 MLC Linac. We used the principles of the time delivery model created for a Varian(®) Linac and added the radio frequency (RF) wave component, and the MLC delay time to the MLC travel time component. Machine input parameters were confirmed using a WIN(®) stopwatch. We tested our derived model by selecting ten random 15 MV prostate IMRT treatment plans from our clinic. The delivery time was measured three times, once per day on three different days. The calculated and measured times were compared by means of correlation. The time delivery ranged between 314 and 480 s. The largest percentage difference was 3.3% (16 s) and the smallest 0.2% (1 s); the mean percentage difference was 1.9%. MLC delay and MLC speed, representing segment delivery, had the greatest uncertainties. From the successfully verified time delivery model created, it is concluded that the inter-segmental component of the process is most time-consuming. In order to decrease delivery time it is proposed that the total segments of a treatment plan be decreased.

  4. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  5. Verification and Validation of the Coastal Modeling System. Report 3: CMS-Flow: Hydrodynamics

    Science.gov (United States)

    2011-12-01

    ER D C/ CH L TR -1 1- 10 Verification and Validation of the Coastal Modeling System Report 3, CMS -Flow: Hydrodynamics Co as ta l a nd...11-10 December 2011 Verification and Validation of the Coastal Modeling System Report 3, CMS -Flow: Hydrodynamics Alejandro Sánchez, Weiming Wu...of four reports toward the Verification and Validation (V&V) of the Coastal Modeling System ( CMS ). The details of the V&V study specific to the

  6. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available : Series B (Statistical Methodology), 50(2), pp. 157-224. 12th INCOSE SA Systems Engineering Conference ISBN 978-0-620-72719-8 Page 103 M&SCO. 2013. Verification, Validation, & Accreditation (VV&A) Recommended Practices Guide (RPG). Retrieved from U....S. DoD Modelling & Simulation Coordination Office. http://www.msco.mil/VVA_RPG.html (last accessed April 8, 2016). Pearl, J. 1988. Probabilistic reasoning in intelligent systems: Networks of plausible inference. Morgan Kaufmann. Sargent, R. G. 1981...

  7. Verification and transfer of thermal pollution model. Volume 3: Verification of 3-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free-surface model also provides surface height variations with time.

  8. Development and Verification of 3000Rpm 48Inch Integral Shroud Blade for Steam Turbine

    Science.gov (United States)

    Kaneko, Yasutomo; Mori, Kazushi; Ohyama, Hiroharu

    The 3000rpm 48inch blade for steam turbine was developed as one of the new standard series of LP end blades. The new LP end blades are characterized by the ISB (Integral Shroud Blade) structure. In the ISB structure, blades are continuously coupled by blade untwist due to centrifugal force when the blades rotate at high speed. Therefore, the number of the resonant vibration modes can be reduced by virtue of the vibration characteristics of the circumferentially continuous blades, and the resonant stress can be decreased due to the additional friction damping generated at shrouds and stubs. In order to develop the 3000rpm 48inch blade, the latest analysis methods to predict the vibration characteristics of the ISB structure were applied, after confirming their validity to the blade design. Moreover, the verification tests such as rotational vibration tests and model turbine tests were carried out in the shop to confirm the reliability of the developed blade. As the final verification test, the field test of the actual steam turbine was carried out in the site during the trial operation, and the vibration stress of the 3000rpm 48inch blade was measured by use of telemetry system. In the field test, the vibratory stress of the blade was measured under various operating conditions for more than one month. This paper first presents the up-to-date design technology applied to the design of the 3000rpm 48inch blade. In the second place, the results of the various verification tests carried out in the shop are presented as well as their procedure. Lastly, the results of the final verification tests of 3000rpm 48inch blade carried out in the site are presented.

  9. Development of evaluation and performance verification technology for radiotherapy radiation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.; Jang, S. Y.; Kim, B. H. and others

    2005-02-15

    No matter how much the importance is emphasized, the exact assessment of the absorbed doses administered to the patients to treat the various diseases such as lately soaring malignant tumors with the radiotherapy practices is the most important factor. In reality, several over-exposed patients from the radiotherapy practice become very serious social issues. Especially, the development of a technology to exactly assess the high doses and high energies (In general, dose administered to the patients with the radiotherapy practices are very huge doses, and they are about three times higher than the lethal doses) generated by the radiation generators and irradiation equipment is a competing issue to be promptly conducted. Over fifty medical centers in Korea operate the radiation generators and irradiation equipment for the radiotherapy practices. However, neither the legal and regulatory systems to implement a quality assurance program are sufficiently stipulated nor qualified personnel who could run a program to maintain the quality assurance and control of those generators and equipment for the radiotherapy practices in the medical facilities are sufficiently employed. To overcome the above deficiencies, a quality assurance program such as those developed in the technically advanced countries should be developed to exactly assess the doses administered to patients with the radiotherapy practices and develop the necessary procedures to maintain the continuing performance of the machine or equipment for the radiotherapy. The QA program and procedures should induce the fluent calibration of the machine or equipment with quality, and definitely establish the safety of patients in the radiotherapy practices. In this study, a methodology for the verification and evaluation of the radiotherapy doses is developed, and several accurate measurements, evaluations of the doses delivered to patients and verification of the performance of the therapy machine and equipment are

  10. 路面气象数值预报模型及性能检验%Development and Verification of a Numerical Forecast Model for Road Meteorological Services

    Institute of Scientific and Technical Information of China (English)

    孟春雷; 张朝林

    2012-01-01

    该文基于通用陆面模式(CoLM)发展了精细化路面参数数值预报模型(BJ-ROME).该模型可以预报路面温度、积雪厚度、积冰厚度以及积水厚度.模型不仅考虑了路面的不透水性、相对较低反照率、低热容以及高热导率等特征,还考虑了城市人为热的影响.模型采用北京市气象局快速更新循环预报系统(BJ-RUC)产生的气象强迫场驱动,预报时间跨度为24 h,更新时间为3h.采用北京地区芬兰Vaisala公司路面观测站2009年8月9-24日路面温度及2010年1月3-4日积雪厚度观测结果对模型预报结果进行验证,同时进行了敏感性试验.结果表明:无论是在晴空还是降水的气象条件下,BJ-ROME均能较准确地预报路面温度极值以及日变化.BJ-ROME还可以较准确地模拟积雪厚度的最大值以及随时间变化情况.%Accurate road meteorology forecast and road traffic information are very important to road transportation security. Road surface temperature is a crucial parameter in traffic weather forecast. Now there are three main kinds of road surface parameters forecast models Statistical model. GIS-based model and physical model. Physical model is widely used and it mainly considers the road surface energy balance model and the effect of anthropogenic heat. In 2008, based on the rapid update cycling forecast system (BJ-RUC), the road weather information system is developed and run operationally by the Institute of Urban Meteorology. Since 2007, Beijing Meteorological Bureau has established 18 weather stations along the express way using the apparatus manufactured by ROSA Vaisala in Finland, and established 8 visibility observation stations using the digital visibility sensor. These all make the fine traffic weather forecast and operational run possible. A fine numerical model for urban road surface temperature (RST), snow depth and ice depth prediction (BJ-ROME) is developed based on Common Land Model (CoLM). The model is

  11. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  12. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however......, feasible for modelling the overall cost per day. The study also shows that combining the insurance and regional data it is possible to establish clear relationships between occurrences of claims and hazard maps. In particular, the results indicate that with improvements on data collection and analysis......, improved prediction of damage information will be possible, e.g. based on also socioeconomic variables. Furthermore, the paper concludes that more collaboration between scientific research and insurance agencies is necessary to improve inundation modelling and economic assessments for urban drainage...

  13. Verification of flood damage modelling using insurance data.

    Science.gov (United States)

    Zhou, Q; Panduro, T E; Thorsen, B J; Arnbjerg-Nielsen, K

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however, feasible for modelling the overall cost per day. The study also shows that in combining the insurance and regional data it is possible to establish clear relationships between occurrences of claims and hazard maps. In particular, the results indicate that with improvements to data collection and analysis, improved prediction of damage costs will be possible, for example based also on socioeconomic variables. Furthermore, the paper concludes that more collaboration between scientific research and insurance agencies is needed to improve inundation modelling and economic assessments for urban drainage designs.

  14. Development and Preliminary Verification of GASFLOW Code Coupled with Film Model%GASFLOW程序液膜模型开发及初步验证

    Institute of Scientific and Technical Information of China (English)

    王方年; 沈峰; 程旭; 黄兴冠

    2015-01-01

    A model of heat structure wall surface film coverage and evaporation was developed based on 3 dimensional CFD containment code GASFLOW.The containment temperature and pressure response and the passive containment cooling system (PCS) performance of AP1000 during large break LOCA were analyzed by GASFLOW code coupled with film model.The calculation results were compared with the calculated results of other containment codes WGOTHIC,MELCOR,CONTAIN under the same accident scenario.The results show that the modified GASFLOW code coupled with film model is feasible to analyze the thermal-hydraulic behavior in PCS of PWR and the basic functions can meet the requirements for the calculation.%本文基于三维CFD安全壳程序 GASFLOW开发了热构件壁面上的液膜覆盖与蒸发模型。通过选定 AP1000大破口事故序列,采用耦合了液膜模型的 GASFLOW程序分析了 AP1000核电厂安全壳内温度压力响应及其非能动安全壳冷却系统(PCS)的性能,并与相同事故序列下 WGOTHIC、MEL-COR、CONTAIN等程序的计算结果进行比较。结果表明,耦合了液膜模型的 GASFLOW程序可用于分析PCS的热工水力行为,其基本功能满足计算需要。

  15. Development of a Scalable Testbed for Mobile Olfaction Verification.

    Science.gov (United States)

    Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-12-09

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment.

  16. Analog Video Authentication and Seal Verification Equipment Development

    Energy Technology Data Exchange (ETDEWEB)

    Gregory Lancaster

    2012-09-01

    Under contract to the US Department of Energy in support of arms control treaty verification activities, the Savannah River National Laboratory in conjunction with the Pacific Northwest National Laboratory, the Idaho National Laboratory and Milagro Consulting, LLC developed equipment for use within a chain of custody regime. This paper discussed two specific devices, the Authentication Through the Lens (ATL) analog video authentication system and a photographic multi-seal reader. Both of these devices have been demonstrated in a field trial, and the experience gained throughout will also be discussed. Typically, cryptographic methods are used to prove the authenticity of digital images and video used in arms control chain of custody applications. However, in some applications analog cameras are used. Since cryptographic authentication methods will not work on analog video streams, a simple method of authenticating analog video was developed and tested. A photographic multi-seal reader was developed to image different types of visual unique identifiers for use in chain of custody and authentication activities. This seal reader is unique in its ability to image various types of seals including the Cobra Seal, Reflective Particle Tags, and adhesive seals. Flicker comparison is used to compare before and after images collected with the seal reader in order to detect tampering and verify the integrity of the seal.

  17. Development of a Scalable Testbed for Mobile Olfaction Verification

    Directory of Open Access Journals (Sweden)

    Syed Muhammad Mamduh Syed Zakaria

    2015-12-01

    Full Text Available The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA, a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment.

  18. On the verification and validation of detonation models

    Science.gov (United States)

    Quirk, James

    2013-06-01

    This talk will consider the verification and validation of detonation models, such as Wescott-Stewart-Davis (Journal of Applied Physics. 2005), from the perspective of the American Institute of Aeronautics and Astronautics policy on numerical accuracy (AIAA J. Vol. 36, No. 1, 1998). A key aspect of the policy is that accepted documentation procedures must be used for journal articles with the aim of allowing the reported work to be reproduced by the interested reader. With the rise of electronic documents, since the policy was formulated, it is now possible to satisfy this mandate in its strictest sense: that is, it is now possible to run a comptuational verification study directly in a PDF, thereby allowing a technical author to report numerical subtleties that traditionally have been ignored. The motivation for this document-centric approach is discussed elsewhere (Quirk2003, Adaptive Mesh Refinement Theory and Practice, Springer), leaving the talk to concentrate on specific detonation examples that should be of broad interest to the shock-compression community.

  19. Approach for estimating microbial growth and the biodegradation of hydrocarbon contaminants in subsoil based on field measurements: 1. Model development and verification.

    Science.gov (United States)

    Song, Dejun; Katayama, Arata

    2010-01-15

    An approach was developed to represent the microbial growth and corresponding biodegradation of hydrocarbons (HCs) during the natural attenuation process based on field measurements of in situ microbial biomass and residual HC concentrations in unsaturated subsurface soil. A kinetic model combining Monod and logistic kinetics represents microbial growth under the limitation of HCs as substrates and environmental factors at actual contaminated sites by the introduction of two new kinetic parameters, the effective rate and the self-limiting coefficient of microbial growth. The correspondence between microbial growth and the biodegradation of HCs in the soil is obtained by dividing the amount of HC and the corresponding degrading microbial groups into two classes: saturated HCs as inert components and aromatic HCs that form a contamination plume as dissolved components. The respiratory quinones were used as indicators of microbial biomass. The biodegradation capacity of contaminated sites was evaluated by the maximum microbial biomass obtained by field measurements, which is considered as the integrated results from measurements of HCs, degrading kinetics, and environmental factors at the site. The feasibility of the proposed approach was verified at two hypothetical contaminated sites. The results suggested that the proposed approach is feasible for application at actual HC-contaminated sites.

  20. Analysis of thin-walled cylindrical composite shell structures subject to axial and bending loads: Concept development, analytical modeling and experimental verification

    Science.gov (United States)

    Mahadev, Sthanu

    Continued research and development efforts devoted in recent years have generated novel avenues towards the advancement of efficient and effective, slender laminated fiber-reinforced composite members. Numerous studies have focused on the modeling and response characterization of composite structures with particular relevance to thin-walled cylindrical composite shells. This class of shell configurations is being actively explored to fully determine their mechanical efficacy as primary aerospace structural members. The proposed research is targeted towards formulating a composite shell theory based prognosis methodology that entails an elaborate analysis and investigation of thin-walled cylindrical shell type laminated composite configurations that are highly desirable in increasing number of mechanical and aerospace applications. The prime motivation to adopt this theory arises from its superior ability to generate simple yet viable closed-form analytical solution procedure to numerous geometrically intense, inherent curvature possessing composite structures. This analytical evaluative routine offers to acquire a first-hand insight on the primary mechanical characteristics that essentially govern the behavior of slender composite shells under typical static loading conditions. Current work exposes the robustness of this mathematical framework via demonstrating its potential towards the prediction of structural properties such as axial stiffness and bending stiffness respectively. Longitudinal ply-stress computations are investigated upon deriving the global stiffness matrix model for composite cylindrical tubes with circular cross-sections. Additionally, this work employs a finite element based numerical technique to substantiate the analytical results reported for cylindrically shaped circular composite tubes. Furthermore, this concept development is extended to the study of thin-walled, open cross-sectioned, curved laminated shells that are geometrically

  1. Development and preliminary verification of the 3D core neutronic code: COCO

    Energy Technology Data Exchange (ETDEWEB)

    Lu, H.; Mo, K.; Li, W.; Bai, N.; Li, J. [Reactor Design and Fuel Management Research Center, China Nuclear Power Technology Research Inst., 47F/A Jiangsu Bldg., Yitian Road, Futian District, Shenzhen (China)

    2012-07-01

    As the recent blooming economic growth and following environmental concerns (China)) is proactively pushing forward nuclear power development and encouraging the tapping of clean energy. Under this situation, CGNPC, as one of the largest energy enterprises in China, is planning to develop its own nuclear related technology in order to support more and more nuclear plants either under construction or being operation. This paper introduces the recent progress in software development for CGNPC. The focus is placed on the physical models and preliminary verification results during the recent development of the 3D Core Neutronic Code: COCO. In the COCO code, the non-linear Green's function method is employed to calculate the neutron flux. In order to use the discontinuity factor, the Neumann (second kind) boundary condition is utilized in the Green's function nodal method. Additionally, the COCO code also includes the necessary physical models, e.g. single-channel thermal-hydraulic module, burnup module, pin power reconstruction module and cross-section interpolation module. The preliminary verification result shows that the COCO code is sufficient for reactor core design and analysis for pressurized water reactor (PWR). (authors)

  2. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  3. Verification and Validation of a Three-Dimensional Orthotropic Plasticity Constitutive Model Using a Unidirectional Composite

    Directory of Open Access Journals (Sweden)

    Canio Hoffarth

    2017-03-01

    Full Text Available A three-dimensional constitutive model has been developed for modeling orthotropic composites subject to impact loads. It has three distinct components—a deformation model involving elastic and plastic deformations; a damage model; and a failure model. The model is driven by tabular data that is generated either using laboratory tests or via virtual testing. A unidirectional composite—T800/F3900, commonly used in the aerospace industry, is used in the verification and validation tests. While the failure model is under development, these tests indicate that the implementation of the deformation and damage models in a commercial finite element program, LS-DYNA, is efficient, robust and accurate.

  4. Formal Development and Verification of Railway Control Systems - In the context of ERTMS/ETCS Level 2

    DEFF Research Database (Denmark)

    Vu, Linh Hong

    This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration data....... The proposed method is based on a combination of formal methods and domain-specific approaches. While formal methods offer mathematically rigorous specification, verification and validation, domain-specific approaches encapsulate the use of formal methods with familiar concepts and notions of the domain, hence...... making the method easy for the railway engineers to use. Furthermore, the method features a 4-step verification and validation approach that can be integrated naturally into different phases of the software development process. This 4-step approach identifies possible errors in generic applications...

  5. Pneumatic Adaptive Absorber: Mathematical Modelling with Experimental Verification

    Directory of Open Access Journals (Sweden)

    Grzegorz Mikułowski

    2016-01-01

    Full Text Available Many of mechanical energy absorbers utilized in engineering structures are hydraulic dampers, since they are simple and highly efficient and have favourable volume to load capacity ratio. However, there exist fields of applications where a threat of toxic contamination with the hydraulic fluid contents must be avoided, for example, food or pharmacy industries. A solution here can be a Pneumatic Adaptive Absorber (PAA, which is characterized by a high dissipation efficiency and an inactive medium. In order to properly analyse the characteristics of a PAA, an adequate mathematical model is required. This paper proposes a concept for mathematical modelling of a PAA with experimental verification. The PAA is considered as a piston-cylinder device with a controllable valve incorporated inside the piston. The objective of this paper is to describe a thermodynamic model of a double chamber cylinder with gas migration between the inner volumes of the device. The specific situation considered here is that the process cannot be defined as polytropic, characterized by constant in time thermodynamic coefficients. Instead, the coefficients of the proposed model are updated during the analysis. The results of the experimental research reveal that the proposed mathematical model is able to accurately reflect the physical behaviour of the fabricated demonstrator of the shock absorber.

  6. Control and verification of industrial hybrid systems using models specified with the formalism $ chi $

    NARCIS (Netherlands)

    J.J.H. Fey

    1996-01-01

    textabstractControl and verification of hybrid systems is studied using two industrial examples. The hybrid models of a conveyor-belt and of a biochemical plant for the production of ethanol are specified in the formalism $chi .$ A verification of the closed-loop systems for those examples,

  7. Development of requirements tracking and verification technology for the NPP software

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-12-30

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification.

  8. Verification of Advances in a Coupled Snow-runoff Modeling Framework for Operational Streamflow Forecasts

    Science.gov (United States)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.

  9. Verification of temporal-causal network models by mathematical analysis

    Directory of Open Access Journals (Sweden)

    Jan Treur

    2016-04-01

    Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.

  10. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  11. Verification of Model of Calculation of Intra-Chamber Parameters In Hybrid Solid-Propellant Rocket Engines

    Directory of Open Access Journals (Sweden)

    Zhukov Ilya S.

    2016-01-01

    Full Text Available On the basis of obtained analytical estimate of characteristics of hybrid solid-propellant rocket engine verification of earlier developed physical and mathematical model of processes in a hybrid solid-propellant rocket engine for quasi-steady-state flow regime was performed. Comparative analysis of calculated and analytical data indicated satisfactory comparability of simulation results.

  12. Verification of Model of Calculation of Intra-Chamber Parameters In Hybrid Solid-Propellant Rocket Engines

    OpenAIRE

    Zhukov Ilya S.; Borisov Boris V.; Bondarchuk Sergey S.; Zhukov Alexander S.

    2016-01-01

    On the basis of obtained analytical estimate of characteristics of hybrid solid-propellant rocket engine verification of earlier developed physical and mathematical model of processes in a hybrid solid-propellant rocket engine for quasi-steady-state flow regime was performed. Comparative analysis of calculated and analytical data indicated satisfactory comparability of simulation results.

  13. Verification of the two-dimensional hydrodynamic model based on remote sensing

    Science.gov (United States)

    Sazonov, Alexey; Mikhailukova, Polina; Krylenko, Inna; Frolova, Natalya; Kireeva, Mariya

    2016-04-01

    Mathematical modeling methods are used more and more actively to evaluate possible damage, identify potential flood zone and the influence of individual factors affecting the river during the passage of the flood. Calculations were performed by means of domestic software complex «STREAM-2D» which is based on the numerical solution of two-dimensional St. Venant equations. One of the major challenges in mathematical modeling is the verification of the model. This is usually made using data on water levels from hydrological stations: the smaller the difference of the actual level and the simulated one, the better the quality of the model used. Data from hydrological stations are not always available, so alternative sources of verification, such as remote sensing, are increasingly used. The aim of this work is to develop a method of verification of hydrodynamic model based on a comparison of actual flood zone area, which in turn is determined on the basis of the automated satellite image interpretation methods for different imaging systems and flooded area obtained in the course of the model. The study areas are Lena River, The North Dvina River, Amur River near Blagoveshchensk. We used satellite images made by optical and radar sensors: SPOT-5/HRG, Resurs-F, Radarsat-2. Flooded area were calculated using unsupervised classification (ISODATA and K-mean) for optical images and segmentation for Radarsat-2. Knowing the flow rate and the water level at a given date for the upper and lower limits of the model, respectively, it is possible to calculate flooded area by means of program STREAM-2D and GIS technology. All the existing vector layers with the boundaries of flooding are included in a GIS project for flood area calculation. This study was supported by the Russian Science Foundation, project no. 14-17-00155.

  14. Quantum position verification in bounded-attack-frequency model

    Science.gov (United States)

    Gao, Fei; Liu, Bin; Wen, QiaoYan

    2016-11-01

    In 2011, Buhrman et al. proved that it is impossible to design an unconditionally secure quantum position verification (QPV) protocol if the adversaries are allowed to previously share unlimited entanglements. Afterwards, people started to design secure QPV protocols in practical settings, e.g. the bounded-storage model, where the adversaries' pre-shared entangled resources are supposed to be limited. Here we focus on another practical factor that it is very difficult for the adversaries to perform attack operations with unlimitedly high frequency. Concretely, we present a new kind of QPV protocols, called non-simultaneous QPV. And we prove the security of a specific non-simultaneous QPV protocol with the assumption that the frequency of the adversaries' attack operations is bounded, but no assumptions on their pre-shared entanglements or quantum storage. Actually, in our nonsimultaneous protocol, the information whether there comes a signal at present time is also a piece of command. It renders the adversaries "blind", that is, they have to execute attack operations with unlimitedly high frequency no matter whether a signal arrives, which implies the non-simultaneous QPV is also secure in the bounded-storage model.

  15. Development and validation of MCNPX-based Monte Carlo treatment plan verification system

    OpenAIRE

    Iraj Jabbari; Shahram Monadi

    2015-01-01

    A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation ...

  16. CFD Modeling & Verification in an Aircraft Paint Hangar

    Science.gov (United States)

    2011-05-01

    Collaboration •Navy Bureau of Medicine and Surgery (BUMED), IH Division –Assists CNO with health and safety of Navy aircraft artisans –Quarterly monitoring...levels • Handling paint particulates and vapors 10 E2S2. Verification Pitfalls • Artisans change process in the weeks between baseline and...verification – Added a fabric blanket in front of filter to save filter bank blocking exhaust airflow during sanding • Learn how to go w/o sleep

  17. Continuous Verification of Large Embedded Software using SMT-Based Bounded Model Checking

    CERN Document Server

    Cordeiro, Lucas; Marques-Silva, Joao

    2009-01-01

    The complexity of software in embedded systems has increased significantly over the last years so that software verification now plays an important role in ensuring the overall product quality. In this context, SAT-based bounded model checking has been successfully applied to discover subtle errors, but for larger applications, it often suffers from the state space explosion problem. This paper describes a new approach called continuous verification to detect design errors as quickly as possible by looking at the Software Configuration Management (SCM) system and by combining dynamic and static verification to reduce the state space to be explored. We also give a set of encodings that provide accurate support for program verification and use different background theories in order to improve scalability and precision in a completely automatic way. A case study from the telecommunications domain shows that the proposed approach improves the error-detection capability and reduces the overall verification time by...

  18. Thermal Pollution Mathematical Model. Volume 4: Verification of Three-Dimensional Rigid-Lid Model at Lake Keowee. [envrionment impact of thermal discharges from power plants

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1980-01-01

    The rigid lid model was developed to predict three dimensional temperature and velocity distributions in lakes. This model was verified at various sites (Lake Belews, Biscayne Bay, etc.) and th verification at Lake Keowee was the last of these series of verification runs. The verification at Lake Keowee included the following: (1) selecting the domain of interest, grid systems, and comparing the preliminary results with archival data; (2) obtaining actual ground truth and infrared scanner data both for summer and winter; and (3) using the model to predict the measured data for the above periods and comparing the predicted results with the actual data. The model results compared well with measured data. Thus, the model can be used as an effective predictive tool for future sites.

  19. Modelling and Formal Verification of Timing Aspects in Large PLC Programs

    CERN Document Server

    Fernandez Adiego, B; Blanco Vinuela, E; Tournier, J-C; Gonzalez Suarez, V M; Blech, J O

    2014-01-01

    One of the main obstacle that prevents model checking from being widely used in industrial control systems is the complexity of building formal models out of PLC programs, especially when timing aspects need to be integrated. This paper brings an answer to this obstacle by proposing a methodology to model and verify timing aspects of PLC programs. Two approaches are proposed to allow the users to balance the trade-off between the complexity of the model, i.e. its number of states, and the set of specifications possible to be verified. A tool supporting the methodology which allows to produce models for different model checkers directly from PLC programs has been developed. Verification of timing aspects for real-life PLC programs are presented in this paper using NuSMV.

  20. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  1. Verification of Quantum Cryptography Protocols by Model Checking

    Directory of Open Access Journals (Sweden)

    Mohamed Elboukhari

    2010-10-01

    Full Text Available Unlike classical cryptography which is based on mathematical functions, Quantum Cryptography orQuantum Key Distribution (QKD exploits the laws of quantum physics to offer unconditionally securecommunication. The progress of research in this field allows the anticipation of QKD to be availableoutside of laboratories within the next few years and efforts are made to improve the performance andreliability of the implemented technologies. But despite this big progress, several challenges remain. Forexample the task of how to test the devices of QKD did not yet receive enough attention. These apparatusesbecome heterogeneous, complex and so demand a big verification effort. In this paper we propose to studyquantum cryptography protocols by applying the technique of probabilistic model checking. Using PRISMtool, we analyze the security of BB84 protocol and we are focused on the specific security property ofeavesdropper's information gain on the key derived from the implementation of this protocol. We show thatthis property is affected by the parameters of the eavesdropper’s power and the quantum channel.

  2. Developing a Verification and Training Phantom for Gynecological Brachytherapy System

    Directory of Open Access Journals (Sweden)

    Mahbobeh Nazarnejad

    2012-03-01

    Full Text Available Introduction Dosimetric accuracy is a major issue in the quality assurance (QA program for treatment planning systems (TPS. An important contribution to this process has been a proper dosimetry method to guarantee the accuracy of delivered dose to the tumor. In brachytherapy (BT of gynecological (Gyn cancer it is usual to insert a combination of tandem and ovoid applicators with a complicated geometry which makes their dosimetry verification difficult and important. Therefore, evaluation and verification of dose distribution is necessary for accurate dose delivery to the patients. Materials and Methods The solid phantom was made from Perspex slabs as a tool for intracavitary brachytherapy dosimetric QA. Film dosimetry (EDR2 was done for a combination of ovoid and tandem applicators introduced by Flexitron brachytherapy system. Treatment planning was also done with Flexiplan 3D-TPS to irradiate films sandwiched between phantom slabs. Isodose curves obtained from treatment planning system and the films were compared with each other in 2D and 3D manners. Results The brachytherapy solid phantom was constructed with slabs. It was possible to insert tandems and ovoids loaded with radioactive source of Ir-192 subsequently. Relative error was 3-8.6% and average relative error was 5.08% in comparison with the films and TPS isodose curves. Conclusion Our results showed that the difference between TPS and the measurements is well within the acceptable boundaries and below the action level according to AAPM TG.45. Our findings showed that this phantom after minor corrections can be used as a method of choice for inter-comparison analysis of TPS and to fill the existing gap for accurate QA program in intracavitary brachytherapy. The constructed phantom also showed that it can be a valuable tool for verification of accurate dose delivery to the patients as well as training for brachytherapy residents and physics students.

  3. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification.

    Science.gov (United States)

    Sager, Jennifer E; Yu, Jingjing; Ragueneau-Majlessi, Isabelle; Isoherranen, Nina

    2015-11-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms "PBPK" and "physiologically based pharmacokinetic model" to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines.

  4. Development of Genetic Markers for Triploid Verification of the Pacific Oyster,

    Directory of Open Access Journals (Sweden)

    Jung-Ha Kang

    2013-07-01

    Full Text Available The triploid Pacific oyster, which is produced by mating tetraploid and diploid oysters, is favored by the aquaculture industry because of its better flavor and firmer texture, particularly during the summer. However, tetraploid oyster production is not feasible in all oysters; the development of tetraploid oysters is ongoing in some oyster species. Thus, a method for ploidy verification is necessary for this endeavor, in addition to ploidy verification in aquaculture farms and in the natural environment. In this study, a method for ploidy verification of triploid and diploid oysters was developed using multiplex polymerase chain reaction (PCR panels containing primers for molecular microsatellite markers. Two microsatellite multiplex PCR panels consisting of three markers each were developed using previously developed microsatellite markers that were optimized for performance. Both panels were able to verify the ploidy levels of 30 triploid oysters with 100% accuracy, illustrating the utility of microsatellite markers as a tool for verifying the ploidy of individual oysters.

  5. Effective Development and Verification of Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This document presents a method for effective development of software for a product line of similar railway control systems. The software is constructed in three steps: first a specifications in a domain-specific language is created, then a formal behavioural controller model is automatically...

  6. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-03-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences yet, few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from different parameter uncertainty estimation methods. The Generalized Uncertainty Likelihood Estimator (GLUE, a modified version of GLUE, and the Shuffle Complex Evolution Metropolis (SCEM are used to generate model ensembles for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of parameter uncertainty, one that is commensurate with the dimension of the ensembles themselves. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  7. Code and Solution Verification of 3D Numerical Modeling of Flow in the Gust Erosion Chamber

    Science.gov (United States)

    Yuen, A.; Bombardelli, F. A.

    2014-12-01

    Erosion microcosms are devices commonly used to investigate the erosion and transport characteristics of sediments at the bed of rivers, lakes, or estuaries. In order to understand the results these devices provide, the bed shear stress and flow field need to be accurately described. In this research, the UMCES Gust Erosion Microcosm System (U-GEMS) is numerically modeled using Finite Volume Method. The primary aims are to simulate the bed shear stress distribution at the surface of the sediment core/bottom of the microcosm, and to validate the U-GEMS produces uniform bed shear stress at the bottom of the microcosm. The mathematical model equations are solved by on a Cartesian non-uniform grid. Multiple numerical runs were developed with different input conditions and configurations. Prior to developing the U-GEMS model, the General Moving Objects (GMO) model and different momentum algorithms in the code were verified. Code verification of these solvers was done via simulating the flow inside the top wall driven square cavity on different mesh sizes to obtain order of convergence. The GMO model was used to simulate the top wall in the top wall driven square cavity as well as the rotating disk in the U-GEMS. Components simulated with the GMO model were rigid bodies that could have any type of motion. In addition cross-verification was conducted as results were compared with numerical results by Ghia et al. (1982), and good agreement was found. Next, CFD results were validated by simulating the flow within the conventional microcosm system without suction and injection. Good agreement was found when the experimental results by Khalili et al. (2008) were compared. After the ability of the CFD solver was proved through the above code verification steps. The model was utilized to simulate the U-GEMS. The solution was verified via classic mesh convergence study on four consecutive mesh sizes, in addition to that Grid Convergence Index (GCI) was calculated and based on

  8. Verification of forward kinematics of the numerical and analytical model of Fanuc AM100iB robot

    Science.gov (United States)

    Cholewa, A.; Świder, J.; Zbilski, A.

    2016-08-01

    The article presents the verification of forward kinematics of Fanuc AM100iB robot. The developed kinematic model of the machine was verified using tests on an actual robot model. The tests consisted in positioning the robot operating in the mode of controlling the values of natural angles in selected points of its workspace and reading the indications of the coordinate values of the TCP point in the robot's global coordinate system on the operator panel. Validation of the model consisted of entering the same values of natural angles that were used for positioning the robot in its inputs and calculating the coordinate values of the TCP of the machine's CAE model, and then comparing the results obtained with the values read. These results are the introduction to the partial verification of the dynamic model of the analysed device.

  9. Real-time Performance Verification of Core Protection and Monitoring System with Integrated Model for SMART Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Bon-Seung; Kim, Sung-Jin; Hwang, Dae-Hyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    In keeping with these purposes, a real-time model of the digital core protection and monitoring systems for simulator implementation was developed on the basis of SCOPS and SCOMS algorithms. In addition, important features of the software models were explained for the application to SMART simulator, and the real-time performance of the models linked with DLL was examined for various simulation scenarios. In this paper, performance verification of core protection and monitoring software is performed with integrated simulator model. A real-time performance verification of core protection and monitoring software for SMART simulator was performed with integrated simulator model. Various DLL connection tests were done for software algorithm change. In addition, typical accident scenarios of SMART were simulated with 3KEYMASTER and simulated results were compared with those of DLL linked core protection and monitoring software. Each calculational result showed good agreements.

  10. Modeling and Verification of Insider Threats Using Logical Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2017-01-01

    and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider...

  11. Efficient speaker verification using Gaussian mixture model component clustering.

    Energy Technology Data Exchange (ETDEWEB)

    De Leon, Phillip L. (New Mexico State University, Las Cruces, NM); McClanahan, Richard D.

    2012-04-01

    In speaker verification (SV) systems that employ a support vector machine (SVM) classifier to make decisions on a supervector derived from Gaussian mixture model (GMM) component mean vectors, a significant portion of the computational load is involved in the calculation of the a posteriori probability of the feature vectors of the speaker under test with respect to the individual component densities of the universal background model (UBM). Further, the calculation of the sufficient statistics for the weight, mean, and covariance parameters derived from these same feature vectors also contribute a substantial amount of processing load to the SV system. In this paper, we propose a method that utilizes clusters of GMM-UBM mixture component densities in order to reduce the computational load required. In the adaptation step we score the feature vectors against the clusters and calculate the a posteriori probabilities and update the statistics exclusively for mixture components belonging to appropriate clusters. Each cluster is a grouping of multivariate normal distributions and is modeled by a single multivariate distribution. As such, the set of multivariate normal distributions representing the different clusters also form a GMM. This GMM is referred to as a hash GMM which can be considered to a lower resolution representation of the GMM-UBM. The mapping that associates the components of the hash GMM with components of the original GMM-UBM is referred to as a shortlist. This research investigates various methods of clustering the components of the GMM-UBM and forming hash GMMs. Of five different methods that are presented one method, Gaussian mixture reduction as proposed by Runnall's, easily outperformed the other methods. This method of Gaussian reduction iteratively reduces the size of a GMM by successively merging pairs of component densities. Pairs are selected for merger by using a Kullback-Leibler based metric. Using Runnal's method of reduction, we

  12. Feature-Aware Verification

    CERN Document Server

    Apel, Sven; Wendler, Philipp; von Rhein, Alexander; Beyer, Dirk

    2011-01-01

    A software product line is a set of software products that are distinguished in terms of features (i.e., end-user--visible units of behavior). Feature interactions ---situations in which the combination of features leads to emergent and possibly critical behavior--- are a major source of failures in software product lines. We explore how feature-aware verification can improve the automatic detection of feature interactions in software product lines. Feature-aware verification uses product-line verification techniques and supports the specification of feature properties along with the features in separate and composable units. It integrates the technique of variability encoding to verify a product line without generating and checking a possibly exponential number of feature combinations. We developed the tool suite SPLverifier for feature-aware verification, which is based on standard model-checking technology. We applied it to an e-mail system that incorporates domain knowledge of AT&T. We found that feat...

  13. Development and verification test of integral reactor major components

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. I.; Kim, Y. W.; Kim, J. H. and others

    1999-03-01

    The conceptual designs for SG, MCP, CEDM to be installed in the integral reactor SMART were developed. Three-dimensional CAD models for the major components were developed to visualize the design concepts. Once-through helical steam generator was conceptually designed for SMART. Canned motor pump was adopted in the conceptual design of MCP. Linear pulse motor type and ballscrew type CEDM, which have fine control capabilities were studied for adoption in SMART. In parallel with the structural design, the electro-magnetic design was performed for the sizing motors and electro-magnet. Prototypes for the CEDM and MCP sub-assemblies were developed and tested to verify the performance. The impeller design procedure and the computer program to analyze the dynamic characteristics of MCP rotor shaft were developed. The design concepts of SG, MCP, CEDM were also invetigated for the fabricability.

  14. Development and verification of a software tool for the acoustic location of partial discharge in a power transformer

    Directory of Open Access Journals (Sweden)

    Polužanski Vladimir

    2014-01-01

    Full Text Available This paper discusses the development and verification of software tool for determining the location of partial discharge in a power transformer with the acoustic method. Classification and systematization of physical principles and detection methods and tests of partial discharge in power transformers are shown at the beginning of this paper. The most important mathematical models, features, algorithms, and real problems that affect measurement accuracy are highlighted. This paper describes the development and implementation of a software tool for determining the location of partial discharge in a power transformer based on a no iterative mathematical algorithm. Verification and accuracy of measurement are proved both by computer simulation and experimental results available in the literature.

  15. Vibratory response modeling and verification of a high precision optical positioning system.

    Energy Technology Data Exchange (ETDEWEB)

    Barraza, J.; Kuzay, T.; Royston, T. J.; Shu, D.

    1999-06-18

    A generic vibratory-response modeling program has been developed as a tool for designing high-precision optical positioning systems. Based on multibody dynamics theory, the system is modeled as rigid-body structures connected by linear elastic elements, such as complex actuators and bearings. The full dynamic properties of each element are determined experimentally or theoretically, then integrated into the program as inertial and stiffness matrices. Utilizing this program, the theoretical and experimental verification of the vibratory behavior of a double-multilayer monochromator support and positioning system is presented. Results of parametric design studies that investigate the influence of support floor dynamics and highlight important design issues are also presented. Overall, good matches between theory and experiment demonstrate the effectiveness of the program as a dynamic modeling tool.

  16. Models and formal verification of multiprocessor system-on-chips

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2008-01-01

    In this article we develop a model for applications running on multiprocessor platforms. An application is modelled by task graphs and a multiprocessor system is modelled by a number of processing elements, each capable of executing tasks according to a given scheduling discipline. We present a d...... could verify a smart-phone application consisting of 103 tasks executing on 4 processing elements....

  17. Second order closure integrated puff (SCIPUFF) model verification and evaluation study. Technical memo

    Energy Technology Data Exchange (ETDEWEB)

    Nappo, C.J.; Eckman, R.M.; Rao, K.S.; Herwehe, J.A.; Gunter, L.

    1998-06-01

    This report summarizes a verification of the SCIPUFF model as descried in the draft report PC-SCIPUFF Version 0.2 Technical Documentation by Sykes et al. The verification included a scientific review of the model physics and parameterizations described in the report, and checks for their internal usage and consistency with current practices in atmospheric dispersion modeling. This work is intended to examine the scientific basis and defensiblity of the model for the intended application. A related task is an assessment of the model`s general capabilities and limitations. A line-by-line verification of the computer source code was not possible; however, the code was checked with a commercial code analyzer. About 47 potential coding inconsistencies were identified.

  18. The development of advanced instrumentation and control technology -The development of verification and validation technology for instrumentation and control in NPPs-

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Ham, Chang Sik; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Lee, Jang Soo; Um, Heung Sub; Kim, Jang Yul; Ryoo, Chan Hoh; Joo, Jae Yoon; Song, Soon Ja [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    We collected and analyzed the domestic/international codes, standards and guidelines to develop high reliable software verification and validation methodology which is satisfied to our real situation. The three major parts of work are performed that is the construction of the frame for high reliable software development environment, establishment of high reliable software development methodology and study for the basic technology related to safety-critical software. These three parts are tightly coupled each other to achieve self-reliable software verification and validation technology for digital I and C in NPPs. The configuration of hardware and software are partly performed using requirements which is developed in first stage for the development of I and C test facility. In hardware part, expanded interface using VXI bus and driving software is completed. The main program for math, modelling and supervisor program for instructions are developed. 27 figs, 22 tabs, 69 refs. (Author).

  19. IP cores design from specifications to production modeling, verification, optimization, and protection

    CERN Document Server

    Mohamed, Khaled Salah

    2016-01-01

    This book describes the life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection. Various trade-offs in the design process are discussed, including  those associated with many of the most common memory cores, controller IPs  and system-on-chip (SoC) buses. Readers will also benefit from the author’s practical coverage of new verification methodologies. such as bug localization, UVM, and scan-chain.  A SoC case study is presented to compare traditional verification with the new verification methodologies. ·         Discusses the entire life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection; ·         Introduce a deep introduction for Verilog for both implementation and verification point of view.  ·         Demonstrates how to use IP in applications such as memory controllers and SoC buses. ·         Describes a new ver...

  20. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  1. Formal Modeling and Verification of Context-Aware Systems using Event-B

    Directory of Open Access Journals (Sweden)

    Hong Anh Le

    2014-12-01

    Full Text Available Context awareness is a computing paradigm that makes applications responsive and adaptive with their environment. Formal modeling and verification of context-aware systems are challenging issues in the development as they are complex and uncertain. In this paper, we propose an approach to use a formal method Event-B to model and verify such systems. First, we specify a context aware system’s components such as context data entities, context rules, context relations by Event-B notions. In the next step, we use the Rodin platform to verify the system’s desired properties such as context constraint preservation. It aims to benefit from natural representation of context awareness concepts in Event-B and proof obligations generated by refinement mechanism to ensure the correctness of systems. We illustrate the use of our approach on a scenario of an Adaptive Cruise Control system.

  2. Develop a Model Component

    Science.gov (United States)

    Ensey, Tyler S.

    2013-01-01

    During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a

  3. Development and Verification of Tritium Analyses Code for a Very High Temperature Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim

    2009-09-01

    A tritium permeation analyses code (TPAC) has been developed by Idaho National Laboratory for the purpose of analyzing tritium distributions in the VHTR systems including integrated hydrogen production systems. A MATLAB SIMULINK software package was used for development of the code. The TPAC is based on the mass balance equations of tritium-containing species and a various form of hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. In the TPAC, ternary fission and neutron reactions with 6Li, 7Li 10B, 3He were taken into considerations as tritium sources. Purification and leakage models were implemented as main tritium sinks. Permeation of HT and H2 through pipes, vessels, and heat exchangers were importantly considered as main tritium transport paths. In addition, electroyzer and isotope exchange models were developed for analyzing hydrogen production systems including both high-temperature electrolysis and sulfur-iodine process. The TPAC has unlimited flexibility for the system configurations, and provides easy drag-and-drops for making models by adopting a graphical user interface. Verification of the code has been performed by comparisons with the analytical solutions and the experimental data based on the Peach Bottom reactor design. The preliminary results calculated with a former tritium analyses code, THYTAN which was developed in Japan and adopted by Japan Atomic Energy Agency were also compared with the TPAC solutions. This report contains descriptions of the basic tritium pathways, theory, simple user guide, verifications, sensitivity studies, sample cases, and code tutorials. Tritium behaviors in a very high temperature reactor/high temperature steam electrolysis system have been analyzed by the TPAC based on the reference indirect parallel configuration proposed by Oh et al. (2007). This analysis showed that only 0.4% of tritium released from the core is transferred to the product hydrogen

  4. Development of an advanced real time simulation tool, ARTIST and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hee Cheol; Moon, S. K.; Yoon, B. J.; Sim, S. K.; Lee, W. J. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    1999-10-01

    A real time reactor system analysis code ARTIST, based on drift flux model has been developed to investigate the transient system behavior under low pressure, low flow and low power conditions with noncondensable gas present in the system. The governing equations of the ARTIST code consist of three mass continuity equations (steam, liquid and noncondensables), two energy equations (steam and mixture) and one mixture equation constituted with the drift flux model. The drift flux model of ARTIST has been validated against the THETIS experimental data by comparing the void distribution in the system. Especially, the calculated void fraction by Chexal-Lellouche void fraction correlation at low pressure and low flow, is better than the results of both the homogeneous model of TASS code and the two-fluid model of RELAP5/MOD3 code. When noncondensable gas exists, thermal-hydraulic state solution scheme and the calculation methods of the partial derivatives are developed. Numerical consistency and convergence was tested with the one volume problems and the manometric oscillation was assessed to examine the calculation methods of the partial derivatives. Calculated thermal-hydraulic state for each test shows the consistent and expected behaviour. In order to evaluate the ARTIST code capability in predicting the two phase thermal-hydraulic phenomena of the loss of RHR accident during midloop operation, BETHSY test 6.9d is simulated. From the results, it is judged that the reflux condensation model and the critical flow model for the noncondensable gas are necessary to correctly predict the thermal-hydraulic behaviour. Finally, the verification run was performed without the drift flux model and the noncondensable gas model for the postulated accidents of the real plants. The ARTIST code well reproduces the parametric trends which are calculated by TASS code. Therefore, the integrity of ARTIST code was verified. 35 refs., 70 figs., 3 tabs. (Author)

  5. Development and Implementation of Cgcre Accreditation Program for Greenhouse Gas Verification Bodies

    Science.gov (United States)

    Kropf Santos Fermam, Ricardo; Barroso Melo Monteiro de Queiroz, Andrea

    2016-07-01

    An organizational innovation is defined as the implementation of a new organizational method in the firm's business practices, organization of your workplace or in its external relations. This work illustrates a Cgcre innovation, by presentation of the development process of greenhouse gases verification body in Brazil according to the Brazilian accreditation body, the General Coordination for Accreditation (Cgcre).

  6. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    NARCIS (Netherlands)

    Joseph, S.; Herold, M.; Sunderlin, W.D.; Verchot, L.V.

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 R

  7. Development and verification of a reciprocating test rig designed for investigation of piston ring tribology

    DEFF Research Database (Denmark)

    Pedersen, Michael Torben; Imran, Tajammal; Klit, Peder

    2009-01-01

    This paper describes the development and verification of a reciprocating test rig, which was designed to study the piston ring tribology. A crank mechanism is used to generate a reciprocating motion for a moving plate, which acts as the liner. A stationary block acting as the ring package is loaded......, which is suitable for the study of piston ring tribology....

  8. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View Texas A& M; DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-12-20

    This project was successfully executed to provide valuable adsorption data and improve a comprehensive model developed in previous work by the authors. Data obtained were used in an integrated computer program to predict the behavior of adsorption columns. The model is supported by experimental data and has been shown to predict capture of off gas similar to that evolving during the reprocessing of nuclear waste. The computer program structure contains (a) equilibrium models of off-gases with the adsorbate; (b) mass-transfer models to describe off-gas mass transfer to a particle, diffusion through the pores of the particle, and adsorption on the active sites of the particle; and (c) incorporation of these models into fixed bed adsorption modeling, which includes advection through the bed. These models are being connected with the MOOSE (Multiphysics Object-Oriented Simulation Environment) software developed at the Idaho National Laboratory through DGOSPREY (Discontinuous Galerkin Off-gas SeParation and REcoverY) computer codes developed in this project. Experiments for iodine and water adsorption have been conducted on reduced silver mordenite (Ag0Z) for single layered particles. Adsorption apparatuses have been constructed to execute these experiments over a useful range of conditions for temperatures ranging from ambient to 250°C and water dew points ranging from -69 to 19°C. Experimental results were analyzed to determine mass transfer and diffusion of these gases into the particles and to determine which models best describe the single and binary component mass transfer and diffusion processes. The experimental results were also used to demonstrate the capabilities of the comprehensive models developed to predict single-particle adsorption and transients of the adsorption-desorption processes in fixed beds. Models for adsorption and mass transfer have been developed to mathematically describe adsorption kinetics and transport via diffusion and advection

  9. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    Science.gov (United States)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    into the KMD-WRF runs, using the product generated by NOAA/NESDIS. Model verification capabilities are also being transitioned to KMD using NCAR's Model *Corresponding author address: Jonathan Case, ENSCO, Inc., 320 Sparkman Dr., Room 3008, Huntsville, AL, 35805. Email: Jonathan.Case-1@nasa.gov Evaluation Tools (MET; Brown et al. 2009) software in conjunction with a SPoRT-developed scripting package, in order to quantify and compare errors in simulated temperature, moisture and precipitation in the experimental WRF model simulations. This extended abstract and accompanying presentation summarizes the efforts and training done to date to support this unique regional modeling initiative at KMD. To honor the memory of Dr. Peter J. Lamb and his extensive efforts in bolstering weather and climate science and capacity-building in Africa, we offer this contribution to the special Peter J. Lamb symposium. The remainder of this extended abstract is organized as follows. The collaborating international organizations involved in the project are presented in Section 2. Background information on the unique land surface input datasets is presented in Section 3. The hands-on training sessions from March 2014 and June 2015 are described in Section 4. Sample experimental WRF output and verification from the June 2015 training are given in Section 5. A summary is given in Section 6, followed by Acknowledgements and References.

  10. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  11. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence L. [Syracuse Univ., NY (United States); Lin, Ronghong [Syracuse Univ., NY (United States); Nan, Yue [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Ladshaw, Austin [Georgia Inst. of Technology, Atlanta, GA (United States); Sharma, Ketki [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View A & M Univ., Prairie View, TX (United States); DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-04-29

    The project has made progress toward developing a comprehensive modeling capability for the capture of target species in off gas evolved during the reprocessing of nuclear fuel. The effort has integrated experimentation, model development, and computer code development for adsorption and absorption processes. For adsorption, a modeling library has been initiated to include (a) equilibrium models for uptake of off-gas components by adsorbents, (b) mass transfer models to describe mass transfer to a particle, diffusion through the pores of the particle and adsorption on the active sites of the particle, and (c) interconnection of these models to fixed bed adsorption modeling which includes advection through the bed. For single-component equilibria, a Generalized Statistical Thermodynamic Adsorption (GSTA) code was developed to represent experimental data from a broad range of isotherm types; this is equivalent to a Langmuir isotherm in the two-parameter case, and was demonstrated for Kr on INL-engineered sorbent HZ PAN, water sorption on molecular sieve A sorbent material (MS3A), and Kr and Xe capture on metal-organic framework (MOF) materials. The GSTA isotherm was extended to multicomponent systems through application of a modified spreading pressure surface activity model and generalized predictive adsorbed solution theory; the result is the capability to estimate multicomponent adsorption equilibria from single-component isotherms. This advance, which enhances the capability to simulate systems related to off-gas treatment, has been demonstrated for a range of real-gas systems in the literature and is ready for testing with data currently being collected for multicomponent systems of interest, including iodine and water on MS3A. A diffusion kinetic model for sorbent pellets involving pore and surface diffusion as well as external mass transfer has been established, and a methodology was developed for determining unknown diffusivity parameters from transient

  12. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    Science.gov (United States)

    Mao, S. P.; Rottenberg, X.; Rochus, V.; Czarnecki, P.; Helin, P.; Severi, S.; Nauwelaers, B.; Tilmans, H. A. C.

    2017-03-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp.). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells.

  13. On the need for data for the verification of service life models for frost damage

    DEFF Research Database (Denmark)

    Geiker, Mette Rica; Engelund, Sven

    1999-01-01

    The purpose of this paper is to draw the attention to the need for the verification of service life models for frost attack on concrete and the collection of relevant data. To illustrate the type of data needed the paper presents models for internal freeze/thaw damage (internal cracking including...

  14. Verification, Validation & Accreditation of Legacy Simulations using the Business Process Modeling Notation

    NARCIS (Netherlands)

    Gianoulis, C.; Roza, M.; Kabilan, V.

    2008-01-01

    Verification, Validation and Accreditation is an important part of the Modeling and Simulation domain. This paper focuses on legacy simulations and examines two VV&A approaches coming from different communities of the defense. We use the Business Process Modeling Notation (BPMN) to describe both app

  15. Development of PIRT and Assessment Matrix for Verification and Validation of Sodium Fire Analysis Codes

    Science.gov (United States)

    Ohno, Shuji; Ohshima, Hiroyuki; Tajima, Yuji; Ohki, Hiroshi

    Thermodynamic consequence in liquid sodium leak and fire accident is one of the important issues to be evaluated when considering the safety aspect of fast reactor plant building. The authors are therefore initiating systematic verification and validation (V&V) activity to assure and demonstrate the reliability of numerical simulation tool for sodium fire analysis. The V&V activity is in progress with the main focuses on already developed sodium fire analysis codes SPHINCS and AQUA-SF. The events to be evaluated are hypothetical sodium spray, pool, or combined fire accidents followed by thermodynamic behaviors postulated in a plant building. The present paper describes that the ‘Phenomena Identification and Ranking Table (PIRT)’ is developed at first for clarifying the important validation points in the sodium fire analysis codes, and that an ‘Assessment Matrix’ is proposed which summarizes both separate effect tests and integral effect tests for validating the computational models or whole code for important phenomena. Furthermore, the paper shows a practical validation with a separate effect test in which the spray droplet combustion model of SPHINCS and AQUA-SF predicts the burned amount of a falling sodium droplet with the error mostly less than 30%.

  16. Kinetic model of ductile iron solidification with experimental verification

    Directory of Open Access Journals (Sweden)

    W. Kapturkiewicz

    2009-10-01

    Full Text Available A solidification model for ductile iron, including Weibull formula for nodule count has been presented. From this model, the following can be determined: cooling curves, kinetics of austenite and eutectic nucleation, austenite and eutectic growth velocity, volume fraction, distribution of Si and P both in austenite and eutectic grain with distribution in casting section.In the developed model of nodular graphite iron casting solidification, the correctness of the mathematical model has been experimentally verified in the range of the most significant factors, which include temperature field, the value of maximum undercooling, and the graphite nodule count interrelated with the casting cross-section. Literature offers practically no data on so confronted process model and simulation program.

  17. Development and optimization of FJP tools and their practical verification

    Science.gov (United States)

    Messelink, Wilhelmus A. C. M.; Waeger, Reto; Meeder, Mark; Looser, Herbert; Wons, Torsten; Heiniger, Kurt C.; Faehnle, Oliver W.

    2005-09-01

    This article presents the recent achievements with Jules Verne, a sub-aperture polishing technique closely related to Fluid Jet Polishing [1]. Whereas FJP typically applies a nozzle stand-off distance of millimeters to centimeters, JV uses a stand-off distance down to 50 μm. The objective is to generate a non-directional fluid flow parallel to the surface, which is specifically suited to reduce the surface roughness [2, 3]. Different characteristic Jules Verne nozzle geometries have been designed and numerically simulated using Computational Fluid Dynamics (CFD). To verify these simulations, the flow of fluid and particles of these nozzles has been visualized in a measurement setup developed specifically for this purpose. A simplified JV nozzle geometry is positioned in a measurement setup and the gap between tool and surface has been observed by an ICCD camera. In order to be able to visualize the motion of the abrasives, the particles have been coated with fluorescence. Furthermore, these nozzles have been manufactured and tested in a practical environment using a modified polishing machine. The results of these laboratory and practical tests are presented and discussed, demonstrating that the CFD simulations are in good agreement with the experiments. It was possible to qualitatively predict the material removal on the processed glass surface, due to the implementation of appropriate erosion models [4, 5] in the CFD software.

  18. Development and verification test of integral reactor major components - Development of MCP impeller design, performance prediction code and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung Kyoon; Oh, Woo Hyoung; Song, Jae Wook [Korea Advanced Institute of Science and Technology, Taejon (Korea)

    1999-03-01

    The present study is aimed at developing a computational code for design and performance prediction of an axial-flow pump. The proposed performance prediction method is tested against a model axial-flow pump streamline curvature method. The preliminary design is made by using the ideal velocity triangles at inlet and exit and the three dimensional blade shape is calculated by employing the free vortex design method. Then the detailed blading design is carried out by using experimental database of double circular arc cambered hydrofoils. To computationally determine the design incidence, deviation, blade camber, solidity and stagger angle, a number of correlation equations are developed form the experimental database and a theorical formula for the lift coefficient is adopted. A total of 8 equations are solved iteratively using an under-relaxation factor. An experimental measurement is conducted under a non-cavitating condition to obtain the off-design performance curve and also a cavitation test is carried out by reducing the suction pressure. The experimental results are very satisfactorily compared with the predictions by the streamline curvature method. 28 refs., 26 figs., 11 tabs. (Author)

  19. A solenoid-based active hydraulic engine mount: modelling, analysis, and verification

    OpenAIRE

    Hosseini, Ali

    2010-01-01

    The focus of this thesis is on the design, modelling, identification, simulation, and experimental verification of a low-cost solenoid-based active hydraulic engine mount. To build an active engine mount, a commercial On-Off solenoid is modified to be used as an actuator and it is embedded inside a hydraulic engine mount. The hydraulic engine mount is modelled and tested, solenoid actuator is modelled and identified, and finally the models were integrated to obtain the analytical model of the...

  20. Development and verification of a dynamic underbalanced drilling simulator

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z.; Vefring, E.H.; Rommetveit, R. [RF-Rogaland Research, Bergen (Norway); Bieseman, T. [Shell RTS, Rijswijk (Netherlands); Maglione, R. [Agip Spa, Milano (Italy); Lage, A.C.; Nakagawa, E. [Petrobras/CENPES, Rio de Janeiro (Brazil)

    1997-07-01

    A dynamic underbalanced drilling (UBD) simulator has been developed in a joint industry project. The simulator incorporates models for multiphase flow, well-reservoir interaction, gas/oil solubility and gas injection systems. The fluid components in the system include injected gases, mud, produced gas, produced oil and water and drilled cuttings. Both coiled tubing and conventional jointed pipe can be simulated. The primary use of the simulator is in the planning phase of an UBD operation. An UBD operation is very dynamic due to the changes in flow conditions and other operations. The importance of the dynamic effects is illustrated by a field example. The dynamic simulator allows for the analysis of various operations that cannot be analyzed with a steady state simulator. Some of these operations include starting/stopping circulation; various gas injection techniques, e.g.: parasitic string, parasitic casing, through completion, and drill string injection; drilling operations: drilling, tripping, pipe connections, and BHA deployment. To verify the simulator, two phase flow tests in near-horizontal annulus were performed in order to provide data for validation. Field data are actively collected for this purpose. In this paper, two field cases are presented. One is a coiled tubing drilling operation in Dalen field in the Netherlands where a Nitrogen lift test was performed in a through completion configuration. The second case is a UBD operation in Candeias field in Brazil. In this case, drillstring gas injection tests were performed in a cemented 9-5/8-in. casing at 1,800 m.

  1. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  2. CFD modeling of pharmaceutical isolators with experimental verification of airflow.

    Science.gov (United States)

    Nayan, N; Akay, H U; Walsh, M R; Bell, W V; Troyer, G L; Dukes, R E; Mohan, P

    2007-01-01

    Computational fluid dynamics (CFD) models have been developed to predict the airflow in a transfer isolator using a commercial CFD code. In order to assess the ability of the CFD approach in predicting the flow inside an isolator, hot wire anemometry measurements and a novel experimental flow visualization technique consisting of helium-filled glycerin bubbles were used. The results obtained have been shown to agree well with the experiments and show that CFD can be used to model barrier systems and isolators with practical fidelity. This indicates that CFD can and should be used to support the design, testing, and operation of barrier systems and isolators.

  3. Characterizing proton-activated materials to develop PET-mediated proton range verification markers

    Science.gov (United States)

    Cho, Jongmin; Ibbott, Geoffrey S.; Kerr, Matthew D.; Amos, Richard A.; Stingo, Francesco C.; Marom, Edith M.; Truong, Mylene T.; Palacio, Diana M.; Betancourt, Sonia L.; Erasmus, Jeremy J.; DeGroot, Patricia M.; Carter, Brett W.; Gladish, Gregory W.; Sabloff, Bradley S.; Benveniste, Marcelo F.; Godoy, Myrna C.; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R.

    2016-06-01

    Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials (18O, Cu, and 68Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm-3) and beef (~1.0 g cm-3) were embedded with Cu or 68Zn foils of several volumes (10-50 mm3). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1-5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20-40 min of scan time using various delay times (30-150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils’ PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers.

  4. Characterizing proton-activated materials to develop PET-mediated proton range verification markers.

    Science.gov (United States)

    Cho, Jongmin; Ibbott, Geoffrey S; Kerr, Matthew D; Amos, Richard A; Stingo, Francesco C; Marom, Edith M; Truong, Mylene T; Palacio, Diana M; Betancourt, Sonia L; Erasmus, Jeremy J; DeGroot, Patricia M; Carter, Brett W; Gladish, Gregory W; Sabloff, Bradley S; Benveniste, Marcelo F; Godoy, Myrna C; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R

    2016-06-07

    Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials ((18)O, Cu, and (68)Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm(-3)) and beef (~1.0 g cm(-3)) were embedded with Cu or (68)Zn foils of several volumes (10-50 mm(3)). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1-5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20-40 min of scan time using various delay times (30-150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils' PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers.

  5. Range verification methods in particle therapy: underlying physics and Monte Carlo modelling

    Directory of Open Access Journals (Sweden)

    Aafke Christine Kraan

    2015-07-01

    Full Text Available Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients.Non-invasive in-vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including beta+ emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC predictions is a key issue. Correctly modelling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modelling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.

  6. Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling.

    Science.gov (United States)

    Kraan, Aafke Christine

    2015-01-01

    Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β (+) emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.

  7. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  8. A New Speaker Verification Method with GlobalSpeaker Model and Likelihood Score Normalization

    Institute of Scientific and Technical Information of China (English)

    张怡颖; 朱小燕; 张钹

    2000-01-01

    In this paper a new text-independent speaker verification method GSMSV is proposed based on likelihood score normalization. In this novel method a global speaker model is established to represent the universal features of speech and normalize the likelihood score. Statistical analysis demonstrates that this normalization method can remove common factors of speech and bring the differences between speakers into prominence. As a result the equal error rate is decreased significantly,verification procedure is accelerated and system adaptability to speaking speed is improved.

  9. Local model for magnet-superconductor mechanical interaction: Experimental verification

    Science.gov (United States)

    Diez-Jimenez, Efren; Perez-Diaz, Jose-Luis; Garcia-Prada, Juan Carlos

    2011-03-01

    Several models exist for calculating superconducting repulsion forces in the Meissner state that are based on the method of images. The method of images, however, is limited to a small number of geometrical configurations that can be solved exactly, and the physical interpretation of the method is under discussion. A general local model based on the London equations and Maxwell's equations has been developed to describe the mechanics of the superconductor-permanent magnet system. Due to its differential form, this expression can be easily implemented in a finite elements analysis and, consequently, is easily applicable to any shape of superconductor in the Meissner state. It can solve both forces and torques. This paper reports different experiments undertaken in order to test the model's validity. The vertical forces and the angle of equilibrium between a magnet and a superconductor were measured, and a positive agreement between the experiments and theoretical calculations was found.

  10. Design, development and verification of the HIFI Alignment Camera System

    NARCIS (Netherlands)

    Boslooper, E.C.; Zwan, B.A. van der; Kruizinga, B.; Lansbergen, R.

    2005-01-01

    This paper presents the TNO share of the development of the HIFI Alignment Camera System (HACS), covering the opto-mechanical and thermal design. The HACS is an Optical Ground Support Equipment (OGSE) that is specifically developed to verify proper alignment of different modules of the HIFI instrume

  11. The Parametric Model for PLC Reference Chanells and its Verification in Real PLC Environment

    OpenAIRE

    2008-01-01

    For the expansion of PLC systems, it is necesssary to have a detailed knowledge of the PLC transmission channel properties. This contribution shortly discusses characteristics of the PLC environment and a classification of PLC transmission channels. A main part is focused on the parametric model for PLC reference channels and its verification in the real PLC environment utilizing experimental measurements.

  12. Towards a Generic Information Data Model for Verification, Validation & Accreditation VV&A

    NARCIS (Netherlands)

    Roza, Z.C.; Voogd, J.M.; Giannoulis, C.

    2008-01-01

    The Generic Methodology for Verification, Validation and Acceptance (GM-VV) is intended to provide a common generic framework for making formal and well balanced acceptance decisions on a specific usage of models, simulations and data. GM-VV will offer the international M&S community with a Verifica

  13. A Verification and Analysis of the USAF/DoD Fatigue Model and Fatigue Management Technology

    Science.gov (United States)

    2005-11-01

    We Nap: Evolution, Chronobiology, and Functions of Polyphasic and Ultrashort Sleep . Stampi, C. (ed) Birkhduser, Boston. Defense Acquisition...Windows® soffivare application of the Sleep , Activity, Fatigue, and Task Effectiveness (SAFTE) applied model. The application, the Fatigue Avoidance...Scheduling Tool (FASTTM) was re-engineered as a clone from the SAFTE specification. The verification considered nine sleep /wake schedules that were

  14. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  15. Dynamic grey model of verification cycle and lifecycle of measuring instrument and its application

    Institute of Scientific and Technical Information of China (English)

    SU Hai-tao; YANG Shi-yuan; DONG Hua; SHEN Mao-hu

    2005-01-01

    Two dynamic grey models DGM (1, 1) for the verification cycle and the lifecycle of measuring instrument based on time sequence and frequency sequence were set up, according to the statistical feature of examination data and weighting method. By a specific case, i.e. vernier caliper, it is proved that the fit precision and forecast precision of the models are much higher, the cycles are obviously different under different working conditions, and the forecast result of the frequency sequence model is better than that of the time sequence model. Combining dynamic grey model and auto-manufacturing case the controlling and information subsystems of verification cycle and the lifecycle based on information integration, multi-sensor controlling and management controlling were given. The models can be used in production process to help enterprise reduce error, cost and flaw.

  16. Ecological dynamic model of grassland and its practical verification

    Institute of Scientific and Technical Information of China (English)

    ZENG; Xiaodong

    2005-01-01

    Based on the physico-biophysical considerations, mathematical analysis and some approximate formulations generally adopted in meteorology and ecology, an ecological dynamic model of grassland is developed. The model consists of three interactive variables, I.e. The biomass of living grass, the biomass of wilted grass, and the soil wetness. The major biophysical processes are represented in parameterization formulas, and the model parameters can be determined inversely by using the observational climatological and ecological data. Some major parameters are adjusted by this method to fit the data (although incomplete) in the Inner Mongolia grassland, and other secondary parameters are estimated through sensitivity studies. The model results are well agreed with reality, e.g., (I) the maintenance of grassland requires a minimum amount of annual precipitation (approximately 300 mm); (ii) there is a significant relationship between the annual precipitation and the biomass of living grass; and (iii) the overgrazing will eventually result in desertification. A specific emphasis is put on the shading effect of the wilted grass accumulated on the soil surface. It effectively reduces the soil surface temperature and the evaporation, hence benefits the maintenance of grassland and the reduction of water loss in the soil.

  17. Carbon dioxide stripping in aquaculture -- part III: model verification

    Science.gov (United States)

    Colt, John; Watten, Barnaby; Pfeiffer, Tim

    2012-01-01

    Based on conventional mass transfer models developed for oxygen, the use of the non-linear ASCE method, 2-point method, and one parameter linear-regression method were evaluated for carbon dioxide stripping data. For values of KLaCO2 < approximately 1.5/h, the 2-point or ASCE method are a good fit to experimental data, but the fit breaks down at higher values of KLaCO2. How to correct KLaCO2 for gas phase enrichment remains to be determined. The one-parameter linear regression model was used to vary the C*CO2 over the test, but it did not result in a better fit to the experimental data when compared to the ASCE or fixed C*CO2 assumptions.

  18. Formalization and Verification of Business Process Modeling Based on UML and Petri Nets

    Institute of Scientific and Technical Information of China (English)

    YAN Zhi-jun; GAN Ren-chu

    2005-01-01

    In order to provide a quantitative analysis and verification method for activity diagrams based business process modeling, a formal definition of activity diagrams is introduced. And the basic requirements for activity diagrams based business process models are proposed. Furthermore, the standardized transformation technique between business process models and basic Petri nets is presented and the analysis method for the soundness and well-structured properties of business processes is introduced.

  19. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  20. Efficient Development and Verification of Safe Railway Control Software

    OpenAIRE

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2013-01-01

    In this book, the authors present current research on the types, design and safety issues of railways. Topics discussed include the acoustic characteristics of noise in train stations; monitoring railway structure conditions and opportunities to use wireless sensor networks as tools to improve the monitoring process; hydraulic absorbers as dampers to dissipate the energy of oscillations in railway electric equipment; development of train fare calculation and adjustment systems using VDM++; ef...

  1. EVA Development and Verification Testing at NASA's Neutral Buoyancy Laboratory

    Science.gov (United States)

    Jairala, Juniper C.; Durkin, Robert; Marak, Ralph J.; Sipila, Stepahnie A.; Ney, Zane A.; Parazynski, Scott E.; Thomason, Arthur H.

    2012-01-01

    As an early step in the preparation for future Extravehicular Activities (EVAs), astronauts perform neutral buoyancy testing to develop and verify EVA hardware and operations. Neutral buoyancy demonstrations at NASA Johnson Space Center's Sonny Carter Training Facility to date have primarily evaluated assembly and maintenance tasks associated with several elements of the International Space Station (ISS). With the retirement of the Shuttle, completion of ISS assembly, and introduction of commercial players for human transportation to space, evaluations at the Neutral Buoyancy Laboratory (NBL) will take on a new focus. Test objectives are selected for their criticality, lack of previous testing, or design changes that justify retesting. Assembly tasks investigated are performed using procedures developed by the flight hardware providers and the Mission Operations Directorate (MOD). Orbital Replacement Unit (ORU) maintenance tasks are performed using a more systematic set of procedures, EVA Concept of Operations for the International Space Station (JSC-33408), also developed by the MOD. This paper describes the requirements and process for performing a neutral buoyancy test, including typical hardware and support equipment requirements, personnel and administrative resource requirements, examples of ISS systems and operations that are evaluated, and typical operational objectives that are evaluated.

  2. Verification and Validation in a Rapid Software Development Process

    Science.gov (United States)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  3. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  4. Performance verification of Surface Mapping Instrument developed at CGM

    DEFF Research Database (Denmark)

    Bariani, Paolo

    covering applications in micro-technology and in surface metrology. The paper addresses the description of the stitching procedure, its validation, and a more comprehensive metrological evaluation of the AFM-CMM instrument performance. Experimental validation of the method was performed by the use of...... of the instrument was the development of stitching software. Successful stitching of AFM scans is demonstrated in this report. Single data files in the millimetre range can be obtained, which are entirely based on AFM probing. High definition of nanostructures can therefore be combined with a measuring range...

  5. ENSO Forecasts in the North American Multi-Model Ensemble: Composite Analysis and Verification

    Science.gov (United States)

    Chen, L. C.

    2015-12-01

    In this study, we examine precipitation and temperature forecasts during El Nino/Southern Oscillation (ENSO) events in six models in the North American Multi-Model Ensemble (NMME), including the CFSv2, CanCM3, CanCM4, FLOR, GEOS5, and CCSM4 models, by comparing the model-based ENSO composites to the observed. The composite analysis is conducted using the 1982-2010 hindcasts for each of the six models with selected ENSO episodes based on the seasonal Ocean Nino Index (ONI) just prior to the date the forecasts were initiated. Two sets of composites are constructed over the North American continent: one based on precipitation and temperature anomalies, the other based on their probability of occurrence in a tercile-based system. The composites apply to monthly mean conditions in November, December, January, February, and March, respectively, as well as to the five-month aggregates representing the winter conditions. For the anomaly composites, we use the anomaly correlation coefficient and root-mean-square error against the observed composites for evaluation. For the probability composites, unlike conventional probabilistic forecast verification assuming binary outcomes to the observations, both model and observed composites are expressed in probability terms. Performance metrics for such validation are limited. Therefore, we develop a probability anomaly correlation measure and a probability score for assessment, so the results are comparable to the anomaly composite evaluation. We found that all NMME models predict ENSO precipitation patterns well during wintertime; however, some models have large discrepancies between the model temperature composites and the observed. The skill is higher for the multi-model ensemble, as well as the five-month aggregates. Comparing to the anomaly composites, the probability composites have superior skill in predicting ENSO temperature patterns and are less sensitive to the sample used to construct the composites, suggesting that

  6. Spaceport Command and Control System Automated Verification Software Development

    Science.gov (United States)

    Backus, Michael W.

    2017-01-01

    For as long as we have walked the Earth, humans have always been explorers. We have visited our nearest celestial body and sent Voyager 1 beyond our solar system1 out into interstellar space. Now it is finally time for us to step beyond our home and onto another planet. The Spaceport Command and Control System (SCCS) is being developed along with the Space Launch System (SLS) to take us on a journey further than ever attempted. Within SCCS are separate subsystems and system level software, each of which have to be tested and verified. Testing is a long and tedious process, so automating it will be much more efficient and also helps to remove the possibility of human error from mission operations. I was part of a team of interns and full-time engineers who automated tests for the requirements on SCCS, and with that was able to help verify that the software systems are performing as expected.

  7. Development and Implementation of Radiation-Hydrodynamics Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Marcath, Matthew J. [Los Alamos National Laboratory; Wang, Matthew Y. [Los Alamos National Laboratory; Ramsey, Scott D. [Los Alamos National Laboratory

    2012-08-22

    Analytic solutions to the radiation-hydrodynamic equations are useful for verifying any large-scale numerical simulation software that solves the same set of equations. The one-dimensional, spherically symmetric Coggeshall No.9 and No.11 analytic solutions, cell-averaged over a uniform-grid have been developed to analyze the corresponding solutions from the Los Alamos National Laboratory Eulerian Applications Project radiation-hydrodynamics code xRAGE. These Coggeshall solutions have been shown to be independent of heat conduction, providing a unique opportunity for comparison with xRAGE solutions with and without the heat conduction module. Solution convergence was analyzed based on radial step size. Since no shocks are involved in either problem and the solutions are smooth, second-order convergence was expected for both cases. The global L1 errors were used to estimate the convergence rates with and without the heat conduction module implemented.

  8. Bringing Automated Formal Verification to PLC Program Development

    CERN Document Server

    Fernández Adiego, Borja; Blanco Viñuela, Enrique

    Automation is the field of engineering that deals with the development of control systems for operating systems such as industrial processes, railways, machinery or aircraft without human intervention. In most of the cases, a failure in these control systems can cause a disaster in terms of economic losses, environmental damages or human losses. For that reason, providing safe, reliable and robust control systems is a first priority goal for control engineers. Ideally, control engineers should be able to guarantee that both software and hardware fulfill the design requirements. This is an enormous challenge in which industry and academia have been working and making progresses in the last decades. This thesis focuses on one particular type of control systems that operates industrial processes, the PLC (Programmable Logic Controller) - based control systems. Moreover it targets one of the main challenges for these systems, guaranteeing that PLC programs are compliant with their specifications. Traditionally ...

  9. DATA VERIFICATION IN ISSUE SUPERVISING SYSTEMS

    Directory of Open Access Journals (Sweden)

    R. S. Katerinenko

    2013-01-01

    Full Text Available The paper proposes a method of data verification in issues tracking systems by means of production rules. This model makes it possible to formulate declaratively conditions that the information containment should comply with and apply reasoning procedures. Practical application of proposed verification system in a real software development project is described.

  10. The IXV guidance, navigation and control subsystem: Development, verification and performances

    Science.gov (United States)

    Marco, Victor; Contreras, Rafael; Sanchez, Raul; Rodriguez, Guillermo; Serrano, Daniel; Kerr, Murray; Fernandez, Vicente; Haya-Ramos, Rodrigo; Peñin, Luis F.; Ospina, Jose A.; De Zaiacomo, Gabriale; Bejar-Romero, Juan Antonio; Yague, Ricardo; Zaccagnino, Elio; Preaud, Jean-Philippe

    2016-07-01

    The Intermediate eXperimental Vehicle (IXV) [1] is an ESA re-entry lifting body demonstrator built to verify in-flight the performance of critical re-entry technologies. The IXV was launched on February the 11th, 2015, aboard Europe's Vega launcher. The IXV´s flight and successful recovery represents a major step forward with respect to previous European re-entry experience with the Atmospheric Re-entry Demonstrator (ARD) [2], flown in October 1998. The increased in-flight manoeuvrability achieved from the lifting body solution permitted the verification of technologies over a wider re-entry corridor. Among other objectives, which included the characterisation of the re-entry environment through a variety of sensors, special attention was paid to Guidance, Navigation and Control (GNC) aspects, including the guidance algorithms for the lifting body, the use of the inertial measurement unit measurements with GPS updates for navigation, and the flight control by means of aerodynamic flaps and reaction control thrusters. This paper presents the overall Design, Development and Verification logic that has been successfully followed by the GNC and Flight Management (FM) subsystem of the IXV. It also focuses on the interactions between the GNC and the System, Avionics and OBSW development lifecycles and how an integrated and incremental verification process has been implemented by ensuring the maximum representativeness and reuse through all stages.

  11. A comparative verification of high resolution precipitation forecasts using model output statistics

    Science.gov (United States)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  12. Design of the software development and verification system (SWDVS) for shuttle NASA study task 35

    Science.gov (United States)

    Drane, L. W.; Mccoy, B. J.; Silver, L. W.

    1973-01-01

    An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.

  13. Modelling and Verification of Multiple UAV Mission Using SMV

    CERN Document Server

    Sirigineedi, Gopinadh; White, Brian A; Zbikowski, Rafal

    2010-01-01

    Model checking has been used to verify the correctness of digital circuits, security protocols, communication protocols, as they can be modelled by means of finite state transition model. However, modelling the behaviour of hybrid systems like UAVs in a Kripke model is challenging. This work is aimed at capturing the behaviour of an UAV performing cooperative search mission into a Kripke model, so as to verify it against the temporal properties expressed in Computation Tree Logic (CTL). SMV model checker is used for the purpose of model checking.

  14. Modelling and Verification of Multiple UAV Mission Using SMV

    Directory of Open Access Journals (Sweden)

    Gopinadh Sirigineedi

    2010-03-01

    Full Text Available Model checking has been used to verify the correctness of digital circuits, security protocols, communication protocols, as they can be modelled by means of finite state transition model. However, modelling the behaviour of hybrid systems like UAVs in a Kripke model is challenging. This work is aimed at capturing the behaviour of an UAV performing cooperative search mission into a Kripke model, so as to verify it against the temporal properties expressed in Computational Tree Logic (CTL. SMV model checker is used for the purpose of model checking.

  15. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  16. Development of experimental verification techniques for non-linear deformation and fracture on the nanometer scale.

    Energy Technology Data Exchange (ETDEWEB)

    Moody, Neville Reid; Bahr, David F.

    2005-11-01

    This work covers three distinct aspects of deformation and fracture during indentations. In particular, we develop an approach to verification of nanoindentation induced film fracture in hard film/soft substrate systems; we examine the ability to perform these experiments in harsh environments; we investigate the methods by which the resulting deformation from indentation can be quantified and correlated to computational simulations, and we examine the onset of plasticity during indentation testing. First, nanoindentation was utilized to induce fracture of brittle thin oxide films on compliant substrates. During the indentation, a load is applied and the penetration depth is continuously measured. A sudden discontinuity, indicative of film fracture, was observed upon the loading portion of the load-depth curve. The mechanical properties of thermally grown oxide films on various substrates were calculated using two different numerical methods. The first method utilized a plate bending approach by modeling the thin film as an axisymmetric circular plate on a compliant foundation. The second method measured the applied energy for fracture. The crack extension force and applied stress intensity at fracture was then determined from the energy measurements. Secondly, slip steps form on the free surface around indentations in most crystalline materials when dislocations reach the free surface. Analysis of these slip steps provides information about the deformation taking place in the material. Techniques have now been developed to allow for accurate and consistent measurement of slip steps and the effects of crystal orientation and tip geometry are characterized. These techniques will be described and compared to results from dislocation dynamics simulations.

  17. Ice classification algorithm development and verification for the Alaska SAR Facility using aircraft imagery

    Science.gov (United States)

    Holt, Benjamin; Kwok, Ronald; Rignot, Eric

    1989-01-01

    The Alaska SAR Facility (ASF) at the University of Alaska, Fairbanks is a NASA program designed to receive, process, and archive SAR data from ERS-1 and to support investigations that will use this regional data. As part of ASF, specialized subsystems and algorithms to produce certain geophysical products from the SAR data are under development. Of particular interest are ice motion, ice classification, and ice concentration. This work focuses on the algorithm under development for ice classification, and the verification of the algorithm using C-band aircraft SAR imagery recently acquired over the Alaskan arctic.

  18. Development, Verification and Validation of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    Science.gov (United States)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.

  19. Verification of an interaction model of an ultrasonic oscillatory system with periodontal tissues

    Directory of Open Access Journals (Sweden)

    V. A. Karpuhin

    2014-01-01

    Full Text Available Verification of an interaction model of an ultrasonic oscillatory system with biological tissues which was developed in COMSOL Multiphysics was carried out. It was shown that calculation results in COMSOL Multiphysics obtained using the “Finer” grid (the ratio of the grid step to a minimum transversal section area of the model ≤ 0.3 mm-1 best of all qualitatively and quantitatively corresponded to practical results. The average relative error of the obtained results in comparison with the experimental ones did not exceed 4.0%. Influence of geometrical parameters (thickness of load on electrical admittance of the ultrasonic oscillatory system interacting with biological tissues was investigated. It was shown that increase in thickness of load within the range from 0 to 95 mm led to decrease in calculated values of natural resonance frequency of longitudinal fluctuations and electrical admittance from 26,58 to 26,35 kHz and from 0,86 to 0,44 mS.

  20. Rheological-dynamical continuum damage model for concrete under uniaxial compression and its experimental verification

    Directory of Open Access Journals (Sweden)

    Milašinović Dragan D.

    2015-01-01

    Full Text Available A new analytical model for the prediction of concrete response under uniaxial compression and its experimental verification is presented in this paper. The proposed approach, referred to as the rheological-dynamical continuum damage model, combines rheological-dynamical analogy and damage mechanics. Within the framework of this approach the key continuum parameters such as the creep coefficient, Poisson’s ratio and damage variable are functionally related. The critical values of the creep coefficient and damage variable under peak stress are used to describe the failure mode of the concrete cylinder. The ultimate strain is determined in the post-peak regime only, using the secant stress-strain relation from damage mechanics. The post-peak branch is used for the energy analysis. Experimental data for five concrete compositions were obtained during the examination presented herein. The principal difference between compressive failure and tensile fracture is that there is a residual stress in the specimens, which is a consequence of uniformly accelerated motion of load during the examination of compressive strength. The critical interpenetration displacements and crushing energy are obtained theoretically based on the concept of global failure analysis. [Projekat Ministarstva nauke Republike Srbije, br. ON 174027: Computational Mechanics in Structural Engineering i br. TR 36017: Utilization of by-products and recycled waste materials in concrete composites for sustainable construction development in Serbia: Investigation and environmental assessment of possible applications

  1. FAST Mast Structural Response to Axial Loading: Modeling and Verification

    Science.gov (United States)

    Knight, Norman F., Jr.; Elliott, Kenny B.; Templeton, Justin D.; Song, Kyongchan; Rayburn, Jeffery T.

    2012-01-01

    The International Space Station s solar array wing mast shadowing problem is the focus of this paper. A building-block approach to modeling and analysis is pursued for the primary structural components of the solar array wing mast structure. Starting with an ANSYS (Registered Trademark) finite element model, a verified MSC.Nastran (Trademark) model is established for a single longeron. This finite element model translation requires the conversion of several modeling and analysis features for the two structural analysis tools to produce comparable results for the single-longeron configuration. The model is then reconciled using test data. The resulting MSC.Nastran (Trademark) model is then extended to a single-bay configuration and verified using single-bay test data. Conversion of the MSC. Nastran (Trademark) single-bay model to Abaqus (Trademark) is also performed to simulate the elastic-plastic longeron buckling response of the single bay prior to folding.

  2. Verification of five pharmacogenomics-based warfarin administration models

    Directory of Open Access Journals (Sweden)

    Meiqin Lin

    2016-01-01

    Conclusions: Since none of the models ranked high for all the three criteria considered, the impact of various factors should be thoroughly considered before selecting the most appropriate model for the region's population.

  3. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao;

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated...

  4. A verification strategy for web services composition using enhanced stacked automata model.

    Science.gov (United States)

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  5. Methods, Computational Platform, Verification, and Application of Earthquake-Soil-Structure-Interaction Modeling and Simulation

    Science.gov (United States)

    Tafazzoli, Nima

    Seismic response of soil-structure systems has attracted significant attention for a long time. This is quite understandable with the size and the complexity of soil-structure systems. The focus of three important aspects of ESSI modeling could be on consistent following of input seismic energy and a number of energy dissipation mechanisms within the system, numerical techniques used to simulate dynamics of ESSI, and influence of uncertainty of ESSI simulations. This dissertation is a contribution to development of one such tool called ESSI Simulator. The work is being done on extensive verified and validated suite for ESSI Simulator. Verification and validation are important for high fidelity numerical predictions of behavior of complex systems. This simulator uses finite element method as a numerical tool to obtain solutions for large class of engineering problems such as liquefaction, earthquake-soil-structure-interaction, site effect, piles, pile group, probabilistic plasticity, stochastic elastic-plastic FEM, and detailed large scale parallel models. Response of full three-dimensional soil-structure-interaction simulation of complex structures is evaluated under the 3D wave propagation. Domain-Reduction-Method is used for applying the forces as a two-step procedure for dynamic analysis with the goal of reducing the large size computational domain. The issue of damping of the waves at the boundary of the finite element models is studied using different damping patterns. This is used at the layer of elements outside of the Domain-Reduction-Method zone in order to absorb the residual waves coming out of the boundary layer due to structural excitation. Extensive parametric study is done on dynamic soil-structure-interaction of a complex system and results of different cases in terms of soil strength and foundation embedment are compared. High efficiency set of constitutive models in terms of computational time are developed and implemented in ESSI Simulator

  6. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  7. On the verification of PGD reduced-order models

    OpenAIRE

    Pled, Florent; Chamoin, Ludovic; Ladevèze, Pierre

    2014-01-01

    International audience; In current computational mechanics practice, multidimensional as well as multiscale or parametric models encountered in a wide variety of scientific and engineering fields often require either the resolution of significantly large complexity problems or the direct calculation of very numerous solutions of such complex models. In this framework, the use of model order reduction allows to dramatically reduce the computational requirements engendered by the increasing mod...

  8. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-05-12

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  9. Verification of a fully coupled FE model for tunneling under compressed air

    Energy Technology Data Exchange (ETDEWEB)

    Oettl, G.; Stark, R.F.; Hofstetter, G. [Innsbruck Univ. (Austria). Inst. for Structural Analysis and Strength of Materials

    2001-07-01

    This paper deals with the verification of a fully coupled finite element model for tunneling under compressed air. The formulation is based on mixture theory treating the soil as a three-phase medium with the constituents: deformable porous soil skeleton, water and air. Starting with a brief outline of the governing equations results of numerical simulations of different laboratory tests and of a large-scale in-situ test are presented and compared with experimental data. (orig.)

  10. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2016-01-01

    checking (BMC) and inductive reasoning, it is verified that the generated model instance satisfies the generated safety properties. Using this method, we are able to verify the safety properties for model instances corresponding to railway networks of industrial size. Experiments show that BMC is also...

  11. Mask synthesis and verification based on geometric model for surface micro-machined MEMS

    Institute of Scientific and Technical Information of China (English)

    LI Jian-hua; LIU Yu-sheng; GAO Shu-ming

    2005-01-01

    Traditional MEMS (microelectromechanical system) design methodology is not a structured method and has become an obstacle for MEMS creative design. In this paper, a novel method of mask synthesis and verification for surface micro-machined MEMS is proposed, which is based on the geometric model of a MEMS device. The emphasis is focused on synthesizing the masks at the basis of the layer model generated from the geometric model of the MEMS device. The method is comprised of several steps: the correction of the layer model, the generation of initial masks and final masks including multi-layer etch masks, and mask simulation. Finally some test results are given.

  12. 3D MODELING FOR UNDERWATER ARCHAEOLOGICAL DOCUMENTATION: METRIC VERIFICATIONS

    Directory of Open Access Journals (Sweden)

    S. D’Amelio

    2015-04-01

    Full Text Available The survey in underwater environment has always presented considerable difficulties both operative and technical and this has sometimes made it difficult to use the techniques of survey commonly used for the documentation of Cultural Heritage in dry environment. The work of study concerns the evaluation in terms of capability and accuracy of the Autodesk123DCatch software for the reconstruction of a three-dimensional model of an object in underwater context. The subjects of the study are models generated from sets of photographs and sets of frames extracted from video sequence. The study is based on comparative method, using a reference model, obtained with laser scanner technique.

  13. Multiple verification in computational modeling of bone pathologies

    CERN Document Server

    Liò, Pietro; Paoletti, Nicola; 10.4204/EPTCS.67.8

    2011-01-01

    We introduce a model checking approach to diagnose the emerging of bone pathologies. The implementation of a new model of bone remodeling in PRISM has led to an interesting characterization of osteoporosis as a defective bone remodeling dynamics with respect to other bone pathologies. Our approach allows to derive three types of model checking-based diagnostic estimators. The first diagnostic measure focuses on the level of bone mineral density, which is currently used in medical practice. In addition, we have introduced a novel diagnostic estimator which uses the full patient clinical record, here simulated using the modeling framework. This estimator detects rapid (months) negative changes in bone mineral density. Independently of the actual bone mineral density, when the decrease occurs rapidly it is important to alarm the patient and monitor him/her more closely to detect insurgence of other bone co-morbidities. A third estimator takes into account the variance of the bone density, which could address the...

  14. Autonomic networking-on-chip bio-inspired specification, development, and verification

    CERN Document Server

    Cong-Vinh, Phan

    2011-01-01

    Despite the growing mainstream importance and unique advantages of autonomic networking-on-chip (ANoC) technology, Autonomic Networking-On-Chip: Bio-Inspired Specification, Development, and Verification is among the first books to evaluate research results on formalizing this emerging NoC paradigm, which was inspired by the human nervous system. The FIRST Book to Assess Research Results, Opportunities, & Trends in ""BioChipNets"" The third book in the Embedded Multi-Core Systems series from CRC Press, this is an advanced technical guide and reference composed of contributions from prominent re

  15. Metal Fuel Development and Verification for Prototype Generation IV Sodium-Cooled Fast Reactor

    OpenAIRE

    Chan Bock Lee; Jin Sik Cheon; Sung Ho Kim; Jeong-Yong Park; Hyung-Kook Joo

    2016-01-01

    Metal fuel is being developed for the prototype generation-IV sodium-cooled fast reactor (PGSFR) to be built by 2028. U–Zr fuel is a driver for the initial core of the PGSFR, and U–transuranics (TRU)–Zr fuel will gradually replace U–Zr fuel through its qualification in the PGSFR. Based on the vast worldwide experiences of U–Zr fuel, work on U–Zr fuel is focused on fuel design, fabrication of fuel components, and fuel verification tests. U–TRU–Zr fuel uses TRU recovered through pyroelectrochem...

  16. Image Smearing Modeling and Verification for Strapdown Star Sensor

    Institute of Scientific and Technical Information of China (English)

    WANG Haiyong; ZHOU Wenrui; CHENG Xuan; LIN Haoyu

    2012-01-01

    To further extend study on celestial attitude determination with strapdown star sensor from static into dynamic field,one prerequisite is to generate precise dynamic simulating star maps.First a neat analytical solution of the smearing trajectory caused by spacecraft attitude maneuver is deduced successfully,whose parameters cover the geometric size of optics,three-axis angular velocities and CCD integral time.Then for the first time the mathematical law and method are discovered about how to synthesize the two formulae of smearing trajectory and the static Gaussian distribution function (GDF) model,the key of which is a line integral with regard to the static GDF attenuated by a factor 1/Ls (Ls is the arc length of the smearing trajectory) along the smearing trajectory.The dynamic smearing model is then obtained,also in an analytical form.After that,three sets of typical simulating maps and data are simulated from this dynamic model manifesting the expected smearing effects,also compatible with the linear model as its special case of no boresight rotation.Finally,model validity tests on a rate turntable are carried out,which results in a mean correlation coefficient 0.920 0 between the camera images and the corresponding model simulated ones with the same parameters.The sufficient similarity verifies the validity of the dynamic smearing model.This model,after parameter calibration,can serve as a front-end loop of the ground semi-physical simulation system for celestial attitude determination with strapdown star sensor.

  17. Model Verification and Validation Using Graphical Information Systems Tools

    Science.gov (United States)

    2013-07-31

    Marques, W. C., E. H. L. Fernandes, B. C. Moraes, O. O. Möller, and A. Malcherek (2010), Dynamics of the Patos Lagoon coastal plume and its...multiple hurricane beds in the northern Gulf of Mexico , Marine Geology, Volume 210, Issues 1-4, Storms and their significance in coastal morpho-sedimentary...accuracy of model forecasts of currents in coastal areas. The MVV module is implemented as part of the Geospatial Analysis and Model Evaluation Software

  18. Verification modeling study for the influential factors of secondary clarifier

    OpenAIRE

    Gao, Haiwen

    2016-01-01

    A numerical Quasi 3-D model of secondary clarifier is applied to verify the data obtained through the literature and analyze the influential factors for secondary clarifiers. The data from the papers provide the input parameters for the model. During this study, several influential factors (density waterfall; surface overflow rate; solids loading rate; solids-settling characteristics; mixed liquor suspended solid; clarifier geometry) are tested. The results show that there are some difference...

  19. Target Soil Impact Verification: Experimental Testing and Kayenta Constitutive Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Broome, Scott Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Flint, Gregory Mark [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Dewers, Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Newell, Pania [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This report details experimental testing and constitutive modeling of sandy soil deformation under quasi - static conditions. This is driven by the need to understand constitutive response of soil to target/component behavior upon impact . An experimental and constitutive modeling program was followed to determine elastic - plastic properties and a compressional failure envelope of dry soil . One hydrostatic, one unconfined compressive stress (UCS), nine axisymmetric compression (ACS) , and one uniaxial strain (US) test were conducted at room temperature . Elastic moduli, assuming isotropy, are determined from unload/reload loops and final unloading for all tests pre - failure and increase monotonically with mean stress. Very little modulus degradation was discernable from elastic results even when exposed to mean stresses above 200 MPa . The failure envelope and initial yield surface were determined from peak stresses and observed onset of plastic yielding from all test results. Soil elasto - plastic behavior is described using the Brannon et al. (2009) Kayenta constitutive model. As a validation exercise, the ACS - parameterized Kayenta model is used to predict response of the soil material under uniaxial strain loading. The resulting parameterized and validated Kayenta model is of high quality and suitable for modeling sandy soil deformation under a range of conditions, including that for impact prediction.

  20. A New Integrated Weighted Model in SNOW-V10: Verification of Continuous Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results of nowcasts of four continuous variables generated from an integrated weighted model and underlying Numerical Weather Prediction (NWP) models. Real-time monitoring of fast changing weather conditions and the provision of short term forecasts, or nowcasts, in complex terrain within coastal regions is challenging to do with sufficient accuracy. A recently developed weighting, evaluation, bias correction and integration system was used in the Science of Nowcasting Olympic Weather for Vancouver 2010 project to generate integrated weighted forecasts (INTW) out to 6 h. INTW forecasts were generated with in situ observation data and background gridded forecasting data from Canadian high-resolution deterministic NWP system with three nested grids at 15-, 2.5- and 1-km horizontal grid-spacing configurations. In this paper, the four variables of temperature, relative humidity, wind speed and wind gust are treated as continuous variables for verifying the INTW forecasts. Fifteen sites were selected for the comparison of the model performances. The results of the study show that integrating surface observation data with the NWP forecasts produce better statistical scores than using either the NWP forecasts or an objective analysis of observed data alone. Overall, integrated observation and NWP forecasts improved forecast accuracy for the four continuous variables. The mean absolute errors from the INTW forecasts for the entire test period (12 February to 21 March 2010) are smaller than those from NWP forecasts with three configurations. The INTW is the best and most consistent performer among all models regardless of location and variable analyzed.

  1. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  2. Developing a Model Component

    Science.gov (United States)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) was a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The initial purpose of the UCTS was to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The UCTS is designed with the capability of servicing future space vehicles; including all Space Station Requirements necessary for the MPLM Modules. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems during their development. As an intern at Kennedy Space Center (KSC), my assignment was to develop a model component for the UCTS. I was given a fluid component (dryer) to model in Simulink. I completed training for UNIX and Simulink. The dryer is a Catch All replaceable core type filter-dryer. The filter-dryer provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-dryer also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. The filter-dryer was modeled by determining affects it has on the pressure and velocity of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my filter-dryer model in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements. I participated in Simulation meetings and was involved in the subsystem design process and team collaborations. I gained valuable work experience and insight into a career path as an engineer.

  3. Case study of verification, validation, and testing in the Automated Data Processing (ADP) system development life cycle

    Energy Technology Data Exchange (ETDEWEB)

    Riemer, C.A.

    1990-05-01

    Staff of the Environmental Assessment and Information Sciences Division of Argonne National Laboratory (ANL) studies the role played by the organizational participants in the Department of Veterans Affairs (VA) that conduct verification, validation, and testing (VV T) activities at various stages in the automated data processing (ADP) system development life cycle (SDLC). A case-study methodology was used to assess the effectiveness of VV T activities (tasks) and products (inputs and outputs). The case selected for the study was a project designed to interface the compensation and pension (C P) benefits systems with the centralized accounts receivable system (CARS). Argonne developed an organizational SDLC VV T model and checklists to help collect information from C P/CARS participants on VV T procedures and activities, and these were then evaluated against VV T standards.

  4. Very fast road database verification using textured 3D city models obtained from airborne imagery

    Science.gov (United States)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models

  5. Verification of the Naval Oceanic Vertical Aerosol Model During Fire

    NARCIS (Netherlands)

    Davidson, K.L.; Leeuw, G. de; Gathman, S.G.; Jensen, D.R.

    1990-01-01

    The Naval Oceanic Vertical Aerosol Model (NOVAM) has been formulated to estimate the vertical structure of the optical and infrared extinction coefficients in the marine atmospheric boundary layer (MABL), for waverengths between 0,2 and 40 um. NOVAM was designed to predict, utilizing a set of routin

  6. Modelling and Verification of Web Services Business Activity Protocol

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2011-01-01

    WS-Business Activity specification defines two coordination protocols in order to ensure a consistent agreement on the outcome of long-running distributed applications. We use the model checker Uppaal to analyse the Business Agreement with Coordination Completion protocol type. Our analyses show...

  7. Methods for the Update and Verification of Forest Surface Model

    Science.gov (United States)

    Rybansky, M.; Brenova, M.; Zerzan, P.; Simon, J.; Mikita, T.

    2016-06-01

    The digital terrain model (DTM) represents the bare ground earth's surface without any objects like vegetation and buildings. In contrast to a DTM, Digital surface model (DSM) represents the earth's surface including all objects on it. The DTM mostly does not change as frequently as the DSM. The most important changes of the DSM are in the forest areas due to the vegetation growth. Using the LIDAR technology the canopy height model (CHM) is obtained by subtracting the DTM and the corresponding DSM. The DSM is calculated from the first pulse echo and DTM from the last pulse echo data. The main problem of the DSM and CHM data using is the actuality of the airborne laser scanning. This paper describes the method of calculating the CHM and DSM data changes using the relations between the canopy height and age of trees. To get a present basic reference data model of the canopy height, the photogrammetric and trigonometric measurements of single trees were used. Comparing the heights of corresponding trees on the aerial photographs of various ages, the statistical sets of the tree growth rate were obtained. These statistical data and LIDAR data were compared with the growth curve of the spruce forest, which corresponds to a similar natural environment (soil quality, climate characteristics, geographic location, etc.) to get the updating characteristics.

  8. Verification-Driven Slicing of UML/OCL Models

    DEFF Research Database (Denmark)

    Shaikh, Asadullah; Clarisó Viladrosa, Robert; Wiil, Uffe Kock;

    2010-01-01

    computational complexity can limit their scalability. In this paper, we consider a specific static model (UML class diagrams annotated with unrestricted OCL constraints) and a specific property to verify (satisfiability, i.e., “is it possible to create objects without violating any constraint?”). Current...

  9. Verification of Conjugate Heat Transfer Models in a Closed Volume with Radiative Heat Source

    Directory of Open Access Journals (Sweden)

    Maksimov Vyacheslav I.

    2016-01-01

    Full Text Available The results of verification of mathematical model of convective-conductive heat transfer in a closed volume with a thermally conductive enclosing structures are presented. Experiments were carried out to determine the temperature of floor premises in the working conditions of radiant heating systems. Comparison of mathematical modelling of temperature fields and experiments showed their good agreement. It is concluded that the mathematical model of conjugate heat transfers in the air cavity with a heat-conducting and heat-retaining walls correspond to the real process of formation of temperature fields in premises with gas infrared heaters system.

  10. Numerical Verification of the Weak Turbulent Model for Swell Evolution

    CERN Document Server

    Korotkevich, A O; Resio, D; Zakharov, V E

    2007-01-01

    We performed numerical simulation of an ensemble of nonlinearly interacting free gravity waves (swell) by two different methods: solution of primordial dynamical equations describing potential flow of the ideal fluid with a free surface and, solution of the kinetic Hasselmann equation, describing the wave ensemble in the framework of the theory of weak turbulence. Comparison of the results demonstrates applicability of the weak turbulent approach. In both cases we observed effects predicted by this theory: frequency downshift, angular spreading and formation of Zakharov-Filonenko spectrum $I_{\\omega} \\sim \\omega^{-4}$. One of the results of our article consists in the fact that physical processes in finite size laboratory wave tanks and in the ocean are quite different, and the results of such laboratory experiments can be applied to modeling of the ocean phenomena with extra care. We also present the estimate on the minimum size of the laboratory installation, allowing to model open ocean surface wave dynami...

  11. A New Approach to Model Verification, Falsification and Selection

    Directory of Open Access Journals (Sweden)

    Andrew J. Buck

    2015-06-01

    Full Text Available This paper shows that a qualitative analysis, i.e., an assessment of the consistency of a hypothesized sign pattern for structural arrays with the sign pattern of the estimated reduced form, can always provide decisive insight into a model’s validity both in general and compared to other models. Qualitative analysis can show that it is impossible for some models to have generated the data used to estimate the reduced form, even though standard specification tests might show the model to be adequate. A partially specified structural hypothesis can be falsified by estimating as few as one reduced form equation. Zero restrictions in the structure can themselves be falsified. It is further shown how the information content of the hypothesized structural sign patterns can be measured using a commonly applied concept of statistical entropy. The lower the hypothesized structural sign pattern’s entropy, the more a priori information it proposes about the sign pattern of the estimated reduced form. As an hypothesized structural sign pattern has a lower entropy, it is more subject to type 1 error and less subject to type 2 error. Three cases illustrate the approach taken here.

  12. Computational reverse shoulder prosthesis model: Experimental data and verification.

    Science.gov (United States)

    Martins, A; Quental, C; Folgado, J; Ambrósio, J; Monteiro, J; Sarmento, M

    2015-09-18

    The reverse shoulder prosthesis aims to restore the stability and function of pathological shoulders, but the biomechanical aspects of the geometrical changes induced by the implant are yet to be fully understood. Considering a large-scale musculoskeletal model of the upper limb, the aim of this study is to evaluate how the Delta reverse shoulder prosthesis influences the biomechanical behavior of the shoulder joint. In this study, the kinematic data of an unloaded abduction in the frontal plane and an unloaded forward flexion in the sagittal plane were experimentally acquired through video-imaging for a control group, composed of 10 healthy shoulders, and a reverse shoulder group, composed of 3 reverse shoulders. Synchronously, the EMG data of 7 superficial muscles were also collected. The muscle force sharing problem was solved through the minimization of the metabolic energy consumption. The evaluation of the shoulder kinematics shows an increase in the lateral rotation of the scapula in the reverse shoulder group, and an increase in the contribution of the scapulothoracic joint to the shoulder joint. Regarding the muscle force sharing problem, the musculoskeletal model estimates an increased activity of the deltoid, teres minor, clavicular fibers of the pectoralis major, and coracobrachialis muscles in the reverse shoulder group. The comparison between the muscle forces predicted and the EMG data acquired revealed a good correlation, which provides further confidence in the model. Overall, the shoulder joint reaction force was lower in the reverse shoulder group than in the control group. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Model-Based Verification and Validation of Spacecraft Avionics

    Science.gov (United States)

    Khan, Mohammed Omair

    2012-01-01

    Our simulation was able to mimic the results of 30 tests on the actual hardware. This shows that simulations have the potential to enable early design validation - well before actual hardware exists. Although simulations focused around data processing procedures at subsystem and device level, they can also be applied to system level analysis to simulate mission scenarios and consumable tracking (e.g. power, propellant, etc.). Simulation engine plug-in developments are continually improving the product, but handling time for time-sensitive operations (like those of the remote engineering unit and bus controller) can be cumbersome.

  14. Modeling and verification of hemispherical solar still using ANSYS CFD

    Energy Technology Data Exchange (ETDEWEB)

    Panchal, Hitesh N. [KSV University, Gujarat Power Engineering and Research Institute, Mehsana (India); Shah, P.K. [Silver Oak College of Engineering and Technology, Ahmedabad, Gujarat (India)

    2013-07-01

    In every efficient solar still design, water temperature, vapor temperature and distillate output, and difference between water temperature and inner glass cover temperatures are very important. Here, two dimensional three phase model of hemispherical solar still is made for evaporation as well as condensation process in ANSYS CFD. Simulation results like water temperature, vapor temperature, distillate output compared with actual experimental results of climate conditions of Mehsana (latitude of 23° 59’ and longitude of 72° 38) of hemispherical solar still. Water temperature and distillate output were good agreement with actual experimental results. Study shows that ANSYS-CFD is very powerful as well as efficient tool for design, comparison purpose of hemispherical solar still.

  15. Verification and validation plan for the SFR system analysis module

    Energy Technology Data Exchange (ETDEWEB)

    Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-12-18

    This report documents the Verification and Validation (V&V) Plan for software verification and validation of the SFR System Analysis Module (SAM), developed at Argonne National Laboratory for sodium fast reactor whole-plant transient analysis. SAM is developed under the DOE NEAMS program and is part of the Reactor Product Line toolkit. The SAM code, the phenomena and computational models of interest, the software quality assurance, and the verification and validation requirements and plans are discussed in this report.

  16. Numerical Modelling of Wind Waves. Problems, Solutions, Verifications, and Applications

    CERN Document Server

    Polnikov, Vladislav

    2011-01-01

    The time-space evolution of the field is described by the transport equation for the 2-dimensional wave energy spectrum density, S(x,t), spread in the space, x, and time, t. This equation has the forcing named the source function, F, depending on both the wave spectrum, S, and the external wave-making factors: local wind, W(x, t), and local current, U(x, t). The source function contains certain physical mechanisms responsible for a wave spectrum evolution. It is used to distinguish three terms in function F: the wind-wave energy exchange mechanism, In; the energy conservative mechanism of nonlinear wave-wave interactions, Nl; and the wave energy loss mechanism, Dis. Differences in mathematical representation of the source function terms determine general differences between wave models. The problem is to derive analytical representations for the source function terms said above from the fundamental wave equations. Basing on publications of numerous authors and on the last two decades studies of the author, th...

  17. One-Quarter-Car Active SuspensionModel Verification

    Directory of Open Access Journals (Sweden)

    Hyniova Katerina

    2017-01-01

    Full Text Available Suspension system influences both the comfort and safety of the passengers. In the paper, energy recuperation and management in automotive suspension systems with linear electric motors that are controlled by a designed H∞ controller to generate a variable mechanical force for a car damper is presented. Vehicle shock absorbers in which forces are generated in response to feedback signals by active elements obviously offer increased design flexibility compared to the conventional suspensions with passive elements (springs and dampers. The main advantage of the proposed solution that uses a linear AC motor is the possibility to generate desired forces acting between the unsprung (wheel and sprung (one-quarter of the car body mass masses of the car, providing good insulation of the car sprung mass from the road surface roughness and load disturbances. As shown in the paper, under certain circumstances linear motors as actuators enable to transform mechanical energy of the vertical car vibrations to electrical energy, accumulate it, and use it when needed. Energy flow control enables to reduce or even eliminate the demands on the external power source. In particular, the paper is focused on experiments with active shock absorber that has been taken on the designed test bed and the way we developed an appropriate input signal for the test bed that as real road disturbance acts upon the vibration absorber and the obtained results are evaluated at the end. Another important point the active suspension design should satisfy is energy supply control that is made via standard controller modification, and which allows changing amount of energy required by the system. Functionality of the designed controller modification was verified taking various experiments on the experiment stand as mentioned in the paper.

  18. Verification of sub-grid filtered drag models for gas-particle fluidized beds with immersed cylinder arrays

    Energy Technology Data Exchange (ETDEWEB)

    Sarkar, Avik [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sun, Xin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sundaresan, Sankaran [Princeton Univ., NJ (United States)

    2014-04-23

    The accuracy of coarse-grid multiphase CFD simulations of fluidized beds may be improved via the inclusion of filtered constitutive models. In our previous study (Sarkar et al., Chem. Eng. Sci., 104, 399-412), we developed such a set of filtered drag relationships for beds with immersed arrays of cooling tubes. Verification of these filtered drag models is addressed in this work. Predictions from coarse-grid simulations with the sub-grid filtered corrections are compared against accurate, highly-resolved simulations of full-scale turbulent and bubbling fluidized beds. The filtered drag models offer a computationally efficient yet accurate alternative for obtaining macroscopic predictions, but the spatial resolution of meso-scale clustering heterogeneities is sacrificed.

  19. Development and validation of MCNPX-based Monte Carlo treatment plan verification system.

    Science.gov (United States)

    Jabbari, Iraj; Monadi, Shahram

    2015-01-01

    A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT) format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D) diode array (MapCHECK2) and gamma index analysis were used. The gamma passing rate (3%/3 mm) of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%). The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.

  20. Development and validation of MCNPX-based Monte Carlo treatment plan verification system

    Directory of Open Access Journals (Sweden)

    Iraj Jabbari

    2015-01-01

    Full Text Available A Monte Carlo treatment plan verification (MCTPV system was developed for clinical treatment plan verification (TPV, especially for the conformal and intensity-modulated radiotherapy (IMRT plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D diode array (MapCHECK2 and gamma index analysis were used. The gamma passing rate (3%/3 mm of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%. The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.

  1. Model-based mask verification on critical 45nm logic masks

    Science.gov (United States)

    Sundermann, F.; Foussadier, F.; Takigawa, T.; Wiley, J.; Vacca, A.; Depre, L.; Chen, G.; Bai, S.; Wang, J.-S.; Howell, R.; Arnoux, V.; Hayano, K.; Narukawa, S.; Kawashima, S.; Mohri, H.; Hayashi, N.; Miyashita, H.; Trouiller, Y.; Robert, F.; Vautrin, F.; Kerrien, G.; Planchot, J.; Martinelli, C.; Di-Maria, J. L.; Farys, V.; Vandewalle, B.; Perraud, L.; Le Denmat, J. C.; Villaret, A.; Gardin, C.; Yesilada, E.; Saied, M.

    2008-05-01

    In the continuous battle to improve critical dimension (CD) uniformity, especially for 45-nanometer (nm) logic advanced products, one important recent advance is the ability to accurately predict the mask CD uniformity contribution to the overall global wafer CD error budget. In most wafer process simulation models, mask error contribution is embedded in the optical and/or resist models. We have separated the mask effects, however, by creating a short-range mask process model (MPM) for each unique mask process and a long-range CD uniformity mask bias map (MBM) for each individual mask. By establishing a mask bias map, we are able to incorporate the mask CD uniformity signature into our modelling simulations and measure the effects on global wafer CD uniformity and hotspots. We also have examined several ways of proving the efficiency of this approach, including the analysis of OPC hot spot signatures with and without the mask bias map (see Figure 1) and by comparing the precision of the model contour prediction to wafer SEM images. In this paper we will show the different steps of mask bias map generation and use for advanced 45nm logic node layers, along with the current results of this new dynamic application to improve hot spot verification through Brion Technologies' model-based mask verification loop.

  2. An empirical model for independent dose verification of the Gamma Knife treatment planning.

    Science.gov (United States)

    Phaisangittisakul, Nakorn; Ma, Lijun

    2002-09-01

    A formalism for an independent dose verification of the Gamma Knife treatment planning is developed. It is based on the approximation that isodose distribution for a single shot is in the shape of an ellipsoid in three-dimensional space. The dose profiles for a phantom along each of the three major axes are fitted to a function which contains the terms that represent the contributions from a point source, an extrafocal scattering, and a flat background. The fitting parameters are extracted for all four helmet collimators, at various shot locations, and with different skull shapes. The 33 parameters of a patient's skull shape obtained from the Skull Scaling Instrument measurements are modeled for individual patients. The relative doses for a treatment volume in the form of 31 x 31 x 31 matrix of points are extracted from the treatment planning system, the Leksell Gamma-Plan (LGP). Our model evaluates the relative doses using the same input parameters as in the LGP, which are skull measurement data, shot location, weight, gamma-angle of the head frame, and helmet collimator size. For 29 single-shot cases, the discrepancy of dose at the focus point between the calculation and the LGP is found to be within -1% to 2%. For multi-shot cases, the value and the coordinate of the maximum dose point from the calculation agree within +/-7% and +/-3 mm with the LGP results. In general, the calculated doses agree with the LGP calculations within +/-10% for the off-center locations. Results of calculation with this method for the dimension and location of the 50% isodose line are in good agreement with results from Leksell GammaPlan. Therefore, this method can be served as a useful tool for secondary quality assurance of Gamma Knife treatment plans.

  3. Quantitative Safety: Linking Proof-Based Verification with Model Checking for Probabilistic Systems

    CERN Document Server

    Ndukwu, Ukachukwu

    2009-01-01

    This paper presents a novel approach for augmenting proof-based verification with performance-style analysis of the kind employed in state-of-the-art model checking tools for probabilistic systems. Quantitative safety properties usually specified as probabilistic system invariants and modeled in proof-based environments are evaluated using bounded model checking techniques. Our specific contributions include the statement of a theorem that is central to model checking safety properties of proof-based systems, the establishment of a procedure; and its full implementation in a prototype system (YAGA) which readily transforms a probabilistic model specified in a proof-based environment to its equivalent verifiable PRISM model equipped with reward structures. The reward structures capture the exact interpretation of the probabilistic invariants and can reveal succinct information about the model during experimental investigations. Finally, we demonstrate the novelty of the technique on a probabilistic library cas...

  4. Verification of nuclear fuel plates by a developed non-destructive assay method

    Science.gov (United States)

    El-Gammal, W.; El-Nagdy, M.; Rizk, M.; Shawky, S.; Samei, M. A.

    2005-11-01

    Nuclear material (NM) verification is a main target for NM accounting and control. In this work a new relative non-destructive assay technique has been developed to verify the uranium mass content in nuclear fuel. The technique uses a planar high-resolution germanium gamma ray spectrometer in combination with the MCNP-4B Monte Carlo transport code. A standard NM sample was used to simulate the assayed NM and to determine the average intrinsic full energy peak efficiency of the detector for assayed configuration. The developed technique was found to be capable of verifying the operator declarations with an average accuracy of about 2.8% within a precision of better than 4%.

  5. CANDU RU fuel manufacturing basic technology development and advanced fuel verification tests

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D. [and others

    1999-04-01

    A PHWR advanced fuel named the CANFLEX fuel has been developed through a KAERI/AECL joint Program. The KAERI made fuel bundle was tested at the KAERI Hot Test Loop for the performance verification of the bundle design. The major test activities were the fuel bundle cross-flow test, the endurance fretting/vibration test, the freon CHF test, and the fuel bundle heat-up test. KAERI also has developing a more advanced PHWR fuel, the CANFLEX-RU fuel, using recovered uranium to extend fuel burn-up in the CANDU reactors. For the purpose of proving safety of the RU handling techniques and appraising feasibility of the CANFLEX-RU fuel fabrication in near future, a physical, chemical and radiological characterization of the RU powder and pellets was performed. (author). 54 refs., 46 tabs., 62 figs.

  6. Book Titled Autonomic Networking-on-Chip: Bio-Inspired Specification, Development, and Verification: An Introduction

    Directory of Open Access Journals (Sweden)

    Phan Cong Vinh

    2015-03-01

    Full Text Available Despite the growing mainstream importance and unique advantages of autonomic networking-onchip (ANoC technology, Autonomic Networking-On-Chip: Bio-Inspired Specification, Development, and Verification is among the first books to evaluate research results on formalizing this emerging NoC paradigm, which was inspired by the human nervous system. The third book in the Embedded Multi-Core Systems series from CRC Press, this is an advanced technical guide and reference composed of contributions from prominent researchers in industry and academia around the world. A response to the critical need for a global information exchange and dialogue, it is written for engineers, scientists, practitioners, and other researchers who have a basic understanding of NoC and are now ready to learn how to specify, develop, and verify ANoC using rigorous approaches.

  7. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  8. A Translator Verification Technique for FPGA Software Development in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Yeob; Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2014-10-15

    Although the FPGAs give a high performance than PLC (Programmable Logic Controller), the platform change from PLC to FPGA impose all PLC software engineers give up their experience, knowledge and practices accumulated over decades, and start a new FPGA-based hardware development from scratch. We have researched to fine the solution to this problem reducing the risk and preserving the experience and knowledge. One solution is to use the FBDtoVerilog translator, which translates the FBD programs into behavior-preserving Verilog programs. In general, the PLCs are usually designed with an FBD, while the FPGAs are described with a HDL (Hardware Description Language) such as Verilog or VHDL. Once PLC designer designed the FBD programs, the FBDtoVerilog translates the FBD into Verilog, mechanically. The designers, therefore, need not consider the rest of FPGA development process (e.g., Synthesis and Place and Routing) and can preserve the accumulated experience and knowledge. Even if we assure that the translation from FBD to Verilog is correct, it must be verified rigorously and thoroughly since it is used in nuclear power plants, which is one of the most safety critical systems. While the designer develops the FPGA software with the FBD program translated by the translator, there are other translation tools such as synthesis tool and place and routing tool. This paper also focuses to verify them rigorously and thoroughly. There are several verification techniques for correctness of translator, but they are hard to apply because of the outrageous cost and performance time. Instead, this paper tries to use an indirect verification technique for demonstrating the correctness of translator using the co-simulation technique. We intend to prove only against specific inputs which are under development for a target I and C system, not against all possible input cases.

  9. Verification of precipitation forecasts by the DWD limited area model LME over Cyprus

    Directory of Open Access Journals (Sweden)

    K. Savvidou

    2007-01-01

    Full Text Available A comparison is made between the precipitation forecasts by the non-hydrostatic limited area model LME of the German Weather Service (DWD and observations from a network of rain gauges in Cyprus. This is a first attempt to carry out a preliminary verification and evaluation of the LME precipitation forecasts over the area of Cyprus. For the verification, model forecasts and observations were used covering an eleven month period, from 1/2/2005 till 31/12/2005. The observations were made by three Automatic Weather Observing Systems (AWOS located at Larnaka and Paphos airports and at Athalassa synoptic station, as well as at 6, 6 and 8 rain gauges within a radius of about 30 km around these stations, respectively. The observations were compared with the model outputs, separately for each of the three forecast days. The "probability of detection" (POD of a precipitation event and the "false alarm rate" (FAR were calculated. From the selected cases of the forecast precipitation events, the average forecast precipitation amounts in the area around the three stations were compared with the measured ones. An attempt was also made to evaluate the model's skill in predicting the spatial distribution of precipitation and, in this respect, the geographical position of the maximum forecast precipitation amount was contrasted to the position of the corresponding observed maximum. Maps with monthly precipitation totals observed by a local network of 150 rain gauges were compared with the corresponding forecast precipitation maps.

  10. Linear models to perform treaty verification tasks for enhanced information security

    Science.gov (United States)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  11. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Energy Technology Data Exchange (ETDEWEB)

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  12. Certification and verification for Northrup model NSC-01-0732 fresnel lens concentrating solar collector

    Science.gov (United States)

    1979-01-01

    Structural analysis and certification of the collector system is presented. System verification against the interim performance criteria is presented and indicated by matrices. The verification discussion, analysis, and test results are also given.

  13. Multi-dimensional boron transport modeling in subchannel approach: Part I. Model selection, implementation and verification of COBRA-TF boron tracking model

    Energy Technology Data Exchange (ETDEWEB)

    Ozdemir, Ozkan Emre, E-mail: ozdemir@psu.edu [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, University Park, PA 16802 (United States); Avramova, Maria N., E-mail: mna109@psu.edu [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, University Park, PA 16802 (United States); Sato, Kenya, E-mail: kenya_sato@mhi.co.jp [Mitsubishi Heavy Industries (MHI), Kobe (Japan)

    2014-10-15

    Highlights: ► Implementation of multidimensional boron transport model in a subchannel approach. ► Studies on cross flow mechanism, heat transfer and lateral pressure drop effects. ► Verification of the implemented model via code-to-code comparison with CFD code. - Abstract: The risk of reflux condensation especially during a Small Break Loss Of Coolant Accident (SB-LOCA) and the complications of tracking the boron concentration experimentally inside the primary coolant system have stimulated and subsequently have been a focus of many computational studies on boron tracking simulations in nuclear reactors. This paper presents the development and implementation of a multidimensional boron transport model with Modified Godunov Scheme within a thermal-hydraulic code based on a subchannel approach. The cross flow mechanism in multiple-subchannel rod bundle geometry as well as the heat transfer and lateral pressure drop effects are considered in the performed studies on simulations of deboration and boration cases. The Pennsylvania State University (PSU) version of the COBRA-TF (CTF) code was chosen for the implementation of three different boron tracking models: First Order Accurate Upwind Difference Scheme, Second Order Accurate Godunov Scheme, and Modified Godunov Scheme. Based on the performed nodalization sensitivity studies, the Modified Godunov Scheme approach with a physical diffusion term was determined to provide the best solution in terms of precision and accuracy. As a part of the verification and validation activities, a code-to-code comparison was carried out with the STAR-CD computational fluid dynamics (CFD) code and presented here. The objective of this study was two-fold: (1) to verify the accuracy of the newly developed CTF boron tracking model against CFD calculations; and (2) to investigate its numerical advantages as compared to other thermal-hydraulics codes.

  14. Formal Verification of a Secure Model for Building E-Learning Systems

    Directory of Open Access Journals (Sweden)

    Farhan M Al Obisat

    2016-06-01

    Full Text Available Internet is considered as common medium for E-learning to connect several parties with each other (instructors and students as they are supposed to be far away from each other. Both wired and wireless networks are used in this learning environment to facilitate mobile access to educational systems. This learning environment requires a secure connection and data exchange. An E-learning model was implemented and evaluated by conducting student’s experiments. Before the approach is deployed in the real world a formal verification for the model is completed which shows that unreachability case does not exist. The model in this paper which is concentrated on the security of e-content has successfully validated the model using SPIN Model Checker where no errors were found.

  15. Truth in Complex Adaptive Systems Models Should BE Based on Proof by Constructive Verification

    Science.gov (United States)

    Shipworth, David

    It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. `Emergent' properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.

  16. Development and verification of a real-time stochastic precipitation nowcasting system for urban hydrology in Belgium

    Directory of Open Access Journals (Sweden)

    L. Foresti

    2015-07-01

    Full Text Available The Short-Term Ensemble Prediction System (STEPS is implemented in real-time at the Royal Meteorological Institute (RMI of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e. the observation that large scale rainfall structures are more persistent and predictable than small scale convective cells. This paper presents the development, adaptation and verification of the system STEPS for Belgium (STEPS-BE. STEPS-BE provides in real-time 20 member ensemble precipitation nowcasts at 1 km and 5 min resolution up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60–90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 80–90 % of the forecast errors.

  17. Development of a Compton camera for online ion beam range verification via prompt γ detection

    Energy Technology Data Exchange (ETDEWEB)

    Aldawood, Saad [Ludwig-Maximilians-Universitaet Muenchen (Germany); King Saud University, Riyadh (Saudi Arabia); Lang, Christian; Lutter, Rudolf; Bortfeldt, Jonathan; Parodi, Katia; Thirolf, Peter G. [Ludwig-Maximilians-Universitaet Muenchen (Germany); Kolff, Hugh van der [Ludwig-Maximilians-Universitaet Muenchen (Germany); Delft University of Technology (Netherlands); Maier, Ludwig [Technische Universitaet Muenchen (Germany)

    2014-07-01

    Precise and preferably online ion beam range verification is a mandatory prerequisite to fully exploit the advantages of hadron-therapy in cancer treatment. Our aim is to develop an imaging system based on a Compton camera designed to detect prompt γ rays induced by nuclear reactions between ion beam and biological tissue. The Compton camera prototype consists of a stack of double-sided Si-strip detectors (DSSSD) acting as scatterers, while the absorber is formed by a LaBr{sub 3} scintillator crystal read out by a position-sensitive multi-anode photomultiplier. The LaBr{sub 3} detector was characterized with both absorptive and reflective side-face wrapping materials. Comparative studies of energy and time resolution, photopeak detection efficiency and spatial resolution are presented together with first tests of the complete camera system.

  18. Development of a Compton camera for online ion beam range verification via prompt γ detection

    Energy Technology Data Exchange (ETDEWEB)

    Aldawood, S. [LMU Munich, Garching (Germany); King Saud University, Riyadh (Saudi Arabia); Liprandi, S.; Marinsek, T.; Bortfeldt, J.; Lang, C.; Lutter, R.; Dedes, G.; Parodi, K.; Thirolf, P.G. [LMU Munich, Garching (Germany); Maier, L.; Gernhaeuser, R. [TU Munich, Garching (Germany); Kolff, H. van der [LMU Munich, Garching (Germany); TU Delft (Netherlands); Castelhano, I. [LMU Munich, Garching (Germany); University of Lisbon, Lisbon (Portugal); Schaart, D.R. [TU Delft (Netherlands)

    2015-07-01

    Precise and preferably online ion beam range verification is a mandatory prerequisite to fully exploit the advantages of hadron therapy in cancer treatment. An imaging system is being developed in Garching aiming to detect promptγ rays induced by nuclear reactions between the ion beam and biological tissue. The Compton camera prototype consists of a stack of six customized double-sided Si-strip detectors (DSSSD, 50 x 50 mm{sup 2}, 0.5 mm thick, 128 strips/side) acting as scatterer, while the absorber is formed by a monolithic LaBr{sub 3}:Ce scintillator crystal (50 x 50 x 30 mm{sup 3}) read out by a position-sensitive multi-anode photomultiplier (Hamamatsu H9500). The on going characterization of the Compton camera properties and its individual components both offline in the laboratory as well as online using proton beam are presented.

  19. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    Science.gov (United States)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture

  20. Modeling and Verification of Reconfigurable and Energy-Efficient Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Jiafeng Zhang

    2015-01-01

    Full Text Available This paper deals with the formal modeling and verification of reconfigurable and energy-efficient manufacturing systems (REMSs that are considered as reconfigurable discrete event control systems. A REMS not only allows global reconfigurations for switching the system from one configuration to another, but also allows local reconfigurations on components for saving energy when the system is in a particular configuration. In addition, the unreconfigured components of such a system should continue running during any reconfiguration. As a result, during a system reconfiguration, the system may have several possible paths and may fail to meet control requirements if concurrent reconfiguration events and normal events are not controlled. To guarantee the safety and correctness of such complex systems, formal verification is of great importance during a system design stage. This paper extends the formalism reconfigurable timed net condition/event systems (R-TNCESs in order to model all possible dynamic behavior in such systems. After that, the designed system based on extended R-TNCESs is verified with the help of a software tool SESA for functional, temporal, and energy-efficient properties. This paper is illustrated by an automatic assembly system.

  1. Forecast Verification for North American Mesoscale (NAM) Operational Model over Karst/Non-Karst regions

    Science.gov (United States)

    Sullivan, Z.; Fan, X.

    2014-12-01

    Karst is defined as a landscape that contains especially soluble rocks such as limestone, gypsum, and marble in which caves, underground water systems, over-time sinkholes, vertical shafts, and subterranean river systems form. The cavities and voids within a karst system affect the hydrology of the region and, consequently, can affect the moisture and energy budget at surface, the planetary boundary layer development, convection, and precipitation. Carbonate karst landscapes comprise about 40% of land areas over the continental U.S east of Tulsa, Oklahoma. Currently, due to the lack of knowledge of the effects karst has on the atmosphere, no existing weather model has the capability to represent karst landscapes and to simulate its impact. One way to check the impact of a karst region on the atmosphere is to check the performance of existing weather models over karst and non-karst regions. The North American Mesoscale (NAM) operational forecast is the best example, of which historical forecasts were archived. Variables such as precipitation, maximum/minimum temperature, dew point, evapotranspiration, and surface winds were taken into account when checking the model performance over karst versus non-karst regions. The forecast verification focused on a five-year period from 2007-2011. Surface station observations, gridded observational dataset, and North American Regional Reanalysis (for certain variables with insufficient observations) were used. Thirteen regions of differing climate, size, and landscape compositions were chosen across the Contiguous United States (CONUS) for the investigation. Equitable threat score (ETS), frequency bias (fBias), and root-mean-square error (RMSE) scores were calculated and analyzed for precipitation. RMSE and mean bias (Bias) were analyzed for other variables. ETS, fBias, and RMSE scores show generally a pattern of lower forecast skills, a greater magnitude of error, and a greater under prediction of precipitation over karst than

  2. 厌氧发酵反应器一维稳态传热模型的建立与验证%Development and verification of one-dimensional model of steady-state heat transfer for anaerobic fermentation reactor

    Institute of Scientific and Technical Information of China (English)

    刘建禹; 陈泽兴; 李文涛

    2012-01-01

    在北方高寒地区,采取适当的加热及保温措施,确保沼气厌氧发酵所需的稳定温度,是关系到沼气工程冬季能否正常运行的关键所在.厌氧发酵反应器传热耗热量是沼气发酵料液加热系统设计的最基本的数据,它直接影响着加热系统方案的选择、供热管道管径和加热器等主要设备的确定.该文在稳态传热理论的基础上,通过对集厌氧发酵和沼气收集为一体式的全地上反应器传热过程的理论分析,建立了一维稳态传热模型,并通过试验对传热模型进行了修正和验证.结果表明,通过模型计算得到的模拟值与实际值在统计上没有明显差异,传热模型可用于反应器耗热量的计算.这为今后大型沼气工程中厌氧发酵反应器热负荷的计算和反应器能耗的预测提供了依据.%In order to ensure normal operation of the biogas engineering in the alpine regions of northern of China in winter, it is a key factor to keep the stable temperature in the biogas anaerobic fermentation process by means of measurements of proper heating and warming preservation. Thermal consumption of heat transfer in the anaerobic fermentation reactor is the basis for the design of fermentation liquid heating system. It has obvious effect on the selection of heating scheme and determination of major equipments, such as heating system, diameter of heating pipes and heater. Basic on steady state heat transfer theory, a one-dimensional steady heat transfer model was developed by the heat transfer analysis for an integral and overground collection of anaerobic fermentation reactor and methane. This model was modified and validated. The results show that there is no significant difference between the simulation value and the actual value. Therefore, the heat transfer model is feasible for the calculation of thermal consumption of rector. This model can be used for the calculation of heat load and prediction of energy consumption

  3. Formal verification technique for grid service chain model and its application

    Institute of Scientific and Technical Information of China (English)

    XU Ke; WANG YueXuan; WU Cheng

    2007-01-01

    Ensuring the correctness and reliability of large-scale resource sharing and complex job processing is an important task for grid applications. From a formal method perspective, a grid service chain model based on state Pi calculus is proposed in this work as the theoretical foundation for the service composition and collaboration in grid. Following the idea of the Web Service Resource Framework (WSRF), state Pi calculus enables the life-cycle management of system states by associating the actions in the original Pi calculus with system states. Moreover, model checking technique is exploited for the design-time and run-time logical verification of grid service chain models. A grid application scenario of the dynamic analysis of material deformation structure is also provided to show the effectiveness of the proposed work.

  4. Development of the Verification and Validation Matrix for Safety Analysis Code SPACE

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yo Han; Ha, Sang Jun; Yang, Chang Keun [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    2009-10-15

    Korea Electric Power Research Institute (KEPRI) has been developed the safety analysis code, called as SPACE (Safety and Performance Analysis CodE for Nuclear Power Plant), for typical pressurized water reactors (PWR). Current safety analysis codes were conducted from foreign vendors, such as Westinghouse Electric Corp., ABB Combustion Engineering Inc., Kraftwerk Union, etc. Considering the conservatism and inflexibility of the foreign code systems, it is difficult to expand the application areas and analysis scopes. To overcome the mentioned problems KEPRI has launched the project to develop the native safety analysis code with Korea Power Engineering Co.(KOPEC), Korea Atomic Energy Research Inst.(KAERI), Korea Nuclear Fuel(KNF), and Korea Hydro and Nuclear Power Co.(KHNP) under the funding of Ministry of Knowledge Economy (MKE). As a result of the project, the demo-version of SPACE has been released in July 2009. As an advance preparation of the next step, KEPRI and colleagues have developed the verification and validation (V and V) matrix for SPACE. To develop the matrix, the preceding studies and experiments were reviewed. After mature consideration, the V and V matrix has been developed and the experiment plans were designed for the next step to compensate the lack of data.

  5. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  6. Development of Out-pile Test Technology for Fuel Assembly Performance Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Tae Hyun; In, W. K.; Oh, D. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)] (and others)

    2007-03-15

    Out-pile tests with full scale fuel assembly are to verify the design and to evaluate the performance of the final products. HTL for the hydraulic tests and FAMeCT for mechanical/structural tests were constructed in this project. The maximum operating conditions of HTL are 30 bar, 320 .deg. C, and 500 m3/hr. This facility can perform the pressure drop test, fuel assembly uplift test, and flow induced vibration test. FAMeCT can perform the bending and vibration tests. The verification of the developed facilities were carried out by comparing the reference data of the fuel assembly which was obtained at the Westinghouse Co. The compared data showed a good coincidence within uncertainties. FRETONUS was developed for high temperature and high pressure fretting wear simulator and performance test. A performance test was conducted for 500 hours to check the integrity, endurance, data acquisition capability of the simulator. The technology of turbulent flow analysis and finite element analysis by computation was developed. From the establishments of out-pile test facilities for full scale fuel assembly, the domestic infrastructure for PWR fuel development has been greatly upgraded.

  7. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  8. Verification and intercomparison of mesoscale ensemble prediction systems in the Beijing 2008 Olympics Research and Development Project

    Science.gov (United States)

    Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui

    2011-05-01

    During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.

  9. Wave dispersion in the hybrid-Vlasov model: verification of Vlasiator

    CERN Document Server

    Kempf, Yann; von Alfthan, Sebastian; Vaivads, Andris; Palmroth, Minna; Koskinen, Hannu E J

    2013-01-01

    Vlasiator is a new hybrid-Vlasov plasma simulation code aimed at simulating the entire magnetosphere of the Earth. The code treats ions (protons) kinetically through Vlasov's equation in the six-dimensional phase space while electrons are a massless charge-neutralizing fluid [M. Palmroth et al., Journal of Atmospheric and Solar-Terrestrial Physics 99, 41 (2013); A. Sandroos et al., Parallel Computing 39, 306 (2013)]. For first global simulations of the magnetosphere, it is critical to verify and validate the model by established methods. Here, as part of the verification of Vlasiator, we characterize the low-\\beta\\ plasma wave modes described by this model and compare with the solution computed by the Waves in Homogeneous, Anisotropic Multicomponent Plasmas (WHAMP) code [K. R\\"onnmark, Kiruna Geophysical Institute Reports 179 (1982)], using dispersion curves and surfaces produced with both programs. The match between the two fundamentally different approaches is excellent in the low-frequency, long wavelength...

  10. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    Directory of Open Access Journals (Sweden)

    Paweł Drapikowski

    2016-06-01

    Full Text Available This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated.

  11. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    Science.gov (United States)

    Joseph, Shijo; Herold, Martin; Sunderlin, William D.; Verchot, Louis V.

    2013-09-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed.

  12. Metal fuel development and verification for prototype generation- IV Sodium- Cooled Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chan Bock; Cheon, Jin Sik; Kim, Sung Ho; Park, Jeong Yong; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Metal fuel is being developed for the prototype generation-IV sodium-cooled fast reactor (PGSFR) to be built by 2028. U-Zr fuel is a driver for the initial core of the PGSFR, and U -transuranics (TRU)-Zr fuel will gradually replace U-Zr fuel through its qualification in the PGSFR. Based on the vast worldwide experiences of U-Zr fuel, work on U-Zr fuel is focused on fuel design, fabrication of fuel components, and fuel verification tests. U-TRU-Zr fuel uses TRU recovered through pyroelectrochemical processing of spent PWR (pressurized water reactor) fuels, which contains highly radioactive minor actinides and chemically active lanthanide or rare earth elements as carryover impurities. An advanced fuel slug casting system, which can prevent vaporization of volatile elements through a control of the atmospheric pressure of the casting chamber and also deal with chemically active lanthanide elements using protective coatings in the casting crucible, was developed. Fuel cladding of the ferritic-martensitic steel FC92, which has higher mechanical strength at a high temperature than conventional HT9 cladding, was developed and fabricated, and is being irradiated in the fast reactor.

  13. Numerical verification of similar Cam-clay model based on generalized potential theory

    Institute of Scientific and Technical Information of China (English)

    钟志辉; 杨光华; 傅旭东; 温勇; 张玉成

    2014-01-01

    From the mathematical principles, the generalized potential theory can be employed to create constitutive model of geomaterial directly. The similar Cam-clay model, which is created based on the generalized potential theory, has less assumptions, clearer mathematical basis, and better computational accuracy. Theoretically, it is more scientific than the traditional Cam-clay models. The particle flow code PFC3D was used to make numerical tests to verify the rationality and practicality of the similar Cam-clay model. The verification process was as follows: 1) creating the soil sample for numerical test in PFC3D, and then simulating the conventional triaxial compression test, isotropic compression test, and isotropic unloading test by PFC3D; 2) determining the parameters of the similar Cam-clay model from the results of above tests; 3) predicting the sample’s behavior in triaxial tests under different stress paths by the similar Cam-clay model, and comparing the predicting results with predictions by the Cam-clay model and the modified Cam-clay model. The analysis results show that the similar Cam-clay model has relatively high prediction accuracy, as well as good practical value.

  14. Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.

    Science.gov (United States)

    Chen, Joseph C.; Chang, Ted C.

    2000-01-01

    Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)

  15. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  16. Development of optical ground verification method for μm to sub-mm reflectors

    Science.gov (United States)

    Stockman, Y.; Thizy, C.; Lemaire, P.; Georges, M.; Mazy, E.; Mazzoli, A.; Houbrechts, Y.; Rochus, P.; Roose, S.; Doyle, D.; Ulbrich, G.

    2004-06-01

    develop and realise suitable verification tools based on infrared interferometry and other optical techniques for testing large reflector structures, telescope configurations and their performances under simulated space conditions. The first one is an IR-phase shifting interferometer with high spatial resolution. This interferometer shall be used specifically for the verification of high precision IR, FIR and sub-mm reflector surfaces and telescopes under both ambient and thermal vacuum conditions. The second one presented hereafter is a holographic method for relative shape measurement. The holographic solution proposed makes use of a home built vacuum compatible holographic camera that allows displacement measurements from typically 20 nanometres to 25 microns in one shot. An iterative process allows the measurement of a total of up to several mm of deformation. Uniquely the system is designed to measure both specular and diffuse surfaces.

  17. Community Radiative Transfer Model for Inter-Satellites Calibration and Verification

    Science.gov (United States)

    Liu, Q.; Nalli, N. R.; Ignatov, A.; Garrett, K.; Chen, Y.; Weng, F.; Boukabara, S. A.; van Delst, P. F.; Groff, D. N.; Collard, A.; Joseph, E.; Morris, V. R.; Minnett, P. J.

    2014-12-01

    Developed at the Joint Center for Satellite Data Assimilation, the Community Radiative Transfer Model (CRTM) [1], operationally supports satellite radiance assimilation for weather forecasting. The CRTM also supports JPSS/NPP and GOES-R missions [2] for instrument calibration, validation, monitoring long-term trending, and satellite retrieved products [3]. The CRTM is used daily at the NOAA NCEP to quantify the biases and standard deviations between radiance simulations and satellite radiance measurements in a time series and angular dependency. The purposes of monitoring the data assimilation system are to ensure the proper performance of the assimilation system and to diagnose problems with the system for future improvements. The CRTM is a very useful tool for cross-sensor verifications. Using the double difference method, it can remove the biases caused by slight differences in spectral response and geometric angles between measurements of the two instruments. The CRTM is particularly useful to reduce the difference between instruments for climate studies [4]. In this study, we will carry out the assessment of the Suomi National Polar-orbiting Partnership (SNPP) [5] Cross-track Infrared Sounder (CrIS) data [6], Advanced Technology Microwave Sounder (ATMS) data, and data for Visible Infrared Imaging Radiometer Suite (VIIRS) [7][8] thermal emissive bands. We use dedicated radiosondes and surface data acquired from NOAA Aerosols and Ocean Science Expeditions (AEROSE) [9]. The high quality radiosondes were launched when Suomi NPP flew over NOAA Ship Ronald H. Brown situated in the tropical Atlantic Ocean. The atmospheric data include profiles of temperature, water vapor, and ozone, as well as total aerosol optical depths. The surface data includes air temperature and humidity at 2 meters, skin temperature (Marine Atmospheric Emitted Radiance Interferometer, M-AERI [10]), surface temperature, and surface wind vector. [1] Liu, Q., and F. Weng, 2006: JAS [2] Liu, Q

  18. Operational Characteristics Identification and Simulation Model Verification for Incheon International Airport

    Science.gov (United States)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Zhu, Zhifan; Jung, Yoon C.; Jeong, Myeongsook; Kim, Hyounkyong; Oh, Eunmi; Hong, Sungkwon; Lee, Junwon

    2016-01-01

    integrated into NASA's Airspace Technology Demonstration-2 (ATD-2) project for technology demonstration of Integrated Arrival-Departure-Surface (IADS) operations at CLT. This study is a part of the international research collaboration between KAIA (Korea Agency for Infrastructure Technology Advancement), KARI (Korea Aerospace Research Institute) and NASA, which is being conducted to validate the effectiveness of SARDA concept as a controller decision support tool for departure and surface management of ICN. This paper presents the preliminary results of the collaboration effort. It includes investigation of the operational environment of ICN, data analysis for identification of the operational characteristics of the airport, construction and verification of airport simulation model using Surface Operations Simulator and Scheduler (SOSS), NASA's fast-time simulation tool.

  19. Real-Time Kennedy Space Center and Cape Canaveral Air Force Station High-Resolution Model Implementation and Verification

    Science.gov (United States)

    Shafer, Jaclyn A.; Watson, Leela R.

    2015-01-01

    Customer: NASA's Launch Services Program (LSP), Ground Systems Development and Operations (GSDO), and Space Launch System (SLS) programs. NASA's LSP, GSDO, SLS and other programs at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) use the daily and weekly weather forecasts issued by the 45th Weather Squadron (45 WS) as decision tools for their day-to-day and launch operations on the Eastern Range (ER). For example, to determine if they need to limit activities such as vehicle transport to the launch pad, protect people, structures or exposed launch vehicles given a threat of severe weather, or reschedule other critical operations. The 45 WS uses numerical weather prediction models as a guide for these weather forecasts, particularly the Air Force Weather Agency (AFWA) 1.67 kilometer Weather Research and Forecasting (WRF) model. Considering the 45 WS forecasters' and Launch Weather Officers' (LWO) extensive use of the AFWA model, the 45 WS proposed a task at the September 2013 Applied Meteorology Unit (AMU) Tasking Meeting requesting the AMU verify this model. Due to the lack of archived model data available from AFWA, verification is not yet possible. Instead, the AMU proposed to implement and verify the performance of an ER version of the AMU high-resolution WRF Environmental Modeling System (EMS) model (Watson 2013) in real-time. The tasking group agreed to this proposal; therefore the AMU implemented the WRF-EMS model on the second of two NASA AMU modeling clusters. The model was set up with a triple-nested grid configuration over KSC/CCAFS based on previous AMU work (Watson 2013). The outer domain (D01) has 12-kilometer grid spacing, the middle domain (D02) has 4-kilometer grid spacing, and the inner domain (D03) has 1.33-kilometer grid spacing. The model runs a 12-hour forecast every hour, D01 and D02 domain outputs are available once an hour and D03 is every 15 minutes during the forecast period. The AMU assessed the WRF-EMS 1

  20. Verification of a laboratory-based dilation model for in situ conditions using continuum models

    Institute of Scientific and Technical Information of China (English)

    G. Walton; M.S. Diederichs; L.R. Alejano; J. Arzúa

    2014-01-01

    With respect to constitutive models for continuum modeling applications, the post-yield domain re-mains the area of greatest uncertainty. Recent studies based on laboratory testing have led to the development of a number of models for brittle rock dilation, which account for both the plastic shear strain and confining stress dependencies of this phenomenon. Although these models are useful in providing an improved understanding of how dilatancy evolves during a compression test, there has been relatively little work performed examining their validity for modeling brittle rock yield in situ. In this study, different constitutive models for rock dilation are reviewed and then tested, in the context of a number of case studies, using a continuum finite-difference approach (FLAC). The uncertainty associated with the modeling of brittle fracture localization is addressed, and the overall ability of mobilized dilation models to replicate in situ deformation measurements and yield patterns is evaluated.

  1. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  2. Verification of communication protocols in web services model-checking service compositions

    CERN Document Server

    Tari, Zahir; Mukherjee, Anshuman

    2014-01-01

    Gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with the essential, state-of-the-art information about sensor networking. In the near future, wireless sensor networks will become an integral part of our day-to-day life. To solve different sensor networking related issues, researchers have put a great deal of effort into coming up with innovative ideas. Verification of Communication Protocols in Web Services: Model-Checking Service Compositions gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with integral information about sensor networking. It introduces current technological trends, particularly in node organization, and provides implementation details of each networking type to help readers set up sensor networks in their related job fields. In addition, it identifies the limitations of current technologies, as well as future research directions.

  3. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  4. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  5. Prototyping the Semantics of a DSL using ASF+SDF: Link to Formal Verification of DSL Models

    CERN Document Server

    Andova, Suzana; Engelen, Luc; 10.4204/EPTCS.56.5

    2011-01-01

    A formal definition of the semantics of a domain-specific language (DSL) is a key prerequisite for the verification of the correctness of models specified using such a DSL and of transformations applied to these models. For this reason, we implemented a prototype of the semantics of a DSL for the specification of systems consisting of concurrent, communicating objects. Using this prototype, models specified in the DSL can be transformed to labeled transition systems (LTS). This approach of transforming models to LTSs allows us to apply existing tools for visualization and verification to models with little or no further effort. The prototype is implemented using the ASF+SDF Meta-Environment, an IDE for the algebraic specification language ASF+SDF, which offers efficient execution of the transformation as well as the ability to read models and produce LTSs without any additional pre or post processing.

  6. Prototyping the Semantics of a DSL using ASF+SDF: Link to Formal Verification of DSL Models

    Directory of Open Access Journals (Sweden)

    Suzana Andova

    2011-06-01

    Full Text Available A formal definition of the semantics of a domain-specific language (DSL is a key prerequisite for the verification of the correctness of models specified using such a DSL and of transformations applied to these models. For this reason, we implemented a prototype of the semantics of a DSL for the specification of systems consisting of concurrent, communicating objects. Using this prototype, models specified in the DSL can be transformed to labeled transition systems (LTS. This approach of transforming models to LTSs allows us to apply existing tools for visualization and verification to models with little or no further effort. The prototype is implemented using the ASF+SDF Meta-Environment, an IDE for the algebraic specification language ASF+SDF, which offers efficient execution of the transformation as well as the ability to read models and produce LTSs without any additional pre or post processing.

  7. Modeling the Magnetospheric X-ray Emission from Solar Wind Charge Exchange with Verification from XMM-Newton Observations

    Science.gov (United States)

    2016-08-26

    and Astronomy, University of Leicester, Leicester, UK, 2Finnish Meteorological Institute, Helsinki, Finland Abstract An MHD-based model of terrestrial...check confirms that we should continue the analysis with these new simulations. Figure 9 shows the comparison of these newly calculated model count rates...Journal of Geophysical Research: Space Physics Modeling the magnetospheric X-ray emission from solar wind charge exchange with verification from XMM

  8. Use of an Existing Airborne Radon Data Base in the Verification of the NASA/AEAP Core Model

    Science.gov (United States)

    Kritz, Mark A.

    1998-01-01

    The primary objective of this project was to apply the tropospheric atmospheric radon (Rn222) measurements to the development and verification of the global 3-D atmospheric chemical transport model under development by NASA's Atmospheric Effects of Aviation Project (AEAP). The AEAP project had two principal components: (1) a modeling effort, whose goal was to create, test and apply an elaborate three-dimensional atmospheric chemical transport model (the NASA/AEAP Core model to an evaluation of the possible short and long-term effects of aircraft emissions on atmospheric chemistry and climate--and (2) a measurement effort, whose goal was to obtain a focused set of atmospheric measurements that would provide some of the observational data used in the modeling effort. My activity in this project was confined to the first of these components. Both atmospheric transport and atmospheric chemical reactions (as well the input and removal of chemical species) are accounted for in the NASA/AEAP Core model. Thus, for example, in assessing the effect of aircraft effluents on the chemistry of a given region of the upper troposphere, the model must keep track not only of the chemical reactions of the effluent species emitted by aircraft flying in this region, but also of the transport into the region of these (and other) species from other, remote sources--for example, via the vertical convection of boundary layer air to the upper troposphere. Radon, because of its known surface source and known radioactive half-life, and freedom from chemical production or loss, and from removal from the atmosphere by physical scavenging, is a recognized and valuable tool for testing the transport components of global transport and circulation models.

  9. Program Verification with Monadic Second-Order Logic & Languages for Web Service Development

    DEFF Research Database (Denmark)

    Møller, Anders

    that only valid HTML documents are ever shown to the clients at runtime and that the documents are constructed consistently. In addition, the language design provides support for declarative form-field validation, caching of dynamic documents, concurrency control based on temporal-logic specifications......Domain-specific formal languages are an essential part of computer science, combining theory and practice. Such languages are characterized by being tailor-made for specific application domains and thereby providing expressiveness on high abstraction levels and allowing specialized analysis...... and verification techniques. This dissertation describes two projects, each exploring one particular instance of such languages: monadic second-order logic and its application to program verification, and programming languages for construction of interactive Web services. Both program verification and Web service...

  10. A Modeling and Verification Study of Summer Precipitation Systems Using NASA Surface Initialization Datasets

    Science.gov (United States)

    Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.

    2010-01-01

    One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using

  11. Manufactured solutions and the verification of three-dimensional Stokes ice-sheet models

    Directory of Open Access Journals (Sweden)

    W. Leng

    2013-01-01

    Full Text Available The manufactured solution technique is used for the verification of computational models in many fields. In this paper, we construct manufactured solutions for the three-dimensional, isothermal, nonlinear Stokes model for flows in glaciers and ice sheets. The solution construction procedure starts with kinematic boundary conditions and is mainly based on the solution of a first-order partial differential equation for the ice velocity that satisfies the incompressibility condition. The manufactured solutions depend on the geometry of the ice sheet, basal sliding parameters, and ice softness. Initial conditions are taken from the periodic geometry of a standard problem of the ISMIP-HOM benchmark tests. The upper surface is altered through the manufactured solution procedure to generate an analytic solution for the time-dependent flow problem. We then use this manufactured solution to verify a parallel, high-order accurate, finite element Stokes ice-sheet model. Simulation results from the computational model show good convergence to the manufactured analytic solution.

  12. Verification of a 1-dimensional model for predicting shallow infiltration at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Hevesi, J.A.; Flint, A.L. [Geological Survey, Mercury, NV (United States); Flint, L.E. [Foothill Engineering Consultants, Mercury, Nevada (United States)

    1994-12-31

    A characterization of net infiltration rates is needed for site-scale evaluation of groundwater flow at Yucca Mountain, Nevada. Shallow infiltration caused by precipitation may be a potential source of net infiltration. A 1-dimensional finite difference model of shallow infiltration with a moisture-dependant evapotranspiration function and a hypothetical root-zone was calibrated and verified using measured water content profiles, measured precipitation, and estimated potential evapotranspiration. Monthly water content profiles obtained from January 1990 through October 1993 were measured by geophysical logging of 3 boreholes located in the alluvium channel of Pagany Wash on Yucca Mountain. The profiles indicated seasonal wetting and drying of the alluvium in response to winter season precipitation and summer season evapotranspiration above a depth of 2.5 meters. A gradual drying trend below a depth of 2.5 meters was interpreted as long-term redistribution and/or evapotranspiration following a deep infiltration event caused by runoff in Pagany Wash during 1984. An initial model, calibrated using the 1990 to 1 992 record, did not provide a satisfactory prediction of water content profiles measured in 1993 following a relatively wet winter season. A re-calibrated model using a modified, seasonally-dependent evapotranspiration function provided an improved fit to the total record. The new model provided a satisfactory verification using water content changes measured at a distance of 6 meters from the calibration site, but was less satisfactory in predicting changes at a distance of 18 meters.

  13. Verification of a 1-dimensional model for predicting shallow infiltration at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Hevesi, J.; Flint, A.L. [Geological Survey, Mercury, NV (United States); Flint, L.E. [Foothill Eng. Consultants, Mercury, NV (United States)

    1994-12-31

    A characterization of net infiltration rates is needed for site-scale evaluation of groundwater flow at Yucca Mountain, Nevada. Shallow infiltration caused by precipitation may be a potential source of net infiltration. A 1-dimensional finite difference model of shallow infiltration with a moisture-dependent evapotranspiration function and a hypothetical root-zone was calibrated and verified using measured water content profiles, measured precipitation, and estimated potential evapotranspiration. Monthly water content profiles obtained from January 1990 through October 1993 were measured by geophysical logging of 3 boreholes located in the alluvium channel of Pagany Wash on Yucca Mountain. The profiles indicated seasonal wetting and drying of the alluvium in response to winter season precipitation and summer season evapotranspiration above a depth of 2.5 meters. A gradual drying trend below a depth of 2.5 meters was interpreted as long-term redistribution and/or evapotranspiration following a deep infiltration event caused by runoff in Pagany Wash during 1984. An initial model, calibrated using the 1990 to 1992 record, did not provide a satisfactory prediction of water content profiles measured in 1993 following a relatively wet winter season. A re-calibrated model using a modified, seasonally-dependent evapotranspiration function provided an improved fit to the total record. The new model provided a satisfactory verification using water content changes measured at a distance of 6 meters from the calibration site, but was less satisfactory in predicting changes at a distance of 18 meters.

  14. Fiction and reality in the modelling world - Balance between simplicity and complexity, calibration and identifiability, verification and falsification

    DEFF Research Database (Denmark)

    Harremoës, P.; Madsen, H.

    1999-01-01

    Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable...... by calibration/verification on the basis of the data series available, which generates elements of sheer guessing - unless the universality of the model is be based on induction, i.e. experience from the sum of all previous investigations. There is a need to deal more explicitly with uncertainty...... and to incorporate that in the design, operation and control of urban drainage structures. (C) 1999 IAWQ Published by Elsevier Science Ltd. All rights reserved....

  15. Development and Verification of a Fully Coupled Simulator for Offshore Wind Turbines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Jonkman, J. M.; Buhl, M. L. Jr.

    2007-01-01

    This report outlines the development of an analysis tool capable of analyzing a variety of wind turbine, support platform, and mooring system configurations.The simulation capability was tested by model-to-model comparisons to ensure its correctness.

  16. Model based correction of placement error in EBL and its verification

    Science.gov (United States)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-05-01

    In maskmaking, the main source of error contributing to placement error is charging. DISPLACE software corrects the placement error for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. One important step is the calibration of physical model. A test layout on a single calibration mask was used for calibration. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for the verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE. A good correlation of the measured and predicted values of the correction confirmed the high accuracy of the charging placement error correction.

  17. Development and Experimental Verification of Key Techniques to Validate Remote Sensing Products

    Science.gov (United States)

    Li, X.; Wang, S. G.; Ge, Y.; Jin, R.; Liu, S. M.; Ma, M. G.; Shi, W. Z.; Li, R. X.; Liu, Q. H.

    2013-05-01

    Validation of remote sensing land products is a fundamental issue for Earth observation. Ministry of Science and Technology of the People's Republic of China (MOST) has launched a high-tech R&D Program named `Development and experimental verification of key techniques to validate remote sensing products' in 2011. This paper introduces the background, scientific objectives, research contents of this project and research result already achieved. The objectives of this project include (1) to build a technical specification for the validation of remote sensing products; (2) to investigate the performance, we will carry out a comprehensive remote sensing experiment on satellite - aircraft - ground truth and then modify Step 1 until reach the predefined requirement; (3) to establish a validation network of China for remote sensing products. In summer 2012, with support of the Heihe Watershed Allied Telemetry Experimental Research (HiWATER), field observations have been successfully conducted in the central stream of the Heihe River Basin, a typical inland river basin in northwest China. A flux observation matrix composed of eddy covariance (EC) and large aperture scintillometer (LAS), in addition to a densely distributed eco-hydrological wireless sensor network have been established to capture multi-scale heterogeneities of evapotranspiration (ET), leaf area index (LAI), soil moisture and temperature. Airborne missions have been flown with the payloads of imaging spectrometer, light detection and ranging (LiDAR), infrared thermal imager and microwave radiometer that provide various scales of aerial remote sensing observations. Satellite images with high resolution have been collected and pre-processed, e.g. PROBA-CHRIS and TerraSAR-X. Simultaneously, ground measurements have been conducted over specific sampling plots and transects to obtain validation data sets. With this setup complex problems are addressed, e.g. heterogeneity, scaling, uncertainty, and eventually to

  18. Modelling horizontal steam generator with ATHLET. Verification of different nodalization schemes and implementation of verified constitutive equations

    Energy Technology Data Exchange (ETDEWEB)

    Beliaev, J.; Trunov, N.; Tschekin, I. [OKB Gidropress (Russian Federation); Luther, W. [GRS Garching (Germany); Spolitak, S. [RNC-KI (Russian Federation)

    1995-12-31

    Currently the ATHLET code is widely applied for modelling of several Power Plants of WWER type with horizontal steam generators. A main drawback of all these applications is the insufficient verification of the models for the steam generator. This paper presents the nodalization schemes for the secondary side of the steam generator, the results of stationary calculations, and preliminary comparisons to experimental data. The consideration of circulation in the water inventory of the secondary side is proved to be necessary. (orig.). 3 refs.

  19. Performance and Probabilistic Verification of Regional Parameter Estimates for Conceptual Rainfall-runoff Models

    Science.gov (United States)

    Franz, K.; Hogue, T.; Barco, J.

    2007-12-01

    Identification of appropriate parameter sets for simulation of streamflow in ungauged basins has become a significant challenge for both operational and research hydrologists. This is especially difficult in the case of conceptual models, when model parameters typically must be "calibrated" or adjusted to match streamflow conditions in specific systems (i.e. some of the parameters are not directly observable). This paper addresses the performance and uncertainty associated with transferring conceptual rainfall-runoff model parameters between basins within large-scale ecoregions. We use the National Weather Service's (NWS) operational hydrologic model, the SACramento Soil Moisture Accounting (SAC-SMA) model. A Multi-Step Automatic Calibration Scheme (MACS), using the Shuffle Complex Evolution (SCE), is used to optimize SAC-SMA parameters for a group of watersheds with extensive hydrologic records from the Model Parameter Estimation Experiment (MOPEX) database. We then explore "hydroclimatic" relationships between basins to facilitate regionalization of parameters for an established ecoregion in the southeastern United States. The impact of regionalized parameters is evaluated via standard model performance statistics as well as through generation of hindcasts and probabilistic verification procedures to evaluate streamflow forecast skill. Preliminary results show climatology ("climate neighbor") to be a better indicator of transferability than physical similarities or proximity ("nearest neighbor"). The mean and median of all the parameters within the ecoregion are the poorest choice for the ungauged basin. The choice of regionalized parameter set affected the skill of the ensemble streamflow hindcasts, however, all parameter sets show little skill in forecasts after five weeks (i.e. climatology is as good an indicator of future streamflows). In addition, the optimum parameter set changed seasonally, with the "nearest neighbor" showing the highest skill in the

  20. Toward automated parasitic extraction of silicon photonics using layout physical verifications

    Science.gov (United States)

    Ismail, Mohamed; El Shamy, Raghi S.; Madkour, Kareem; Hammouda, Sherif; Swillam, Mohamed A.

    2016-08-01

    A physical verification flow of the layout of silicon photonic circuits is suggested. Simple empirical models are developed to estimate the bend power loss and coupled power in photonic integrated circuits fabricated using SOI standard wafers. These models are utilized in physical verification flow of the circuit layout to verify reliable fabrication using any electronic design automation tool. The models are accurate compared with electromagnetic solvers. The models are closed form and circumvent the need to utilize any EM solver for the verification process. Hence, it dramatically reduces the time of the verification process.

  1. Verification test of the SURF and SURFplus models in xRage

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-18

    As a verification test of the SURF and SURFplus models in the xRage code we use a propagating underdriven detonation wave in 1-D. This is about the only test cases for which an accurate solution can be determined based on the theoretical structure of the solution. The solution consists of a steady ZND reaction zone profile joined with a scale invariant rarefaction or Taylor wave and followed by a constant state. The end of the reaction profile and the head of the rarefaction coincide with the sonic CJ state of the detonation wave. The constant state is required to match a rigid wall boundary condition. For a test case, we use PBX 9502 with the same EOS and burn rate as previously used to test the shock detector algorithm utilized by the SURF model. The detonation wave is propagated for 10 μs (slightly under 80mm). As expected, the pointwise errors are largest in the neighborhood of discontinuities; pressure discontinuity at the lead shock front and pressure derivative discontinuities at the head and tail of the rarefaction. As a quantitative measure of the overall accuracy, the L2 norm of the difference of the numerical pressure and the exact solution is used. Results are presented for simulations using both a uniform grid and an adaptive grid that refines the reaction zone.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHOTOACOUSTIC SPECTROPHOTOMATER INNOVA AIR TECH INSTRUMENTS MODEL 1312 MULTI-GAS MONITOR

    Science.gov (United States)

    The U.S. Environmental Protection Agency, Through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report documents demons...

  3. Spot scanning proton therapy plan assessment: design and development of a dose verification application for use in routine clinical practice

    Science.gov (United States)

    Augustine, Kurt E.; Walsh, Timothy J.; Beltran, Chris J.; Stoker, Joshua B.; Mundy, Daniel W.; Parry, Mark D.; Bues, Martin; Fatyga, Mirek

    2016-04-01

    The use of radiation therapy for the treatment of cancer has been carried out clinically since the late 1800's. Early on however, it was discovered that a radiation dose sufficient to destroy cancer cells can also cause severe injury to surrounding healthy tissue. Radiation oncologists continually strive to find the perfect balance between a dose high enough to destroy the cancer and one that avoids damage to healthy organs. Spot scanning or "pencil beam" proton radiotherapy offers another option to improve on this. Unlike traditional photon therapy, proton beams stop in the target tissue, thus better sparing all organs beyond the targeted tumor. In addition, the beams are far narrower and thus can be more precisely "painted" onto the tumor, avoiding exposure to surrounding healthy tissue. To safely treat patients with proton beam radiotherapy, dose verification should be carried out for each plan prior to treatment. Proton dose verification systems are not currently commercially available so the Department of Radiation Oncology at the Mayo Clinic developed its own, called DOSeCHECK, which offers two distinct dose simulation methods: GPU-based Monte Carlo and CPU-based analytical. The three major components of the system include the web-based user interface, the Linux-based dose verification simulation engines, and the supporting services and components. The architecture integrates multiple applications, libraries, platforms, programming languages, and communication protocols and was successfully deployed in time for Mayo Clinic's first proton beam therapy patient. Having a simple, efficient application for dose verification greatly reduces staff workload and provides additional quality assurance, ultimately improving patient safety.

  4. 不同水分胁迫条件下DSSAT-CERES-Wheat模型的调参与验证%Parameter estimation and verification of DSSAT-CERES-Wheat model for simulation of growth and development of winter wheat under water stresses at different growth stages

    Institute of Scientific and Technical Information of China (English)

    姚宁; 周元刚; 宋利兵; 刘健; 李毅; 吴淑芳; 冯浩; 何建强

    2015-01-01

    error,ARE)和相对均方根误差(relative root mean squared error, RRMSE)分别为4.89%和5.18%。在冬小麦抽穗期和灌浆期受旱时,DSSAT-CERES-Wheat模型可以较好地模拟小麦的生长发育过程以及土壤水分的动态变化,但是在越冬期和返青期受旱时,模拟结果相对较差,并且随着受旱时段提前和受旱程度的加重,模拟精度将变得更低。此外,该模型无法模拟由不同水分胁迫造成的冬小麦物候期差异,需要对模型进行相应的改进。交叉验证表明 DSSAT-CERES-Wheat 模型模拟该研究中不同水分胁迫条件下冬小麦生长和产量的总体性误差在15%~18%左右。总之,DSSAT-CERES-Wheat模型在模拟旱区冬小麦生境过程时存在着一定的局限性,若要更广泛地将该模型应用在中国干旱半干旱地区的冬小麦生产管理和研究,有必要对冬小麦营养生长阶段前期的水分胁迫响应机制和模拟方法进行进一步的深入研究。%Crop growth simulation models are useful tools to help us understand and regulate the agro-ecological systems in arid areas. In this study, the CERES-Wheat, a wheat growth simulation model in the DSSAT (decision support system for agrotechnology transfer) software, was investigated for its ability to simulate the growth and yield of winter wheat (Triticum aestivum L.) in arid areas and to find the optimal plan for the estimation of genetic parameters and the model verification. Field experiments were conducted under a rainout shelter for winter wheat growing under water stresses at different growth stages in 2 growth seasons (from October 2012 to June 2013 and from October 2013 to June 2014). The whole growth season of wheat was divided into 5 growing stages (wintering, greening, jointing, heading and grain filling). Water stress occurred every 2 continuous stages while irrigations were applied at other stages, which resulted in 4 different levels of water stress period (D1

  5. Formal Verification of UML Profil

    DEFF Research Database (Denmark)

    Bhutto, Arifa; Hussain, Dil Muhammad Akbar

    2011-01-01

    The Unified Modeling Language (UML) is based on the Model Driven Development (MDD) approach which capturing the system functionality using the platform-independent model (PMI) and appropriate domain-specific languages. In UML base system notations, structural view is model by the class, components...... and object diagrams and behavioral view model by the activity, use case, state, and sequence diagram. However, UML does not provide the formal syntax, therefore its semantics is not formally definable, so for assure of correctness, we need to incorporate semantic reasoning through verification, specification...

  6. Program Verification of Numerical Computation

    OpenAIRE

    Pantelis, Garry

    2014-01-01

    These notes outline a formal method for program verification of numerical computation. It forms the basis of the software package VPC in its initial phase of development. Much of the style of presentation is in the form of notes that outline the definitions and rules upon which VPC is based. The initial motivation of this project was to address some practical issues of computation, especially of numerically intensive programs that are commonplace in computer models. The project evolved into a...

  7. PREDICTION AND VERIFICATION OF THE 1997-1999 EL NINO AND LA NINA BY USING AN INTERMEDIATE OCEAN-ATMOSPHERE COUPLED MODEL

    Institute of Scientific and Technical Information of China (English)

    李清泉; 赵宗慈; 丁一汇

    2001-01-01

    The numerical simulations, hindcasts and verifications of the tropical Pacific sea surface temperature anomaly (SSTA) have been conducted by using a dynamical tropical Pacific ocean atmosphere coupled model named NCCo. The results showed that the model had performed reasonable simulations of the major El Nino episodes in the history, and the model forecast skill in 1990s had been significantly improved. NCCo model has been used to predict the tropical Pacific SSTA since January 1997. The comparisons between predictions and observations indicated that the occurrence, evolution and ending of the 1997/1998 El Nino episode have been predicted fairly well by using this model. Also, the La Nina episode that began in the autumn of 1998 and the developing tendency of the tropical Pacific SSTA during the year 1999 have been predicted successfully. The forecast skills of NCCo model during the 1997-1999 El Nino and La Nina events are above 0. 5 at 0- 14 lead months.

  8. A Method Based on Active Appearance Model and Gradient Orientation Pyramid of Face Verification as People Age

    Directory of Open Access Journals (Sweden)

    Ji-Xiang Du

    2014-01-01

    Full Text Available Face verification in the presence of age progression is an important problem that has not been widely addressed. In this paper, we propose to use the active appearance model (AAM and gradient orientation pyramid (GOP feature representation for this problem. First, we use the AAM on the dataset and generate the AAM images; we then get the representation of gradient orientation on a hierarchical model, which is the appearance of GOP. When combined with a support vector machine (SVM, experimental results show that our approach has excellent performance on two public domain face aging datasets: FGNET and MORPH. Second, we compare the performance of the proposed methods with a number of related face verification methods; the results show that the new approach is more robust and performs better.

  9. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    Science.gov (United States)

    Hillen, F.; Höfle, B.; Ehlers, M.; Reinartz, P.

    2014-02-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories.

  10. DEVELOPMENT AND VERIFICATION OF NEW SOLID DENTAL FILLING TEMPORARY MATERIALS CONTAINING ZINC. FORMULA DEVELOPMENT STAGE.

    Science.gov (United States)

    Pytko-Polończyk, Jolanta; Antosik, Agata; Zajac, Magdalena; Szlósarczyk, Marek; Krywult, Agnieszka; Jachowicz, Renata; Opoka, Włodzimierz

    2016-01-01

    Caries is the most popular problem affecting teeth and this is the reason why so many temporary dental filling materials are being developed. An example of such filling is zinc oxide paste mixed with eugenol, Thymodentin and Coltosol F®. Zinc-oxide eugenol is used in dentistry because of its multiplied values: it improves heeling of the pulp by dentine bridge formation; has antiseptic properties; is hygroscopic. Because of these advantages compouds of zinc oxide are used as temporary fillings, especially in deep caries lesions when treatment is oriented on support of vital pulp. Temporary dental fillings based on zinc oxide are prepared ex tempone by simple mixing powder (Thymodentin) and eugenol liqiud together or a ready to use paste Coltosol F®. Quantitative composition depends mainly on experience of person who is preparing it, therefore, exact qualitative composition of dental fillings is not replicable. The main goal of the study was to develop appropriate dental fillings in solid form containing set amount of zinc oxide. Within the study, the influence of preparation method on solid dental fillings properties like mechanical properties and zinc ions release were examined.

  11. Introducing uncertainty of radar-rainfall estimates to the verification of mesoscale model precipitation forecasts

    Directory of Open Access Journals (Sweden)

    M. P. Mittermaier

    2008-05-01

    Full Text Available A simple measure of the uncertainty associated with using radar-derived rainfall estimates as "truth" has been introduced to the Numerical Weather Prediction (NWP verification process to assess the effect on forecast skill and errors. Deterministic precipitation forecasts from the mesoscale version of the UK Met Office Unified Model for a two-day high-impact event and for a month were verified at the daily and six-hourly time scale using a spatially-based intensity-scale method and various traditional skill scores such as the Equitable Threat Score (ETS and log-odds ratio. Radar-rainfall accumulations from the UK Nimrod radar-composite were used.

    The results show that the inclusion of uncertainty has some effect, shifting the forecast errors and skill. The study also allowed for the comparison of results from the intensity-scale method and traditional skill scores. It showed that the two methods complement each other, one detailing the scale and rainfall accumulation thresholds where the errors occur, the other showing how skillful the forecast is. It was also found that for the six-hourly forecasts the error distributions remain similar with forecast lead time but skill decreases. This highlights the difference between forecast error and forecast skill, and that they are not necessarily the same.

  12. Introducing uncertainty of radar-rainfall estimates to the verification of mesoscale model precipitation forecasts

    Science.gov (United States)

    Mittermaier, M. P.

    2008-05-01

    A simple measure of the uncertainty associated with using radar-derived rainfall estimates as "truth" has been introduced to the Numerical Weather Prediction (NWP) verification process to assess the effect on forecast skill and errors. Deterministic precipitation forecasts from the mesoscale version of the UK Met Office Unified Model for a two-day high-impact event and for a month were verified at the daily and six-hourly time scale using a spatially-based intensity-scale method and various traditional skill scores such as the Equitable Threat Score (ETS) and log-odds ratio. Radar-rainfall accumulations from the UK Nimrod radar-composite were used. The results show that the inclusion of uncertainty has some effect, shifting the forecast errors and skill. The study also allowed for the comparison of results from the intensity-scale method and traditional skill scores. It showed that the two methods complement each other, one detailing the scale and rainfall accumulation thresholds where the errors occur, the other showing how skillful the forecast is. It was also found that for the six-hourly forecasts the error distributions remain similar with forecast lead time but skill decreases. This highlights the difference between forecast error and forecast skill, and that they are not necessarily the same.

  13. Verification of three-dimensional neutron kinetics model of TRAP-KS code regarding reactivity variations

    Energy Technology Data Exchange (ETDEWEB)

    Uvakin, Maxim A.; Alekhin, Grigory V.; Bykov, Mikhail A.; Zaitsev, Sergei I. [EDO ' GIDROPRESS' , Moscow Region, Podolsk (Russian Federation)

    2016-09-15

    This work deals with TRAP-KS code verification. TRAP-KS is used for coupled neutron and thermo-hydraulic process calculations of VVER reactors. The three-dimensional neutron kinetics model enables consideration of space effects, which are produced by energy field and feedback parameters variations. This feature has to be investigated especially for asymmetrical multiplying variations of core properties, power fluctuations and strong local perturbation insertion. The presented work consists of three test definitions. First, an asymmetrical control rod (CR) ejection during power operation is defined. This process leads to fast reactivity insertion with short-time power spike. As second task xenon oscillations are considered. Here, small negative reactivity insertion leads to power decreasing and induces space oscillations of xenon concentration. In the late phase, these oscillations are suppressed by external actions. As last test, an international code comparison for a hypothetical main steam line break (V1000CT-2, task 2) was performed. This scenario is interesting for asymmetrical positive reactivity insertion by decreasing coolant temperature in the affected loop.

  14. FEM modeling for 3D dynamic analysis of deep-ocean mining pipeline and its experimental verification

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    3D dynamic analysis models of 1000 m deep-ocean mining pipeline, including steel lift pipe, pump, buffer and flexible hose, were established by finite element method (FEM). The coupling effect of steel lift pipe and flexible hose, and main external loads of pipeline were considered in the models, such as gravity, buoyancy, hydrodynamic forces, internal and external fluid pressures, concentrated suspension buoyancy on the flexible hose, torsional moment and axial force induced by pump working.Some relevant FEM models and solution techniques were developed, according to various 3D transient behaviors of integrated deep-ocean mining pipeline, including towing motions of track-keeping operation and launch process of pipeline. Meanwhile, an experimental verification system in towing water tank that had similar characteristics of designed mining pipeline was developed to verify the accuracy of the FEM models and dynamic simulation. The experiment results show that the experimental records and simulation results of stress of pipe are coincided. Based on the further simulations of 1 000 m deep-ocean mining pipeline, the simulation results show that, to form configuration of a saddle shape, the total concentrated suspension buoyancy of flexible hose should be 95%-105% of the gravity of flexible hose in water, the first suspension point occupies 1/3 of the total buoyancy, and the second suspension point occupies 2/3 of the total buoyancy. When towing velocity of mining system is less than 0.5 m/s, the towing track of buffer is coincided with the setting route of ship on the whole and the configuration of flexible hose is also kept well.

  15. Ethylene Decomposition Initiated by Ultraviolet Radiation from Low Pressure Mercury Lamps: Kinetics Model Prediction and Experimental Verification.

    Science.gov (United States)

    Jozwiak, Zbigniew Boguslaw

    1995-01-01

    Ethylene is an important auto-catalytic plant growth hormone. Removal of ethylene from the atmosphere surrounding ethylene-sensitive horticultural products may be very beneficial, allowing an extended period of storage and preventing or delaying the induction of disorders. Various ethylene removal techniques have been studied and put into practice. One technique is based on using low pressure mercury ultraviolet lamps as a source of photochemical energy to initiate chemical reactions that destroy ethylene. Although previous research showed that ethylene disappeared in experiments with mercury ultraviolet lamps, the reactions were not described and the actual cause of ethylene disappearance remained unknown. Proposed causes for this disappearance were the direct action of ultraviolet rays on ethylene, reaction of ethylene with ozone (which is formed when air or gas containing molecular oxygen is exposed to radiation emitted by this type of lamp), or reactions with atomic oxygen leading to formation of ozone. The objective of the present study was to determine the set of physical and chemical actions leading to the disappearance of ethylene from artificial storage atmosphere under conditions of ultraviolet irradiation. The goal was achieved by developing a static chemical model based on the physical properties of a commercially available ultraviolet lamp, the photochemistry of gases, and the kinetics of chemical reactions. The model was used to perform computer simulations predicting time dependent concentrations of chemical species included in the model. Development of the model was accompanied by the design of a reaction chamber used for experimental verification. The model provided a good prediction of the general behavior of the species involved in the chemistry under consideration; however the model predicted lower than measured rate of ethylene disappearance. Some reasons for the model -experiment disagreement are radiation intensity averaging, the experimental

  16. The MODUS approach to formal verification

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Soler, José; Berger, Michael Stübert

    2014-01-01

    in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality) project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution...... Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model...... verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development...

  17. A Formal Verification Methodology for Checking Data Integrity

    CERN Document Server

    Umezawa, Yasushi

    2011-01-01

    Formal verification techniques have been playing an important role in pre-silicon validation processes. One of the most important points considered in performing formal verification is to define good verification scopes; we should define clearly what to be verified formally upon designs under tests. We considered the following three practical requirements when we defined the scope of formal verification. They are (a) hard to verify (b) small to handle, and (c) easy to understand. Our novel approach is to break down generic properties for system into stereotype properties in block level and to define requirements for Verifiable RTL. Consequently, each designer instead of verification experts can describe properties of the design easily, and formal model checking can be applied systematically and thoroughly to all the leaf modules. During the development of a component chip for server platforms, we focused on RAS (Reliability, Availability, and Serviceability) features and described more than 2000 properties in...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL RESIDENTIAL HOMES, WATERLOO BIOFILTER® MODEL 4-BEDROOM (NSF 02/03/WQPC-SWP)

    Science.gov (United States)

    Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...

  19. Establishment and experimental verification of the photoresist model considering interface slip between photoresist and concave spherical substrate

    Directory of Open Access Journals (Sweden)

    S. Yang

    2015-07-01

    Full Text Available A thickness distribution model of photoresist spin-coating on concave spherical substrate (CSS has been developed via both theoretical studies and experimental verification. The stress of photoresist on rotating CSS is analyzed and the boundary conditions of hydrodynamic equation are presented under the non-lubricating condition. Moreover, a multivariable polynomial equation of photoresist-layer thickness distribution is derived by analyzing and deducing the flow equation where the evaporation rate, substrate topography, interface slip between liquid and CSS, and the variation of rotational speed and photoresist parameters are considered in detail. Importantly, the photoresist-layer thickness at various CSS rotational speeds and liquid concentrations can be obtained according to the theoretical equation. The required photoresist viscosity and concentration parameters of different photoresist coating thickness under a certain coating speeds can be also solved through this equation. It is noted that the calculated theoretical values are well consistent with the experimental results which were measured with various CSS rotational speeds and liquid concentrations at steady state. Therefore, both our experimental results and theoretical analysis provide the guidance for photoresist dilution and pave the way for potential improvements and microfabrication applications in the future.

  20. SU-E-T-203: Development of a QA Software Tool for Automatic Verification of Plan Data Transfer and Delivery.

    Science.gov (United States)

    Chen, G; Li, X

    2012-06-01

    Consistency verification between the data from treatment planning system (TPS), record and verification system (R&V), and delivered recorder with visual inspection is time consuming and subject to human error. The purpose of this work is to develop a software tool to automatically perform such verifications. Using Microsoft visual C++, a quality assurance (QA) tool was developed to (1) read plan data including gantry/collimator/couch parameters, multi-leaf-collimator leaf positions, and monitor unit (MU) numbers from a TPS (Xio, CMS/Elekta, or RealART, Prowess) via RTP link or DICOM transfer, (2) retrieve imported (prior to delivery) and recorded (after delivery) data from a R&V system (Mosaiq, Elekta) with open database connectivity, calculate MU independently based on the DICOM plan data using a modified Clarkson integration algorithm, and (4) compare all the extracted data to identify possible discrepancy between TPS and R&V, and R&V and delivery. The tool was tested for 20 patients with 3DCRT and IMRT plans from regular and the online adaptive radiotherapy treatments. It was capable of automatically detecting any inconsistency between the beam data from the TPS and the data stored in the R&V system with an independent MU check and any significant treatment delivery deviation from the plan within a few seconds. With this tool being used prior to and after the delivery as an essential QA step, our clinical online adaptive re-planning process can be speeded up to save a few minutes by eliminating the tedious visual inspection. A QA software tool has been developed to automatically verify the treatment data consistency from delivery back to plan and to identify discrepancy in MU calculations between the TPS and the secondary MU check. This tool speeds up clinical QA process and eliminating human errors from visual inspection, thus improves safety. © 2012 American Association of Physicists in Medicine.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE UV DISINFECTION OF SECONDARY EFFLUENTS, SUNTEC, INC. MODEL LPX200 DISINFECTION SYSTEM - 03/09/WQPC-SWP

    Science.gov (United States)

    Verification testing of the SUNTEC LPX200 UV Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Two lamp modules were mounted parallel in a 6.5-meter lon...

  2. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare these appro......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  3. A dynamic human water and electrolyte balance model for verification and optimization of life support systems in space flight applications

    Science.gov (United States)

    Hager, P.; Czupalla, M.; Walter, U.

    2010-11-01

    In this paper we report on the development of a dynamic MATLAB SIMULINK® model for the water and electrolyte balance inside the human body. This model is part of an environmentally sensitive dynamic human model for the optimization and verification of environmental control and life support systems (ECLSS) in space flight applications. An ECLSS provides all vital supplies for supporting human life on board a spacecraft. As human space flight today focuses on medium- to long-term missions, the strategy in ECLSS is shifting to closed loop systems. For these systems the dynamic stability and function over long duration are essential. However, the only evaluation and rating methods for ECLSS up to now are either expensive trial and error breadboarding strategies or static and semi-dynamic simulations. In order to overcome this mismatch the Exploration Group at Technische Universität München (TUM) is developing a dynamic environmental simulation, the "Virtual Habitat" (V-HAB). The central element of this simulation is the dynamic and environmentally sensitive human model. The water subsystem simulation of the human model discussed in this paper is of vital importance for the efficiency of possible ECLSS optimizations, as an over- or under-scaled water subsystem would have an adverse effect on the overall mass budget. On the other hand water has a pivotal role in the human organism. Water accounts for about 60% of the total body mass and is educt and product of numerous metabolic reactions. It is a transport medium for solutes and, due to its high evaporation enthalpy, provides the most potent medium for heat load dissipation. In a system engineering approach the human water balance was worked out by simulating the human body's subsystems and their interactions. The body fluids were assumed to reside in three compartments: blood plasma, interstitial fluid and intracellular fluid. In addition, the active and passive transport of water and solutes between those

  4. 基于 OCL的本体模型校验方法%ONTOLOGY MODEL VERIFICATION APPROACH BASED ON OCL

    Institute of Scientific and Technical Information of China (English)

    钱鹏飞; 王英林; 张申生

    2015-01-01

    In this paper, by combining the set and relation theory with ontology model and introducing and expanding Object Constraint Language ( OCL) in object oriented technology, we present an OCL-based ontology verification method.The method extracts an ontology defi-nition meta-model ( ODM) , which is based on set and relation theory, from a large number of ontology models.The ontology model is divided into'entity related element' and'constraint rule related element' , and through a series of OCL expansion functions the formalised expres-sion of the above 2 kinds of ontology model elements are completed so as to fulfil the OCL-based formalised ontology model verification.In the end, the issue of realising ontology model conflict inspection and reconciliation using this model verification approach is further discussed through an ontology model verification sample of'vehicle management ontology slice of Baosteel information sharing platform' .%将集合关系理论与本体模型相结合,同时引入并扩展面向对象中的OCL( Object Constraint Language)语言,提出一种基于OCL的本体校验方法. 该方法从大量本体模型中抽象出一个本体定义元模型ODM(Ontology Constraint Meta-model),该元模型基于集合关系理论,将本体模型划分为"实体相关元素"和"约束规则相关元素",并通过一系列OCL扩展函数来完成上述两种本体模型元素的形式化表示,以完成基于OCL的本体模型形式化校验. 最后,通过宝钢信息共享平台车辆管理本体片段的本体模型校验实例,进一步讨论如何使用该模型校验方法实现本体模型的冲突检测和冲突消解.

  5. Bringing Automated Model Checking to PLC Program Development - A CERN Case Study

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. Model checking appears to be an appropriate approach for this purpose. However, this technique is not widely used in industry yet, due to some obstacles. The main obstacles encountered when trying to apply formal verification techniques at industrial installations are the difficulty of creating models out of PLC programs and defining formally the specification requirements. In addition, models produced out of real-life programs have a huge state space, thus preventing the verification due to performance issues. Our work at CERN (European Organization for Nuclear Research) focuses on developing efficient automatic verification methods for industrial critical installations based on PLC (Programmable Logic Controller) control systems. In this paper, we present a tool generating automatically formal models out of PLC code. The tool implements a general methodology which can support several input languages, ...

  6. CPAchecker: A Tool for Configurable Software Verification

    CERN Document Server

    Beyer, Dirk

    2009-01-01

    Configurable software verification is a recent concept for expressing different program analysis and model checking approaches in one single formalism. This paper presents CPAchecker, a tool and framework that aims at easy integration of new verification components. Every abstract domain, together with the corresponding operations, is required to implement the interface of configurable program analysis (CPA). The main algorithm is configurable to perform a reachability analysis on arbitrary combinations of existing CPAs. The major design goal during the development was to provide a framework for developers that is flexible and easy to extend. We hope that researchers find it convenient and productive to implement new verification ideas and algorithms using this platform and that it advances the field by making it easier to perform practical experiments. The tool is implemented in Java and runs as command-line tool or as Eclipse plug-in. We evaluate the efficiency of our tool on benchmarks from the software mo...

  7. Verification of COMDES-II Systems Using UPPAAL with Model Transformation

    DEFF Research Database (Denmark)

    Xu, Ke; Pettersson, Paul; Sierszecki, Krzysztof

    2008-01-01

    in a timed multitasking environment, modal continuous operation combining reactive control behavior with continuous data processing, etc., by following the principle of separation-of-concerns. In the paper we present a transformational approach to the formal verification of both timing and reactive behaviors...

  8. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2017-01-01

    -dependent. We show that the proposed method significantly reduces the error rates of text-dependent speaker verification for the non-target types: target-wrong and impostor-wrong while it maintains comparable TD-SV performance when impostors speak a correct utterance with respect to the conventional system...

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH ELECTRONIC SENSOR TECHNOLOGY MODEL 4100

    Science.gov (United States)

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. As part of this program, the...

  10. Electric Machine Analysis, Control and Verification for Mechatronics Motion Control Applications, Using New MATLAB Built-in Function and Simulink Model

    Directory of Open Access Journals (Sweden)

    Farhan A. Salem

    2014-05-01

    Full Text Available This paper proposes a new, simple and user–friendly MATLAB built-in function, mathematical and Simulink models, to be used to early identify system level problems, to ensure that all design requirements are met, and, generally, to simplify Mechatronics motion control design process including; performance analysis and verification of a given electric DC machine, proper controller selection and verification for desired output speed or angle.

  11. Operative temperature and thermal comfort in the sun - Implementation and verification of a model for IDA ICE

    DEFF Research Database (Denmark)

    Karlsen, Line; Grozman, Grigori; Heiselberg, Per Kvols;

    2015-01-01

    (MRT) model for IDA Indoor Climate and Energy (IDA ICE). The new feature of the model is that it includes the effect of shortwave radiation in the room and contributes to a more comprehensive prediction of operative temperature, e.g. of a person exposed to direct sun light. The verification...... comfort of persons affected by direct solar radiation. This may further have implications on the predicted energy use and design of the façade, since e.g. an enlarged need for local cooling or use of dynamic solar shading might be discovered....

  12. Experimental Verification of the Physical Model for Droplet-Particles Cleaning in Pulsed Bias Arc Ion Plating

    Institute of Scientific and Technical Information of China (English)

    Yanhui ZHAO; Guoqiang LIN; Chuang DONG; Lishi WEN

    2005-01-01

    It has been reported that application of pulsed biases in arc ion plating could effectively eliminate droplet particles.The present paper aims at experimental verification of a physical model proposed previously by us which is based on particle charging and repulsion in the pulsed plasma sheath. An orthogonal experiment was designed for this purpose,using the electrical parameters of the pulsed bias for the deposition of TiN films on stainless steel substrates. The effect of these parameters on the amount and the size distribution of the particles were analyzed, and the results provided sufficient evidence for the physical model.

  13. Synergy between Emissions Verification for Climate and Air Quality: Results from Modeling Analysis over the Contiguous US using CMAQ

    Science.gov (United States)

    Liu, Z.; Bambha, R.; Pinto, J. P.; Zeng, T.; Michelsen, H. A.

    2013-12-01

    The synergy between emissions-verification exercises for fossil-fuel CO2 and traditional air pollutants (TAPs, e.g., NOx, SO2, CO, and PM) stems from the common physical processes underlying the generation, transport, and perturbations of their emissions. Better understanding and characterizing such a synergetic relationship are of great interest and benefit for science and policy. To this end, we have been developing a modeling framework that allows for studying CO2 along with TAPs on regional-through-urban scales. The framework is based on the EPA Community Multi-Scale Air Quality (CMAQ) modeling system and has been implemented on a domain over the contiguous US, where abundant observational data and complete emissions information is available. In this presentation, we will show results from a comprehensive analysis of atmospheric CO2 and an array of TAPs observed from multiple networks and platforms (in situ and satellite observations) and those simulated by CMAQ over the contiguous US for a full year of 2007. We will first present the model configurations and input data used for CMAQ CO2 simulations and the results from model evaluations [1]. In light of the unique properties of CO2 compared to TAPs, we tested the sensitivity of model-simulated CO2 to different initial and boundary conditions, biosphere-atmosphere bidirectional fluxes and fossil-fuel emissions. We then examined the variability of CO2 and TAPs simulated by CMAQ and observed from the NOAA ESRL tall-tower network, the EPA AQS network, and satellites (e.g., SCIAMACHY and OMI) at various spatial and temporal scales. Finally, we diagnosed in CMAQ the roles of fluxes and transport in regulating the covariance between CO2 and TAPs manifested in both surface concentrations and column-integrated densities. We will discuss the implications from these results on how to understand trends and characteristics fossil-fuel emissions by exploiting and combining currently available observational and modeling

  14. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    Directory of Open Access Journals (Sweden)

    M. A. Lukin

    2014-01-01

    Full Text Available The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maximum size of verifiable programs. Considered method supports verification of the parallel system for hierarchical automata that interact with each other through messages and shared variables. The feature of automaton model is that each state machine is considered as a new data type and can have an arbitrary bounded number of instances. Each state machine in the system can run a different state machine in a new thread or have nested state machine. This method was implemented in the developed Stater tool. Stater shows correct operation for all test cases.

  15. A room acoustical computer model for industrial environments - the model and its verification

    DEFF Research Database (Denmark)

    Christensen, Claus Lynge; Foged, Hans Torben

    1998-01-01

    This paper presents an extension to the traditional room acoustic modelling methods allowing computer modelling of huge machinery in industrial spaces. The program in question is Odeon 3.0 Industrial and Odeon 3.0 Combined which allows the modelling of point sources, surface sources and line...... sources. Combining these three source types it is possible to model huge machinery in an easy and visually clear way. Traditionally room acoustic simulations have been aimed at auditorium acoustics. The aim of the simulations has been to model the room acoustic measuring setup consisting...

  16. DEVELOPMENT OF AN INNOVATIVE LASER SCANNER FOR GEOMETRICAL VERIFICATION OF METALLIC AND PLASTIC PARTS

    DEFF Research Database (Denmark)

    Carmignato, Simone; De Chiffre, Leonardo; Fisker, Rune

    2008-01-01

    and plastic parts. A first prototype of the novel measuring system has been developed, using laser triangulation. The system, besides ensuring the automatic reconstruction of complete surface models, has been designed to guarantee user-friendliness, versatility, reliability and speed. The paper focuses mainly...... dimensional measurements with adequate accuracy for most industrial requirements....

  17. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object...... of Refinement of Component and Object Systems (rCOS) and illustrates it with experiences from the work on the Common Component Modelling Example (CoCoME). This gives evidence that the formal techniques developed in rCOS can be integrated into a model-driven development process and shows where it may...

  18. Model Oriented Approach for Industrial Software Development

    Directory of Open Access Journals (Sweden)

    P. D. Drobintsev

    2015-01-01

    Full Text Available The article considers the specifics of a model oriented approach to software development based on the usage of Model Driven Architecture (MDA, Model Driven Software Development (MDSD and Model Driven Development (MDD technologies. Benefits of this approach usage in the software development industry are described. The main emphasis is put on the system design, automated code generation for large systems, verification, proof of system properties and reduction of bug density. Drawbacks of the approach are also considered. The approach proposed in the article is specific for industrial software systems development. These systems are characterized by different levels of abstraction, which is used on modeling and code development phases. The approach allows to detail the model to the level of the system code, at the same time store the verified model semantics and provide the checking of the whole detailed model. Steps of translating abstract data structures (including transactions, signals and their parameters into data structures used in detailed system implementation are presented. Also the grammar of a language for specifying rules of abstract model data structures transformation into real system detailed data structures is described. The results of applying the proposed method in the industrial technology are shown.The article is published in the authors’ wording.

  19. Exploring Middle School Students' Representational Competence in Science: Development and Verification of a Framework for Learning with Visual Representations

    Science.gov (United States)

    Tippett, Christine Diane

    Scientific knowledge is constructed and communicated through a range of forms in addition to verbal language. Maps, graphs, charts, diagrams, formulae, models, and drawings are just some of the ways in which science concepts can be represented. Representational competence---an aspect of visual literacy that focuses on the ability to interpret, transform, and produce visual representations---is a key component of science literacy and an essential part of science reading and writing. To date, however, most research has examined learning from representations rather than learning with representations. This dissertation consisted of three distinct projects that were related by a common focus on learning from visual representations as an important aspect of scientific literacy. The first project was the development of an exploratory framework that is proposed for use in investigations of students constructing and interpreting multimedia texts. The exploratory framework, which integrates cognition, metacognition, semiotics, and systemic functional linguistics, could eventually result in a model that might be used to guide classroom practice, leading to improved visual literacy, better comprehension of science concepts, and enhanced science literacy because it emphasizes distinct aspects of learning with representations that can be addressed though explicit instruction. The second project was a metasynthesis of the research that was previously conducted as part of the Explicit Literacy Instruction Embedded in Middle School Science project (Pacific CRYSTAL, http://www.educ.uvic.ca/pacificcrystal). Five overarching themes emerged from this case-to-case synthesis: the engaging and effective nature of multimedia genres, opportunities for differentiated instruction using multimodal strategies, opportunities for assessment, an emphasis on visual representations, and the robustness of some multimodal literacy strategies across content areas. The third project was a mixed

  20. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  1. Verification of extended model of goal directed behavior applied on aggression

    Directory of Open Access Journals (Sweden)

    Katarína Vasková

    2016-01-01

    behavioral desire. Also important impact of this factor on prevolitional stages of aggressive behavior was identified. Next important predictor of behavioral desire was anticipation of positive emotions, but not negative emotions. These results correspond with theory of self-regulation where behavior that is focused on goal attainment is accompanied with positive emotions (see for example Cacioppo, Gardner & Berntson, 1999, Carver, 2004. Results confirmed not only sufficient model fit, but also explained 53% of variance of behavioral desire, 68% of intention and 37% of behavior. Some limitations should be mentioned - especially unequal gender representation in the second sample. Some results could be affected by lower sample size. For the future we recommend use also other types of aggressive behavior in verification EMGB and also to apply more complex incorporation of inhibition to the model. At last, character of this study is co-relational, therefore further researches should manipulate with key variables in experimental way to appraise main characteristics of stated theoretical background.

  2. Systems, methods and apparatus for pattern matching in procedure development and verification

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Rouff, Christopher A. (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.

  3. Postures and Motions Library Development for Verification of Ground Crew Human Systems Integration Requirements

    Science.gov (United States)

    Jackson, Mariea Dunn; Dischinger, Charles; Stambolian, Damon; Henderson, Gena

    2012-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a Primitive motion capture library. The Library will be used by the human factors engineering in the future to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the Primitive models are being developed for the library the project has selected several current human factors issues to be addressed for the SLS and Orion launch systems. This paper explains how the Motion Capture of unique ground systems activities are being used to verify the human factors analysis requirements for ground system used to process the STS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  4. Postures and Motions Library Development for Verification of Ground Crew Human Factors Requirements

    Science.gov (United States)

    Stambolian, Damon; Henderson, Gena; Jackson, Mariea Dunn; Dischinger, Charles

    2013-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a primitive motion capture library. The library will be used by human factors engineering analysts to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the primitive models are being developed for the library, the project has selected several current human factors issues to be addressed for the Space Launch System (SLS) and Orion launch systems. This paper explains how the motion capture of unique ground systems activities is being used to verify the human factors engineering requirements for ground systems used to process the SLS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  5. Verification and Validation of the Spalart-Allmaras Turbulence Model for Strand Grids

    Science.gov (United States)

    2013-01-01

    pp. 4703–4723. [11] Feynman , R., The Feynman Lectures on Physics : Mainly Mechanics, Radiation and Heat , 6th ed., Basic Books, New York, NY, 1977... Lecture Notes in Physics , Vol. 323, 1989, pp. 273–277. [34] Folkner, D., Katz, A., and Sankaran, V., “Design and Verification Methodology of Boundary...unpredictable, thus making its predic- tion and simulation difficult. Nobel Laureate Richard Feynman famously described turbu- lence as “the most important

  6. A Survey of Workflow Modeling Approaches and Model Verification%工作流过程建模方法及模型的形式化验证

    Institute of Scientific and Technical Information of China (English)

    杨东; 王英林; 张申生; 傅谦

    2003-01-01

    Work/low technology is widely used in business process modeling, software process modeling as well as en-terprise information integration. At present, there exist a variety of workflow modeling approaches, which differ in the easiness of modeling, expressiveness and formalism. In this paper, the modeling approaches most used in research project and workflow products are compared. And the verification of workflow model is also dealt. We argue that a ideal workflow modelin~ approach is a hybrid one, i.e. the inteuration of the above approaches.

  7. Manufactured solutions and the numerical verification of isothermal, nonlinear, three-dimensional Stokes ice-sheet models

    Directory of Open Access Journals (Sweden)

    W. Leng

    2012-07-01

    Full Text Available The technique of manufactured solutions is used for verification of computational models in many fields. In this paper we construct manufactured solutions for models of three-dimensional, isothermal, nonlinear Stokes flow in glaciers and ice sheets. The solution construction procedure starts with kinematic boundary conditions and is mainly based on the solution of a first-order partial differential equation for the ice velocity that satisfies the incompressibility condition. The manufactured solutions depend on the geometry of the ice sheet and other model parameters. Initial conditions are taken from the periodic geometry of a standard problem of the ISMIP-HOM benchmark tests and altered through the manufactured solution procedure to generate an analytic solution for the time-dependent flow problem. We then use this manufactured solution to verify a parallel, high-order accurate, finite element Stokes ice-sheet model. Results from the computational model show excellent agreement with the manufactured analytic solutions.

  8. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    Science.gov (United States)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  9. Formal Verification of UML Profil

    DEFF Research Database (Denmark)

    Bhutto, Arifa; Hussain, Dil Muhammad Akbar

    2011-01-01

    The Unified Modeling Language (UML) is based on the Model Driven Development (MDD) approach which capturing the system functionality using the platform-independent model (PMI) and appropriate domain-specific languages. In UML base system notations, structural view is model by the class, components...... and object diagrams and behavioral view model by the activity, use case, state, and sequence diagram. However, UML does not provide the formal syntax, therefore its semantics is not formally definable, so for assure of correctness, we need to incorporate semantic reasoning through verification, specification......, refinement, and incorporate into the development process. Our motivation of research is to make an easy structural view and suggest formal technique/ method which can be best applied or used for the UML based development system. We investigate the tools and methods, which broadly used for the formal...

  10. Verification and Validation of Numerical Models for Air/Water Flow on Coastal and Navigation Fluid-Structure Interaction Applications

    Science.gov (United States)

    Kees, C. E.; Farthing, M.; Dimakopoulos, A.; DeLataillade, T.

    2015-12-01

    Performance analysis and optimization of coastal and navigation structures is becoming feasible due to recent improvements in numerical methods for multiphase flows and the steady increase in capacity and availability of high performance computing resources. Now that the concept of fully three-dimensional air/water flow modelling for real world engineering analysis is achieving acceptance by the wider engineering community, it is critical to expand careful comparative studies on verification,validation, benchmarking, and uncertainty quantification for the variety of competing numerical methods that are continuing to evolve. Furthermore, uncertainty still remains about the relevance of secondary processes such as surface tension, air compressibility, air entrainment, and solid phase (structure) modelling so that questions about continuum mechanical theory and mathematical analysis of multiphase flow are still required. Two of the most popular and practical numerical approaches for large-scale engineering analysis are the Volume-Of-Fluid (VOF) and Level Set (LS) approaches. In this work we will present a publically available verification and validation test set for air-water-structure interaction problems as well as computational and physical model results including a hybrid VOF-LS method, traditional VOF methods, and Smoothed Particle Hydrodynamics (SPH) results. The test set repository and test problem formats will also be presented in order to facilitate future comparative studies and reproduction of scientific results.

  11. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  12. Verification of the model of predisposition in triathlon – structural model of confirmative factor analysis

    Directory of Open Access Journals (Sweden)

    Lenka Kovářová

    2012-09-01

    Full Text Available BACKGROUND: The triathlon is a combination of three different types of sport – swimming, cycling, and running. Each of these requires different top level predispositions and complex approach to talent selection is a rather difficult process. Attempts to identify assumptions in the triathlon have so far been specific and focused only on some groups of predispositions (physiology, motor tests, and psychology. The latest studies missed the structural approach and were based on determinants of sport performance, theory of sports training and expert assessment. OBJECTIVE: The aim of our study was to verify the model of predisposition in the short triathlon for talent assessment of young male athletes age 17–20 years. METHODS: The research sample consisted of 55 top level triathletes – men, who were included in the Government supported sports talent programme in the Czech Republic at the age of 17–20 years. We used a confirmative factor analysis (FA and Path diagram to verify the model, which allow us to explain mutual relationships among observed variables. For statistical data processing we used a structure equating modeling (SEM by software Lisrel L88. RESULTS: The study confirms best structural model for talent selection in triathlon at the age of 17–20 years old men, which composed seventeen indicators (tests and explained 91% of all cross-correlations (Goodness of Fit Index /GFI/ 0.91, Root Mean Square Residual /RMSR/ 0.13. Tests for predispositions in triathlons were grouped into five items, three motor predispositions (swimming, cycling and running skills, aerobic and psychological predispositions. Aerobic predispositions showed the highest importance to the assumptions to the general factor (1.00; 0. Running predispositions were measured as a very significant factor (–0.85; 0.28 which confirms importance of this critical stage of the race. Lower factor weight showed clusters of swimming (–0.61; 0.63 and cycling (0.53; 0

  13. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    Directory of Open Access Journals (Sweden)

    Tuo Ming Fu

    2016-01-01

    Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion

  14. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs.

  15. Documentation, User Support, and Verification of Wind Turbine and Plant Models

    Energy Technology Data Exchange (ETDEWEB)

    Robert Zavadil; Vadim Zheglov; Yuriy Kazachkov; Bo Gong; Juan Sanchez; Jun Li

    2012-09-18

    As part of the Utility Wind Energy Integration Group (UWIG) and EnerNex's Wind Turbine Modeling Project, EnerNex has received ARRA (federal stimulus) funding through the Department of Energy (DOE) to further the progress of wind turbine and wind plant models. Despite the large existing and planned wind generation deployment, industry-standard models for wind generation have not been formally adopted. Models commonly provided for interconnection studies are not adequate for use in general transmission planning studies, where public, non-proprietary, documented and validated models are needed. NERC MOD (North American Electric Reliability Corporation) reliability standards require that power flow and dynamics models be provided, in accordance with regional requirements and procedures. The goal of this project is to accelerate the appropriate use of generic wind turbine models for transmission network analysis by: (1) Defining proposed enhancements to the generic wind turbine model structures that would allow representation of more advanced; (2) Comparative testing of the generic models against more detailed (and sometimes proprietary) versions developed by turbine vendors; (3) Developing recommended parameters for the generic models to best mimic the performance of specific commercial wind turbines; (4) Documenting results of the comparative simulations in an application guide for users; (5) Conducting technology transfer activities in regional workshops for dissemination of knowledge and information gained, and to engage electric power and wind industry personnel in the project while underway; (6) Designing of a "living" homepage to establish an online resource for transmission planners.

  16. Development of high-efficiency passive counters (HEPC) for the verification of large LEU samples

    Energy Technology Data Exchange (ETDEWEB)

    Peerani, P. [European Commission, DG-JRC, IPSC, Ispra (Italy)], E-mail: paolo.peerani@jrc.it; Canadell, V.; Garijo, J.; Jackson, K. [European Commission, DG-TREN/I, Nuclear Inspections (Luxembourg); Jaime, R.; Looman, M.; Ravazzani, A. [European Commission, DG-JRC, IPSC, Ispra (Italy); Schwalbach, P. [European Commission, DG-TREN/I, Nuclear Inspections (Luxembourg); Swinhoe, M. [Los Alamos National Laboratory, Los Alamos, NM (United States)

    2009-04-01

    A paper describing the conceptual idea of using passive neutron assay for the verification of large size uranium samples in fuel fabrication plants was first presented at the 2001 ESARDA conference. The advantages of this technique, as a replacement of active interrogation using the PHOto-Neutron Interrogation Device (PHONID) device, were evident provided that a suitable detector with higher efficiency than those commercially available would be realised. The previous paper also included a feasibility study based on the experimental data. To implement this technique, a high-efficiency passive counter (HEPC) has been designed by the JRC, Ispra. JRC has also built a first smaller-scale prototype. This paper will describe the tests made in the PERLA laboratory and report the performance of the prototype. In parallel, the design of the large HEPC has been finalised for Euratom safeguards. Two units for the fuel fabrication plants in Dessel (B) and Juzbado (E) have been produced by a commercial manufacturer under JRC specifications. The two detectors have been installed in the two sites in summer 2004 after an extensive test campaign in PERLA. Since then they are in use and some feedback on the experience gained is reported at the end of this paper.

  17. General-Purpose Heat Source development: Safety Verification Test Program. Bullet/fragment test series

    Energy Technology Data Exchange (ETDEWEB)

    George, T.G.; Tate, R.E.; Axler, K.M.

    1985-05-01

    The radioisotope thermoelectric generator (RTG) that will provide power for space missions contains 18 General-Purpose Heat Source (GPHS) modules. Each module contains four /sup 238/PuO/sub 2/-fueled clads and generates 250 W/sub (t)/. Because a launch-pad or post-launch explosion is always possible, we need to determine the ability of GPHS fueled clads within a module to survive fragment impact. The bullet/fragment test series, part of the Safety Verification Test Plan, was designed to provide information on clad response to impact by a compact, high-energy, aluminum-alloy fragment and to establish a threshold value of fragment energy required to breach the iridium cladding. Test results show that a velocity of 555 m/s (1820 ft/s) with an 18-g bullet is at or near the threshold value of fragment velocity that will cause a clad breach. Results also show that an exothermic Ir/Al reaction occurs if aluminum and hot iridium are in contact, a contact that is possible and most damaging to the clad within a narrow velocity range. The observed reactions between the iridium and the aluminum were studied in the laboratory and are reported in the Appendix.

  18. Comparison between a Computational Seated Human Model and Experimental Verification Data

    Directory of Open Access Journals (Sweden)

    Christian G. Olesen

    2014-01-01

    Full Text Available Sitting-acquired deep tissue injuries (SADTI are the most serious type of pressure ulcers. In order to investigate the aetiology of SADTI a new approach is under development: a musculo-skeletal model which can predict forces between the chair and the human body at different seated postures. This study focuses on comparing results from a model developed in the AnyBody Modeling System, with data collected from an experimental setup. A chair with force-measuring equipment was developed, an experiment was conducted with three subjects, and the experimental results were compared with the predictions of the computational model. The results show that the model predicted the reaction forces for different chair postures well. The correlation coefficients of how well the experiment and model correlate for the seat angle, backrest angle and footrest height was 0.93, 0.96, and 0.95. The study show a good agreement between experimental data and model prediction of forces between a human body and a chair. The model can in the future be used in designing wheelchairs or automotive seats.

  19. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  20. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  1. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  2. Quantitative Verification in Practice

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.; Katoen, Joost-Pieter; Larsen, Kim G.

    2010-01-01

    Soon after the birth of model checking, the first theoretical achievements have been reported on the automated verification of quanti- tative system aspects such as discrete probabilities and continuous time. These theories have been extended in various dimensions, such as con- tinuous probabilities

  3. Development and Verification of Unstructured Adaptive Mesh Technique with Edge Compatibility

    Science.gov (United States)

    Ito, Kei; Kunugi, Tomoaki; Ohshima, Hiroyuki

    In the design study of the large-sized sodium-cooled fast reactor (JSFR), one key issue is suppression of gas entrainment (GE) phenomena at a gas-liquid interface. Therefore, the authors have been developed a high-precision CFD algorithm to evaluate the GE phenomena accurately. The CFD algorithm has been developed on unstructured meshes to establish an accurate modeling of JSFR system. For two-phase interfacial flow simulations, a high-precision volume-of-fluid algorithm is employed. It was confirmed that the developed CFD algorithm could reproduce the GE phenomena in a simple GE experiment. Recently, the authors have been developed an important technique for the simulation of the GE phenomena in JSFR. That is an unstructured adaptive mesh technique which can apply fine cells dynamically to the region where the GE occurs in JSFR. In this paper, as a part of the development, a two-dimensional unstructured adaptive mesh technique is discussed. In the two-dimensional adaptive mesh technique, each cell is refined isotropically to reduce distortions of the mesh. In addition, connection cells are formed to eliminate the edge incompatibility between refined and non-refined cells. The two-dimensional unstructured adaptive mesh technique is verified by solving well-known lid-driven cavity flow problem. As a result, the two-dimensional unstructured adaptive mesh technique succeeds in providing a high-precision solution, even though poor-quality distorted initial mesh is employed. In addition, the simulation error on the two-dimensional unstructured adaptive mesh is much less than the error on the structured mesh with a larger number of cells.

  4. Open verification methodology cookbook

    CERN Document Server

    Glasser, Mark

    2009-01-01

    Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic

  5. A general simulation model developing process based on five-object framework

    Institute of Scientific and Technical Information of China (English)

    胡安斌; 伞冶; 陈建明; 陈永强

    2003-01-01

    Different paradigms that relate verification and validation to the simulation model have different development process. A simulation model developing process based on Five-Object Framework (FOF) is discussed in this paper. An example is given to demonstrate the applications of the proposed method.

  6. Development of NASA's Models and Simulations Standard

    Science.gov (United States)

    Bertch, William J.; Zang, Thomas A.; Steele, Martin J.

    2008-01-01

    From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.

  7. Verification of a Monte-Carlo planetary surface radiation environment model using gamma-ray data from Lunar Prospector and 2001 Mars Odyssey

    Energy Technology Data Exchange (ETDEWEB)

    Skidmore, M.S., E-mail: mss16@star.le.ac.u [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester, LE1 7RH (United Kingdom); Ambrosi, R.M. [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester, LE1 7RH (United Kingdom)

    2010-01-01

    Characterising a planetary radiation environment is important to: (1) assess the habitability of a planetary body for indigenous life; (2) assess the risks associated with manned exploration missions to a planetary body and (3) predict/interpret the results that remote sensing instrumentation may obtain from a planetary body (e.g. interpret the gamma-ray emissions from a planetary surface produced by radioactive decay or via the interaction of galactic cosmic rays to obtain meaningful estimates of the concentration of certain elements on the surface of a planet). The University of Leicester is developing instrumentation for geophysical applications that include gamma-ray spectroscopy, gamma-ray densitometry and radiometric dating. This paper describes the verification of a Monte-Carlo planetary radiation environment model developed using the MCNPX code. The model is designed to model the radiation environments of Mars and the Moon, but is applicable to other planetary bodies, and will be used to predict the performance of the instrumentation being developed at Leicester. This study demonstrates that the modelled gamma-ray data is in good agreement with gamma-ray data obtained by the gamma-ray spectrometers on 2001 Mars Odyssey and Lunar Prospector, and can be used to accurately model geophysical instrumentation for planetary science applications.

  8. Development and Verification of MAAP5.0.3 Parameter file for APR1400

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Mi Ro; Kim, Hyeong Taek [KHNP-CRI, Daejeon (Korea, Republic of)

    2015-05-15

    After the Fukushima accident, EPRI has continuously upgrade the MAAP5 (Modular Accident Analysis Program version 5) that is expected to expand the limitation of MAAP4. As a result of those efforts, the MAAP5.0.2 (Build 5020000) was released officially in December, 2013. Also, in August, 2014, the newest version of MAAP5, MAAP 5.0.3 (Build 5030000), was officially released. The parameter file development is essential for severe accident analysis using MAAP code for specific plant. In 2014, KHNP developed the first draft version of MAAP 5.0.2 parameter file for APR1400 type NPP and had tested for some basic severe accident sequence. And, until now, KHNP has continuously complemented the first draft version of APR1400 type NPP parameter file for MAAP 5.0.2 and 5.0.3. In this study, we analysis the MCCI phenomena using MAAP 5.0.3 version with the 2''n''d draft version of APR1400 parameter file developed by KHNP. The purpose of this study is to compare the major difference in MAAP 5.0.2 and 5.0.3 MCCI model and to verify the appropriateness of the 2''n''d draft version of parameter file. The MCCI phenomena have been controversial issues in the severe accident progression, so there have been great efforts to solve them until now. As the part of these efforts, EPRI published MAAP 5.0.3 version which is known that the 'Lower head plenum model' and the 'MCCI model' was upgraded. KHNP have the plan in order to upgrade the old parameter file based on MAAP4 to that based on MAAP5.0.2 or higher version for all domestic nuclear power plants. So, we have continuously developed the MAAP 5.0.2 and 5.0.3 parameter file for APR1400 type NPP. In this study, we analyzed the MCCI phenomena using MAAP 5.0.3 and 2''n''d draft version parameter file. And we found some insight as belows; (1) The Melt Eruption Model can greatly affect the MCCI progression only in the case of limestone concrete in the wet cavity

  9. Model-Based Verification and Validation of the SMAP Uplink Processes

    Science.gov (United States)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  10. VERIFICATION OF MATHEMATICL MODEL FOR SEDIMENT TRANSPORT BY UNSTEADY FLOW IN THE LOWER YELLOW RIVER

    Institute of Scientific and Technical Information of China (English)

    Jianjun ZHOU; Bingnan LIN

    2004-01-01

    Field data from the Lower Yellow River (LYR) covering a period of ten consecutive years are used to test a mathematical model for one dimensional sediment transport by unsteady flow developed previously by the writers. Data of the first year of the said period, i.e., 1976, are used to calibrate the model and those of the remaining years to verify it. Items investigated include discharge, water stage, rate of transport of suspended sediment and riverbed erosion/deposition. Comparisons between computed and observed data indicate that the proposed model may well simulate sediment transport in the LYR under conditions of unsteady flow with sufficient accuracy.

  11. Modeling of bubble detachment in reduced gravity under the influence of electric fields and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Herman, Cila [Johns Hopkins University, Department of Mechanical Engineering, Baltimore, MD 21218 (United States); Iacona, Estelle [Johns Hopkins University, Department of Mechanical Engineering, Baltimore, MD 21218 (United States); Ecole Centrale, Laboratoire EM2C, Paris UPR 288 (France)

    2004-10-01

    A simple model for predicting bubble volume and shape at detachment in reduced gravity under the influence of electric fields is described in the paper. The model is based on relatively simple thermodynamic arguments and relies on and combines several models described in the literature. It accounts for the level of gravity and the magnitude of the electric field. For certain conditions of bubble development the properties of the bubble source are also considered. Computations were carried out for a uniform unperturbed electric field for a range of model parameters, and the significance of model assumptions and simplifications is discussed for the particular method of bubble formation. Experiments were conducted in terrestrial conditions and reduced gravity (during parabolic flights in NASA's KC-135 aircraft) by injecting air bubbles through an orifice into the electrically insulating working fluid, PF5052. Bubble shapes visualized experimentally were compared with model predictions. Measured data and model predictions show good agreement. The results suggest that the model can provide quick engineering estimates concerning bubble formation for a range of conditions (both for formation at an orifice and boiling) and such a model reduces the need for complex and expensive numerical simulations for certain applications. (orig.)

  12. The Development and Verification of a Novel ECMS of Hybrid Electric Bus

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2014-01-01

    Full Text Available This paper presents the system modeling, control strategy design, and hardware-in-the-loop test for a series-parallel hybrid electric bus. First, the powertrain mathematical models and the system architecture were proposed. Then an adaptive ECMS is developed for the real-time control of a hybrid electric bus, which is investigated and verified in a hardware-in-the-loop simulation system. The ECMS through driving cycle recognition results in updating the equivalent charge and discharge coefficients and extracting optimized rules for real-time control. This method not only solves the problems of mode transition frequently and improves the fuel economy, but also simplifies the complexity of control strategy design and provides new design ideas for the energy management strategy and gear-shifting rules designed. Finally, the simulation results show that the proposed real-time A-ECMS can coordinate the overall hybrid electric powertrain to optimize fuel economy and sustain the battery SOC level.

  13. Developing mathematical modelling competence

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Jensen, Tomas Højgaard

    2003-01-01

    In this paper we introduce the concept of mathematical modelling competence, by which we mean being able to carry through a whole mathematical modelling process in a certain context. Analysing the structure of this process, six sub-competences are identified. Mathematical modelling competence...... cannot be reduced to these six sub-competences, but they are necessary elements in the development of mathematical modelling competence. Experience from the development of a modelling course is used to illustrate how the different nature of the sub-competences can be used as a tool for finding...... the balance between different kinds of activities in a particular educational setting. Obstacles of social, cognitive and affective nature for the students' development of mathematical modelling competence are reported and discussed in relation to the sub-competences....

  14. Design and verification of a simple 3D dynamic model of speed skating which mimics observed forces and motions.

    Science.gov (United States)

    van der Kruk, E; Veeger, H E J; van der Helm, F C T; Schwab, A L

    2017-09-14

    Advice about the optimal coordination pattern for an individual speed skater, could be addressed by simulation and optimization of a biomechanical speed skating model. But before getting to this optimization approach one needs a model that can reasonably match observed behaviour. Therefore, the objective of this study is to present a verified three dimensional inverse skater model with minimal complexity, which models the speed skating motion on the straights. The model simulates the upper body transverse translation of the skater together with the forces exerted by the skates on the ice. The input of the model is the changing distance between the upper body and the skate, referred to as the leg extension (Euclidean distance in 3D space). Verification shows that the model mimics the observed forces and motions well. The model is most accurate for the position and velocity estimation (respectively 1.2% and 2.9% maximum residuals) and least accurate for the force estimations (underestimation of 4.5-10%). The model can be used to further investigate variables in the skating motion. For this, the input of the model, the leg extension, can be optimized to obtain a maximal forward velocity of the upper body. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  15. EXPERIMENTAL VERIFICATION OF THE THREE-DIMENSIONAL THERMAL-HYDRAULIC MODELS IN THE BEST-ESTIMATE CODE BAGIRA.

    Energy Technology Data Exchange (ETDEWEB)

    KALINICHENKO,S.D.KROSHILIN,A.E.KROSHILIN,V.E.SMIRNOV,A.V.KOHUT,P.

    2004-03-15

    In this paper we present verification results of the BAGIRA code that was performed using data from integral thermal-hydraulic experimental test facilities as well as data obtained from operating nuclear power plants. BAGIRA is a three-dimensional numerical best-estimate code that includes non-homogeneous modeling. Special consideration was given to the recently completed experimental data from the PSB-VVER integral test facility (EREC, Electrogorsk, Russia)--a new Russian large-scale four-loop unit, which has been designed to model the primary circuits of VVER-1000 type reactors. It is demonstrated that the code BAGIRA can be used to analyze nuclear reactor behavior under normal and accident conditions.

  16. Verification of GRAPES unified global and regional numerical weather prediction model dynamic core

    Institute of Scientific and Technical Information of China (English)

    YANG XueSheng; HU JiangLin; CHEN DeHui; ZHANG HongLiang; SHEN XueShun; CHEN JiaBin; JI LiRen

    2008-01-01

    During the past few years, most of the new developed numerical weather prediction models adopt the strategy of multi-scale technique. Therefore, China Meteorological Administration has devoted to de-veloping a new generation of global and regional multi-scale model since 2003. In order to validate the performance of the GRAPES (Global and Regional Assimilation and PrEdiction System) model both for its scientific design and program coding, a suite of idealized tests has been proposed and conducted, which includes the density flow test, three-dimensional mountain wave and the cross-polar flow test. The density flow experiment indicates that the dynamic core has the ability to simulate the fine scale nonlinear flow structures and its transient features. While the three-dimensional mountain wave test shows that the model can reproduce the horizontal and vertical propagation of internal gravity waves quite well. Cross-polar flow test demonstrates the rationality of both for the semi-Lagrangian departure point calculation and the discretization of the model near the poles. The real case forecasts reveal that the model has the ability to predict the large-scale weather regimes in summer such as the subtropical high, and to capture the major synoptic patterns in the mid and high latitudes.

  17. Model Driven Development of m-Health Systems (with a Touch of Formality)

    NARCIS (Netherlands)

    Jones, Val

    2006-01-01

    We propose a model driven design and development methodology augmented with formal validation and verification (V&V) for the development of mobile health systems. Systems which deliver healthcare services remotely should be developed using robust and trusted engineering technologies. The methodology

  18. Development of dose delivery verification by PET imaging of photonuclear reactions following high energy photon therapy

    Energy Technology Data Exchange (ETDEWEB)

    Janek, S [Medical Radiation Physics, Department of Oncology and Pathology, Karolinska Institutet and Stockholm University, Box 260, 171 76 Stockholm (Sweden); Svensson, R [Medical Radiation Physics, Department of Oncology and Pathology, Karolinska Institutet and Stockholm University, Box 260, 171 76 Stockholm (Sweden); Jonsson, C [Medical Radiation Physics, Department of Oncology and Pathology, Karolinska Institutet and Stockholm University, Box 260, 171 76 Stockholm (Sweden); Brahme, A [Medical Radiation Physics, Department of Oncology and Pathology, Karolinska Institutet and Stockholm University, Box 260, 171 76 Stockholm (Sweden)

    2006-11-21

    verification by means of PET imaging seems to be applicable provided that biological transport processes such as capillary blood flow containing mobile {sup 15}O and {sup 11}C in the activated tissue volume can be accounted for.

  19. Development of dose delivery verification by PET imaging of photonuclear reactions following high energy photon therapy

    Science.gov (United States)

    Janek, S.; Svensson, R.; Jonsson, C.; Brahme, A.

    2006-11-01

    A method for dose delivery monitoring after high energy photon therapy has been investigated based on positron emission tomography (PET). The technique is based on the activation of body tissues by high energy bremsstrahlung beams, preferably with energies well above 20 MeV, resulting primarily in 11C and 15O but also 13N, all positron-emitting radionuclides produced by photoneutron reactions in the nuclei of 12C, 16O and 14N. A PMMA phantom and animal tissue, a frozen hind leg of a pig, were irradiated to 10 Gy and the induced positron activity distributions were measured off-line in a PET camera a couple of minutes after irradiation. The accelerator used was a Racetrack Microtron at the Karolinska University Hospital using 50 MV scanned photon beams. From photonuclear cross-section data integrated over the 50 MV photon fluence spectrum the predicted PET signal was calculated and compared with experimental measurements. Since measured PET images change with time post irradiation, as a result of the different decay times of the radionuclides, the signals from activated 12C, 16O and 14N within the irradiated volume could be separated from each other. Most information is obtained from the carbon and oxygen radionuclides which are the most abundant elements in soft tissue. The predicted and measured overall positron activities are almost equal (-3%) while the predicted activity originating from nitrogen is overestimated by almost a factor of two, possibly due to experimental noise. Based on the results obtained in this first feasibility study the great value of a combined radiotherapy-PET-CT unit is indicated in order to fully exploit the high activity signal from oxygen immediately after treatment and to avoid patient repositioning. With an RT-PET-CT unit a high signal could be collected even at a dose level of 2 Gy and the acquisition time for the PET could be reduced considerably. Real patient dose delivery verification by means of PET imaging seems to be

  20. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  1. Linear Modeling, Simulation and Experimental Verification of a Pressure Regulator for CNG Injection Systems

    Directory of Open Access Journals (Sweden)

    Dirk Hübner

    2008-08-01

    Full Text Available The number of motor vehicles powered by internal combustion engines keeps growing despite shrinking oil reserves. As a result, compressed natural gas (CNG is gaining currency as an emerging combustion engine fuel. To this day, CNG systems – e.g., in passenger cars – are not fully integrated into the development process as conducted by vehicle or engine manufacturers. Instead, they are usually "adapted in" at a downstream stage by small, specialized companies. The present paper initially outlines the state of the art in advanced gas injection technologies. Especially the development towards sequential injection systems is described. A pressure regulator for CNG driven combustion engines is examined in detail, given its role as a highly sensitive and critical system component. Based on a precise theoretical analysis, a linear model of this pressure regulator is derived and subjected to dynamic simulation. The analytical approach is accompanied by an experimental investigation of the device. On a test rig developed at the Trier University of Applied Sciences, the static and dynamic features of the pressure regulator can be measured with the requisite precision. The comparison of measured and simulated data yields a validation of the dynamic simulation model. With the approaches developed it is now possible for the first time to model, simulate and optimize single- or multi-stage pressure regulators for CNG driven engines with less effort and higher accuracy.

  2. Modelling river dune development

    NARCIS (Netherlands)

    Paarlberg, Andries; Weerts, H.J.T.; Dohmen-Janssen, Catarine M.; Ritsema, I.L; Hulscher, Suzanne J.M.H.; van Os, A.G.; Termes, A.P.P.

    2005-01-01

    Since river dunes influence flow resistance, predictions of dune dimensions are required to make accurate water level predictions. A model approach to simulate developing river dunes is presented. The model is set-up to be appropriate, i.e. as simple as possible, but with sufficient accuracy for

  3. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Se Ho; Lee, Seung Wook [Pusan National University, Busan (Korea, Republic of); Han, Su Chul; Park, Seung Woo [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-05-15

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

  4. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    In a world that increasingly relies on the Internet to function, application developers rely on the implementations of protocols to guarantee the security of data transferred. Whether a chosen protocol gives the required guarantees, and whether the implementation does the same, is usually unclear....... The Guided System Development framework contributes to more secure communication systems by aiding the development of such systems. The framework features a simple modelling language, step-wise refinement from models to implementation, interfaces to security verification tools, and code generation from...

  5. Verification of the New FAST v8 Capabilities for the Modeling of Fixed-Bottom Offshore Wind Turbines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Barahona, B.; Jonkman, J.; Damiani, R.; Robertson, A.; Hayman, G.

    2014-12-01

    Coupled dynamic analysis has an important role in the design of offshore wind turbines because the systems are subject to complex operating conditions from the combined action of waves and wind. The aero-hydro-servo-elastic tool FAST v8 is framed in a novel modularization scheme that facilitates such analysis. Here, we present the verification of new capabilities of FAST v8 to model fixed-bottom offshore wind turbines. We analyze a series of load cases with both wind and wave loads and compare the results against those from the previous international code comparison projects-the International Energy Agency (IEA) Wind Task 23 Subtask 2 Offshore Code Comparison Collaboration (OC3) and the IEA Wind Task 30 OC3 Continued (OC4) projects. The verification is performed using the NREL 5-MW reference turbine supported by monopile, tripod, and jacket substructures. The substructure structural-dynamics models are built within the new SubDyn module of FAST v8, which uses a linear finite-element beam model with Craig-Bampton dynamic system reduction. This allows the modal properties of the substructure to be synthesized and coupled to hydrodynamic loads and tower dynamics. The hydrodynamic loads are calculated using a new strip theory approach for multimember substructures in the updated HydroDyn module of FAST v8. These modules are linked to the rest of FAST through the new coupling scheme involving mapping between module-independent spatial discretizations and a numerically rigorous implicit solver. The results show that the new structural dynamics, hydrodynamics, and coupled solutions compare well to the results from the previous code comparison projects.

  6. Experimental verification and comparison of the rubber V- belt continuously variable transmission models

    Science.gov (United States)

    Grzegożek, W.; Dobaj, K.; Kot, A.

    2016-09-01

    The paper includes the analysis of the rubber V-belt cooperation with the CVT transmission pulleys. The analysis of the forces and torques acting in the CVT transmission was conducted basing on calculated characteristics of the centrifugal regulator and the torque regulator. The accurate estimation of the regulator surface curvature allowed for calculation of the relation between the driving wheel axial force, the engine rotational speed and the gear ratio of the CVT transmission. Simplified analytical models of the rubber V-belt- pulley cooperation are based on three basic approaches. The Dittrich model assumes two contact regions on the driven and driving wheel. The Kim-Kim model considers, in addition to the previous model, also the radial friction. The radial friction results in the lack of the developed friction area on the driving pulley. The third approach, formulated in the Cammalleri model, assumes variable sliding angle along the wrap arch and describes it as a result the belt longitudinal and cross flexibility. Theoretical torque on the driven and driving wheel was calculated on the basis of the known regulators characteristics. The calculated torque was compared to the measured loading torque. The best accordance, referring to the centrifugal regulator range of work, was obtained for the Kim-Kim model.

  7. Contact Modelling in Resistance Welding, Part I: Algorithms and Numerical Verification

    DEFF Research Database (Denmark)

    Song, Quanfeng; Zhang, Wenqi; Bay, Niels

    2006-01-01

    Finite element analysis of resistance welding involves the contact problems between different parts. The contact problem in resistance welding includes not only mechanical contact but also thermal and electrical contact. In this paper a contact model based on the penalty method is developed for s...... for simulation of resistance spot and projection welding. After a description of the algorithms several numerical examples are presented to validate the mechanical contact algorithm.......Finite element analysis of resistance welding involves the contact problems between different parts. The contact problem in resistance welding includes not only mechanical contact but also thermal and electrical contact. In this paper a contact model based on the penalty method is developed...

  8. Fiscal 1997 report of the verification research on geothermal prospecting technology. Theme 5-2. Development of a reservoir change prospecting method (reservoir change prediction technique (modeling support technique)); 1997 nendo chinetsu tansa gijutsu nado kensho chosa. 5-2. Choryuso hendo tansaho kaihatsu (choryuso hendo yosoku gijutsu (modeling shien gijutsu)) hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    To evaluate geothermal reservoirs in the initial stage of development, to keep stable output in service operation, and to develop a technology effective for extraction from peripheral reservoirs, study was made on a reservoir variation prediction technique, in particular, a modeling support technique. This paper describes the result in fiscal 1997. Underground temperature estimation technique using homogenization temperatures of fluid inclusions among core fault system measurement systems was applied to Wasabizawa field. The effect of stretching is important to estimate reservoir temperatures, and use of a minimum homogenization temperature of fluid inclusions in quartz was suitable. Even in the case of no quartz in hydrothermal veins, measured data of quartz (secondary fluid inclusion) in parent rocks adjacent to hydrothermal veins well agreed with measured temperature data. The developmental possibility of a new modeling support technique was confirmed enough through collection of documents and information. Based on the result, measurement equipment suitable for R and D was selected, and a measurement system was established through preliminary experiments. 39 refs., 35 figs., 6 tabs.

  9. Regression Verification Using Impact Summaries

    Science.gov (United States)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  10. Coupled groundwater flow and transport: 1. Verification of variable density flow and transport models

    Science.gov (United States)

    Kolditz, Olaf; Ratke, Rainer; Diersch, Hans-Jörg G.; Zielke, Werner

    This work examines variable density flow and corresponding solute transport in groundwater systems. Fluid dynamics of salty solutions with significant density variations are of increasing interest in many problems of subsurface hydrology. The mathematical model comprises a set of non-linear, coupled, partial differential equations to be solved for pressure/hydraulic head and mass fraction/concentration of the solute component. The governing equations and underlying assumptions are developed and discussed. The equation of solute mass conservation is formulated in terms of mass fraction and mass concentration. Different levels of the approximation of density variations in the mass balance equations are used for convection problems (e.g. the Boussinesq approximation and its extension, fully density approximation). The impact of these simplifications is studied by use of numerical modelling. Numerical models for nonlinear problems, such as density-driven convection, must be carefully verified in a particular series of tests. Standard benchmarks for proving variable density flow models are the Henry, Elder, and salt dome (HYDROCOIN level 1 case 5) problems. We studied these benchmarks using two finite element simulators - ROCKFLOW, which was developed at the Institute of Fluid Mechanics and Computer Applications in Civil Engineering and FEFLOW, which was developed at the Institute for Water Resources Planning and Systems Research Ltd. Although both simulators are based on the Galerkin finite element method, they differ in many approximation details such as temporal discretization (Crank-Nicolson vs predictor-corrector schemes), spatial discretization (triangular and quadrilateral elements), finite element basis functions (linear, bilinear, biquadratic), iteration schemes (Newton, Picard) and solvers (direct, iterative). The numerical analysis illustrates discretization effects and defects arising from the different levels of the density of approximation. We contribute

  11. Improvement, Verification, and Refinement of Spatially-Explicit Exposure Models in Risk Assessment - SEEM

    Science.gov (United States)

    2015-06-01

    on the NPL in 1990, and is located east of the demolition debris site. The shot fall zone is to the east of Gun Club Creek marshes (GPC, 2009...vegetation that comprised the rodents’ diets . Vegetation concentrations come from plant material that was collected at each trap site where a small...through either incidental soil ingestion with the diet , or by direct contact with the soil while foraging. The model was developed for APG Gun Club

  12. Verification of the multi-layer SNOWPACK model with different water transport schemes

    Science.gov (United States)

    Wever, N.; Schmid, L.; Heilig, A.; Eisen, O.; Fierz, C.; Lehning, M.

    2015-12-01

    The widely used detailed SNOWPACK model has undergone constant development over the years. A notable recent extension is the introduction of a Richards equation (RE) solver as an alternative for the bucket-type approach for describing water transport in the snow and soil layers. In addition, continuous updates of snow settling and new snow density parameterizations have changed model behavior. This study presents a detailed evaluation of model performance against a comprehensive multiyear data set from Weissfluhjoch near Davos, Switzerland. The data set is collected by automatic meteorological and snowpack measurements and manual snow profiles. During the main winter season, snow height (RMSE: manually observed snow profiles do not support this conclusion. This discrepancy suggests that the implementation of RE partly mimics preferential flow effects.

  13. Experimental Validation and Model Verification for a Novel Geometry ICPC Solar Collector

    DEFF Research Database (Denmark)

    Perers, Bengt; Duff, William S.; Daosukho, Jirachote

    A novel geometry ICPC solar collector was developed at the University of Chicago and Colorado State University. A ray tracing model has been designed to investigate the optical performance of both the horizontal and vertical fin versions of this collector. Solar radiation is modeled as discrete...... passing through transparent media, the size of the gap between the glass tube and fin, reflectivity of the reflective surface, absorptivity of the fin and blocking and displacement of the rays by adjacent tubes. . Presentation of the progressive animation of individual rays and associated summary graphics...... to the desired incident angle of the sun’s rays, performance of the novel ICPC solar collector at various specified angles along the transverse and longitudinal evacuated tube directions were experimentally determined. To validate the ray tracing model, transverse and longitudinal performance predictions...

  14. Verification of the IVA4 film boiling model with the data base of Liu and Theofanous

    Energy Technology Data Exchange (ETDEWEB)

    Kolev, N.I. [Siemens AG Unternehmensbereich KWU, Erlangen (Germany)

    1998-01-01

    Part 1 of this work presents a closed analytical solution for mixed-convection film boiling on vertical walls. Heat transfer coefficients predicted by the proposed model and experimental data obtained at the Royal Institute of Technology in Sweden by Okkonen et al are compared. All data predicted are inside the {+-}10% error band, with mean averaged error being below 4% using the slightly modified analytical solution. The solution obtained is recommended for practical applications. The method presented here is used in Part 2 as a guideline for developing model for film boiling on spheres. The new semi-empirical film boiling model for spheres used in IVA4 computer code is compared with the experimental data base obtained by Liu and Theofanous. The data are predicted within {+-}30% error band. (author)

  15. Verification Test of the SURF and SURFplus Models in xRage: Part III Affect of Mesh Alignment

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-15

    The previous studies used an underdriven detonation wave in 1-dimension (steady ZND reaction zone profile followed by a scale-invariant rarefaction wave) for PBX 9502 as a verification test of the implementation of the SURF and SURFplus models in the xRage code. Since the SURF rate is a function of the lead shock pressure, the question arises as to the effect on accuracy of variations in the detected shock pressure due to the alignment of the shock front with the mesh. To study the effect of mesh alignment we simulate a cylindrically diverging detonation wave using a planar 2-D mesh. The leading issue is the magnitude of azimuthal asymmetries in the numerical solution. The 2-D test case does not have an exact analytic solution. To quantify the accuracy, the 2-D solution along rays through the origin are compared to a highly resolved 1-D simulation in cylindrical geometry.

  16. Development of nuclear thermal hydraulic verification test and evaluation technology - Development of fundamental technique for experiment of natural circulation phenomena in PWR systems

    Energy Technology Data Exchange (ETDEWEB)

    Park, Goon Cherl; Lee, Tae Ho; Kim, Moon Oh; Kim, Hak Joon [Seoul National University, Seoul (Korea)

    2000-04-01

    The dimensional analysis applied two-fluid model of CFX-4,2 were performed. For verification of analysis results, experimental measurement data of two-phase flow parameters in subcooled boiling flow were produced for vertical(0 deg) and inclination (60 deg). And through comparison analysis and experiments the application possibility of various two -phase flow models and the analysis ability of code were evaluated. Measurement technique of bubble velocity in two-phase flow using backscattering standard LDV was investigated from slug to bubbly flow regime. The range of velocity measured is from 0.2 to 1.5 m/s and that of bubble size is from 2 to 20 mm. For local temperature of boiling flow measurement, microthermocouple were manufactured and local liquid and vapor temperatures were measured in pool boiling and boiling flow. 66 refs., 74 figs., 4 tabs. (Author)

  17. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  18. A Runtime Verification System for Developing, Analyzing and Controlling Complex Safety-Critical Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A comprehensive commercial-grade system for the development of safe parallel and serial programs is developed. The system has the ability to perform efficient...

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS: TRI-DIM FILTER CORP. PREDATOR II MODEL 8VADTP123C23

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Predator II, Model 8VADTP123C23CC000 air filter for dust and bioaerosol filtration manufactured by Tri-Dim Filter Corporation. The pressure drop across the filter was 138 Pa clean and...

  20. Nanofibre distribution in composites manufactured with epoxy reinforced with nanofibrillated cellulose: model prediction and verification

    Science.gov (United States)

    Aitomäki, Yvonne; Westin, Mikael; Korpimäki, Jani; Oksman, Kristiina

    2016-07-01

    In this study a model based on simple scattering is developed and used to predict the distribution of nanofibrillated cellulose in composites manufactured by resin transfer moulding (RTM) where the resin contains nanofibres. The model is a Monte Carlo based simulation where nanofibres are randomly chosen from probability density functions for length, diameter and orientation. Their movements are then tracked as they advance through a random arrangement of fibres in defined fibre bundles. The results of the model show that the fabric filters the nanofibres within the first 20 µm unless clear inter-bundle channels are available. The volume fraction of the fabric fibres, flow velocity and size of nanofibre influence this to some extent. To verify the model, an epoxy with 0.5 wt.% Kraft Birch nanofibres was made through a solvent exchange route and stained with a colouring agent. This was infused into a glass fibre fabric using an RTM process. The experimental results confirmed the filtering of the nanofibres by the fibre bundles and their penetration in the fabric via the inter-bundle channels. Hence, the model is a useful tool for visualising the distribution of the nanofibres in composites in this manufacturing process.

  1. Experimental verification of a model describing the intensity distribution from a single mode optical fiber

    Energy Technology Data Exchange (ETDEWEB)

    Moro, Erik A [Los Alamos National Laboratory; Puckett, Anthony D [Los Alamos National Laboratory; Todd, Michael D [UCSD

    2011-01-24

    The intensity distribution of a transmission from a single mode optical fiber is often approximated using a Gaussian-shaped curve. While this approximation is useful for some applications such as fiber alignment, it does not accurately describe transmission behavior off the axis of propagation. In this paper, another model is presented, which describes the intensity distribution of the transmission from a single mode optical fiber. A simple experimental setup is used to verify the model's accuracy, and agreement between model and experiment is established both on and off the axis of propagation. Displacement sensor designs based on the extrinsic optical lever architecture are presented. The behavior of the transmission off the axis of propagation dictates the performance of sensor architectures where large lateral offsets (25-1500 {micro}m) exist between transmitting and receiving fibers. The practical implications of modeling accuracy over this lateral offset region are discussed as they relate to the development of high-performance intensity modulated optical displacement sensors. In particular, the sensitivity, linearity, resolution, and displacement range of a sensor are functions of the relative positioning of the sensor's transmitting and receiving fibers. Sensor architectures with high combinations of sensitivity and displacement range are discussed. It is concluded that the utility of the accurate model is in its predicative capability and that this research could lead to an improved methodology for high-performance sensor design.

  2. Formal Verification, Engineering and Business Value

    Directory of Open Access Journals (Sweden)

    Ralf Huuck

    2012-12-01

    Full Text Available How to apply automated verification technology such as model checking and static program analysis to millions of lines of embedded C/C++ code? How to package this technology in a way that it can be used by software developers and engineers, who might have no background in formal verification? And how to convince business managers to actually pay for such a software? This work addresses a number of those questions. Based on our own experience on developing and distributing the Goanna source code analyzer for detecting software bugs and security vulnerabilities in C/C++ code, we explain the underlying technology of model checking, static analysis and SMT solving, steps involved in creating industrial-proof tools.

  3. An approach for rapid development of nasal delivery of analgesics--identification of relevant features, in vitro screening and in vivo verification.

    Science.gov (United States)

    Wang, Shu; Chow, Moses S S; Zuo, Zhong

    2011-11-25

    Drug delivery via the nasal route is gaining increasing interest over the last two decades as an alternative to oral or parenteral drug administration. In the current study an approach for rapid identification of relevant features, screening and in vivo verification of potential therapeutic agents for nasal delivery was carried out using "analgesic agents" as an example. Four such drug candidates (rizatriptan, meloxicam, lornoxicam and nebivolol) were initially identified as potentially viable agents based on their therapeutic use and physicochemical characteristics. An in vitro screening was then carried out using the Calu-3 cell line model. Based on the in vitro screening results and the reported pharmacokinetic and the stability data, meloxicam was predicted to be the most promising drug candidate and was subsequently verified using an in vivo animal model. The in vivo results showed that nasal administration of meloxicam was comparable to its intravenous administration, with respect to plasma drug concentration and AUC(0-2h). In addition, nasal absorption of meloxicam was much more rapid with higher plasma drug concentration and AUC(0-2h) than that of oral administration. The current approach appears to be capable of developing "analgesic agents" suitable for nasal delivery. Further studies are needed to prove the clinical advantage of the specific selected agent, meloxicam, by nasal administration in patients.

  4. Design of the front end electronics for the infrared camera of JEM-EUSO, and manufacturing and verification of the prototype model

    Science.gov (United States)

    Maroto, Oscar; Diez-Merino, Laura; Carbonell, Jordi; Tomàs, Albert; Reyes, Marcos; Joven-Alvarez, Enrique; Martín, Yolanda; Morales de los Ríos, J. A.; del Peral, Luis; Rodríguez-Frías, M. D.

    2014-07-01

    The Japanese Experiment Module (JEM) Extreme Universe Space Observatory (EUSO) will be launched and attached to the Japanese module of the International Space Station (ISS). Its aim is to observe UV photon tracks produced by ultra-high energy cosmic rays developing in the atmosphere and producing extensive air showers. The key element of the instrument is a very wide-field, very fast, large-lense telescope that can detect extreme energy particles with energy above 1019 eV. The Atmospheric Monitoring System (AMS), comprising, among others, the Infrared Camera (IRCAM), which is the Spanish contribution, plays a fundamental role in the understanding of the atmospheric conditions in the Field of View (FoV) of the telescope. It is used to detect the temperature of clouds and to obtain the cloud coverage and cloud top altitude during the observation period of the JEM-EUSO main instrument. SENER is responsible for the preliminary design of the Front End Electronics (FEE) of the Infrared Camera, based on an uncooled microbolometer, and the manufacturing and verification of the prototype model. This paper describes the flight design drivers and key factors to achieve the target features, namely, detector biasing with electrical noise better than 100μV from 1Hz to 10MHz, temperature control of the microbolometer, from 10°C to 40°C with stability better than 10mK over 4.8hours, low noise high bandwidth amplifier adaptation of the microbolometer output to differential input before analog to digital conversion, housekeeping generation, microbolometer control, and image accumulation for noise reduction. It also shows the modifications implemented in the FEE prototype design to perform a trade-off of different technologies, such as the convenience of using linear or switched regulation for the temperature control, the possibility to check the camera performances when both microbolometer and analog electronics are moved further away from the power and digital electronics, and

  5. Reconfigurable system design and verification

    CERN Document Server

    Hsiung, Pao-Ann; Huang, Chun-Hsian

    2009-01-01

    Reconfigurable systems have pervaded nearly all fields of computation and will continue to do so for the foreseeable future. Reconfigurable System Design and Verification provides a compendium of design and verification techniques for reconfigurable systems, allowing you to quickly search for a technique and determine if it is appropriate to the task at hand. It bridges the gap between the need for reconfigurable computing education and the burgeoning development of numerous different techniques in the design and verification of reconfigurable systems in various application domains. The text e

  6. Development and verification testing of automation and robotics for assembly of space structures

    Science.gov (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  7. Bedrock geology Forsmark. Modelling stage 2.3. Implications for and verification of the deterministic geological models based on complementary data

    Energy Technology Data Exchange (ETDEWEB)

    Stephens, Michael B. (Geological Survey of Sweden, Uppsala (Sweden)); Simeonov, Assen (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Isaksson, Hans (GeoVista AB, Luleaa (Sweden))

    2008-12-15

    The Swedish Nuclear Fuel and Waste Management Company is in the process of completing site descriptive modelling at two locations in Sweden, with the objective to site a deep geological repository for spent nuclear fuel. At Forsmark, the results of the stage 2.2 geological modelling formed the input for downstream users. Since complementary ground and borehole geological and geophysical data, acquired after model stage 2.2, were not planned to be included in the deterministic rock domain, fracture domain and deformation zone models supplied to the users, it was deemed necessary to evaluate the implications of these stage 2.3 data for the stage 2.2 deterministic geological models and, if possible, to make use of these data to verify the models. This report presents the results of the analysis of the complementary stage 2.3 geological and geophysical data. Model verification from borehole data has been implemented in the form of a prediction-outcome test. The stage 2.3 geological and geophysical data at Forsmark mostly provide information on the bedrock outside the target volume. Additional high-resolution ground magnetic data and the data from the boreholes KFM02B, KFM11A, KFM12A and HFM33 to HFM37 can be included in this category. Other data complement older information of identical character, both inside and outside this volume. These include the character and kinematics of deformation zones and fracture mineralogy. In general terms, it can be stated that all these new data either confirm the geological modelling work completed during stage 2.2 or are in good agreement with the data that were used in this work. In particular, although the new high-resolution ground magnetic data modify slightly the position and trace length of some stage 2.2 deformation zones at the ground surface, no new or modified deformation zones with a trace length longer than 3,000 m at the ground surface have emerged. It is also apparent that the revision of fracture orientation data

  8. Verification of the predictive capabilities of the 4C code cryogenic circuit model

    Science.gov (United States)

    Zanino, R.; Bonifetto, R.; Hoa, C.; Richard, L. Savoldi

    2014-01-01

    The 4C code was developed to model thermal-hydraulics in superconducting magnet systems and related cryogenic circuits. It consists of three coupled modules: a quasi-3D thermal-hydraulic model of the winding; a quasi-3D model of heat conduction in the magnet structures; an object-oriented a-causal model of the cryogenic circuit. In the last couple of years the code and its different modules have undergone a series of validation exercises against experimental data, including also data coming from the supercritical He loop HELIOS at CEA Grenoble. However, all this analysis work was done each time after the experiments had been performed. In this paper a first demonstration is given of the predictive capabilities of the 4C code cryogenic circuit module. To do that, a set of ad-hoc experimental scenarios have been designed, including different heating and control strategies. Simulations with the cryogenic circuit module of 4C have then been performed before the experiment. The comparison presented here between the code predictions and the results of the HELIOS measurements gives the first proof of the excellent predictive capability of the 4C code cryogenic circuit module.

  9. Experimental verification of a precooled mixed gas Joule-Thomson cryoprobe model

    Science.gov (United States)

    Passow, Kendra Lynn; Skye, Harrison; Nellis, Gregory; Klein, Sanford

    2012-06-01

    Cryosurgery is a medical technique that uses a cryoprobe to apply extreme cold to undesirable tissue such as cancers. Precooled Mixed Gas Joule-Thomson (pMGJT) cycles with Hampson-style recuperators are integrated with the latest generation of cryoprobes to create more powerful and compact instruments. Selection of gas mixtures for these cycles is not a trivial process; the focus of this research is the development of a detailed model that can be integrated with an optimization algorithm to select optimal gas mixtures. A test facility has been constructed to experimentally tune and verify this model. The facility uses a commercially available cryoprobe system that was modified to integrate measurement instrumentation sufficient to determine the performance of the system and its component parts. Spatially resolved temperature measurements allow detailed measurements of the heat transfer within the recuperator and therefore computation of the spatially resolved conductance. These data can be used to study the multiphase, multicomponent heat transfer process in the complicated recuperator geometry. The optimization model has been expanded to model the pressure drop associated with the flow to more accurately predict the performance of the system. The test facility has been used to evaluate the accuracy and usefulness of this improvement.

  10. Verification and Diagnosis Infrastructure of SoC HDL-model

    CERN Document Server

    Hahanov, Vladimir; Litvinova, Eugenia; Chumachenko, Svetlana

    2012-01-01

    This article describes technology for diagnosing SoC HDL-models, based on transactional graph. Diagnosis method is focused to considerable decrease the time of fault detection and memory for storage of diagnosis matrix by means of forming ternary relations in the form of test, monitor, and functional component. The following problems are solved: creation of digital system model in the form of transaction graph and multi-tree of fault detection tables, as well as ternary matrices for activating functional components in tests, relative to the selected set of monitors; development of a method for analyzing the activation matrix to detect the faults with given depth and synthesizing logic functions for subsequent embedded hardware fault diagnosing.

  11. Numerical climate modeling and verification of selected areas for heat waves of Pakistan using ensemble prediction system

    Science.gov (United States)

    Amna, S.; Samreen, N.; Khalid, B.; Shamim, A.

    2013-06-01

    Depending upon the topography, there is an extreme variation in the temperature of Pakistan. Heat waves are the Weather-related events, having significant impact on the humans, including all socioeconomic activities and health issues as well which changes according to the climatic conditions of the area. The forecasting climate is of prime importance for being aware of future climatic changes, in order to mitigate them. The study used the Ensemble Prediction System (EPS) for the purpose of modeling seasonal weather hind-cast of three selected areas i.e., Islamabad, Jhelum and Muzaffarabad. This research was purposely carried out in order to suggest the most suitable climate model for Pakistan. Real time and simulated data of five General Circulation Models i.e., ECMWF, ERA-40, MPI, Meteo France and UKMO for selected areas was acquired from Pakistan Meteorological Department. Data incorporated constituted the statistical temperature records of 32 years for the months of June, July and August. This study was based on EPS to calculate probabilistic forecasts produced by single ensembles. Verification was done out to assess the quality of the forecast t by using standard probabilistic measures of Brier Score, Brier Skill Score, Cross Validation and Relative Operating Characteristic curve. The results showed ECMWF the most suitable model for Islamabad and Jhelum; and Meteo France for Muzaffarabad. Other models have significant results by omitting particular initial conditions.

  12. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  13. Experimental verification of bridge seismic damage states quantified by calibrating analytical models with empirical field data

    Institute of Scientific and Technical Information of China (English)

    Swagata Banerjee; Masanobu Shinozuka

    2008-01-01

    Bridges are one of the most vulnerable components of a highway transportation network system subjected to earthquake ground motions.Prediction of resilience and sustainability of bridge performance in a probabilistic manner provides valuable information for pre-event system upgrading and post-event functional recovery of the network.The current study integrates bridge seismic damageability information obtained through empirical,analytical and experimental procedures and quantifies threshold limits of bridge damage states consistent with the physical damage description given in HAZUS.Experimental data from a large-scale shaking table test are utilized for this purpose.This experiment was conducted at the University of Nevada,Reno,where a research team from the University of California,Irvine,participated.Observed experimental damage data are processed to idemify and quantify bridge damage states in terms of rotational ductility at bridge column ends.In parallel,a mechanistic model for fragility curves is developed in such a way that the model can be calibrated against empirical fragility curves that have been constructed from damage data obtained during the 1994 Northridge earthquake.This calibration quantifies threshold values of bridge damage states and makes the analytical study consistent with damage data observed in past earthquakes.The mechanistic model is transportable and applicable to most types and sizes of bridges.Finally,calibrated damage state definitions are compared with that obtained using experimental findings.Comparison shows excellent consistency among results from analytical,empirical and experimental observations.

  14. System-level modeling and verification of a micro pitch-tunable grating

    Science.gov (United States)

    Lv, Xianglian; Xu, Jinghui; Yu, Yiting; He, Yang; Yuan, Weizheng

    2010-10-01

    Micro Pitch-tunable Grating based on microeletromechanical systems(MEMS) technology can modulate the grating period dynamically by controlling the drive voltage. The device is so complex that it is impossible to model and sumulation by FEA method or only analysis macromodel. In this paper, a new hybrid system-level modeling method was presented. Firstly the grating was decomposed into function components such as grating beam, supporting beam, electrostatic comb-driver. Block Arnoldi algorithm was used to obtain the numerical macromodel of the grating beams and supporting beams, the analytical macromodels called multi-port-elements(MPEs) of the comb-driver and other parts were also established, and the elements were connected together to form hybrid network for representing the systemlevel models of the grating in MEME Garden, which is a MEMS CAD tool developed by Micro and Nano Electromechanical Systems Laboratory, Northwestern Polytechnical University. Both frequency and time domain simulation were implemented. The grating was fabricated using silicon-on-glass(SOG) process. The measured working displacement is 16.5μm at a driving voltage of 40V. The simulation result is 17.6μm which shows an acceptable agreement with the measurement result within the error tolerance of 6.7%. The method proposed in this paper can solve the voltage-displacement simulation problem of this kind of complex grating. It can also be adapted to similar MEMS/MOEMS devices simulations.

  15. The Woodward Effect: Math Modeling and Continued Experimental Verifications at 2 to 4 MHz

    Science.gov (United States)

    March, Paul; Palfreyman, Andrew

    2006-01-01

    The Woodward Effect (W-E), the supposition that energy-storing ions experience a transient mass fluctuation near their rest mass when accelerated, has been tentatively verified using linear electrical thrusters based on the Heaviside-Lorentz force transformation. This type of electromagnetic field thruster, or Mach-Lorentz Thruster (MLT), purports to create a transient mass differential that is expressed in a working medium to produce a net thrust in the dielectric material contained in several capacitors. These mass differentials are hypothesized to result from gravity/inertia-based Wheeler-Feynman radiation reactions with the rest of the mass in the universe (per Mach's Principle) in order to conserve momentum. Thus if a net unidirectional force is produced in such a device, then mass fluctuations in the working media should be present. A net unidirectional and reversible force on the order of +/- 3.14 milli-Newton or 0.069% of the suspended test article mass was recorded by us in our first high frequency 2.2 MHz test article. The authors also developed a W-E model that integrates the various engineering parameters affecting the design, construction, and performance of W-E based MLTs for the next generation of systems. When Woodward's (2004a, 2004b, 2005) and our test results were compared with the model's predictions, the test results exceeded predictions by one to two orders of magnitude. Efforts are underway to understand the discrepancies and update the model. The test results imply that these devices, when fully developed, could be competitive with ion engines intended for use on satellite station keeping and/or orbital transfers.

  16. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-21

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  17. Constraining millennial scale dynamics of a Greenland tidewater glacier for the verification of a calving criterion based numerical model

    Science.gov (United States)

    Lea, J.; Mair, D.; Rea, B.; Nick, F.; Schofield, E.

    2012-04-01

    The ability to successfully model the behaviour of Greenland tidewater glaciers is pivotal to understanding the controls on their dynamics and potential impact on global sea level. However, to have confidence in the results of numerical models in this setting, the evidence required for robust verification must extend well beyond the existing instrumental record. Perhaps uniquely for a major Greenland outlet glacier, both the advance and retreat dynamics of Kangiata Nunata Sermia (KNS), Nuuk Fjord, SW Greenland over the last ~1000 years can be reasonably constrained through a combination of geomorphological, sedimentological and archaeological evidence. It is therefore an ideal location to test the ability of the latest generation of calving criterion based tidewater models to explain millennial scale dynamics. This poster presents geomorphological evidence recording the post-Little Ice Age maximum dynamics of KNS, derived from high-resolution satellite imagery. This includes evidence of annual retreat moraine complexes suggesting controlled rather than catastrophic retreat between pinning points, in addition to a series of ice dammed lake shorelines, allowing detailed interpretation of the dynamics of the glacier as it thinned and retreated. Pending ground truthing, this evidence will contribute towards the calibration of results obtained from a calving criterion numerical model (Nick et al, 2010), driven by an air temperature reconstruction for the KNS region determined from ice core data.

  18. Development, implementation, and verification of multicycle depletion perturbation theory for reactor burnup analysis

    Energy Technology Data Exchange (ETDEWEB)

    White, J.R.

    1980-08-01

    A generalized depletion perturbation formulation based on the quasi-static method for solving realistic multicycle reactor depletion problems is developed and implemented within the VENTURE/BURNER modular code system. The present development extends the original formulation derived by M.L. Williams to include nuclide discontinuities such as fuel shuffling and discharge. This theory is first described in detail with particular emphasis given to the similarity of the forward and adjoint quasi-static burnup equations. The specific algorithm and computational methods utilized to solve the adjoint problem within the newly developed DEPTH (Depletion Perturbation Theory) module are then briefly discussed. Finally, the main features and computational accuracy of this new method are illustrated through its application to several representative reactor depletion problems.

  19. Development, implementation, and verification of multicycle depletion perturbation theory for reactor burnup analysis

    Energy Technology Data Exchange (ETDEWEB)

    White, J.R.

    1980-08-01

    A generalized depletion perturbation formulation based on the quasi-static method for solving realistic multicycle reactor depletion problems is developed and implemented within the VENTURE/BURNER modular code system. The present development extends the original formulation derived by M.L. Williams to include nuclide discontinuities such as fuel shuffling and discharge. This theory is first described in detail with particular emphasis given to the similarity of the forward and adjoint quasi-static burnup equations. The specific algorithm and computational methods utilized to solve the adjoint problem within the newly developed DEPTH (Depletion Perturbation Theory) module are then briefly discussed. Finally, the main features and computational accuracy of this new method are illustrated through its application to several representative reactor depletion problems.

  20. Documenting Differences between Early Stone Age Flake Production Systems: An Experimental Model and Archaeological Verification.

    Science.gov (United States)

    Presnyakova, Darya; Archer, Will; Braun, David R; Flear, Wesley

    2015-01-01

    This study investigates morphological differences between flakes produced via "core and flake" technologies and those resulting from bifacial shaping strategies. We investigate systematic variation between two technological groups of flakes using experimentally produced assemblages, and then apply the experimental model to the Cutting 10 Mid -Pleistocene archaeological collection from Elandsfontein, South Africa. We argue that a specific set of independent variables--and their interactions--including external platform angle, platform depth, measures of thickness variance and flake curvature should distinguish between these two technological groups. The role of these variables in technological group separation was further investigated using the Generalized Linear Model as well as Linear Discriminant Analysis. The Discriminant model was used to classify archaeological flakes from the Cutting 10 locality in terms of their probability of association, within either experimentally developed technological group. The results indicate that the selected independent variables play a central role in separating core and flake from bifacial technologies. Thickness evenness and curvature had the greatest effect sizes in both the Generalized Linear and Discriminant models. Interestingly the interaction between thickness evenness and platform depth was significant and played an important role in influencing technological group membership. The identified interaction emphasizes the complexity in attempting to distinguish flake production strategies based on flake morphological attributes. The results of the discriminant function analysis demonstrate that the majority of flakes at the Cutting 10 locality were not associated with the production of the numerous Large Cutting Tools found at the site, which corresponds with previous suggestions regarding technological behaviors reflected in this assemblage.

  1. Relative Navigation Light Detection and Ranging (LIDAR) Sensor Development Test Objective (DTO) Performance Verification

    Science.gov (United States)

    Dennehy, Cornelius J.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) received a request from the NASA Associate Administrator (AA) for Human Exploration and Operations Mission Directorate (HEOMD), to quantitatively evaluate the individual performance of three light detection and ranging (LIDAR) rendezvous sensors flown as orbiter's development test objective on Space Transportation System (STS)-127, STS-133, STS-134, and STS-135. This document contains the outcome of the NESC assessment.

  2. Formal verification of industrial control systems

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  3. Product Development Process Modeling

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The use of Concurrent Engineering and other modern methods of product development and maintenance require that a large number of time-overlapped "processes" be performed by many people. However, successfully describing and optimizing these processes are becoming even more difficult to achieve. The perspective of industrial process theory (the definition of process) and the perspective of process implementation (process transition, accumulation, and inter-operations between processes) are used to survey the method used to build one base model (multi-view) process model.

  4. The development of verification and validation technology for instrumentation and control in NPPs - A study on the software development methodology of a highly reliable software

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Yong Rae; Cha, Sung Deok; Lee, Woo Jin; Chae, Hong Seok; Yoon, Kwang Sik; Jeong, Ki Suk [Korea Advanced Institute of Science and Technology,= Taejon (Korea, Republic of)

    1996-07-01

    Nuclear industries have tried to use the digital I and C technology in developing advanced nuclear power plants. However, because the industries did= not establish the highly reliable software development methodologies and standards applied to developing the highly reliable and safe software for digital I and C systems, they were confronted with the difficulties to avoid software common mode failures. To mitigate the difficulties, the highly reliable software development environments and methodologies and validation and verification techniques should be the cornerstone of all digital implementation in nuclear power plants. The objectives of this project is to establish the highly reliable software development methodology to support developing digital instrumentation and control systems in nuclear power plants. In this project, we have investigated the business-oriented and the real-time software development methods and techniques for ensuring safety and reliability of the software. Also we have studied standards related to licensing the software for digital I and C systems. 50 refs., 51 figs. (author)

  5. Analytical design model for a piezo-composite unimorph actuator and its verification using lightweight piezo-composite curved actuators

    Science.gov (United States)

    Yoon, K. J.; Park, K. H.; Lee, S. K.; Goo, N. S.; Park, H. C.

    2004-06-01

    This paper describes an analytical design model for a layered piezo-composite unimorph actuator and its numerical and experimental verification using a LIPCA (lightweight piezo-composite curved actuator) that is lighter than other conventional piezo-composite type actuators. The LIPCA is composed of top fiber composite layers with high modulus and low CTE (coefficient of thermal expansion), a middle PZT ceramic wafer, and base layers with low modulus and high CTE. The advantages of the LIPCA design are to replace the heavy metal layer of THUNDER by lightweight fiber-reinforced plastic layers without compromising the generation of high force and large displacement and to have design flexibility by selecting the fiber direction and the number of prepreg layers. In addition to the lightweight advantage and design flexibility, the proposed device can be manufactured without adhesive layers when we use a resin prepreg system. A piezo-actuation model for a laminate with piezo-electric material layers and fiber composite layers is proposed to predict the curvature and residual stress of the LIPCA. To predict the actuation displacement of the LIPCA with curvature, a finite element analysis method using the proposed piezo-actuation model is introduced. The predicted deformations are in good agreement with the experimental ones.

  6. Dynamic modeling of double-helical gear with Timoshenko beam theory and experiment verification

    Directory of Open Access Journals (Sweden)

    Jincheng Dong

    2016-05-01

    Full Text Available In the dynamic study of the double-helical gear transmission, the coupling shaft in the middle of the two helical gears is difficult to be handled accurately. In this article, the coupling shaft is treated as the Timoshenko beam elements and is synthesized with the lumped-mass method of the two helical gear pairs. Then, the numerical integration method is used to solve the amplitude–frequency responses and dynamic factors under diverse operating conditions. A gear vibration test rig of closed power circuit is developed for in-depth experimental measurements and model validation. After comparing the theoretical data with the practical results, the following conclusions are drawn: (1 the dynamic model with the Timoshenko beam element is quite appropriate and reliable in the dynamic analysis of double-helical gear transmission and is of great theoretical value in the accurate dynamic research of the double-helical gear transmission. (2 In both theoretical analysis and experimental measurements, the dynamic factors of gear pair diminish with the increase in the input torque and augment with the increase in the input speed. (3 The deviation ratio of the theoretical data and the experimental results decrease with the increase in the input torque, reaching the minimum at the highest input speed.

  7. Optics Design for the U.S. SKA Technology Development Project Design Verification Antenna

    Science.gov (United States)

    Imbriale, W. A.; Baker, L.; Cortes-Medellin, G.

    2012-01-01

    The U.S. design concept for the Square Kilometer Array (SKA) program is based on utilizing a large number of 15 meter dish antennas. The Technology Development Project (TDP) is planning to design and build the first of these antennas to provide a demonstration of the technology and a solid base on which to estimate costs. This paper describes the performance of the selected optics design. It is a dual-shaped offset Gregorian design with a feed indexer that can accommodate corrugated horns, wide band single pixel feeds or phased array feeds.

  8. Optics Design for the U.S. SKA Technology Development Project Design Verification Antenna

    Science.gov (United States)

    Imbriale, W. A.; Baker, L.; Cortes-Medellin, G.

    2012-01-01

    The U.S. design concept for the Square Kilometer Array (SKA) program is based on utilizing a large number of 15 meter dish antennas. The Technology Development Project (TDP) is planning to design and build the first of these antennas to provide a demonstration of the technology and a solid base on which to estimate costs. This paper describes the performance of the selected optics design. It is a dual-shaped offset Gregorian design with a feed indexer that can accommodate corrugated horns, wide band single pixel feeds or phased array feeds.

  9. Adaptive multi-rate interface: development and experimental verification for real-time hybrid simulation

    DEFF Research Database (Denmark)

    Maghareh, Amin; Waldbjørn, Jacob Paamand; Dyke, Shirley J.;

    2016-01-01

    Real-time hybrid simulation (RTHS) is a powerful cyber-physical technique that is a relatively cost-effective method to perform global/local system evaluation of structural systems. A major factor that determines the ability of an RTHS to represent true system-level behavior is the fidelity...... it employs different time steps in the numerical and the physical substructures while including rate-transitioning to link the components appropriately. Typically, a higher-order numerical substructure model is solved at larger time intervals, and is coupled with a physical substructure that is driven...... frequency between the numerical and physical substructures and for input signals with high-frequency content. Further, it does not induce signal chattering at the coupling frequency. The effectiveness of AMRI is also verified experimentally....

  10. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  11. Kinematic Modelling and Simulation of a 2-R Robot Using SolidWorks and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Mahmoud Gouasmi

    2012-12-01

    Full Text Available The simulation of robot systems is becoming very popular, especially with the lowering of the cost of computers, and it can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. The trajectory planning of redundant manipulators is a very active area since many tasks require special characteristics to be satisfied. The importance of redundant manipulators has increased over the last two decades because of the possibility of avoiding singularities as well as obstacles within the course of motion. The angle that the last link of a 2 DOF manipulator makes with the x-axis is required in order to find the solution for the inverse kinematics problem. This angle could be optimized with respect to a given specified key factor (time, velocity, torques while the end-effector performs a chosen trajectory (i.e., avoiding an obstacle in the task space. Modeling and simulation of robots could be achieved using either of the following models: the geometrical model (positions, postures, the kinematic model and the dynamic model. To do so, the modelization of a 2-R robot type is implemented. Our main tasks are comparing two robot postures with the same trajectory (path and for the same length of time, and establishing a computing code to obtain the kinematic and dynamic parameters. SolidWorks and MATLAB/Simulink softwares are used to check the theory and the robot motion simulation. This could be easily generalized to a 3-R robot and possibly therefore to any serial robot (Scara, Puma, etc.. The verification of the obtained results by both softwares allows us to qualitatively evaluate and underline the validityof the chosen model and obtain the right conclusions. The results of the simulations are discussed and an agreement between the two softwares is certainly obtained.

  12. Integrated Computational Model Development

    Science.gov (United States)

    2014-03-01

    68.5%, 9.6% and 21.9%, respectively. The alloy density and Vickers microhardness were ρ = 8.23 ± 0.01 g/cm3 and Hv = 5288 ± 1 MPa. [3...and 3-D. Techniques to mechanically test materials at smaller scales were developed to better inform the deformation models. Also methods were...situ microscale tension testing technique was adapted to enable microscale fatigue testing on tensile dog-bone specimens. Microscale tensile fatigue

  13. Development and Performance Verification of Fiber Optic Temperature Sensors in High Temperature Engine Environments

    Science.gov (United States)

    Adamovsky, Grigory; Mackey, Jeffrey R.; Kren, Lawrence A.; Floyd, Bertram M.; Elam, Kristie A.; Martinez, Martel

    2014-01-01

    A High Temperature Fiber Optic Sensor (HTFOS) has been developed at NASA Glenn Research Center for aircraft engine applications. After fabrication and preliminary in-house performance evaluation, the HTFOS was tested in an engine environment at NASA Armstrong Flight Research Center. The engine tests enabled the performance of the HTFOS in real engine environments to be evaluated along with the ability of the sensor to respond to changes in the engine's operating condition. Data were collected prior, during, and after each test in order to observe the change in temperature from ambient to each of the various test point levels. An adequate amount of data was collected and analyzed to satisfy the research team that HTFOS operates properly while the engine was running. Temperature measurements made by HTFOS while the engine was running agreed with those anticipated.

  14. Program Verification with Monadic Second-Order Logic & Languages for Web Service Development

    DEFF Research Database (Denmark)

    Møller, Anders

    development are areas of programming language research that have received increased attention during the last years. We first show how the logic Weak monadic Second-order Logic on Strings and Trees can be implemented efficiently despite an intractable theoretical worst-case complexity. Among several other......, such as maintaining session state and dynamically producing HTML or XML documents. By introducing explicit language-based mechanisms for those issues, we liberate the Web service programmer from the tedious and error-prone alternatives. Specialized program analyses aid the programmer by verifying at compile time......Domain-specific formal languages are an essential part of computer science, combining theory and practice. Such languages are characterized by being tailor-made for specific application domains and thereby providing expressiveness on high abstraction levels and allowing specialized analysis...

  15. The CoreGram project: theoretical linguistics, theory development and verification

    Directory of Open Access Journals (Sweden)

    Stefan Müller

    2015-06-01

    Full Text Available This paper describes the CoreGram project, a multilingual grammar engineering project that develops HPSG grammars for several typologically diverse languages that share a common core. The paper provides a general motivation for doing theoretical linguistics the way it is done in the CoreGram project and therefore is not targeted at computational linguists exclusively. I argue for a constraint-based approach to language rather than a generative-enumerative one and discuss issues of formalization. Recent advantages in the language acquisition research are mentioned and conclusions on how theories should be constructed are drawn. The paper discusses some of the highlights in the implemented grammars, gives a brief overview of central theoretical concepts and their implementation in TRALE and compares the CoreGram project with other multilingual grammar engineering projects.

  16. Recent Developments In Fast Neutron Detection And Multiplicity Counting With Verification With Liquid Scintillator

    Energy Technology Data Exchange (ETDEWEB)

    Nakae, L; Chapline, G; Glenn, A; Kerr, P; Kim, K; Ouedraogo, S; Prasad, M; Sheets, S; Snyderman, N; Verbeke, J; Wurtz, R

    2011-09-30

    For many years at LLNL, we have been developing time-correlated neutron detection techniques and algorithms for applications such as Arms Control, Threat Detection and Nuclear Material Assay. Many of our techniques have been developed specifically for the relatively low efficiency (a few percent) attainable by detector systems limited to man-portability. Historically, we used thermal neutron detectors (mainly {sup 3}He), taking advantage of the high thermal neutron interaction cross-sections. More recently, we have been investigating the use of fast neutron detection with liquid scintillators, inorganic crystals, and in the near future, pulse-shape discriminating plastics which respond over 1000 times faster (nanoseconds versus tens of microseconds) than thermal neutron detectors. Fast neutron detection offers considerable advantages, since the inherent nanosecond production time-scales of spontaneous fission and neutron-induced fission are preserved and measured instead of being lost by thermalization required for thermal neutron detectors. We are now applying fast neutron technology to the safeguards regime in the form of fast portable digital electronics as well as faster and less hazardous scintillator formulations. Faster detector response times and sensitivity to neutron momentum show promise for measuring, differentiating, and assaying samples that have modest to very high count rates, as well as mixed fission sources like Cm and Pu. We report on measured results with our existing liquid scintillator array, and progress on the design of a nuclear material assay system that incorporates fast neutron detection, including the surprising result that fast liquid scintillator detectors become competitive and even surpass the precision of {sup 3}He-based counters measuring correlated pairs in modest (kg) samples of plutonium.

  17. Development and verification of child observation sheet for 5-year-old children.

    Science.gov (United States)

    Fujimoto, Keiko; Nagai, Toshisaburo; Okazaki, Shin; Kawajiri, Mie; Tomiwa, Kiyotaka

    2014-02-01

    The aim of the study was to develop a newly devised child observation sheet (COS-5) as a scoring sheet, based on the Childhood Autism Rating Scale (CARS), for use in the developmental evaluation of 5-year-old children, especially focusing on children with autistic features, and to verify its validity. Seventy-six children were studied. The children were recruited among participants of the Japan Children's Cohort Study, a research program implemented by the Research Institute of Science and Technology for Society (RISTEX) from 2004 to 2009. The developmental evaluation procedure was performed by doctors, clinical psychologists, and public health nurses. The COS-5 was also partly based on the Kyoto Scale of Psychological Development 2001 (Kyoto Scale 2001). Further, the Developmental Disorders Screening Questionnaire for 5-Years-Olds, PDD-Autism Society Japan Rating Scale (PARS), doctor interview questions and neurological examination for 5-year-old children, and the Draw-a-Man Test (DAM) were used as evaluation scales. Eighteen (25.4%) children were rated as Suspected, including Suspected PDD, Suspected ADHD and Suspected MR. The COS-5 was suggested to be valid with favorable reliability (α=0.89) and correlation with other evaluation scales. The COS-5 may be useful, with the following advantages: it can be performed within a shorter time frame; it facilitates the maintenance of observation quality; it facilitates sharing information with other professions; and it is reliable to identify the autistic features of 5-year-old children. In order to verify its wider applications including the screening of infants (18months to 3years old) by adjusting the items of younger age, additional study is needed.

  18. Robust verification analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William, E-mail: wjrider@sandia.gov [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States); Witkowski, Walt [Sandia National Laboratories, Verification and Validation, Uncertainty Quantification, Credibility Processes Department, Engineering Sciences Center, Albuquerque, NM 87185 (United States); Kamm, James R. [Los Alamos National Laboratory, Methods and Algorithms Group, Computational Physics Division, Los Alamos, NM 87545 (United States); Wildey, Tim [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States)

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  19. AXAF-I Low Intensity-Low Temperature (LILT) Testing of the Development Verification Test (DVT) Solar Panel

    Science.gov (United States)

    Alexander, Doug; Edge, Ted; Willowby, Doug

    1998-01-01

    The planned orbit of the AXAF-I spacecraft will subject the spacecraft to both short, less than 30 minutes for solar and less than 2 hours for lunar, and long earth eclipses and lunar eclipses with combined conjunctive duration of up to 3 to 4 hours. Lack of proper Electrical Power System (EPS) conditioning prior to eclipse may cause loss of mission. To avoid this problem, for short eclipses, it is necessary to off-point the solar array prior to or at the beginning of the eclipse to reduce the battery state of charge (SOC). This yields less overcharge during the high charge currents at sun entry. For long lunar eclipses, solar array pointing and load scheduling must be tailored for the profile of the eclipse. The battery SOC, loads, and solar array current-voltage (I-V) must be known or predictable to maintain the bus voltage within acceptable range. To address engineering concerns about the electrical performance of the AXAF-I solar array under Low Intensity and Low Temperature (LILT) conditions, Marshall Space Flight Center (MSFC) engineers undertook special testing of the AXAF-I Development Verification Test (DVT) solar panel in September-November 1997. In the test the DVT test panel was installed in a thermal vacuum chamber with a large view window with a mechanical "flapper door". The DVT test panel was "flash" tested with a Large Area Pulse Solar Simulator (LAPSS) at various fractional sun intensities and panel (solar cell) temperatures. The testing was unique with regards to the large size of the test article and type of testing performed. The test setup, results, and lessons learned from the testing will be presented.

  20. Automatic Verification of Biochemical Network Using Model Checking Method%基于模型校核的生化网络自动辨别方法

    Institute of Scientific and Technical Information of China (English)

    Jinkyung Kim; Younghee Lee; Il Moon

    2008-01-01

    This study focuses on automatic searching and verifying methods for the reachability, transition logics and hierarchical structure in all possible paths of biological processes using model checking. The automatic search and verification for alternative paths within complex and large networks in biological process can provide a consid-erable amount of solutions, which is difficult to handle manually. Model checking is an automatic method for veri-fying if a circuit or a condition, expressed as a concurrent transition system, satisfies a set of properties expressed ina temporal logic, such as computational tree logic (CTL). This article represents that model checking is feasible in biochemical network verification and it shows certain advantages over simulation for querying and searching of special behavioral properties in biochemical processes.

  1. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  2. Development of Tensile Softening Model for Plain Concrete

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.K.; Song, Y.C. [Korea Electric Power Research Institute, Taejon (Korea)

    2002-07-01

    Large-scale direct tensile softenng tests using plate concrete specimens(4000, 5000psi) with notch were performed under uniaxial stress. There were presented the basic physical properties and the complete load-CMOD(Crack Mouth Opening Displacement) curves for them And them the fracture energy was evaluated using the complete load-CMOD curves respectively, and there was presents optimal tensile softening model which is modified by a little revision of an existing one. Therefore, here provided the real verification data through the tests for developing other nonlinear concrete finite element models. (author). 32 refs., 38 figs., 4 tabs.

  3. Kinematic Modeling and Simulation of a 2-R Robot by Using Solid Works and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Fernini Brahim

    2012-05-01

    Full Text Available Simulation of robot systems which is getting very popular, especially with the lowering cost of computers, can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. Object staging modelisation using robots holds, wether for the object or the robot, the following models: The geometric one, the kinematics one and the dynamic one. To do so, the modelisation of a 2-R robot type is being implemented. Comparing between two robot postures with the same trajectory (path and for the same length of time and establishing a computing code to obtain the kinematic and dynamic parameters are the main tasks. SolidWorks and Matlab/Simulink softwares are used to check the theory and the robot motion simulation. The verification of the obtained results by both softwares allows us to, qualitatively evaluate ,underline the rightness of the chosen model and to get the right conclusions. The results of simulations were discussed. An agreement between the two softwares is certainly Obtained.

  4. Informational model verification of ZVS Buck quasi-resonant DC-DC converter

    Science.gov (United States)

    Vakovsky, Dimiter; Hinov, Nikolay

    2016-12-01

    The aim of the paper is to create a polymorphic informational model of a ZVS Buck quasi-resonant DC-DC converter for the modeling purposes of the object. For the creation of the model is applied flexible open standards for setting, storing, publishing and exchange of data in distributed information environment. The created model is useful for creation of many and different by type variants with different configuration of the composing elements and different inner model of the examined object.

  5. Verification and Validation of the Coastal Modeling System. Report 2: CMS-Wave

    Science.gov (United States)

    2011-12-01

    wave models in this category are ideal for generation, growth and transformation of wind-waves over large distances (fetches) in regional -scale...quantitative model -to-data intercomparison or model -to- model intercomparison . Both evaluations involve assessment of the methods and data required for...combined wind and wave modeling capabilities of CMS-Wave in a large tidally- dominated inlet environment with an energetic wave climate . Extensive field

  6. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  7. Hysteresis modelling and experimental verification of a Fe–Ga alloy magnetostrictive actuator

    Science.gov (United States)

    Wei, Zhu; Lei Xiang, Bian; Gangli, Chen; Shuxin, Liu; Qinbo, Zhou; Xiaoting, Rui

    2017-03-01

    In order to accurately describe the asymmetric rate-bias-dependent hysteresis of a Fe–Ga alloy magnetostrictive actuator, a comprehensive model, which is composed of a phenomenon model, describing hysteresis by the modified Bouc–Wen hysteresis operator, and a theoretical model, representing the dynamics characteristics, is put forward. The experimental system is setup to verify the performance of the comprehensive model. Results show that the modified Bouc–Wen model can effectively describe the dynamics and hysteresis characteristics of the Fe–Ga alloy magnetostrictive actuator. The results highlight significantly improved accuracy in the modelling of the magnetostrictive actuator.

  8. IMPACT fragmentation model developments

    Science.gov (United States)

    Sorge, Marlon E.; Mains, Deanna L.

    2016-09-01

    The IMPACT fragmentation model has been used by The Aerospace Corporation for more than 25 years to analyze orbital altitude explosions and hypervelocity collisions. The model is semi-empirical, combining mass, energy and momentum conservation laws with empirically derived relationships for fragment characteristics such as number, mass, area-to-mass ratio, and spreading velocity as well as event energy distribution. Model results are used for several types of analysis including assessment of short-term risks to satellites from orbital altitude fragmentations, prediction of the long-term evolution of the orbital debris environment and forensic assessments of breakup events. A new version of IMPACT, version 6, has been completed and incorporates a number of advancements enabled by a multi-year long effort to characterize more than 11,000 debris fragments from more than three dozen historical on-orbit breakup events. These events involved a wide range of causes, energies, and fragmenting objects. Special focus was placed on the explosion model, as the majority of events examined were explosions. Revisions were made to the mass distribution used for explosion events, increasing the number of smaller fragments generated. The algorithm for modeling upper stage large fragment generation was updated. A momentum conserving asymmetric spreading velocity distribution algorithm was implemented to better represent sub-catastrophic events. An approach was developed for modeling sub-catastrophic explosions, those where the majority of the parent object remains intact, based on estimated event energy. Finally, significant modifications were made to the area-to-mass ratio distribution to incorporate the tendencies of different materials to fragment into different shapes. This ability enabled better matches between the observed area-to-mass ratios and those generated by the model. It also opened up additional possibilities for post-event analysis of breakups. The paper will discuss

  9. Verification, validation, and predictive capability in computational engineering and physics.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Hirsch, Charles (Vrije Universiteit Brussel, Brussels, Belgium); Trucano, Timothy Guy

    2003-02-01

    Developers of computer codes, analysts who use the codes, and decision makers who rely on the results of the analyses face a critical question: How should confidence in modeling and simulation be critically assessed? Verification and validation (V&V) of computational simulations are the primary methods for building and quantifying this confidence. Briefly, verification is the assessment of the accuracy of the solution to a computational model. Validation is the assessment of the accuracy of a computational simulation by comparison with experimental data. In verification, the relationship of the simulation to the real world is not an issue. In validation, the relationship between computation and the real world, i.e., experimental data, is the issue.

  10. Modelling and Simulation of Variable Speed Thruster Drives with Full-Scale Verification

    Directory of Open Access Journals (Sweden)

    Jan F. Hansen

    2001-10-01

    Full Text Available In this paper considerations about modelling and simulation of variable speed thruster drives are made with comparison to full scale measurements from Varg FPSO. For special purpose vessels with electric propulsion operating in DP (Dynamic Positioning mode the thruster drives are essential for the vessel operation. Different model strategies of thruster drives are discussed. An advanced thruster drive model with a dynamic motor model and field vector control principle is shown. Simulations are performed with both the advanced model and a simplified model. These are compared with full-scale measurements from Varg FPSO. The simulation results correspond well with the measurements, for both the simplified model and the advanced model.

  11. Experimental verification of optical models of graphene with multimode slab waveguides.

    Science.gov (United States)

    Chang, Zeshan; Chiang, Kin Seng

    2016-05-01

    We compare three optical models of graphene, namely, the interface model, the isotropic model, and the anisotropic model, and verify them experimentally with two multimode slab waveguide samples operating at the wavelengths of 632.8 and 1536 nm. By comparing the calculated graphene-induced losses and the measurement data, we confirm that the interface model and the anisotropic model give correct results for both the transverse electric (TE) and transverse magnetic modes, while the isotropic model gives correct results only for the TE modes. With the experimental data, we also quantitatively verify the widely used expression for the surface conductivity of graphene in the optical regime. Our findings clarify the issue of modeling graphene in the analysis of graphene-incorporated waveguides and offer deeper insight into the optical properties of graphene for waveguide applications.

  12. Analysis of the State of the Art Contingency Analysis Model (SOTACA), Air Module Verification

    Science.gov (United States)

    1990-03-01

    General Information 6 Model Uses 7 Proponent and Users 8 System Requirements 8 History 8 Reasons to Use 9 Model Operation 10 Preprocessor 11 Area Files...Definition File (ADF) 42 Theather Data 42 Munitions and Target Effects Data 44 Air Data 44 Decision Threshold File (DTF) 44 Scenario 45 Test Cases 46 Null Case...information, model uses, proponent and users, system requirements, model history , and reasons to use SOTACA. This scction also covers the four phases of

  13. Effective verification of confidentiality for multi-threaded programs

    NARCIS (Netherlands)

    Ngo, Minh Tri; Stoelinga, Mariëlle; Huisman, Marieke

    2014-01-01

    This paper studies how confidentiality properties of multi-threaded programs can be verified efficiently by a combination of newly developed and existing model checking algorithms. In particular, we study the verification of scheduler-specific observational determinism (SSOD), a property that charac

  14. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  15. Validation and verification of agent models for trust: Independent compared to relative trust

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van

    2011-01-01

    In this paper, the results of a validation experiment for two existing computational trust models describing human trust are reported. One model uses experiences of performance in order to estimate the trust in different trustees. The second model in addition carries the notion of relative trust. Th

  16. Derivation, calibration and verification of macroscopic model for urban traffic flow. Part 1

    CERN Document Server

    Kholodov, Yaroslav A; Kholodov, Aleksandr S; Vasiliev, Mikhail O; Kurzhanskiy, Alexander A

    2016-01-01

    In this paper we present a second-order hydrodynamic traffic model that generalizes the existing second-order models of Payne-Whithem, Zhang and Aw-Rascle. In the proposed model, we introduce the pressure equation describing the dependence of "traffic pressure" on traffic density. The pressure equation is constructed for each road segment from the fundamental diagram that is estimated using measurements from traffic detectors. We show that properties of any phenomenological model are fully defined by the pressure equation. We verify the proposed model through simulations of the Interstate 580 freeway segment in California, USA, with traffic measurements from the Performance Measurement System (PeMS).

  17. Towards the Availability of the Distributed Cluster Rendering System: Automatic Modeling and Verification

    DEFF Research Database (Denmark)

    Wang, Kemin; Jiang, Zhengtao; Wang, Yongbin;

    2012-01-01

    , whenever the number of node-n and related parameters vary, we can create the PRISM model file rapidly and then we can use PRISM model checker to verify ralated system properties. At the end of this study, we analyzed and verified the availability distributions of the Distributed Cluster Rendering System......In this study, we proposed a Continuous Time Markov Chain Model towards the availability of n-node clusters of Distributed Rendering System. It's an infinite one, we formalized it, based on the model, we implemented a software, which can automatically model with PRISM language. With the tool...

  18. Using Visual Specifications in Verification of Industrial Automation Controllers

    Directory of Open Access Journals (Sweden)

    Valeriy Vyatkin

    2008-03-01

    Full Text Available This paper deals with further development of a graphical specification language resembling timing-diagrams and allowing specification of partially ordered events in input and output signals. The language specifically aims at application in modular modelling of industrial automation systems and their formal verification via model-checking. The graphical specifications are translated into a model which is connected with the original model under study.

  19. Using Visual Specifications in Verification of Industrial Automation Controllers

    Directory of Open Access Journals (Sweden)

    Bouzon Gustavo

    2008-01-01

    Full Text Available Abstract This paper deals with further development of a graphical specification language resembling timing-diagrams and allowing specification of partially ordered events in input and output signals. The language specifically aims at application in modular modelling of industrial automation systems and their formal verification via model-checking. The graphical specifications are translated into a model which is connected with the original model under study.

  20. Fingerprint verification based on wavelet subbands

    Science.gov (United States)

    Huang, Ke; Aviyente, Selin

    2004-08-01

    Fingerprint verification has been deployed in a variety of security applications. Traditional minutiae detection based verification algorithms do not utilize the rich discriminatory texture structure of fingerprint images. Furthermore, minutiae detection requires substantial improvement of image quality and is thus error-prone. In this paper, we propose an algorithm for fingerprint verification using the statistics of subbands from wavelet analysis. One important feature for each frequency subband is the distribution of the wavelet coefficients, which can be modeled with a Generalized Gaussian Density (GGD) function. A fingerprint verification algorithm that combines the GGD parameters from different subbands is proposed to match two fingerprints. The verification algorithm in this paper is tested on a set of 1,200 fingerprint images. Experimental results indicate that wavelet analysis provides useful features for the task of fingerprint verification.