WorldWideScience

Sample records for model verification process

  1. A verification and validation process for model-driven engineering

    Science.gov (United States)

    Delmas, R.; Pires, A. F.; Polacsek, T.

    2013-12-01

    Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

  2. Formalization and Verification of Business Process Modeling Based on UML and Petri Nets

    Institute of Scientific and Technical Information of China (English)

    YAN Zhi-jun; GAN Ren-chu

    2005-01-01

    In order to provide a quantitative analysis and verification method for activity diagrams based business process modeling, a formal definition of activity diagrams is introduced. And the basic requirements for activity diagrams based business process models are proposed. Furthermore, the standardized transformation technique between business process models and basic Petri nets is presented and the analysis method for the soundness and well-structured properties of business processes is introduced.

  3. Verification, Validation & Accreditation of Legacy Simulations using the Business Process Modeling Notation

    NARCIS (Netherlands)

    Gianoulis, C.; Roza, M.; Kabilan, V.

    2008-01-01

    Verification, Validation and Accreditation is an important part of the Modeling and Simulation domain. This paper focuses on legacy simulations and examines two VV&A approaches coming from different communities of the defense. We use the Business Process Modeling Notation (BPMN) to describe both app

  4. Requirement Assurance: A Verification Process

    Science.gov (United States)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  5. Early Development of UVM based Verification Environment of Image Signal Processing Designs using TLM Reference Model of RTL

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2014-01-01

    Full Text Available With semiconductor industry trend of “smaller the better”, from an idea to a final product, more innovation on product portfolio and yet remaining competitive and profitable are few criteria which are culminating into pressure and need for more and more innovation for CAD flow, process management and project execution cycle. Project schedules are very tight and to achieve first silicon success is key for projects. This necessitates quicker verification with better coverage matrix. Quicker Verification requires early development of the verification environment with wider test vectors without waiting for RTL to be available. In this paper, we are presenting a novel approach of early development of reusable multi-language verification flow, by addressing four major activities of verification – 1. Early creation of Executable Specification 2. Early creation of Verification Environment 3. Early development of test vectors and 4. Better and increased Re-use of blocks Although this paper focuses on early development of UVM based Verification Environment of Image Signal Processing designs using TLM Reference Model of RTL, same concept can be extended for non-image signal processing designs.

  6. Vacuum-assisted resin transfer molding (VARTM) model development, verification, and process analysis

    Science.gov (United States)

    Sayre, Jay Randall

    2000-12-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process

  7. Specification, Verification and Optimisation of Business Processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas

    to model checking. This allows for a rich set of both qualitative and quantitative properties of a business process to be precisely determined in an automated fashion directly from the model of the business process. A number of advanced applications of this framework are presented which allow for automated...... Model and Notation (BPMN). The automated analysis of business processes is done by means of quantitative probabilistic model checking which allows verification of validation and performance properties through use of an algorithm for the translation of business process models into a format amenable......This thesis develops a unified framework wherein to specify, verify and optimise stochastic business processes. This framework provides for the modelling of business processes via a mathematical structure which captures business processes as a series of connected activities. This structure...

  8. Vacuum-Assisted Resin Transfer Molding (VARTM) Model Development, Verification, and Process Analysis

    OpenAIRE

    Sayre, Jay Randall

    2000-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this techni...

  9. Using formal concept analysis for the verification of process-data matrices in conceptual domain models

    NARCIS (Netherlands)

    Poelmans, J.; Dedene, G.; Snoeck, M.; Viaene, S.; Fox, R.; Golubski, W.

    2010-01-01

    One of the first steps in a software engineering process is the elaboration of the conceptual domain model. In this paper, we investigate how Formal Concept Analysis can be used to formally underpin the construction of a conceptual domain model. In particular, we demonstrate that intuitive verificat

  10. Model-Based Verification and Validation of the SMAP Uplink Processes

    Science.gov (United States)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  11. ENROLMENT MODEL STABILITY IN STATIC SIGNATURE VERIFICATION

    NARCIS (Netherlands)

    Allgrove, C.; Fairhurst, M.C.

    2004-01-01

    The stability of enrolment models used in a static verification system is assessed, in order to provide an enhanced chracterisation of signatures through the validation of the enrolment process. A number of static features are used to illustrate the effect of the variation in enrolment model size on

  12. Verification of Stochastic Process Calculi

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya

    Stochastic process calculi represent widely accepted formalisms within Computer Science for modelling nondeterministic stochastic systems in a compositional way. Similar to process calculi in general, they are suited for modelling systems in a hierarchical manner, by explicitly specifying...... subsystems as well as their interdependences and communication channels. Stochastic process calculi incorporate both the quantified uncertainty on probabilities or durations of events and nondeterministic choices between several possible continuations of the system behaviour. Modelling of a system is often...

  13. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2012-12-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.

  14. Probabilistic Model for Dynamic Signature Verification System

    Directory of Open Access Journals (Sweden)

    Chai Tong Yuen

    2011-11-01

    Full Text Available This study has proposed the algorithm for signature verification system using dynamic parameters of the signature: pen pressure, velocity and position. The system is proposed to read, analyze and verify the signatures from the SUSig online database. Firstly, the testing and reference samples will have to be normalized, re-sampled and smoothed through pre-processing stage. In verification stage, the difference between reference and testing signatures will be calculated based on the proposed thresholded standard deviation method. A probabilistic acceptance model has been designed to enhance the performance of the verification system. The proposed algorithm has reported False Rejection Rate (FRR of 14.8% and False Acceptance Rate (FAR of 2.64%. Meanwhile, the classification rate of the system is around 97%.

  15. Modeling of numerical simulation and experimental verification for carburizing-nitriding quenching process

    Institute of Scientific and Technical Information of China (English)

    R. MUKAI; T. MATSUMOTO; JU Dong-ying; T. SUZUKI; H. SAITO; Y. ITO

    2006-01-01

    A model considering quantitative effects of diffused carbon and nitrogen gradients and kinetics of phase transformation is presented to examine metallo-thermo-mechanical behavior during carburized and nitrided quenching. Coupled simulation of diffusion,phase transformation and stress/strain provides the final distribution of carbon and nitrogen contents as well as residual stress and distortion. Effects of both transformation and lattice expansion induced by carbon and nitrogen absorption were introduced into calculating the evolution of the internal stress and strain. In order to verify the method and the results,the simulated distributions of carbon and nitrogen content and residual stress/strain of a ring model during carburized and nitrided quenching were compared with the measured data.

  16. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Ed; Mader, Angelika

    2004-01-01

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the verificatio

  17. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  18. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on t

  19. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  20. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  1. Verification strategies for fluid-based plasma simulation models

    Science.gov (United States)

    Mahadevan, Shankar

    2012-10-01

    Verification is an essential aspect of computational code development for models based on partial differential equations. However, verification of plasma models is often conducted internally by authors of these programs and not openly discussed. Several professional research bodies including the IEEE, AIAA, ASME and others have formulated standards for verification and validation (V&V) of computational software. This work focuses on verification, defined succinctly as determining whether the mathematical model is solved correctly. As plasma fluid models share several aspects with the Navier-Stokes equations used in Computational Fluid Dynamics (CFD), the CFD verification process is used as a guide. Steps in the verification process: consistency checks, examination of iterative, spatial and temporal convergence, and comparison with exact solutions, are described with examples from plasma modeling. The Method of Manufactured Solutions (MMS), which has been used to verify complex systems of PDEs in solid and fluid mechanics, is introduced. An example of the application of MMS to a self-consistent plasma fluid model using the local mean energy approximation is presented. The strengths and weaknesses of the techniques presented in this work are discussed.

  2. Calibration and verification of environmental models

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Weinberg, N.; Hiser, H.

    1976-01-01

    The problems of calibration and verification of mesoscale models used for investigating power plant discharges are considered. The value of remote sensors for data acquisition is discussed as well as an investigation of Biscayne Bay in southern Florida.

  3. Towards agent-based modelling and verification of collaborative business processes : An approach centred on interactions and behaviours

    NARCIS (Netherlands)

    Stuit, M.; Szirbik, N.

    2009-01-01

    This paper presents the process-oriented aspects of a formal and visual agent-based business process modeling language. The language is of use for (networks of) organizations that elect or envisage multi-agent systems for the support of collaborative business processes. The paper argues that the

  4. Characterization of and sensor-model verification and control of the properties of PMR-13 during processing

    Science.gov (United States)

    Kranbuehl, D.; Prettyman, T.; Robillard, K.; Smith, J.; Nicoletti, A.; Hart, S.; Loos, A.; Koury, Jim

    1991-01-01

    This study presents an in situ sensor and a process-simulation model developed for monitoring and controlling the cure process of PMR-15. The time-temperature dependence of the buildup in the molecular network structure, extent of reaction, viscosity, flow, and consolidation during the cure of PMR-15 are discussed. The relationship of the time-temperature cycle used during imidization to the length of the endcapped chains formed is examined. The relationship of the time-temperature cure-processing cycle to the reaction kinetics, viscosity, flow, and consolidation during crosslinking is analyzed using frequency-dependent electromagnetic sensors and the Loos processing model. Application of the FDEMS sensing technique and the process-simulation model for quality assurance processing and automated on-line control of cure is discussed.

  5. VARTM Model Development and Verification

    Science.gov (United States)

    Cano, Roberto J. (Technical Monitor); Dowling, Norman E.

    2004-01-01

    In this investigation, a comprehensive Vacuum Assisted Resin Transfer Molding (VARTM) process simulation model was developed and verified. The model incorporates resin flow through the preform, compaction and relaxation of the preform, and viscosity and cure kinetics of the resin. The computer model can be used to analyze the resin flow details, track the thickness change of the preform, predict the total infiltration time and final fiber volume fraction of the parts, and determine whether the resin could completely infiltrate and uniformly wet out the preform.

  6. Modeling of the shrinking process of a bubble induced by laser metal ablation in water and experimental verification

    Science.gov (United States)

    Dabir-Moghaddam, Navid; Liu, Ze; Wu, Benxin

    2017-01-01

    Laser ablation of a solid target immersed in liquid (such as water) has many important applications such as laser synthesis of nanoparticles, laser micromachining in water, and laser shock peening. Laser ablation of a solid target in water involves complicated physical processes. One important process often involved is the generation and evolution of a bubble in water and attached to the target surface, which may have significant effects on the target and the ambient water, and hence may greatly affect the relevant practical applications. Some experimental studies were reported in the literature on bubble evolutions induced by laser ablation of a solid target in water. However, the reported previous relevant physics-based modeling work is not sufficient. A physics-based model may help improve the process fundamental understanding and generate valuable information to related applications. In this paper, physics-based modeling work has been performed on the shrinking process of a bubble induced by laser metal ablation in water, together with time-resolved shadowgraph imaging experiments to verify the model. The model-predicted bubble evolution agrees reasonably well with the experimental measurement shown in the paper. Under the studied conditions, it has been found that near the bubble collapse moment (i.e., the moment when the bubble shrinks to a minimum size): (1) the bubble shrinks very fast, and the peak fluid velocity magnitude occurs inside the bubble and can exceed ˜550 m/s; (2) the temperature inside the bubble increases very quickly and approaches ˜2000 K; and (3) the pressure inside the bubble becomes very high, and can reach a peak magnitude of ˜380 MPa at the collapse moment at the bubble center. During the shrinking process, a high-pressure region outside and near the bubble wall is generated near the collapse moment, but the temperature of the region outside the bubble mostly remains low.

  7. MCCI 过程模型开发及验证%Development and Verification of MCCI Process Model

    Institute of Scientific and Technical Information of China (English)

    魏巍; 齐克林; 万舒; 陈艳芳; 郭富德

    2014-01-01

    A mechanistic model of the molten core-concrete interaction (MCCI) process was described ,and it was used to calculate and analyze the MCCI process of Daya Bay Nuclear Power Plant when the station blackout (SBO) accident ,or large loss of coolant accident (LLOCA) with the failure of safety injection was happened .The calculation results of this procedure were compared with the large-scale analysis program MELCOR to verify the reasonableness and correctness of the model .T he results indicate that the model presented in this paper can simulate the MCCI process correctly and reasonably under the given severe accidents ,and the calculation speed is fast .It can meet the appli-cation requirements of simulators .%概述了严重事故下堆芯熔融物与混凝土相互作用(MCCI)过程的机理性模型,并给出了大亚湾核电厂全厂断电及大破口叠加安注失效等典型初因事故导致的严重事故下的MCCI过程的计算分析结果,并与相同事故序列下的 M ELCOR计算结果进行对比。计算结果表明,所给出的严重事故下的MCCI过程模型正确合理,计算速度快,能满足在模拟机上应用的要求。

  8. VERIFICATION OF SMOM AND QMOM POPULATION BALANCE MODELING IN CFD CODE USING ANALYTICAL SOLUTIONS FOR BATCH PARTICULATE PROCESSES

    Institute of Scientific and Technical Information of China (English)

    Bin Wan; Terry A.Ring

    2006-01-01

    For many processes of industrial significance, due to the strong coupling between particle interactions and fluid dynamics, the population balance must be solved as part of a computational fluid dynamics (CFD) simulation. In this work, a CFD based population balance model is tested using a batch crystallization reactor. In this CFD model, the population balance is solved by the standard method of moments (SMOM) and the quadrature method of moments (QMOM). The results of these simulations are compared to analytical solutions for the population balance in a batch tank where 1) nucleation, 2) growth, 3) aggregation, and 4) breakage are taking place separately. The results of these comparisons show that the first 6 moments of the population balance are accurately predicted for nucleation, growth, aggregation and breakage at all times.

  9. The Construction of Verification Models for Embedded Systems

    NARCIS (Netherlands)

    Mader, Angelika H.; Wupper, H.; Boon, Mieke

    2007-01-01

    The usefulness of verification hinges on the quality of the verification model. Verification is useful if it increases our confidence that an artefact bahaves as expected. As modelling inherently contains non-formal elements, the qualityof models cannot be captured by purely formal means. Still, we

  10. FMEF Electrical single line diagram and panel schedule verification process

    Energy Technology Data Exchange (ETDEWEB)

    FONG, S.K.

    1998-11-11

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF.

  11. Formal Specifications and Verification of a Secure Communication Protocol Model

    Institute of Scientific and Technical Information of China (English)

    夏阳; 陆余良; 蒋凡

    2003-01-01

    This paper presents a secure communication protocol model-EABM, by which network security communication can be realized easily and efficiently. First, the paper gives a thorough analysis of the protocol system, systematic construction and state transition of EABM. Then , it describes the channels and the process of state transition of EABM in terms of ESTELLE. At last, it offers a verification of the accuracy of the EABM model.

  12. Verification of hydrological processes using the ACRU agro-hydrological modelling system for simulating potential climate change impacts in an alpine watershed in Alberta, Canada

    Science.gov (United States)

    Nemeth, M. W.; Kienzle, S. W.; Byrne, J. M.

    2009-12-01

    The upper North Saskatchewan River (UNSR) watershed is situated south-west of Edmonton, Alberta, with a watershed area of slightly over 20,000km2. This on-going research looks to model the UNSR watershed to help predict future streamflows in the UNSR, based on potential future climate change and land use changes within the watershed, by setting up the ACRU agro-hydrological modelling system (Schulze et al., 2004). The watershed was divided into hydrological response units (HRUs) by using soil, land cover, climate, and stream data that were collected and processed using a GIS. Each HRU was set up individually, so that a minimum daily time series of 30 years is available to simulate all elements of the hydrological cycle for each of the HRUs. Initial model runs were completed with simulated output for many hydrological variables including snowpack development and snow melt, actual evaporation, transpiration, soil moisture storage, storm flow and groundwater contributions. Simulated temperatures, streamflow, actual evaporation, snow cover, and snow water equivalent (SWE) were used to help verify that hydrological processes simulated by the model are consistent with that of the watershed. Observed temperature from six high elevation fire lookout stations were used to verify simulated temperatures. Four years of MODIS images were used to verify that spatial snow cover over the watershed was being adequately simulated. Snow water equivalent (SWE) was verified using observed data from thirteen snow courses, and two snow pillows in the watershed. Observed evaporation (A-pan) data from three meteorological stations just outside the study area were used to determine if simulated evaporation values were within physically meaningful ranges for this region. Observed naturalized streamflow data from sixteen gauging stations around the watershed were used to help verify streamflows from different areas of the watershed were being properly simulated. Verification analysis is

  13. Verification of road databases using multiple road models

    Science.gov (United States)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  14. Verification of pneumatic railway brake models

    Science.gov (United States)

    Piechowiak, Tadeusz

    2010-03-01

    The article presents a survey of diverse methods for validation of pneumatic train brake modelling. Various experimental measurements of railway pneumatic brakes were made chiefly on a test stand at Poznań University of Technology; other test stands and some results have been taken from the literature. The measurements, some of them unconventional, were performed on separate pneumatic elements, brake devices, the brake pipe and fragments thereof. Mechanical devices were also included. The experimental measurement results were used for the verification of numerical models and for the determination of parameters. The latter was partially performed using an optimisation method.

  15. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  16. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  17. A survey of formal business process verification : From soundness to variability

    NARCIS (Netherlands)

    Groefsema, Heerko; Bucur, Doina

    2013-01-01

    Formal verification of business process models is of interest to a number of application areas, including checking for basic process correctness, business compliance, and process variability. A large amount of work on these topics exist, while a comprehensive overview of the field and its directions

  18. Formal Modeling and Verification for MVB

    Directory of Open Access Journals (Sweden)

    Mo Xia

    2013-01-01

    Full Text Available Multifunction Vehicle Bus (MVB is a critical component in the Train Communication Network (TCN, which is widely used in most of the modern train techniques of the transportation system. How to ensure security of MVB has become an important issue. Traditional testing could not ensure the system correctness. The MVB system modeling and verification are concerned in this paper. Petri Net and model checking methods are used to verify the MVB system. A Hierarchy Colored Petri Net (HCPN approach is presented to model and simulate the Master Transfer protocol of MVB. Synchronous and asynchronous methods are proposed to describe the entities and communication environment. Automata model of the Master Transfer protocol is designed. Based on our model checking platform M3C, the Master Transfer protocol of the MVB is verified and some system logic critical errors are found. Experimental results show the efficiency of our methods.

  19. Approaches to verification of two-dimensional water quality models

    Energy Technology Data Exchange (ETDEWEB)

    Butkus, S.R. (Tennessee Valley Authority, Chattanooga, TN (USA). Water Quality Dept.)

    1990-11-01

    The verification of a water quality model is the one procedure most needed by decision making evaluating a model predictions, but is often not adequate or done at all. The results of a properly conducted verification provide the decision makers with an estimate of the uncertainty associated with model predictions. Several statistical tests are available for quantifying of the performance of a model. Six methods of verification were evaluated using an application of the BETTER two-dimensional water quality model for Chickamauga reservoir. Model predictions for ten state variables were compared to observed conditions from 1989. Spatial distributions of the verification measures showed the model predictions were generally adequate, except at a few specific locations in the reservoir. The most useful statistics were the mean standard error of the residuals. Quantifiable measures of model performance should be calculated during calibration and verification of future applications of the BETTER model. 25 refs., 5 figs., 7 tabs.

  20. Statistical process control for IMRT dosimetric verification.

    Science.gov (United States)

    Breen, Stephen L; Moseley, Douglas J; Zhang, Beibei; Sharpe, Michael B

    2008-10-01

    Patient-specific measurements are typically used to validate the dosimetry of intensity-modulated radiotherapy (IMRT). To evaluate the dosimetric performance over time of our IMRT process, we have used statistical process control (SPC) concepts to analyze the measurements from 330 head and neck (H&N) treatment plans. The objectives of the present work are to: (i) Review the dosimetric measurements of a large series of consecutive head and neck treatment plans to better understand appropriate dosimetric tolerances; (ii) analyze the results with SPC to develop action levels for measured discrepancies; (iii) develop estimates for the number of measurements that are required to describe IMRT dosimetry in the clinical setting; and (iv) evaluate with SPC a new beam model in our planning system. H&N IMRT cases were planned with the PINNACLE treatment planning system versions 6.2b or 7.6c (Philips Medical Systems, Madison, WI) and treated on Varian (Palo Alto, CA) or Elekta (Crawley, UK) linacs. As part of regular quality assurance, plans were recalculated on a 20-cm-diam cylindrical phantom, and ion chamber measurements were made in high-dose volumes (the PTV with highest dose) and in low-dose volumes (spinal cord organ-at-risk, OR). Differences between the planned and measured doses were recorded as a percentage of the planned dose. Differences were stable over time. Measurements with PINNACLE3 6.2b and Varian linacs showed a mean difference of 0.6% for PTVs (n=149, range, -4.3% to 6.6%), while OR measurements showed a larger systematic discrepancy (mean 4.5%, range -4.5% to 16.3%) that was due to well-known limitations of the MLC model in the earlier version of the planning system. Measurements with PINNACLE3 7.6c and Varian linacs demonstrated a mean difference of 0.2% for PTVs (n=160, range, -3.0%, to 5.0%) and -1.0% for ORs (range -5.8% to 4.4%). The capability index (ratio of specification range to range of the data) was 1.3 for the PTV data, indicating that almost

  1. USER CONTEXT MODELS : A FRAMEWORK TO EASE SOFTWARE FORMAL VERIFICATIONS

    OpenAIRE

    2010-01-01

    This article is accepted to appear in ICEIS 2010 proceedings; International audience; Several works emphasize the difficulties of software verification applied to embedded systems. In past years, formal verification techniques and tools were widely developed and used by the research community. However, the use of formal verification at industrial scale remains difficult, expensive and requires lot of time. This is due to the size and the complexity of manipulated models, but also, to the impo...

  2. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    Science.gov (United States)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  3. Performance verification tests of JT-60SA CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Obana, Tetsuhiro, E-mail: obana.tetsuhiro@LHD.nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Murakami, Haruyuki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan)

    2015-11-15

    Highlights: • The performance of the JT-60SA CS model coil was verified. • The CS model coil comprised a quad-pancake wound with a Nb{sub 3}Sn CIC conductor. • The CS model coil met the design requirements. - Abstract: As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb{sub 3}Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of −0.62% for the Nb{sub 3}Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  4. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  5. An ocean-land-atmosphere coupled model for tropical cyclone landfall processes: The multi-layer ocean model and its verification

    Institute of Scientific and Technical Information of China (English)

    DUAN Yihong; YU Runling; LI Yongping

    2006-01-01

    POM (Princeton ocean model) tentatively taken as the ocean part of an ocean-land-atmosphere coupled model is verified for the ultimate purpose of studying the landfall process of tropical cyclone (TC) in the western North Pacific. The POM is tested with monthly mean wind stress in the summer and given lateral boundary conditions. The results indicate that the equilibrium state of the ocean is in accordance with the climate mean, with the error in sea surface temperature (salinity) less than 0.5 ℃ (0.5). The simulated ocean currents are reasonable as well. Several numerical experiments are designed to verify the oceanic response to a stationary or moving TC. It is found that the results agree fairly well with the previous work, including both the drop magnitude and the distribution of sea temperature. Compared with the simple two-layer ocean model used by some other studies, the response of the ocean to a TC is more logical here. The model is also verified in a real case with a TC passing the neighborhood of a buoy station. It is shown that the established ocean model can basically reproduce the sea surface temperature change as observed.

  6. Verification and transfer of thermal pollution model. Volume 5: Verification of 2-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorate (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  7. Implementation and verification of a HELIAS module for the systems code PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    Warmer, F., E-mail: Felix.Warmer@ipp.mpg.de [Max Planck Institute for Plasma Physics, D-17491 Greifswald (Germany); Beidler, C.D.; Dinklage, A.; Egorov, K.; Feng, Y.; Geiger, J. [Max Planck Institute for Plasma Physics, D-17491 Greifswald (Germany); Kemp, R.; Knight, P. [Culham Centre for Fusion Energy, Abingdon, Oxfordshire OX14 3DB (United Kingdom); Schauer, F.; Turkin, Y. [Max Planck Institute for Plasma Physics, D-17491 Greifswald (Germany); Ward, D. [Culham Centre for Fusion Energy, Abingdon, Oxfordshire OX14 3DB (United Kingdom); Wolf, R.; Xanthopoulos, P. [Max Planck Institute for Plasma Physics, D-17491 Greifswald (Germany)

    2015-10-15

    Highlights: • The implementation of a HELIAS module in the systems code PROCESS is discussed. • Verification w.r.t. W7-X and its performance predictions yields very good agreement. • The generality of the HELIAS models allows with minor adaption's the modeling of tokamaks. • Verification with respect to a tokamak DEMO test case shows very good agreement. - Abstract: In order to study design points of next-step fusion devices such as DEMO, comprehensive systems codes are commonly employed. The code package PROCESS is such a tool, widely used for tokamak systems studies. In this work, the implementation and verification of a HELIAS module into PROCESS is addressed. These HELIAS models include: a plasma geometry model based on Fourier coefficients, a basic island divertor model, as well as a coil model which combines scaling aspects based on the Helias 5-B reactor design in combination with analytic inductance and field calculations. The models are verified firstly with respect to W7-X. Secondly, the generality of the models is used to represent the tokamak which is compared against the original tokamak PROCESS models using a DEMO design as reference case. Both approaches show very good agreement.

  8. LithoScope: Simulation Based Mask Layout Verification with Physical Resist Model

    Science.gov (United States)

    Qian, Qi-De

    2002-12-01

    Simulation based mask layout verification and optimization is a cost effective way to ensure high mask performance in wafer lithography. Because mask layout verification serves as a gateway to the expensive manufacturing process, the model used for verification must have superior accuracy than models used upstream. In this paper, we demonstrate, for the first time, a software system for mask layout verification and optical proximity correction that employs a physical resist development model. The new system, LithoScope, predicts wafer patterning by solving optical and resist processing equations on a scale that is until recently considered unpractical. Leveraging the predictive capability of the physical model, LithoScope can perform mask layout verification and optical proximity correction under a wide range of processing conditions and for any reticle enhancement technology without the need for multiple model development. We show the ability for physical resist model to change iso-focal bias by optimizing resist parameters, which is critical for matching the experimental process window. We present line width variation statistics and chip level process window predictions using a practical cell layout. We show that LithoScope model can accurately describe the resist-intensive poly gate layer patterning. This system can be used to pre-screen mask data problems before manufacturing to reduce the overall cost of the mask and the product.

  9. CFD Modeling & Verification in an Aircraft Paint Hangar

    Science.gov (United States)

    2011-05-01

    Collaboration •Navy Bureau of Medicine and Surgery (BUMED), IH Division –Assists CNO with health and safety of Navy aircraft artisans –Quarterly monitoring...levels • Handling paint particulates and vapors 10 E2S2. Verification Pitfalls • Artisans change process in the weeks between baseline and...verification – Added a fabric blanket in front of filter to save filter bank blocking exhaust airflow during sanding • Learn how to go w/o sleep

  10. IP cores design from specifications to production modeling, verification, optimization, and protection

    CERN Document Server

    Mohamed, Khaled Salah

    2016-01-01

    This book describes the life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection. Various trade-offs in the design process are discussed, including  those associated with many of the most common memory cores, controller IPs  and system-on-chip (SoC) buses. Readers will also benefit from the author’s practical coverage of new verification methodologies. such as bug localization, UVM, and scan-chain.  A SoC case study is presented to compare traditional verification with the new verification methodologies. ·         Discusses the entire life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection; ·         Introduce a deep introduction for Verilog for both implementation and verification point of view.  ·         Demonstrates how to use IP in applications such as memory controllers and SoC buses. ·         Describes a new ver...

  11. Process and quality verification controls for solid propellant manufacturing

    Science.gov (United States)

    Rogers, C. J.

    1983-01-01

    It is pointed out that in-process tests to verify quality and detect discrepant propellant which could compromise motor performance are essential elements of the solid composite propellant manufacturing process. The successful performance of the 260SL-1 and 260SL-2 motors aptly verified the controls used for manufacturing the propellant. The present investigation is concerned with the selected control parameters, and their relationships to composition and final propellant properties. Control performance is evaluated by comparison with processing data experienced in the manufacture of the propellant for the 260SL-1 motor. It is found that the in-process quality verification controls utilized in the propellant manufacturing process for the 260-in. diameter motor contributed significantly to the confidence of successful and predictable motor performance.

  12. Process and quality verification controls for solid propellant manufacturing

    Science.gov (United States)

    Rogers, C. J.

    1983-01-01

    It is pointed out that in-process tests to verify quality and detect discrepant propellant which could compromise motor performance are essential elements of the solid composite propellant manufacturing process. The successful performance of the 260SL-1 and 260SL-2 motors aptly verified the controls used for manufacturing the propellant. The present investigation is concerned with the selected control parameters, and their relationships to composition and final propellant properties. Control performance is evaluated by comparison with processing data experienced in the manufacture of the propellant for the 260SL-1 motor. It is found that the in-process quality verification controls utilized in the propellant manufacturing process for the 260-in. diameter motor contributed significantly to the confidence of successful and predictable motor performance.

  13. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  14. Verification of Embedded Memory Systems using Efficient Memory Modeling

    CERN Document Server

    Ganai, Malay K; Ashar, Pranav

    2011-01-01

    We describe verification techniques for embedded memory systems using efficient memory modeling (EMM), without explicitly modeling each memory bit. We extend our previously proposed approach of EMM in Bounded Model Checking (BMC) for a single read/write port single memory system, to more commonly occurring systems with multiple memories, having multiple read and write ports. More importantly, we augment such EMM to providing correctness proofs, in addition to finding real bugs as before. The novelties of our verification approach are in a) combining EMM with proof-based abstraction that preserves the correctness of a property up to a certain analysis depth of SAT-based BMC, and b) modeling arbitrary initial memory state precisely and thereby, providing inductive proofs using SAT-based BMC for embedded memory systems. Similar to the previous approach, we construct a verification model by eliminating memory arrays, but retaining the memory interface signals with their control logic and adding constraints on tho...

  15. Unmanned Aerial Systems in the Process of Juridical Verification of Cadastral Border

    Science.gov (United States)

    Rijsdijk, M.; van Hinsbergh, W. H. M.; Witteveen, W.; ten Buuren, G. H. M.; Schakelaar, G. A.; Poppinga, G.; van Persie, M.; Ladiges, R.

    2013-08-01

    Quite often in the verification of cadastral borders, owners of the parcels involved are not able to make their attendance at the appointed moment in time. New appointments have to be made in order to complete the verification process, and as a result often costs and throughput times grow beyond what is considered to be acceptable. To improve the efficiency of the verification process an experiment was set up that refrains from the conventional terrestrial methods for border verification. The central research question was formulated as "How useful are Unmanned Aerial Systems in the juridical verification process of cadastral borders of ownership at het Kadaster in the Netherlands?". For the experiment, operational evaluations were executed at two different locations. The first operational evaluation took place at the Pyramid of Austerlitz, a flat area with a 30 m high pyramid built by troops of Napoleon, with low civilian attendance. Two subsequent evaluations were situated in a small neighbourhood in the city of Nunspeet, where the cadastral situation recently changed, resulting from twenty new houses that were build. Initially a mini-UAS of the KLPD was used to collect photo datasets with less than 1 cm spatial resolution. In a later stage the commercial service provider Orbit Gis was hired. During the experiment four different software packages were used for processing the photo datasets into accurate geo-referenced ortho-mosaics. In this article more details will be described on the experiments carried out. Attention will be paid to the mini-UAS platforms (AscTec Falcon 8, Microdrone MD-4), the cameras used, the photo collection plan, the usage of ground control markers and the calibration of the camera's. Furthermore the results and experiences of the different used SFM software packages (Visual SFM/Bundler, PhotoScan, PhotoModeler and the Orbit software) will be shared.

  16. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...

  17. Formal verification of business process models using Petri nets%业务过程模型的Petri网形式化检查方法

    Institute of Scientific and Technical Information of China (English)

    邓建; 陈智; 曾家智

    2011-01-01

    为解决企业间业务协同模型形式化检查的问题,将采用标准业务过程建模符号的业务模型转换为Petri网,构造出一种可建模企业问复杂业务协同的业务过程流网.采用过程定义可扩展标记语言2.1版本,开发了一种具有通用性的业务过程模型转换和检查工具.采用业务过程流网对供应链中企业间采购订单过程进行了建模.实验结果表明,业务过程流网能将符合过程定义町扩展标记语占规范的业务模型完整地转换为Petri网,得到的Petri网易于化简和分析.%To resolve the formal checking of business collaboration model cross enterprises, a special Petri net named Business Process Flow net (BPF-net) was constructed to transform business process modeled by Business Process Modeling Notation (BPMN) into Petri nets.With the use of eXtensible Markup Language (XML) Process Definition Language (XPDL) 2.1 version, a universal tool was also developed to convert and check business process model.An inter-organizational order process in supply chain was modeled by using of BPF-net.Experimental results showed that BPF-net could transform the bueiness model which met the XPDL specifications into Petri net, and easy to be simplified and analyzed.

  18. Correction, improvement and model verification of CARE 3, version 3

    Science.gov (United States)

    Rose, D. M.; Manke, J. W.; Altschul, R. E.; Nelson, D. L.

    1987-01-01

    An independent verification of the CARE 3 mathematical model and computer code was conducted and reported in NASA Contractor Report 166096, Review and Verification of CARE 3 Mathematical Model and Code: Interim Report. The study uncovered some implementation errors that were corrected and are reported in this document. The corrected CARE 3 program is called version 4. Thus the document, correction. improvement, and model verification of CARE 3, version 3 was written in April 1984. It is being published now as it has been determined to contain a more accurate representation of CARE 3 than the preceding document of April 1983. This edition supercedes NASA-CR-166122 entitled, 'Correction and Improvement of CARE 3,' version 3, April 1983.

  19. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...

  20. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  1. Neural correlates of semantic ambiguity processing during context verification.

    Science.gov (United States)

    Hoenig, Klaus; Scheef, Lukas

    2009-04-15

    Understanding the relevant meaning of a word with different meanings (homonym) in a given context requires activation of the neural representations of the relevant meaning and inhibition of the irrelevant meaning. The cognitive demand of such disambiguation is highest when the dominant, yet contextually irrelevant meaning of a polar homonym must be suppressed. This central process (semantic inhibition) for lexico-semantic ambiguity resolution was monitored with fMRI during semantic context verifications. Twenty-two healthy volunteers decided whether congruent or incongruent target words fitted into the contexts established by preceding sentences. Half of the sentences ended with a homonym, thereby allowing to cross the factors ambiguity and semantic congruency. BOLD increases related to the inhibitory attentional control over non-selected meanings during ambiguity processing occurred in a brain network including left dorsolateral prefrontal cortex (DLPFC), bilateral angular gyrus (AG), bilateral anterior superior temporal gyrus (aSTG) as well as right ventromedial temporal lobe. In left DLPFC (BA 46/9) and left AG (BA 39) BOLD activity to target words of the incongruent-ambiguous condition correlated with the individual amount of semantic interference experienced by the subjects. BOLD increases of incongruent versus congruent semantic verifications occurred in bilateral inferior frontal gyrus. The results of the present study suggest a specific role of left DLPFC and AG in the resolution of semantic interference from contextually inappropriate homonym meanings. These fronto-parietal areas might exert inhibitory control over temporal regions in service of attentional selection between relevant and irrelevant homonym meanings, by creating a sufficient activation difference between their respective representations.

  2. Pneumatic Adaptive Absorber: Mathematical Modelling with Experimental Verification

    Directory of Open Access Journals (Sweden)

    Grzegorz Mikułowski

    2016-01-01

    Full Text Available Many of mechanical energy absorbers utilized in engineering structures are hydraulic dampers, since they are simple and highly efficient and have favourable volume to load capacity ratio. However, there exist fields of applications where a threat of toxic contamination with the hydraulic fluid contents must be avoided, for example, food or pharmacy industries. A solution here can be a Pneumatic Adaptive Absorber (PAA, which is characterized by a high dissipation efficiency and an inactive medium. In order to properly analyse the characteristics of a PAA, an adequate mathematical model is required. This paper proposes a concept for mathematical modelling of a PAA with experimental verification. The PAA is considered as a piston-cylinder device with a controllable valve incorporated inside the piston. The objective of this paper is to describe a thermodynamic model of a double chamber cylinder with gas migration between the inner volumes of the device. The specific situation considered here is that the process cannot be defined as polytropic, characterized by constant in time thermodynamic coefficients. Instead, the coefficients of the proposed model are updated during the analysis. The results of the experimental research reveal that the proposed mathematical model is able to accurately reflect the physical behaviour of the fabricated demonstrator of the shock absorber.

  3. Modeling and Verification of Insider Threats Using Logical Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2017-01-01

    and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider...

  4. Sensor Fusion and Model Verification for a Mobile Robot

    OpenAIRE

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck; Bendtsen, Jan Dimon; Izadi-Zamanabadi, Roozbeh

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms as well as slip. An Unscented Kalman Filter (UKF) based on the dynamic model is used for sensor fusion, feeding sensor measurements back to the robot controller in an intelligent manner. Through practi...

  5. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...

  6. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Panduro, T. E.; Thorsen, B. J.

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...

  7. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  8. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    Science.gov (United States)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  9. High temperature furnace modeling and performance verifications

    Science.gov (United States)

    Smith, James E., Jr.

    1992-01-01

    Analytical, numerical, and experimental studies were performed on two classes of high temperature materials processing sources for their potential use as directional solidification furnaces. The research concentrated on a commercially available high temperature furnace using a zirconia ceramic tube as the heating element and an Arc Furnace based on a tube welder. The first objective was to assemble the zirconia furnace and construct parts needed to successfully perform experiments. The 2nd objective was to evaluate the zirconia furnace performance as a directional solidification furnace element. The 3rd objective was to establish a data base on materials used in the furnace construction, with particular emphasis on emissivities, transmissivities, and absorptivities as functions of wavelength and temperature. A 1-D and 2-D spectral radiation heat transfer model was developed for comparison with standard modeling techniques, and were used to predict wall and crucible temperatures. The 4th objective addressed the development of a SINDA model for the Arc Furnace and was used to design sample holders and to estimate cooling media temperatures for the steady state operation of the furnace. And, the 5th objective addressed the initial performance evaluation of the Arc Furnace and associated equipment for directional solidification. Results of these objectives are presented.

  10. VERIFICATION OF GEAR DYNAMIC MODEL IN DIFFERENT OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ

    2014-09-01

    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  11. Verification of the karst flow model under laboratory controlled conditions

    Science.gov (United States)

    Gotovac, Hrvoje; Andric, Ivo; Malenica, Luka; Srzic, Veljko

    2016-04-01

    Karst aquifers are very important groundwater resources around the world as well as in coastal part of Croatia. They consist of extremely complex structure defining by slow and laminar porous medium and small fissures and usually fast turbulent conduits/karst channels. Except simple lumped hydrological models that ignore high karst heterogeneity, full hydraulic (distributive) models have been developed exclusively by conventional finite element and finite volume elements considering complete karst heterogeneity structure that improves our understanding of complex processes in karst. Groundwater flow modeling in complex karst aquifers are faced by many difficulties such as a lack of heterogeneity knowledge (especially conduits), resolution of different spatial/temporal scales, connectivity between matrix and conduits, setting of appropriate boundary conditions and many others. Particular problem of karst flow modeling is verification of distributive models under real aquifer conditions due to lack of above-mentioned information. Therefore, we will show here possibility to verify karst flow models under the laboratory controlled conditions. Special 3-D karst flow model (5.6*2.6*2 m) consists of concrete construction, rainfall platform, 74 piezometers, 2 reservoirs and other supply equipment. Model is filled by fine sand (3-D porous matrix) and drainage plastic pipes (1-D conduits). This model enables knowledge of full heterogeneity structure including position of different sand layers as well as conduits location and geometry. Moreover, we know geometry of conduits perforation that enable analysis of interaction between matrix and conduits. In addition, pressure and precipitation distribution and discharge flow rates from both phases can be measured very accurately. These possibilities are not present in real sites what this model makes much more useful for karst flow modeling. Many experiments were performed under different controlled conditions such as different

  12. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  13. Gaia challenging performances verification: combination of spacecraft models and test results

    Science.gov (United States)

    Ecale, Eric; Faye, Frédéric; Chassat, François

    2016-08-01

    To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.

  14. 78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process

    Science.gov (United States)

    2013-05-28

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and Hazardous Materials Safety Administration, DOT. ACTION: Notice of... fitness for service processes. At this workshop, the Pipeline and Hazardous Materials...

  15. 78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension

    Science.gov (United States)

    2013-09-12

    ... Process'' flowchart. PHMSA is using this notice to announce the revised ``Integrity Verification Process'' flowchart and extend the comment period from September 9, 2013, to October 7, 2013. DATES: The closing...

  16. Dynamic business process modeling and verification for inter-organizational collaboration%跨组织动态协作业务流程组合建模与验证

    Institute of Scientific and Technical Information of China (English)

    胡庆成; 邢春晓; 杨吉江; 严琪; 李益民

    2007-01-01

    To achieve an on-demand and dynamic composition model of inter-organizational business processes,a new approach for business process modeling and verification is introduced by using the pi-calculus theory.A new business process model which is multi-role,multi-dimensional,integrated and dynamic is proposed relying on inter-organizational collaboration.Compatible with the traditional linear sequence model,the new model is an M×N multi-dimensional mesh,and provides horizontal and vertical formal descriptions for the collaboration business process model.Finally,the pi-calculus theory is utilized to verify the deadlocks,livelocks and synchronization of the example models. The result shows that the proposed approach is efficient and applicable in inter-organizational business process modeling.%为了实现随需应变、动态组合的跨组织协作业务流程模型,将pi-演算理论引入业务流程建模与验证中,应用pi-演算理论建立了多角色、多维度、多集成性的业务流程跨组织协同的模型.新的业务流程模型为M×N多维度网状关系,同时兼顾了以往单一的线性顺序关系模型,并分别从横向与纵向对该协作业务流程模型进行了形式化描述.最后依据pi-演算理论对样例模型的死锁、活锁以及同步性进行了验证,确保了模型的正确性和可行性.跨组织动态协作业务流程模型具有稳固的理论基础,为跨部门、跨区域分布式信息交互提供了新的思路和方法.

  17. Assessment of Galileo modal test results for mathematical model verification

    Science.gov (United States)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  18. Verification of A Numerical Harbour Wave Model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A numerical model for wave propagation in a harbour is verified by use of physical models. The extended time-dependent mild slope equation is employed as the governing equation, and the model is solved by use of ADI method containing the relaxation factor. Firstly, the reflection coefficient of waves in front of rubble-mound breakwaters under oblique incident waves is determined through physical model tests, and it is regarded as the basis for simulating partial reflection boundaries of the numerical model. Then model tests on refraction, diffraction and reflection of waves in a harbour are performed to measure wave height distribution. Comparative results between physical and numerical model tests show that the present numerical model can satisfactorily simulate the propagation of regular and irregular waves in a harbour with complex topography and boundary conditions.

  19. Algorithmic Verification of Single-Pass List Processing Programs

    CERN Document Server

    Alur, Rajeev

    2010-01-01

    We identify a class of programs manipulating lists of data items for which checking functional equivalence and pre/post verification conditions is decidable. Lists are modeled as data words, (unbounded) sequences of data values, tagged with symbols from a finite set, over a potentially infinite data domain that supports only the operations of equality and ordering. First, we introduce streaming data-word transducers that map input data words to output data words in a single left-to-right pass in linear time. The transducer uses a finite set of states, a finite set of variables ranging over data domain, and a finite set of variables ranging over data words. At every step, it can make decisions based on the next input symbol, updating its state, remembering the input data value in its data variables, and updating data-word variables by concatenating data-word variables and new symbols formed from data variables, while avoiding duplication. Second, we establish Pspace bounds for the problems of checking function...

  20. Is my model good enough? Best practices for verification and validation of musculoskeletal models and simulations of movement.

    Science.gov (United States)

    Hicks, Jennifer L; Uchida, Thomas K; Seth, Ajay; Rajagopal, Apoorva; Delp, Scott L

    2015-02-01

    Computational modeling and simulation of neuromusculoskeletal (NMS) systems enables researchers and clinicians to study the complex dynamics underlying human and animal movement. NMS models use equations derived from physical laws and biology to help solve challenging real-world problems, from designing prosthetics that maximize running speed to developing exoskeletal devices that enable walking after a stroke. NMS modeling and simulation has proliferated in the biomechanics research community over the past 25 years, but the lack of verification and validation standards remains a major barrier to wider adoption and impact. The goal of this paper is to establish practical guidelines for verification and validation of NMS models and simulations that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies. In particular, we review a general process for verification and validation applied to NMS models and simulations, including careful formulation of a research question and methods, traditional verification and validation steps, and documentation and sharing of results for use and testing by other researchers. Modeling the NMS system and simulating its motion involves methods to represent neural control, musculoskeletal geometry, muscle-tendon dynamics, contact forces, and multibody dynamics. For each of these components, we review modeling choices and software verification guidelines; discuss variability, errors, uncertainty, and sensitivity relationships; and provide recommendations for verification and validation by comparing experimental data and testing robustness. We present a series of case studies to illustrate key principles. In closing, we discuss challenges the community must overcome to ensure that modeling and simulation are successfully used to solve the broad spectrum of problems that limit human mobility.

  1. Dynamic grey model of verification cycle and lifecycle of measuring instrument and its application

    Institute of Scientific and Technical Information of China (English)

    SU Hai-tao; YANG Shi-yuan; DONG Hua; SHEN Mao-hu

    2005-01-01

    Two dynamic grey models DGM (1, 1) for the verification cycle and the lifecycle of measuring instrument based on time sequence and frequency sequence were set up, according to the statistical feature of examination data and weighting method. By a specific case, i.e. vernier caliper, it is proved that the fit precision and forecast precision of the models are much higher, the cycles are obviously different under different working conditions, and the forecast result of the frequency sequence model is better than that of the time sequence model. Combining dynamic grey model and auto-manufacturing case the controlling and information subsystems of verification cycle and the lifecycle based on information integration, multi-sensor controlling and management controlling were given. The models can be used in production process to help enterprise reduce error, cost and flaw.

  2. Spatial Error Metrics for Oceanographic Model Verification

    Science.gov (United States)

    2012-02-01

    quantitatively and qualitatively for this oceano - graphic data and successfully separates the model error into displacement and intensity components. This... oceano - graphic models as well, though one would likely need to make special modifications to handle the often-used nonuniform spacing between depth layers

  3. Certification and verification for Northrup Model NSC-01-0732 Fresnel lens concentrating solar collector

    Energy Technology Data Exchange (ETDEWEB)

    1979-03-01

    The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.

  4. Superelement Verification in Complex Structural Models

    Directory of Open Access Journals (Sweden)

    B. Dupont

    2008-01-01

    Full Text Available The objective of this article is to propose decision indicators to guide the analyst in the optimal definition of an ensemble of superelements in a complex structural assembly. These indicators are constructed based on comparisons between the unreduced physical model and the approximate solution provided by a nominally reduced superelement model. First, the low contribution substructure slave modes are filtered. Then, the minimum dynamical residual expansion is used to localize the superelements which are the most responsible for the response prediction errors. Moreover, it is shown that static residual vectors, which are a natural result of these calculations, can be included to represent the contribution of important truncated slave modes and consequently correct the deficient superelements. The proposed methodology is illustrated on a subassembly of an aeroengine model.

  5. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  6. Verification of the Chesapeake Bay Model.

    Science.gov (United States)

    1981-12-01

    line of the five cups was about 0.045 ft above the bottom of the meter frame; 30 STEPPING MOTOR 200 STEPS REVOLUTION TRANSLATOR SRPPOT.E SELECTOR DIST...about 0.1 ft in the model, represented a horizontal width of about 100 ft in the prototype. The height of the meter cups , about 0.04 ft, represented...the entire bay. Although station-to-station wind magnitude comparisons cannot be made due to variations in anemometer height and exposure, wind-field

  7. Modeling and Verification of the Bitcoin Protocol

    Directory of Open Access Journals (Sweden)

    Kaylash Chaudhary

    2015-11-01

    Full Text Available Bitcoin is a popular digital currency for online payments, realized as a decentralized peer-to-peer electronic cash system. Bitcoin keeps a ledger of all transactions; the majority of the participants decides on the correct ledger. Since there is no trusted third party to guard against double spending, and inspired by its popularity, we would like to investigate the correctness of the Bitcoin protocol. Double spending is an important threat to electronic payment systems. Double spending would happen if one user could force a majority to believe that a ledger without his previous payment is the correct one. We are interested in the probability of success of such a double spending attack, which is linked to the computational power of the attacker. This paper examines the Bitcoin protocol and provides its formalization as an UPPAAL model. The model will be used to show how double spending can be done if the parties in the Bitcoin protocol behave maliciously, and with what probability double spending occurs.

  8. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  9. Design verification and cold-flow modeling test report

    Energy Technology Data Exchange (ETDEWEB)

    1993-07-01

    This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, and TRW proprietary information has been excluded.

  10. Transforming PLC Programs into Formal Models for Verification Purposes

    CERN Document Server

    Darvas, D; Blanco, E

    2013-01-01

    Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.

  11. A verification strategy for web services composition using enhanced stacked automata model.

    Science.gov (United States)

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  12. Weather model verification using Sodankylä mast measurements

    Directory of Open Access Journals (Sweden)

    M. Kangas

    2015-12-01

    Full Text Available Sodankylä, in the heart of Arctic Research Centre of the Finnish Meteorological Institute (FMI ARC in northern Finland, is an ideal site for atmospheric and environmental research in the boreal and sub-arctic zone. With temperatures ranging from −50 to +30 °C, it provides a challenging testing ground for numerical weather forecasting (NWP models as well as weather forecasting in general. An extensive set of measurements has been carried out in Sodankylä for more than 100 years. In 2000, a 48 m high micrometeorological mast was erected in the area. In this article, the use of Sodankylä mast measurements in NWP model verification is described. Started in 2000 with NWP model HIRLAM and Sodankylä measurements, the verification system has now been expanded to include comparisons between 12 NWP models and seven measurement masts. A case study, comparing forecasted and observed radiation fluxes, is also presented. It was found that three different radiation schemes, applicable in NWP model HARMONIE-AROME, produced during cloudy days somewhat different downwelling long-wave radiation fluxes, which however did not change the overall cold bias of the predicted screen-level temperature.

  13. SoS contract verification using statistical model checking

    Directory of Open Access Journals (Sweden)

    Alessandro Mignogna

    2013-11-01

    Full Text Available Exhaustive formal verification for systems of systems (SoS is impractical and cannot be applied on a large scale. In this paper we propose to use statistical model checking for efficient verification of SoS. We address three relevant aspects for systems of systems: 1 the model of the SoS, which includes stochastic aspects; 2 the formalization of the SoS requirements in the form of contracts; 3 the tool-chain to support statistical model checking for SoS. We adapt the SMC technique for application to heterogeneous SoS. We extend the UPDM/SysML specification language to express the SoS requirements that the implemented strategies over the SoS must satisfy. The requirements are specified with a new contract language specifically designed for SoS, targeting a high-level English- pattern language, but relying on an accurate semantics given by the standard temporal logics. The contracts are verified against the UPDM/SysML specification using the Statistical Model Checker (SMC PLASMA combined with the simulation engine DESYRE, which integrates heterogeneous behavioral models through the functional mock-up interface (FMI standard. The tool-chain allows computing an estimation of the satisfiability of the contracts by the SoS. The results help the system architect to trade-off different solutions to guide the evolution of the SoS.

  14. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates...... sequential release – a feature in the new Danish interlocking systems. The generic model and safety properties can be instantiated with interlocking configuration data, resulting in a concrete model in the form of a Kripke structure, and in high-level safety properties expressed as state invariants. Using...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  15. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates...... sequential release - a feature in the new Danish interlocking systems. The generic model and safety properties can be instantiated with interlocking configuration data, resulting in a concrete model in the form of a Kripke structure, and in high-level safety properties expressed as state invariants. Using...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  16. Verification and Validation of the Coastal Modeling System. Report 3: CMS-Flow: Hydrodynamics

    Science.gov (United States)

    2011-12-01

    ER D C/ CH L TR -1 1- 10 Verification and Validation of the Coastal Modeling System Report 3, CMS -Flow: Hydrodynamics Co as ta l a nd...11-10 December 2011 Verification and Validation of the Coastal Modeling System Report 3, CMS -Flow: Hydrodynamics Alejandro Sánchez, Weiming Wu...of four reports toward the Verification and Validation (V&V) of the Coastal Modeling System ( CMS ). The details of the V&V study specific to the

  17. Active Learning of Markov Decision Processes for System Verification

    DEFF Research Database (Denmark)

    Chen, Yingke; Nielsen, Thomas Dyhre

    2012-01-01

    of input/output observations. While alleviating the problem of manually constructing a system model, the collection/generation of observed system behaviors can also prove demanding. Consequently we seek to minimize the amount of data required. In this paper we propose an algorithm for learning...... deterministic Markov decision processes from data by actively guiding the selection of input actions. The algorithm is empirically analyzed by learning system models of slot machines, and it is demonstrated that the proposed active learning procedure can significantly reduce the amount of data required...... demanding process, and this shortcoming has motivated the development of algorithms for automatically learning system models from observed system behaviors. Recently, algorithms have been proposed for learning Markov decision process representations of reactive systems based on alternating sequences...

  18. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available : Series B (Statistical Methodology), 50(2), pp. 157-224. 12th INCOSE SA Systems Engineering Conference ISBN 978-0-620-72719-8 Page 103 M&SCO. 2013. Verification, Validation, & Accreditation (VV&A) Recommended Practices Guide (RPG). Retrieved from U....S. DoD Modelling & Simulation Coordination Office. http://www.msco.mil/VVA_RPG.html (last accessed April 8, 2016). Pearl, J. 1988. Probabilistic reasoning in intelligent systems: Networks of plausible inference. Morgan Kaufmann. Sargent, R. G. 1981...

  19. A comparative verification of high resolution precipitation forecasts using model output statistics

    Science.gov (United States)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  20. Combining mask and OPC process verification for improved wafer patterning and yield

    Science.gov (United States)

    Hamouda, Ayman; Abdelghany, Hesham

    2016-10-01

    As technology advances into deep submicron nodes, the mask manufacturing process accuracy become more important. Mask Process Correction (MPC) has been transitioning from Rules-Based Mask Process correction to Model-Based Mask Process Correction mode. MPC is a subsequent step to OPC, where additional perturbation is applied to the mask shapes to correct for the mask manufacturing process. Shifting towards full model-based MPC is driven mainly by the accuracy requirements in advanced technology nodes, both for DUV and EUV processes. In the current state-of-the-art MPC process, MPC is completely decoupled from OPC, where each of them assumes that the other is executing perfectly. However, this decoupling is not suitable anymore due to the limited tolerance in the mask CDU budget and the increased wafer CDU requirements required from OPC. It is becoming more important to reduce any systematic mask errors, especially where they matter the most. In this work, we present a new combined-verification methodology that allows testing the combined effect of mask process and lithography process together and judging the final wafer patterning quality. This has the potential to intercept risks due to superposition of OPC and MPC correction residual errors, and capturing and correcting such a previously hidden source of patterning degradation.

  1. Verification and transfer of thermal pollution model. Volume 3: Verification of 3-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free-surface model also provides surface height variations with time.

  2. Very fast road database verification using textured 3D city models obtained from airborne imagery

    Science.gov (United States)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models

  3. Efficient speaker verification using Gaussian mixture model component clustering.

    Energy Technology Data Exchange (ETDEWEB)

    De Leon, Phillip L. (New Mexico State University, Las Cruces, NM); McClanahan, Richard D.

    2012-04-01

    In speaker verification (SV) systems that employ a support vector machine (SVM) classifier to make decisions on a supervector derived from Gaussian mixture model (GMM) component mean vectors, a significant portion of the computational load is involved in the calculation of the a posteriori probability of the feature vectors of the speaker under test with respect to the individual component densities of the universal background model (UBM). Further, the calculation of the sufficient statistics for the weight, mean, and covariance parameters derived from these same feature vectors also contribute a substantial amount of processing load to the SV system. In this paper, we propose a method that utilizes clusters of GMM-UBM mixture component densities in order to reduce the computational load required. In the adaptation step we score the feature vectors against the clusters and calculate the a posteriori probabilities and update the statistics exclusively for mixture components belonging to appropriate clusters. Each cluster is a grouping of multivariate normal distributions and is modeled by a single multivariate distribution. As such, the set of multivariate normal distributions representing the different clusters also form a GMM. This GMM is referred to as a hash GMM which can be considered to a lower resolution representation of the GMM-UBM. The mapping that associates the components of the hash GMM with components of the original GMM-UBM is referred to as a shortlist. This research investigates various methods of clustering the components of the GMM-UBM and forming hash GMMs. Of five different methods that are presented one method, Gaussian mixture reduction as proposed by Runnall's, easily outperformed the other methods. This method of Gaussian reduction iteratively reduces the size of a GMM by successively merging pairs of component densities. Pairs are selected for merger by using a Kullback-Leibler based metric. Using Runnal's method of reduction, we

  4. Speaker verification using combined acoustic and EM sensor signal processing

    Energy Technology Data Exchange (ETDEWEB)

    Ng, L C; Gable, T J; Holzrichter, J F

    2000-11-10

    Low Power EM radar-like sensors have made it possible to measure properties of the human speech production system in real-time, without acoustic interference. This greatly enhances the quality and quantity of information for many speech related applications. See Holzrichter, Burnett, Ng, and Lea, J. Acoustic. SOC. Am . 103 ( 1) 622 (1998). By combining the Glottal-EM-Sensor (GEMS) with the Acoustic-signals, we've demonstrated an almost 10 fold reduction in error rates from a speaker verification system experiment under a moderate noisy environment (-10dB).

  5. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    and implementation. Our approach has been developed in the context of the Coloured Petri Nets (CPNs) modelling language. We illustrate our approach by presenting a descriptive specification model of the Websocket protocol which is currently under development by the Internet Engineering Task Force (IETF), and we show......Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...

  6. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however......, feasible for modelling the overall cost per day. The study also shows that combining the insurance and regional data it is possible to establish clear relationships between occurrences of claims and hazard maps. In particular, the results indicate that with improvements on data collection and analysis......, improved prediction of damage information will be possible, e.g. based on also socioeconomic variables. Furthermore, the paper concludes that more collaboration between scientific research and insurance agencies is necessary to improve inundation modelling and economic assessments for urban drainage...

  7. Verification of flood damage modelling using insurance data.

    Science.gov (United States)

    Zhou, Q; Panduro, T E; Thorsen, B J; Arnbjerg-Nielsen, K

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however, feasible for modelling the overall cost per day. The study also shows that in combining the insurance and regional data it is possible to establish clear relationships between occurrences of claims and hazard maps. In particular, the results indicate that with improvements to data collection and analysis, improved prediction of damage costs will be possible, for example based also on socioeconomic variables. Furthermore, the paper concludes that more collaboration between scientific research and insurance agencies is needed to improve inundation modelling and economic assessments for urban drainage designs.

  8. Verification of Conjugate Heat Transfer Models in a Closed Volume with Radiative Heat Source

    Directory of Open Access Journals (Sweden)

    Maksimov Vyacheslav I.

    2016-01-01

    Full Text Available The results of verification of mathematical model of convective-conductive heat transfer in a closed volume with a thermally conductive enclosing structures are presented. Experiments were carried out to determine the temperature of floor premises in the working conditions of radiant heating systems. Comparison of mathematical modelling of temperature fields and experiments showed their good agreement. It is concluded that the mathematical model of conjugate heat transfers in the air cavity with a heat-conducting and heat-retaining walls correspond to the real process of formation of temperature fields in premises with gas infrared heaters system.

  9. On the verification and validation of detonation models

    Science.gov (United States)

    Quirk, James

    2013-06-01

    This talk will consider the verification and validation of detonation models, such as Wescott-Stewart-Davis (Journal of Applied Physics. 2005), from the perspective of the American Institute of Aeronautics and Astronautics policy on numerical accuracy (AIAA J. Vol. 36, No. 1, 1998). A key aspect of the policy is that accepted documentation procedures must be used for journal articles with the aim of allowing the reported work to be reproduced by the interested reader. With the rise of electronic documents, since the policy was formulated, it is now possible to satisfy this mandate in its strictest sense: that is, it is now possible to run a comptuational verification study directly in a PDF, thereby allowing a technical author to report numerical subtleties that traditionally have been ignored. The motivation for this document-centric approach is discussed elsewhere (Quirk2003, Adaptive Mesh Refinement Theory and Practice, Springer), leaving the talk to concentrate on specific detonation examples that should be of broad interest to the shock-compression community.

  10. The virtual product-process design laboratory to manage the complexity in the verification of formulated products

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Malik, Tahir I.

    2011-01-01

    mixtures need to be predicted. This complexity has to be managed through decomposition of the problem into sub-problems. Each sub-problem is solved and analyzed and, from the knowledge gained, an overall evaluation of the complex chemical system representing the product is made. The virtual Product......-Process Design laboratory (virtual PPD-lab) software is based on this decomposition strategy for the design of formulated liquid products. When the needed models are available in the software, the solution of formulation design/verification problems is straightforward, while when models are not available...... in the software library, they need to be developed and/or implemented. The potential of the virtual PPD-lab in managing the complexity in the verification of formulated products, after the needed models have been developed and implemented, is highlighted in this paper through a case study from industry dealing...

  11. A scenario-based verification technique to assess the compatibility of collaborative business processes

    NARCIS (Netherlands)

    De Backer, M.; Snoeck, M.; Monsieur, G.; Lemahieu, W.; Dedene, G.

    2009-01-01

    Successful E-Business is based on seamless collaborative business processes. Each partner in the collaboration specifies its own rules and interaction preconditions. The verification of the compatibility of collaborative business processes, based on local and global views, is a complex task, which i

  12. Model-based mask verification on critical 45nm logic masks

    Science.gov (United States)

    Sundermann, F.; Foussadier, F.; Takigawa, T.; Wiley, J.; Vacca, A.; Depre, L.; Chen, G.; Bai, S.; Wang, J.-S.; Howell, R.; Arnoux, V.; Hayano, K.; Narukawa, S.; Kawashima, S.; Mohri, H.; Hayashi, N.; Miyashita, H.; Trouiller, Y.; Robert, F.; Vautrin, F.; Kerrien, G.; Planchot, J.; Martinelli, C.; Di-Maria, J. L.; Farys, V.; Vandewalle, B.; Perraud, L.; Le Denmat, J. C.; Villaret, A.; Gardin, C.; Yesilada, E.; Saied, M.

    2008-05-01

    In the continuous battle to improve critical dimension (CD) uniformity, especially for 45-nanometer (nm) logic advanced products, one important recent advance is the ability to accurately predict the mask CD uniformity contribution to the overall global wafer CD error budget. In most wafer process simulation models, mask error contribution is embedded in the optical and/or resist models. We have separated the mask effects, however, by creating a short-range mask process model (MPM) for each unique mask process and a long-range CD uniformity mask bias map (MBM) for each individual mask. By establishing a mask bias map, we are able to incorporate the mask CD uniformity signature into our modelling simulations and measure the effects on global wafer CD uniformity and hotspots. We also have examined several ways of proving the efficiency of this approach, including the analysis of OPC hot spot signatures with and without the mask bias map (see Figure 1) and by comparing the precision of the model contour prediction to wafer SEM images. In this paper we will show the different steps of mask bias map generation and use for advanced 45nm logic node layers, along with the current results of this new dynamic application to improve hot spot verification through Brion Technologies' model-based mask verification loop.

  13. Verification of tropical cyclone using the KIAPS Integration Model (KIM)

    Science.gov (United States)

    Lim, S.; Seol, K. H.

    2015-12-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) is a government funded non-profit research and development institute located in Seoul, South Korea. KIAPS is developing the Global Model, a backbone for the next-generation operational global numerical weather prediction (NWP) system with three-phase plans; Establishment and R&D Planning (2011-2013), Test Model Development (2014-2016), and Operational Model Development (2017-2019). As a second-phase, we have beta version of KIAPS Integration Model (KIM) that can produce reasonable global forecasting. Using the KIM model, we are evaluating the tropical cyclone forecast in the global model. To objectively provide a best estimate of the storm's central position, we use the Geophysical Fluid Dynamics Laboratory (GFDL) vortex tracker, widely used in tracker algorithms. It gives the track and intensity of the storm throughout the duration of the forecast based on its algorithm. As a verification tool, we use the Model Evaluation Tool - Tropical Cyclone (MET-TC), which produces statistical evaluation. We expect these results give the statue of ability for the tropical cyclone forecast with KIM model.

  14. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  15. Demonstration of Design Verification Model of Rubidium Frequency Standard

    CERN Document Server

    Ghosal, Bikash; Nandanwar, Satish; Banik, Alak; Dasgupta, K S; Saxena, G M

    2011-01-01

    In this paper we report the development of the design verification model (DVM) of Rb atomic frequency standard. The Rb atomic frequency standard or clock has two distinct parts. One is the Physics Package where the hyperfine transitions produce the clock signal in the integrated filter cell configuration and the other is the electronic circuits which generate the resonant microwave hyperfine frequency, phase modulator and phase sensitive detector. In this paper the details of the Rb Physics package and the electronic circuits are given. The effect of putting the photo detector inside the microwave cavity is studied and reported with its effect on the resonance signal profile. The Rb clock frequency stability measurements have also been discussed.

  16. Verification of Model of Calculation of Intra-Chamber Parameters In Hybrid Solid-Propellant Rocket Engines

    Directory of Open Access Journals (Sweden)

    Zhukov Ilya S.

    2016-01-01

    Full Text Available On the basis of obtained analytical estimate of characteristics of hybrid solid-propellant rocket engine verification of earlier developed physical and mathematical model of processes in a hybrid solid-propellant rocket engine for quasi-steady-state flow regime was performed. Comparative analysis of calculated and analytical data indicated satisfactory comparability of simulation results.

  17. Verification of Model of Calculation of Intra-Chamber Parameters In Hybrid Solid-Propellant Rocket Engines

    OpenAIRE

    Zhukov Ilya S.; Borisov Boris V.; Bondarchuk Sergey S.; Zhukov Alexander S.

    2016-01-01

    On the basis of obtained analytical estimate of characteristics of hybrid solid-propellant rocket engine verification of earlier developed physical and mathematical model of processes in a hybrid solid-propellant rocket engine for quasi-steady-state flow regime was performed. Comparative analysis of calculated and analytical data indicated satisfactory comparability of simulation results.

  18. Model Checking of Boolean Process Models

    CERN Document Server

    Schneider, Christoph

    2011-01-01

    In the field of Business Process Management formal models for the control flow of business processes have been designed since more than 15 years. Which methods are best suited to verify the bulk of these models? The first step is to select a formal language which fixes the semantics of the models. We adopt the language of Boolean systems as reference language for Boolean process models. Boolean systems form a simple subclass of coloured Petri nets. Their characteristics are low tokens to model explicitly states with a subsequent skipping of activations and arbitrary logical rules of type AND, XOR, OR etc. to model the split and join of the control flow. We apply model checking as a verification method for the safeness and liveness of Boolean systems. Model checking of Boolean systems uses the elementary theory of propositional logic, no modal operators are needed. Our verification builds on a finite complete prefix of a certain T-system attached to the Boolean system. It splits the processes of the Boolean sy...

  19. Simscape Modeling Verification in the Simulink Development Environment

    Science.gov (United States)

    Volle, Christopher E. E.

    2014-01-01

    The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.

  20. Control and verification of industrial hybrid systems using models specified with the formalism $ chi $

    NARCIS (Netherlands)

    J.J.H. Fey

    1996-01-01

    textabstractControl and verification of hybrid systems is studied using two industrial examples. The hybrid models of a conveyor-belt and of a biochemical plant for the production of ethanol are specified in the formalism $chi .$ A verification of the closed-loop systems for those examples,

  1. Runtime Verification for Business Processes Utilizing the Bitcoin Blockchain

    OpenAIRE

    Prybila, Christoph; Schulte, Stefan; Hochreiner, Christoph; Weber, Ingo

    2017-01-01

    The usage of process choreographies and decentralized Business Process Management Systems has been named as an alternative to centralized business process orchestration. In choreographies, control over a process instance is shared between independent parties, and no party has full control or knowledge during process runtime. Nevertheless, it is necessary to monitor and verify process instances during runtime for purposes of documentation, accounting, or compensation. To achieve business proce...

  2. 49 CFR 40.151 - What are MROs prohibited from doing as part of the verification process?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false What are MROs prohibited from doing as part of the... Verification Process § 40.151 What are MROs prohibited from doing as part of the verification process? As an... in a closed car with several people smoking crack. MROs are unlikely to be able to verify the...

  3. Beef-cattle producers benefit from Extension-managed process verification programs

    OpenAIRE

    Sutphin, Michael D.

    2009-01-01

    The most recent data from the U.S. Census of Agriculture shows that Virginia's beef producers face a myriad of challenges. Virginia Cooperative Extension is working with the beef-cattle industry to boost profits and add value to Virginia's beef products by training and educating producers about process verification programs that certify the age and source of beef cattle.

  4. Verification of temporal-causal network models by mathematical analysis

    Directory of Open Access Journals (Sweden)

    Jan Treur

    2016-04-01

    Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.

  5. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Christina [Bundesanstalt fuer Materialforschung und -pruefung (BAM), Berlin (Germany); Oeberg, Tomas [Tomas Oeberg Konsult AB, Lyckeby (Sweden)

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance

  6. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  7. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  8. Quantum position verification in bounded-attack-frequency model

    Science.gov (United States)

    Gao, Fei; Liu, Bin; Wen, QiaoYan

    2016-11-01

    In 2011, Buhrman et al. proved that it is impossible to design an unconditionally secure quantum position verification (QPV) protocol if the adversaries are allowed to previously share unlimited entanglements. Afterwards, people started to design secure QPV protocols in practical settings, e.g. the bounded-storage model, where the adversaries' pre-shared entangled resources are supposed to be limited. Here we focus on another practical factor that it is very difficult for the adversaries to perform attack operations with unlimitedly high frequency. Concretely, we present a new kind of QPV protocols, called non-simultaneous QPV. And we prove the security of a specific non-simultaneous QPV protocol with the assumption that the frequency of the adversaries' attack operations is bounded, but no assumptions on their pre-shared entanglements or quantum storage. Actually, in our nonsimultaneous protocol, the information whether there comes a signal at present time is also a piece of command. It renders the adversaries "blind", that is, they have to execute attack operations with unlimitedly high frequency no matter whether a signal arrives, which implies the non-simultaneous QPV is also secure in the bounded-storage model.

  9. Verification and Planning for Stochastic Processes with Asynchronous Events

    Science.gov (United States)

    2005-01-01

    IEEE Transactions on Automatic Control 38, no. 7: 1040–1059. Bartlett, M. S. 1966. An Introduction to Stochastic Processes with Special Reference to...Artificial Intelligence, 875–881, Madison, Wisconsin. AAAI Press. Çinlar, Erhan. 1975. Introduction to Stochastic Processes . Englewood Cliffs, New... to Stochastic Processes . Boston: Houghton Mifflin Company. Hoey, Jesse, Robert St-Aubin, Alan Hu, and Craig Boutilier. 1999. SPUDD: Stochastic

  10. Continuous Verification of Large Embedded Software using SMT-Based Bounded Model Checking

    CERN Document Server

    Cordeiro, Lucas; Marques-Silva, Joao

    2009-01-01

    The complexity of software in embedded systems has increased significantly over the last years so that software verification now plays an important role in ensuring the overall product quality. In this context, SAT-based bounded model checking has been successfully applied to discover subtle errors, but for larger applications, it often suffers from the state space explosion problem. This paper describes a new approach called continuous verification to detect design errors as quickly as possible by looking at the Software Configuration Management (SCM) system and by combining dynamic and static verification to reduce the state space to be explored. We also give a set of encodings that provide accurate support for program verification and use different background theories in order to improve scalability and precision in a completely automatic way. A case study from the telecommunications domain shows that the proposed approach improves the error-detection capability and reduces the overall verification time by...

  11. Models and formal verification of multiprocessor system-on-chips

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2008-01-01

    In this article we develop a model for applications running on multiprocessor platforms. An application is modelled by task graphs and a multiprocessor system is modelled by a number of processing elements, each capable of executing tasks according to a given scheduling discipline. We present a d...... could verify a smart-phone application consisting of 103 tasks executing on 4 processing elements....

  12. A novel simulation and verification approach in an ASIC design process

    CERN Document Server

    Husmann, D; Mahboubi, K; Pfeiffer, U; Schumacher, C

    2000-01-01

    We have built a fast signal-processing and readout ASIC (PPrAsic) for the Pre-Processor system of the ATLAS Level-1 Calorimeter Trigger. Our novel ASIC design environment incorporates algorithm development with digital hardware synthesis and verification. The purely digital ASIC was designed in Verilog HDL (hardware description language) and embedded in a system wide analog and digital simulation or implemented algorithms. We present here our design environment and experience that we gained from the design process. (10 refs).

  13. Verification of Quantum Cryptography Protocols by Model Checking

    Directory of Open Access Journals (Sweden)

    Mohamed Elboukhari

    2010-10-01

    Full Text Available Unlike classical cryptography which is based on mathematical functions, Quantum Cryptography orQuantum Key Distribution (QKD exploits the laws of quantum physics to offer unconditionally securecommunication. The progress of research in this field allows the anticipation of QKD to be availableoutside of laboratories within the next few years and efforts are made to improve the performance andreliability of the implemented technologies. But despite this big progress, several challenges remain. Forexample the task of how to test the devices of QKD did not yet receive enough attention. These apparatusesbecome heterogeneous, complex and so demand a big verification effort. In this paper we propose to studyquantum cryptography protocols by applying the technique of probabilistic model checking. Using PRISMtool, we analyze the security of BB84 protocol and we are focused on the specific security property ofeavesdropper's information gain on the key derived from the implementation of this protocol. We show thatthis property is affected by the parameters of the eavesdropper’s power and the quantum channel.

  14. Linear models to perform treaty verification tasks for enhanced information security

    Science.gov (United States)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  15. Prototyping the Semantics of a DSL using ASF+SDF: Link to Formal Verification of DSL Models

    CERN Document Server

    Andova, Suzana; Engelen, Luc; 10.4204/EPTCS.56.5

    2011-01-01

    A formal definition of the semantics of a domain-specific language (DSL) is a key prerequisite for the verification of the correctness of models specified using such a DSL and of transformations applied to these models. For this reason, we implemented a prototype of the semantics of a DSL for the specification of systems consisting of concurrent, communicating objects. Using this prototype, models specified in the DSL can be transformed to labeled transition systems (LTS). This approach of transforming models to LTSs allows us to apply existing tools for visualization and verification to models with little or no further effort. The prototype is implemented using the ASF+SDF Meta-Environment, an IDE for the algebraic specification language ASF+SDF, which offers efficient execution of the transformation as well as the ability to read models and produce LTSs without any additional pre or post processing.

  16. Prototyping the Semantics of a DSL using ASF+SDF: Link to Formal Verification of DSL Models

    Directory of Open Access Journals (Sweden)

    Suzana Andova

    2011-06-01

    Full Text Available A formal definition of the semantics of a domain-specific language (DSL is a key prerequisite for the verification of the correctness of models specified using such a DSL and of transformations applied to these models. For this reason, we implemented a prototype of the semantics of a DSL for the specification of systems consisting of concurrent, communicating objects. Using this prototype, models specified in the DSL can be transformed to labeled transition systems (LTS. This approach of transforming models to LTSs allows us to apply existing tools for visualization and verification to models with little or no further effort. The prototype is implemented using the ASF+SDF Meta-Environment, an IDE for the algebraic specification language ASF+SDF, which offers efficient execution of the transformation as well as the ability to read models and produce LTSs without any additional pre or post processing.

  17. The algorithm of verification of welding process for plastic pipes

    Science.gov (United States)

    Rzasinski, R.

    2017-08-01

    The study analyzes the process of butt welding of PE pipes in terms of proper selection of connector parameters. The process was oriented to the elements performed as a series of types of pipes. Polymeric materials commonly referred to as polymers or plastics, synthetic materials are produced from oil products in the polyreaction compounds of low molecular weight, called monomers. During the polyreactions monomers combine to build a macromolecule material monomer named with the prefix poly polypropylene, polyethylene or polyurethane, creating particles in solid state on the order of 0,2 to 0,4 mm. Finished products from polymers of virtually any shape and size are obtained by compression molding, injection molding, extrusion, laminating, centrifugal casting, etc. Weld can only be a thermoplastic that softens at an elevated temperature, and thus can be connected via a clamp. Depending on the source and method of supplying heat include the following welding processes: welding contact, radiant welding, friction welding, dielectric welding, ultrasonic welding. The analysis will be welding contact. In connection with the development of new generation of polyethylene, and the production of pipes with increasing dimensions (diameter, wall thickness) is important to select the correct process.

  18. Safety Verification of Piecewise-Deterministic Markov Processes

    DEFF Research Database (Denmark)

    Wisniewski, Rafael; Sloth, Christoffer; Bujorianu, Manuela

    2016-01-01

    We consider the safety problem of piecewise-deterministic Markov processes (PDMP). These are systems that have deterministic dynamics and stochastic jumps, where both the time and the destination of the jumps are stochastic. Specifically, we solve a p-safety problem, where we identify the set...

  19. Electric Machine Analysis, Control and Verification for Mechatronics Motion Control Applications, Using New MATLAB Built-in Function and Simulink Model

    Directory of Open Access Journals (Sweden)

    Farhan A. Salem

    2014-05-01

    Full Text Available This paper proposes a new, simple and user–friendly MATLAB built-in function, mathematical and Simulink models, to be used to early identify system level problems, to ensure that all design requirements are met, and, generally, to simplify Mechatronics motion control design process including; performance analysis and verification of a given electric DC machine, proper controller selection and verification for desired output speed or angle.

  20. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  1. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  2. Engineering within the assembly, verification, and integration (AIV) process in ALMA

    Science.gov (United States)

    Lopez, Bernhard; McMullin, Joseph P.; Whyborn, Nicholas D.; Duvall, Eugene

    2010-07-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) is a joint project between astronomical organizations in Europe, North America, and East Asia, in collaboration with the Republic of Chile. ALMA will consist of at least 54 twelve-meter antennas and 12 seven-meter antennas operating as an interferometer in the millimeter and sub-millimeter wavelength range. It will be located at an altitude above 5000m in the Chilean Atacama desert. As part of the ALMA construction phase the Assembly, Verification and Integration (AIV) team receives antennas and instrumentation from Integrated Product Teams (IPTs), verifies that the sub-systems perform as expected, performs the assembly and integration of the scientific instrumentation and verifies that functional and performance requirements are met. This paper aims to describe those aspects related to the AIV Engineering team, its role within the 4-station AIV process, the different phases the group underwent, lessons learned and potential space for improvement. AIV Engineering initially focused on the preparation of the necessary site infrastructure for AIV activities, on the purchase of tools and equipment and on the first ALMA system installations. With the first antennas arriving on site the team started to gather experience with AIV Station 1 beacon holography measurements for the assessment of the overall antenna surface quality, and with optical pointing to confirm the antenna pointing and tracking capabilities. With the arrival of the first receiver AIV Station 2 was developed which focuses on the installation of electrical and cryogenic systems and incrementally establishes the full connectivity of the antenna as an observing platform. Further antenna deliveries then allowed to refine the related procedures, develop staff expertise and to transition towards a more routine production process. Stations 3 and 4 deal with verification of the antenna with integrated electronics by the AIV Science Team and is not covered

  3. Formal verification technique for grid service chain model and its application

    Institute of Scientific and Technical Information of China (English)

    XU Ke; WANG YueXuan; WU Cheng

    2007-01-01

    Ensuring the correctness and reliability of large-scale resource sharing and complex job processing is an important task for grid applications. From a formal method perspective, a grid service chain model based on state Pi calculus is proposed in this work as the theoretical foundation for the service composition and collaboration in grid. Following the idea of the Web Service Resource Framework (WSRF), state Pi calculus enables the life-cycle management of system states by associating the actions in the original Pi calculus with system states. Moreover, model checking technique is exploited for the design-time and run-time logical verification of grid service chain models. A grid application scenario of the dynamic analysis of material deformation structure is also provided to show the effectiveness of the proposed work.

  4. Verification and Validation in a Rapid Software Development Process

    Science.gov (United States)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  5. Building a medical image processing algorithm verification database

    Science.gov (United States)

    Brown, C. Wayne

    2000-06-01

    The design of a database containing head Computed Tomography (CT) studies is presented, along with a justification for the database's composition. The database will be used to validate software algorithms that screen normal head CT studies from studies that contain pathology. The database is designed to have the following major properties: (1) a size sufficient for statistical viability, (2) inclusion of both normal (no pathology) and abnormal scans, (3) inclusion of scans due to equipment malfunction, technologist error, and uncooperative patients, (4) inclusion of data sets from multiple scanner manufacturers, (5) inclusion of data sets from different gender and age groups, and (6) three independent diagnosis of each data set. Designed correctly, the database will provide a partial basis for FDA (United States Food and Drug Administration) approval of image processing algorithms for clinical use. Our goal for the database is the proof of viability of screening head CT's for normal anatomy using computer algorithms. To put this work into context, a classification scheme for 'computer aided diagnosis' systems is proposed.

  6. TALYS/TENDL verification and validation processes: Outcomes and recommendations

    Science.gov (United States)

    Fleming, Michael; Sublet, Jean-Christophe; Gilbert, Mark R.; Koning, Arjan; Rochman, Dimitri

    2017-09-01

    The TALYS-generated Evaluated Nuclear Data Libraries (TENDL) provide truly general-purpose nuclear data files assembled from the outputs of the T6 nuclear model codes system for direct use in both basic physics and engineering applications. The most recent TENDL-2015 version is based on both default and adjusted parameters of the most recent TALYS, TAFIS, TANES, TARES, TEFAL, TASMAN codes wrapped into a Total Monte Carlo loop for uncertainty quantification. TENDL-2015 contains complete neutron-incident evaluations for all target nuclides with Z ≤116 with half-life longer than 1 second (2809 isotopes with 544 isomeric states), up to 200 MeV, with covariances and all reaction daughter products including isomers of half-life greater than 100 milliseconds. With the added High Fidelity Resonance (HFR) approach, all resonances are unique, following statistical rules. The validation of the TENDL-2014/2015 libraries against standard, evaluated, microscopic and integral cross sections has been performed against a newly compiled UKAEA database of thermal, resonance integral, Maxwellian averages, 14 MeV and various accelerator-driven neutron source spectra. This has been assembled using the most up-to-date, internationally-recognised data sources including the Atlas of Resonances, CRC, evaluated EXFOR, activation databases, fusion, fission and MACS. Excellent agreement was found with a small set of errors within the reference databases and TENDL-2014 predictions.

  7. TALYS/TENDL verification and validation processes: Outcomes and recommendations

    Directory of Open Access Journals (Sweden)

    Fleming Michael

    2017-01-01

    Full Text Available The TALYS-generated Evaluated Nuclear Data Libraries (TENDL provide truly general-purpose nuclear data files assembled from the outputs of the T6 nuclear model codes system for direct use in both basic physics and engineering applications. The most recent TENDL-2015 version is based on both default and adjusted parameters of the most recent TALYS, TAFIS, TANES, TARES, TEFAL, TASMAN codes wrapped into a Total Monte Carlo loop for uncertainty quantification. TENDL-2015 contains complete neutron-incident evaluations for all target nuclides with Z ≤116 with half-life longer than 1 second (2809 isotopes with 544 isomeric states, up to 200 MeV, with covariances and all reaction daughter products including isomers of half-life greater than 100 milliseconds. With the added High Fidelity Resonance (HFR approach, all resonances are unique, following statistical rules. The validation of the TENDL-2014/2015 libraries against standard, evaluated, microscopic and integral cross sections has been performed against a newly compiled UKAEA database of thermal, resonance integral, Maxwellian averages, 14 MeV and various accelerator-driven neutron source spectra. This has been assembled using the most up-to-date, internationally-recognised data sources including the Atlas of Resonances, CRC, evaluated EXFOR, activation databases, fusion, fission and MACS. Excellent agreement was found with a small set of errors within the reference databases and TENDL-2014 predictions.

  8. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  9. Second order closure integrated puff (SCIPUFF) model verification and evaluation study. Technical memo

    Energy Technology Data Exchange (ETDEWEB)

    Nappo, C.J.; Eckman, R.M.; Rao, K.S.; Herwehe, J.A.; Gunter, L.

    1998-06-01

    This report summarizes a verification of the SCIPUFF model as descried in the draft report PC-SCIPUFF Version 0.2 Technical Documentation by Sykes et al. The verification included a scientific review of the model physics and parameterizations described in the report, and checks for their internal usage and consistency with current practices in atmospheric dispersion modeling. This work is intended to examine the scientific basis and defensiblity of the model for the intended application. A related task is an assessment of the model`s general capabilities and limitations. A line-by-line verification of the computer source code was not possible; however, the code was checked with a commercial code analyzer. About 47 potential coding inconsistencies were identified.

  10. On-Ground Processing of Yaogan-24 Remote Sensing Satellite Attitude Data and Verification Using Geometric Field Calibration.

    Science.gov (United States)

    Wang, Mi; Fan, Chengcheng; Yang, Bo; Jin, Shuying; Pan, Jun

    2016-07-30

    Satellite attitude accuracy is an important factor affecting the geometric processing accuracy of high-resolution optical satellite imagery. To address the problem whereby the accuracy of the Yaogan-24 remote sensing satellite's on-board attitude data processing is not high enough and thus cannot meet its image geometry processing requirements, we developed an approach involving on-ground attitude data processing and digital orthophoto (DOM) and the digital elevation model (DEM) verification of a geometric calibration field. The approach focuses on three modules: on-ground processing based on bidirectional filter, overall weighted smoothing and fitting, and evaluation in the geometric calibration field. Our experimental results demonstrate that the proposed on-ground processing method is both robust and feasible, which ensures the reliability of the observation data quality, convergence and stability of the parameter estimation model. In addition, both the Euler angle and quaternion could be used to build a mathematical fitting model, while the orthogonal polynomial fitting model is more suitable for modeling the attitude parameter. Furthermore, compared to the image geometric processing results based on on-board attitude data, the image uncontrolled and relative geometric positioning result accuracy can be increased by about 50%.

  11. Modeling and Verification of Distributed Generation and Voltage Regulation Equipment for Unbalanced Distribution Power Systems; Annual Subcontract Report, June 2007

    Energy Technology Data Exchange (ETDEWEB)

    Davis, M. W.; Broadwater, R.; Hambrick, J.

    2007-07-01

    This report summarizes the development of models for distributed generation and distribution circuit voltage regulation equipment for unbalanced power systems and their verification through actual field measurements.

  12. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    Science.gov (United States)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated

  13. Verification of a Probabilistic Model for A Distribution System with Integration of Dispersed Generation

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte;

    2008-01-01

    In order to assess the present and predict the future distribution system performance using a probabilistic model, verification of the model is crucial. This paper illustrates the error caused by using traditional Monte Carlo (MC) based probabilistic load flow (PLF) when involving tap...... obtained from the developed probabilistic model....

  14. Numerical verification of similar Cam-clay model based on generalized potential theory

    Institute of Scientific and Technical Information of China (English)

    钟志辉; 杨光华; 傅旭东; 温勇; 张玉成

    2014-01-01

    From the mathematical principles, the generalized potential theory can be employed to create constitutive model of geomaterial directly. The similar Cam-clay model, which is created based on the generalized potential theory, has less assumptions, clearer mathematical basis, and better computational accuracy. Theoretically, it is more scientific than the traditional Cam-clay models. The particle flow code PFC3D was used to make numerical tests to verify the rationality and practicality of the similar Cam-clay model. The verification process was as follows: 1) creating the soil sample for numerical test in PFC3D, and then simulating the conventional triaxial compression test, isotropic compression test, and isotropic unloading test by PFC3D; 2) determining the parameters of the similar Cam-clay model from the results of above tests; 3) predicting the sample’s behavior in triaxial tests under different stress paths by the similar Cam-clay model, and comparing the predicting results with predictions by the Cam-clay model and the modified Cam-clay model. The analysis results show that the similar Cam-clay model has relatively high prediction accuracy, as well as good practical value.

  15. A Survey of Workflow Modeling Approaches and Model Verification%工作流过程建模方法及模型的形式化验证

    Institute of Scientific and Technical Information of China (English)

    杨东; 王英林; 张申生; 傅谦

    2003-01-01

    Work/low technology is widely used in business process modeling, software process modeling as well as en-terprise information integration. At present, there exist a variety of workflow modeling approaches, which differ in the easiness of modeling, expressiveness and formalism. In this paper, the modeling approaches most used in research project and workflow products are compared. And the verification of workflow model is also dealt. We argue that a ideal workflow modelin~ approach is a hybrid one, i.e. the inteuration of the above approaches.

  16. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View Texas A& M; DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-12-20

    This project was successfully executed to provide valuable adsorption data and improve a comprehensive model developed in previous work by the authors. Data obtained were used in an integrated computer program to predict the behavior of adsorption columns. The model is supported by experimental data and has been shown to predict capture of off gas similar to that evolving during the reprocessing of nuclear waste. The computer program structure contains (a) equilibrium models of off-gases with the adsorbate; (b) mass-transfer models to describe off-gas mass transfer to a particle, diffusion through the pores of the particle, and adsorption on the active sites of the particle; and (c) incorporation of these models into fixed bed adsorption modeling, which includes advection through the bed. These models are being connected with the MOOSE (Multiphysics Object-Oriented Simulation Environment) software developed at the Idaho National Laboratory through DGOSPREY (Discontinuous Galerkin Off-gas SeParation and REcoverY) computer codes developed in this project. Experiments for iodine and water adsorption have been conducted on reduced silver mordenite (Ag0Z) for single layered particles. Adsorption apparatuses have been constructed to execute these experiments over a useful range of conditions for temperatures ranging from ambient to 250°C and water dew points ranging from -69 to 19°C. Experimental results were analyzed to determine mass transfer and diffusion of these gases into the particles and to determine which models best describe the single and binary component mass transfer and diffusion processes. The experimental results were also used to demonstrate the capabilities of the comprehensive models developed to predict single-particle adsorption and transients of the adsorption-desorption processes in fixed beds. Models for adsorption and mass transfer have been developed to mathematically describe adsorption kinetics and transport via diffusion and advection

  17. Kinetic model of ductile iron solidification with experimental verification

    Directory of Open Access Journals (Sweden)

    W. Kapturkiewicz

    2009-10-01

    Full Text Available A solidification model for ductile iron, including Weibull formula for nodule count has been presented. From this model, the following can be determined: cooling curves, kinetics of austenite and eutectic nucleation, austenite and eutectic growth velocity, volume fraction, distribution of Si and P both in austenite and eutectic grain with distribution in casting section.In the developed model of nodular graphite iron casting solidification, the correctness of the mathematical model has been experimentally verified in the range of the most significant factors, which include temperature field, the value of maximum undercooling, and the graphite nodule count interrelated with the casting cross-section. Literature offers practically no data on so confronted process model and simulation program.

  18. Risk-Based Tailoring of the Verification, Validation, and Accreditation/Acceptance Processes (Adaptation fondee sur le risque, des processus de verification, de validation, et d’accreditation/d’acceptation)

    Science.gov (United States)

    2012-04-01

    Canada, 20-24 July 2003, pp. 737-747. [20] D. Brade , “Enhancing Modeling and Simulation Accreditation by Structuring Verification and Validation Results...and Execution Process, 2007 – IEEE Std 1516.4-2007. [5] Kilikauskas, Michelle and Hall, David : The Use of M&S VV&A as a Risk Mitigation Strategy in...Validation, and Accreditation (VV&A) of Federations. 2006-2009 – NMSG-054. [11] O’Neil, David and Hale, Joe, NASA’s M&S Accreditation Process and

  19. On the need for data for the verification of service life models for frost damage

    DEFF Research Database (Denmark)

    Geiker, Mette Rica; Engelund, Sven

    1999-01-01

    The purpose of this paper is to draw the attention to the need for the verification of service life models for frost attack on concrete and the collection of relevant data. To illustrate the type of data needed the paper presents models for internal freeze/thaw damage (internal cracking including...

  20. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  1. Range verification methods in particle therapy: underlying physics and Monte Carlo modelling

    Directory of Open Access Journals (Sweden)

    Aafke Christine Kraan

    2015-07-01

    Full Text Available Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients.Non-invasive in-vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including beta+ emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC predictions is a key issue. Correctly modelling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modelling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.

  2. Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling.

    Science.gov (United States)

    Kraan, Aafke Christine

    2015-01-01

    Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β (+) emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.

  3. A solenoid-based active hydraulic engine mount: modelling, analysis, and verification

    OpenAIRE

    Hosseini, Ali

    2010-01-01

    The focus of this thesis is on the design, modelling, identification, simulation, and experimental verification of a low-cost solenoid-based active hydraulic engine mount. To build an active engine mount, a commercial On-Off solenoid is modified to be used as an actuator and it is embedded inside a hydraulic engine mount. The hydraulic engine mount is modelled and tested, solenoid actuator is modelled and identified, and finally the models were integrated to obtain the analytical model of the...

  4. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  5. Verification of COMDES-II Systems Using UPPAAL with Model Transformation

    DEFF Research Database (Denmark)

    Xu, Ke; Pettersson, Paul; Sierszecki, Krzysztof

    2008-01-01

    in a timed multitasking environment, modal continuous operation combining reactive control behavior with continuous data processing, etc., by following the principle of separation-of-concerns. In the paper we present a transformational approach to the formal verification of both timing and reactive behaviors...

  6. Vacuum assisted resin transfer molding (VARTM): Model development and verification

    Science.gov (United States)

    Song, Xiaolan

    2003-06-01

    In this investigation, a comprehensive Vacuum Assisted Resin Transfer Molding (VARTM) process simulation model was developed and verified. The model incorporates resin flow through the preform, compaction and relaxation of the preform, and viscosity and cure kinetics of the resin. The computer model can be used to analyze the resin flow details, track the thickness change of the preform, predict the total infiltration time and final fiber volume fraction of the parts, and determine whether the resin could completely infiltrate and uniformly wet out the preform. Flow of resin through the preform is modeled as flow through porous media. Darcy's law combined with the continuity equation for an incompressible Newtonian fluid forms the basis of the flow model. During the infiltration process, it is well accepted that the total pressure is shared by the resin pressure and the pressure supported by the fiber network. With the progression of the resin, the net pressure applied to the preform decreases as a result of increasing local resin pressure. This leads to the springback of the preform, and is called the springback mechanism. On the other side, the lubrication effect of the resin causes the rearrangement of the fiber network and an increase in the preform compaction. This is called the wetting compaction mechanism. The thickness change of the preform is determined by the relative magnitude of the springback and wetting deformation mechanisms. In the compaction model, the transverse equilibrium equation is used to calculate the net compaction pressure applied to the preform, and the compaction test results are fitted to give the compressive constitutive law of the preform. The Finite Element/Control Volume (FE/CV) method is adopted to find the flow front location and the fluid pressure. The code features the ability of simultaneous integration of 1-D, 2-D and 3-D element types in a single simulation, and thus enables efficient modeling of the flow in complex mold

  7. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence L. [Syracuse Univ., NY (United States); Lin, Ronghong [Syracuse Univ., NY (United States); Nan, Yue [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Ladshaw, Austin [Georgia Inst. of Technology, Atlanta, GA (United States); Sharma, Ketki [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View A & M Univ., Prairie View, TX (United States); DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-04-29

    The project has made progress toward developing a comprehensive modeling capability for the capture of target species in off gas evolved during the reprocessing of nuclear fuel. The effort has integrated experimentation, model development, and computer code development for adsorption and absorption processes. For adsorption, a modeling library has been initiated to include (a) equilibrium models for uptake of off-gas components by adsorbents, (b) mass transfer models to describe mass transfer to a particle, diffusion through the pores of the particle and adsorption on the active sites of the particle, and (c) interconnection of these models to fixed bed adsorption modeling which includes advection through the bed. For single-component equilibria, a Generalized Statistical Thermodynamic Adsorption (GSTA) code was developed to represent experimental data from a broad range of isotherm types; this is equivalent to a Langmuir isotherm in the two-parameter case, and was demonstrated for Kr on INL-engineered sorbent HZ PAN, water sorption on molecular sieve A sorbent material (MS3A), and Kr and Xe capture on metal-organic framework (MOF) materials. The GSTA isotherm was extended to multicomponent systems through application of a modified spreading pressure surface activity model and generalized predictive adsorbed solution theory; the result is the capability to estimate multicomponent adsorption equilibria from single-component isotherms. This advance, which enhances the capability to simulate systems related to off-gas treatment, has been demonstrated for a range of real-gas systems in the literature and is ready for testing with data currently being collected for multicomponent systems of interest, including iodine and water on MS3A. A diffusion kinetic model for sorbent pellets involving pore and surface diffusion as well as external mass transfer has been established, and a methodology was developed for determining unknown diffusivity parameters from transient

  8. A New Speaker Verification Method with GlobalSpeaker Model and Likelihood Score Normalization

    Institute of Scientific and Technical Information of China (English)

    张怡颖; 朱小燕; 张钹

    2000-01-01

    In this paper a new text-independent speaker verification method GSMSV is proposed based on likelihood score normalization. In this novel method a global speaker model is established to represent the universal features of speech and normalize the likelihood score. Statistical analysis demonstrates that this normalization method can remove common factors of speech and bring the differences between speakers into prominence. As a result the equal error rate is decreased significantly,verification procedure is accelerated and system adaptability to speaking speed is improved.

  9. Numerical Verification of the Weak Turbulent Model for Swell Evolution

    CERN Document Server

    Korotkevich, A O; Resio, D; Zakharov, V E

    2007-01-01

    We performed numerical simulation of an ensemble of nonlinearly interacting free gravity waves (swell) by two different methods: solution of primordial dynamical equations describing potential flow of the ideal fluid with a free surface and, solution of the kinetic Hasselmann equation, describing the wave ensemble in the framework of the theory of weak turbulence. Comparison of the results demonstrates applicability of the weak turbulent approach. In both cases we observed effects predicted by this theory: frequency downshift, angular spreading and formation of Zakharov-Filonenko spectrum $I_{\\omega} \\sim \\omega^{-4}$. One of the results of our article consists in the fact that physical processes in finite size laboratory wave tanks and in the ocean are quite different, and the results of such laboratory experiments can be applied to modeling of the ocean phenomena with extra care. We also present the estimate on the minimum size of the laboratory installation, allowing to model open ocean surface wave dynami...

  10. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  11. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  12. Ecological dynamic model of grassland and its practical verification

    Institute of Scientific and Technical Information of China (English)

    ZENG; Xiaodong

    2005-01-01

    Based on the physico-biophysical considerations, mathematical analysis and some approximate formulations generally adopted in meteorology and ecology, an ecological dynamic model of grassland is developed. The model consists of three interactive variables, I.e. The biomass of living grass, the biomass of wilted grass, and the soil wetness. The major biophysical processes are represented in parameterization formulas, and the model parameters can be determined inversely by using the observational climatological and ecological data. Some major parameters are adjusted by this method to fit the data (although incomplete) in the Inner Mongolia grassland, and other secondary parameters are estimated through sensitivity studies. The model results are well agreed with reality, e.g., (I) the maintenance of grassland requires a minimum amount of annual precipitation (approximately 300 mm); (ii) there is a significant relationship between the annual precipitation and the biomass of living grass; and (iii) the overgrazing will eventually result in desertification. A specific emphasis is put on the shading effect of the wilted grass accumulated on the soil surface. It effectively reduces the soil surface temperature and the evaporation, hence benefits the maintenance of grassland and the reduction of water loss in the soil.

  13. The Parametric Model for PLC Reference Chanells and its Verification in Real PLC Environment

    OpenAIRE

    2008-01-01

    For the expansion of PLC systems, it is necesssary to have a detailed knowledge of the PLC transmission channel properties. This contribution shortly discusses characteristics of the PLC environment and a classification of PLC transmission channels. A main part is focused on the parametric model for PLC reference channels and its verification in the real PLC environment utilizing experimental measurements.

  14. Towards a Generic Information Data Model for Verification, Validation & Accreditation VV&A

    NARCIS (Netherlands)

    Roza, Z.C.; Voogd, J.M.; Giannoulis, C.

    2008-01-01

    The Generic Methodology for Verification, Validation and Acceptance (GM-VV) is intended to provide a common generic framework for making formal and well balanced acceptance decisions on a specific usage of models, simulations and data. GM-VV will offer the international M&S community with a Verifica

  15. A Verification and Analysis of the USAF/DoD Fatigue Model and Fatigue Management Technology

    Science.gov (United States)

    2005-11-01

    We Nap: Evolution, Chronobiology, and Functions of Polyphasic and Ultrashort Sleep . Stampi, C. (ed) Birkhduser, Boston. Defense Acquisition...Windows® soffivare application of the Sleep , Activity, Fatigue, and Task Effectiveness (SAFTE) applied model. The application, the Fatigue Avoidance...Scheduling Tool (FASTTM) was re-engineered as a clone from the SAFTE specification. The verification considered nine sleep /wake schedules that were

  16. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  17. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2010-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor ...

  18. Case study of verification, validation, and testing in the Automated Data Processing (ADP) system development life cycle

    Energy Technology Data Exchange (ETDEWEB)

    Riemer, C.A.

    1990-05-01

    Staff of the Environmental Assessment and Information Sciences Division of Argonne National Laboratory (ANL) studies the role played by the organizational participants in the Department of Veterans Affairs (VA) that conduct verification, validation, and testing (VV T) activities at various stages in the automated data processing (ADP) system development life cycle (SDLC). A case-study methodology was used to assess the effectiveness of VV T activities (tasks) and products (inputs and outputs). The case selected for the study was a project designed to interface the compensation and pension (C P) benefits systems with the centralized accounts receivable system (CARS). Argonne developed an organizational SDLC VV T model and checklists to help collect information from C P/CARS participants on VV T procedures and activities, and these were then evaluated against VV T standards.

  19. Process Sensitivity, Performance, and Direct Verification Testing of Adhesive Locking Features

    Science.gov (United States)

    Golden, Johnny L.; Leatherwood, Michael D.; Montoya, Michael D.; Kato, Ken A.; Akers, Ed

    2012-01-01

    during assembly by measuring the dynamic prevailing torque. Adhesive locking features or LLCs are another method of providing redundant locking, but a direct verification method has not been used in aerospace applications to verify proper installation when using LLCs because of concern for damage to the adhesive bond. The reliability of LLCs has also been questioned due to failures observed during testing with coupons for process verification, although the coupon failures have often been attributed to a lack of proper procedures. It is highly desirable to have a direct method of verifying the LLC cure or bond integrity. The purpose of the Phase II test program was to determine if the torque applied during direct verification of an adhesive locking feature degrades that locking feature. This report documents the test program used to investigate the viability of such a direct verification method. Results of the Phase II testing were positive, and additional investigation of direct verification of adhesive locking features is merited.

  20. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  1. Land surface Verification Toolkit (LVT – a generalized framework for land surface model evaluation

    Directory of Open Access Journals (Sweden)

    S. V. Kumar

    2012-02-01

    Full Text Available Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS, it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  2. Land surface Verification Toolkit (LVT – a generalized framework for land surface model evaluation

    Directory of Open Access Journals (Sweden)

    S. V. Kumar

    2012-06-01

    Full Text Available Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS, it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  3. Modeling and verification of hemispherical solar still using ANSYS CFD

    Energy Technology Data Exchange (ETDEWEB)

    Panchal, Hitesh N. [KSV University, Gujarat Power Engineering and Research Institute, Mehsana (India); Shah, P.K. [Silver Oak College of Engineering and Technology, Ahmedabad, Gujarat (India)

    2013-07-01

    In every efficient solar still design, water temperature, vapor temperature and distillate output, and difference between water temperature and inner glass cover temperatures are very important. Here, two dimensional three phase model of hemispherical solar still is made for evaporation as well as condensation process in ANSYS CFD. Simulation results like water temperature, vapor temperature, distillate output compared with actual experimental results of climate conditions of Mehsana (latitude of 23° 59’ and longitude of 72° 38) of hemispherical solar still. Water temperature and distillate output were good agreement with actual experimental results. Study shows that ANSYS-CFD is very powerful as well as efficient tool for design, comparison purpose of hemispherical solar still.

  4. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  5. PLM-based Approach for Design Verification and Validation using Manufacturing Process Knowledge

    Directory of Open Access Journals (Sweden)

    Luis Toussaint

    2010-02-01

    Full Text Available Out of 100 hours of engineering work, only 20 are dedicated to real engineering and 80 are spent on what is considered as routine activities. Readjusting the ratio of innovative vs. routine work is a considerable challenge in the product lifecycle management (PLM strategy. Therefore, the main objective is to develop an approach in order to accelerate routine processes in engineering design. The proposed methodology called FabK consists of capturing manufacturing knowledge and its application towards the design verification and validation of new engineering designs. The approach is implemented into a Web-based PLM prototype and a Computer Aided Design system. A series of experiments from an industrial case study is introduced to provide significant results.

  6. Issues to be considered on obtaining plant models for formal verification purposes

    Science.gov (United States)

    Pacheco, R.; Gonzalez, L.; Intriago, M.; Machado, J.; Prisacaru, G.; Olaru, D.

    2016-08-01

    The development of dependable software for mechatronic systems can be a very complex and hard task. For facilitating the obtaining of dependable software for industrial controllers, some powerful software tools and analysis techniques can be used. Mainly, when using simulation and formal verification analysis techniques, it is necessary to develop plant models, in order to describe the plant behavior of those systems. However, developing a plant model implies that designer takes his (or her) decisions concerning granularity and level of abstraction of models; approach to consider for modeling (global or modular); and definition of strategies for simulation and formal verification tasks. This paper intends to highlight some aspects that can be considered for taking into account those decisions. For this purpose, it is presented a case study and there are illustrated and discussed very important aspects concerning above exposed issues.

  7. Verification of three-dimensional neutron kinetics model of TRAP-KS code regarding reactivity variations

    Energy Technology Data Exchange (ETDEWEB)

    Uvakin, Maxim A.; Alekhin, Grigory V.; Bykov, Mikhail A.; Zaitsev, Sergei I. [EDO ' GIDROPRESS' , Moscow Region, Podolsk (Russian Federation)

    2016-09-15

    This work deals with TRAP-KS code verification. TRAP-KS is used for coupled neutron and thermo-hydraulic process calculations of VVER reactors. The three-dimensional neutron kinetics model enables consideration of space effects, which are produced by energy field and feedback parameters variations. This feature has to be investigated especially for asymmetrical multiplying variations of core properties, power fluctuations and strong local perturbation insertion. The presented work consists of three test definitions. First, an asymmetrical control rod (CR) ejection during power operation is defined. This process leads to fast reactivity insertion with short-time power spike. As second task xenon oscillations are considered. Here, small negative reactivity insertion leads to power decreasing and induces space oscillations of xenon concentration. In the late phase, these oscillations are suppressed by external actions. As last test, an international code comparison for a hypothetical main steam line break (V1000CT-2, task 2) was performed. This scenario is interesting for asymmetrical positive reactivity insertion by decreasing coolant temperature in the affected loop.

  8. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  9. Modelling and Verification of Multiple UAV Mission Using SMV

    CERN Document Server

    Sirigineedi, Gopinadh; White, Brian A; Zbikowski, Rafal

    2010-01-01

    Model checking has been used to verify the correctness of digital circuits, security protocols, communication protocols, as they can be modelled by means of finite state transition model. However, modelling the behaviour of hybrid systems like UAVs in a Kripke model is challenging. This work is aimed at capturing the behaviour of an UAV performing cooperative search mission into a Kripke model, so as to verify it against the temporal properties expressed in Computation Tree Logic (CTL). SMV model checker is used for the purpose of model checking.

  10. Modelling and Verification of Multiple UAV Mission Using SMV

    Directory of Open Access Journals (Sweden)

    Gopinadh Sirigineedi

    2010-03-01

    Full Text Available Model checking has been used to verify the correctness of digital circuits, security protocols, communication protocols, as they can be modelled by means of finite state transition model. However, modelling the behaviour of hybrid systems like UAVs in a Kripke model is challenging. This work is aimed at capturing the behaviour of an UAV performing cooperative search mission into a Kripke model, so as to verify it against the temporal properties expressed in Computational Tree Logic (CTL. SMV model checker is used for the purpose of model checking.

  11. 49 CFR 40.169 - Where is other information concerning the role of MROs and the verification process found in this...

    Science.gov (United States)

    2010-10-01

    ... MROs and the verification process found in this regulation? 40.169 Section 40.169 Transportation Office... concerning the role of MROs and the verification process found in this regulation? You can find more information concerning the role of MROs in several sections of this part: § 40.3—Definition. §§...

  12. 49 CFR 40.21 - May an employer stand down an employee before the MRO has completed the verification process?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false May an employer stand down an employee before the... Responsibilities § 40.21 May an employer stand down an employee before the MRO has completed the verification process? (a) As an employer, you are prohibited from standing employees down, except consistent with...

  13. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    Science.gov (United States)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  14. Process Document for the joint ETV/NOWATECH verification of the Sorbisense GSW40 passive sampler

    Science.gov (United States)

    Nordic Water Technology Verification Center’s (NOWATECH) DHI Water Monitoring Center (DHI WMC), a pilot Environmental Technology Verification (ETV) program in the European Union, and the United States Environmental Protection Agency ETV (US EPA ETV) program’s Advanced Monitoring ...

  15. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid;

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or speed...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  16. FAST Mast Structural Response to Axial Loading: Modeling and Verification

    Science.gov (United States)

    Knight, Norman F., Jr.; Elliott, Kenny B.; Templeton, Justin D.; Song, Kyongchan; Rayburn, Jeffery T.

    2012-01-01

    The International Space Station s solar array wing mast shadowing problem is the focus of this paper. A building-block approach to modeling and analysis is pursued for the primary structural components of the solar array wing mast structure. Starting with an ANSYS (Registered Trademark) finite element model, a verified MSC.Nastran (Trademark) model is established for a single longeron. This finite element model translation requires the conversion of several modeling and analysis features for the two structural analysis tools to produce comparable results for the single-longeron configuration. The model is then reconciled using test data. The resulting MSC.Nastran (Trademark) model is then extended to a single-bay configuration and verified using single-bay test data. Conversion of the MSC. Nastran (Trademark) single-bay model to Abaqus (Trademark) is also performed to simulate the elastic-plastic longeron buckling response of the single bay prior to folding.

  17. Verification of five pharmacogenomics-based warfarin administration models

    Directory of Open Access Journals (Sweden)

    Meiqin Lin

    2016-01-01

    Conclusions: Since none of the models ranked high for all the three criteria considered, the impact of various factors should be thoroughly considered before selecting the most appropriate model for the region's population.

  18. Radial seepage model and verification for oil in oilseeds processing by cylinder press%油料冷态预榨过程的油脂径向渗流模型及验证

    Institute of Scientific and Technical Information of China (English)

    刘汝宽; 柯佳见; 肖志红; 李培旺; 张爱华; 李昌珠

    2016-01-01

    为量化考察油料一维压榨过程中油脂的流动状态,针对直筒式冷态压榨制油过程,通过简化假设建立了一维压榨模型,并对油料微元进行受力分析;利用Darcy渗流定律与Terzaghi固结理论建立了油料压榨过程中的渗流模型,确定了影响压榨出油效果的因素主要为料筒内径、物料层高度、压榨压力、压榨时间、油脂黏度和物料孔隙度。在实际压榨过程中,压榨压力和压榨时间易控制,油脂黏度和孔隙度也可以通过设置压榨参数而改变,而压榨机的料筒规格不易改变,因此料筒半径的选取尤为重要。基于此,根据渗流模型推导出油率模型,分析了料筒半径对出油率的影响关系,指出料筒内径不宜过大。开展不同筒径时的出油率试验,并与模型预测值对照,其最大误差为2.10%。研究结果为油料直筒式低温预榨制油设备制造及工艺参数的优化及选取提供了参考。%There are 2 kinds of methods for oilseed processing, which are mechanical pressing and solvent extraction. In mechanical pressing, screw pressing is widely used in industry for large-scale oilseeds with a low residual oil in cake while hydraulic pressing is suitable for special oilseeds under a low temperature. One-dimensional pressing is the foundation of mechanical pressing especially for new type of oilseeds with high oil content and high protein content. In process of one-dimensional pressing, oilseeds are squeezed and ruptured under the interaction with the barrel by the axial compression and lateral limit. Oils in the cells are gradually squeezed out only leaving the cake inside the cylinder barrel. In order to quantitate the flow state of oil in cold processing by cylinder pressing, experiments for stress analysis of oilseed were carried out on the basis of one-dimensional pressing model, which was established through simplifying assumptions. Based on the Darcy percolation

  19. Automatic Verification of Biochemical Network Using Model Checking Method%基于模型校核的生化网络自动辨别方法

    Institute of Scientific and Technical Information of China (English)

    Jinkyung Kim; Younghee Lee; Il Moon

    2008-01-01

    This study focuses on automatic searching and verifying methods for the reachability, transition logics and hierarchical structure in all possible paths of biological processes using model checking. The automatic search and verification for alternative paths within complex and large networks in biological process can provide a consid-erable amount of solutions, which is difficult to handle manually. Model checking is an automatic method for veri-fying if a circuit or a condition, expressed as a concurrent transition system, satisfies a set of properties expressed ina temporal logic, such as computational tree logic (CTL). This article represents that model checking is feasible in biochemical network verification and it shows certain advantages over simulation for querying and searching of special behavioral properties in biochemical processes.

  20. Introducing uncertainty of radar-rainfall estimates to the verification of mesoscale model precipitation forecasts

    Directory of Open Access Journals (Sweden)

    M. P. Mittermaier

    2008-05-01

    Full Text Available A simple measure of the uncertainty associated with using radar-derived rainfall estimates as "truth" has been introduced to the Numerical Weather Prediction (NWP verification process to assess the effect on forecast skill and errors. Deterministic precipitation forecasts from the mesoscale version of the UK Met Office Unified Model for a two-day high-impact event and for a month were verified at the daily and six-hourly time scale using a spatially-based intensity-scale method and various traditional skill scores such as the Equitable Threat Score (ETS and log-odds ratio. Radar-rainfall accumulations from the UK Nimrod radar-composite were used.

    The results show that the inclusion of uncertainty has some effect, shifting the forecast errors and skill. The study also allowed for the comparison of results from the intensity-scale method and traditional skill scores. It showed that the two methods complement each other, one detailing the scale and rainfall accumulation thresholds where the errors occur, the other showing how skillful the forecast is. It was also found that for the six-hourly forecasts the error distributions remain similar with forecast lead time but skill decreases. This highlights the difference between forecast error and forecast skill, and that they are not necessarily the same.

  1. Introducing uncertainty of radar-rainfall estimates to the verification of mesoscale model precipitation forecasts

    Science.gov (United States)

    Mittermaier, M. P.

    2008-05-01

    A simple measure of the uncertainty associated with using radar-derived rainfall estimates as "truth" has been introduced to the Numerical Weather Prediction (NWP) verification process to assess the effect on forecast skill and errors. Deterministic precipitation forecasts from the mesoscale version of the UK Met Office Unified Model for a two-day high-impact event and for a month were verified at the daily and six-hourly time scale using a spatially-based intensity-scale method and various traditional skill scores such as the Equitable Threat Score (ETS) and log-odds ratio. Radar-rainfall accumulations from the UK Nimrod radar-composite were used. The results show that the inclusion of uncertainty has some effect, shifting the forecast errors and skill. The study also allowed for the comparison of results from the intensity-scale method and traditional skill scores. It showed that the two methods complement each other, one detailing the scale and rainfall accumulation thresholds where the errors occur, the other showing how skillful the forecast is. It was also found that for the six-hourly forecasts the error distributions remain similar with forecast lead time but skill decreases. This highlights the difference between forecast error and forecast skill, and that they are not necessarily the same.

  2. On the verification of PGD reduced-order models

    OpenAIRE

    Pled, Florent; Chamoin, Ludovic; Ladevèze, Pierre

    2014-01-01

    International audience; In current computational mechanics practice, multidimensional as well as multiscale or parametric models encountered in a wide variety of scientific and engineering fields often require either the resolution of significantly large complexity problems or the direct calculation of very numerous solutions of such complex models. In this framework, the use of model order reduction allows to dramatically reduce the computational requirements engendered by the increasing mod...

  3. Development and verification of printed circuit board toroidal transformer model

    DEFF Research Database (Denmark)

    Pejtersen, Jens; Mønster, Jakob Døllner; Knott, Arnold

    2013-01-01

    by comparing calculated parameters with 3D finite element simulations and experimental measurement results. The developed transformer model shows good agreement with the simulated and measured results. The model can be used to predict the parameters of printed circuit board toroidal transformer configurations......An analytical model of an air core printed circuit board embedded toroidal transformer configuration is presented. The transformer has been developed for galvanic isolation of very high frequency switch-mode dc-dc power converter applications. The theoretical model is developed and verified...

  4. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-05-12

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  5. Verification of a fully coupled FE model for tunneling under compressed air

    Energy Technology Data Exchange (ETDEWEB)

    Oettl, G.; Stark, R.F.; Hofstetter, G. [Innsbruck Univ. (Austria). Inst. for Structural Analysis and Strength of Materials

    2001-07-01

    This paper deals with the verification of a fully coupled finite element model for tunneling under compressed air. The formulation is based on mixture theory treating the soil as a three-phase medium with the constituents: deformable porous soil skeleton, water and air. Starting with a brief outline of the governing equations results of numerical simulations of different laboratory tests and of a large-scale in-situ test are presented and compared with experimental data. (orig.)

  6. A Formal Model for Compliance Verification of Service Compositions

    NARCIS (Netherlands)

    Groefsema, Heerko; van Beest, Nick; Aiello, Marco

    2016-01-01

    Business processes design and execution environments increasingly need support from modular services in service compositions to offer the flexibility required by rapidly changing requirements. With each evolution, however, the service composition must continue to adhere to laws and regulations, resu

  7. A Formal Model for Compliance Verification of Service Compositions

    NARCIS (Netherlands)

    Groefsema, Heerko; van Beest, Nick; Aiello, Marco

    2016-01-01

    Business processes design and execution environments increasingly need support from modular services in service compositions to offer the flexibility required by rapidly changing requirements. With each evolution, however, the service composition must continue to adhere to laws and regulations,

  8. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2016-01-01

    checking (BMC) and inductive reasoning, it is verified that the generated model instance satisfies the generated safety properties. Using this method, we are able to verify the safety properties for model instances corresponding to railway networks of industrial size. Experiments show that BMC is also...

  9. Mask synthesis and verification based on geometric model for surface micro-machined MEMS

    Institute of Scientific and Technical Information of China (English)

    LI Jian-hua; LIU Yu-sheng; GAO Shu-ming

    2005-01-01

    Traditional MEMS (microelectromechanical system) design methodology is not a structured method and has become an obstacle for MEMS creative design. In this paper, a novel method of mask synthesis and verification for surface micro-machined MEMS is proposed, which is based on the geometric model of a MEMS device. The emphasis is focused on synthesizing the masks at the basis of the layer model generated from the geometric model of the MEMS device. The method is comprised of several steps: the correction of the layer model, the generation of initial masks and final masks including multi-layer etch masks, and mask simulation. Finally some test results are given.

  10. 3D MODELING FOR UNDERWATER ARCHAEOLOGICAL DOCUMENTATION: METRIC VERIFICATIONS

    Directory of Open Access Journals (Sweden)

    S. D’Amelio

    2015-04-01

    Full Text Available The survey in underwater environment has always presented considerable difficulties both operative and technical and this has sometimes made it difficult to use the techniques of survey commonly used for the documentation of Cultural Heritage in dry environment. The work of study concerns the evaluation in terms of capability and accuracy of the Autodesk123DCatch software for the reconstruction of a three-dimensional model of an object in underwater context. The subjects of the study are models generated from sets of photographs and sets of frames extracted from video sequence. The study is based on comparative method, using a reference model, obtained with laser scanner technique.

  11. Multiple verification in computational modeling of bone pathologies

    CERN Document Server

    Liò, Pietro; Paoletti, Nicola; 10.4204/EPTCS.67.8

    2011-01-01

    We introduce a model checking approach to diagnose the emerging of bone pathologies. The implementation of a new model of bone remodeling in PRISM has led to an interesting characterization of osteoporosis as a defective bone remodeling dynamics with respect to other bone pathologies. Our approach allows to derive three types of model checking-based diagnostic estimators. The first diagnostic measure focuses on the level of bone mineral density, which is currently used in medical practice. In addition, we have introduced a novel diagnostic estimator which uses the full patient clinical record, here simulated using the modeling framework. This estimator detects rapid (months) negative changes in bone mineral density. Independently of the actual bone mineral density, when the decrease occurs rapidly it is important to alarm the patient and monitor him/her more closely to detect insurgence of other bone co-morbidities. A third estimator takes into account the variance of the bone density, which could address the...

  12. Image Smearing Modeling and Verification for Strapdown Star Sensor

    Institute of Scientific and Technical Information of China (English)

    WANG Haiyong; ZHOU Wenrui; CHENG Xuan; LIN Haoyu

    2012-01-01

    To further extend study on celestial attitude determination with strapdown star sensor from static into dynamic field,one prerequisite is to generate precise dynamic simulating star maps.First a neat analytical solution of the smearing trajectory caused by spacecraft attitude maneuver is deduced successfully,whose parameters cover the geometric size of optics,three-axis angular velocities and CCD integral time.Then for the first time the mathematical law and method are discovered about how to synthesize the two formulae of smearing trajectory and the static Gaussian distribution function (GDF) model,the key of which is a line integral with regard to the static GDF attenuated by a factor 1/Ls (Ls is the arc length of the smearing trajectory) along the smearing trajectory.The dynamic smearing model is then obtained,also in an analytical form.After that,three sets of typical simulating maps and data are simulated from this dynamic model manifesting the expected smearing effects,also compatible with the linear model as its special case of no boresight rotation.Finally,model validity tests on a rate turntable are carried out,which results in a mean correlation coefficient 0.920 0 between the camera images and the corresponding model simulated ones with the same parameters.The sufficient similarity verifies the validity of the dynamic smearing model.This model,after parameter calibration,can serve as a front-end loop of the ground semi-physical simulation system for celestial attitude determination with strapdown star sensor.

  13. Model Verification and Validation Using Graphical Information Systems Tools

    Science.gov (United States)

    2013-07-31

    Marques, W. C., E. H. L. Fernandes, B. C. Moraes, O. O. Möller, and A. Malcherek (2010), Dynamics of the Patos Lagoon coastal plume and its...multiple hurricane beds in the northern Gulf of Mexico , Marine Geology, Volume 210, Issues 1-4, Storms and their significance in coastal morpho-sedimentary...accuracy of model forecasts of currents in coastal areas. The MVV module is implemented as part of the Geospatial Analysis and Model Evaluation Software

  14. Verification modeling study for the influential factors of secondary clarifier

    OpenAIRE

    Gao, Haiwen

    2016-01-01

    A numerical Quasi 3-D model of secondary clarifier is applied to verify the data obtained through the literature and analyze the influential factors for secondary clarifiers. The data from the papers provide the input parameters for the model. During this study, several influential factors (density waterfall; surface overflow rate; solids loading rate; solids-settling characteristics; mixed liquor suspended solid; clarifier geometry) are tested. The results show that there are some difference...

  15. Target Soil Impact Verification: Experimental Testing and Kayenta Constitutive Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Broome, Scott Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Flint, Gregory Mark [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Dewers, Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Newell, Pania [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This report details experimental testing and constitutive modeling of sandy soil deformation under quasi - static conditions. This is driven by the need to understand constitutive response of soil to target/component behavior upon impact . An experimental and constitutive modeling program was followed to determine elastic - plastic properties and a compressional failure envelope of dry soil . One hydrostatic, one unconfined compressive stress (UCS), nine axisymmetric compression (ACS) , and one uniaxial strain (US) test were conducted at room temperature . Elastic moduli, assuming isotropy, are determined from unload/reload loops and final unloading for all tests pre - failure and increase monotonically with mean stress. Very little modulus degradation was discernable from elastic results even when exposed to mean stresses above 200 MPa . The failure envelope and initial yield surface were determined from peak stresses and observed onset of plastic yielding from all test results. Soil elasto - plastic behavior is described using the Brannon et al. (2009) Kayenta constitutive model. As a validation exercise, the ACS - parameterized Kayenta model is used to predict response of the soil material under uniaxial strain loading. The resulting parameterized and validated Kayenta model is of high quality and suitable for modeling sandy soil deformation under a range of conditions, including that for impact prediction.

  16. Toward a formal verification of a floating-point coprocessor and its composition with a central processing unit

    Science.gov (United States)

    Pan, Jing; Levitt, Karl N.; Cohen, Gerald C.

    1991-01-01

    Discussed here is work to formally specify and verify a floating point coprocessor based on the MC68881. The HOL verification system developed at Cambridge University was used. The coprocessor consists of two independent units: the bus interface unit used to communicate with the cpu and the arithmetic processing unit used to perform the actual calculation. Reasoning about the interaction and synchronization among processes using higher order logic is demonstrated.

  17. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  18. Verification and Validation of Numerical Models for Air/Water Flow on Coastal and Navigation Fluid-Structure Interaction Applications

    Science.gov (United States)

    Kees, C. E.; Farthing, M.; Dimakopoulos, A.; DeLataillade, T.

    2015-12-01

    Performance analysis and optimization of coastal and navigation structures is becoming feasible due to recent improvements in numerical methods for multiphase flows and the steady increase in capacity and availability of high performance computing resources. Now that the concept of fully three-dimensional air/water flow modelling for real world engineering analysis is achieving acceptance by the wider engineering community, it is critical to expand careful comparative studies on verification,validation, benchmarking, and uncertainty quantification for the variety of competing numerical methods that are continuing to evolve. Furthermore, uncertainty still remains about the relevance of secondary processes such as surface tension, air compressibility, air entrainment, and solid phase (structure) modelling so that questions about continuum mechanical theory and mathematical analysis of multiphase flow are still required. Two of the most popular and practical numerical approaches for large-scale engineering analysis are the Volume-Of-Fluid (VOF) and Level Set (LS) approaches. In this work we will present a publically available verification and validation test set for air-water-structure interaction problems as well as computational and physical model results including a hybrid VOF-LS method, traditional VOF methods, and Smoothed Particle Hydrodynamics (SPH) results. The test set repository and test problem formats will also be presented in order to facilitate future comparative studies and reproduction of scientific results.

  19. FURTHER OPTIMISATIONS OF CONSTANT Q CEPSTRAL PROCESSING FOR INTEGRATED UTTERANCE AND TEXT-DEPENDENT SPEAKER VERIFICATION

    DEFF Research Database (Denmark)

    Delgado, Hector; Todisco, Massimiliano; Sahidullah, Md

    2016-01-01

    Many authentication applications involving automatic speaker verification (ASV) demand robust performance using short-duration, fixed or prompted text utterances. Text constraints not only reduce the phone-mismatch between enrollment and test utterances, which generally leads to improved performa...

  20. Further optimisations of constant Q cepstral processing for integrated utterance and text-dependent speaker verification

    DEFF Research Database (Denmark)

    Delgado, Hector; Todisco, Massimiliano; Sahidullah, Md

    2016-01-01

    Many authentication applications involving automatic speaker verification (ASV) demand robust performance using short-duration, fixed or prompted text utterances. Text constraints not only reduce the phone-mismatch between enrollment and test utterances, which generally leads to improved performa...

  1. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  2. Local model for magnet-superconductor mechanical interaction: Experimental verification

    Science.gov (United States)

    Diez-Jimenez, Efren; Perez-Diaz, Jose-Luis; Garcia-Prada, Juan Carlos

    2011-03-01

    Several models exist for calculating superconducting repulsion forces in the Meissner state that are based on the method of images. The method of images, however, is limited to a small number of geometrical configurations that can be solved exactly, and the physical interpretation of the method is under discussion. A general local model based on the London equations and Maxwell's equations has been developed to describe the mechanics of the superconductor-permanent magnet system. Due to its differential form, this expression can be easily implemented in a finite elements analysis and, consequently, is easily applicable to any shape of superconductor in the Meissner state. It can solve both forces and torques. This paper reports different experiments undertaken in order to test the model's validity. The vertical forces and the angle of equilibrium between a magnet and a superconductor were measured, and a positive agreement between the experiments and theoretical calculations was found.

  3. CFD modeling of pharmaceutical isolators with experimental verification of airflow.

    Science.gov (United States)

    Nayan, N; Akay, H U; Walsh, M R; Bell, W V; Troyer, G L; Dukes, R E; Mohan, P

    2007-01-01

    Computational fluid dynamics (CFD) models have been developed to predict the airflow in a transfer isolator using a commercial CFD code. In order to assess the ability of the CFD approach in predicting the flow inside an isolator, hot wire anemometry measurements and a novel experimental flow visualization technique consisting of helium-filled glycerin bubbles were used. The results obtained have been shown to agree well with the experiments and show that CFD can be used to model barrier systems and isolators with practical fidelity. This indicates that CFD can and should be used to support the design, testing, and operation of barrier systems and isolators.

  4. Product Development Process Modeling

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The use of Concurrent Engineering and other modern methods of product development and maintenance require that a large number of time-overlapped "processes" be performed by many people. However, successfully describing and optimizing these processes are becoming even more difficult to achieve. The perspective of industrial process theory (the definition of process) and the perspective of process implementation (process transition, accumulation, and inter-operations between processes) are used to survey the method used to build one base model (multi-view) process model.

  5. Verification of the Naval Oceanic Vertical Aerosol Model During Fire

    NARCIS (Netherlands)

    Davidson, K.L.; Leeuw, G. de; Gathman, S.G.; Jensen, D.R.

    1990-01-01

    The Naval Oceanic Vertical Aerosol Model (NOVAM) has been formulated to estimate the vertical structure of the optical and infrared extinction coefficients in the marine atmospheric boundary layer (MABL), for waverengths between 0,2 and 40 um. NOVAM was designed to predict, utilizing a set of routin

  6. Modelling and Verification of Web Services Business Activity Protocol

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2011-01-01

    WS-Business Activity specification defines two coordination protocols in order to ensure a consistent agreement on the outcome of long-running distributed applications. We use the model checker Uppaal to analyse the Business Agreement with Coordination Completion protocol type. Our analyses show...

  7. Methods for the Update and Verification of Forest Surface Model

    Science.gov (United States)

    Rybansky, M.; Brenova, M.; Zerzan, P.; Simon, J.; Mikita, T.

    2016-06-01

    The digital terrain model (DTM) represents the bare ground earth's surface without any objects like vegetation and buildings. In contrast to a DTM, Digital surface model (DSM) represents the earth's surface including all objects on it. The DTM mostly does not change as frequently as the DSM. The most important changes of the DSM are in the forest areas due to the vegetation growth. Using the LIDAR technology the canopy height model (CHM) is obtained by subtracting the DTM and the corresponding DSM. The DSM is calculated from the first pulse echo and DTM from the last pulse echo data. The main problem of the DSM and CHM data using is the actuality of the airborne laser scanning. This paper describes the method of calculating the CHM and DSM data changes using the relations between the canopy height and age of trees. To get a present basic reference data model of the canopy height, the photogrammetric and trigonometric measurements of single trees were used. Comparing the heights of corresponding trees on the aerial photographs of various ages, the statistical sets of the tree growth rate were obtained. These statistical data and LIDAR data were compared with the growth curve of the spruce forest, which corresponds to a similar natural environment (soil quality, climate characteristics, geographic location, etc.) to get the updating characteristics.

  8. Verification-Driven Slicing of UML/OCL Models

    DEFF Research Database (Denmark)

    Shaikh, Asadullah; Clarisó Viladrosa, Robert; Wiil, Uffe Kock;

    2010-01-01

    computational complexity can limit their scalability. In this paper, we consider a specific static model (UML class diagrams annotated with unrestricted OCL constraints) and a specific property to verify (satisfiability, i.e., “is it possible to create objects without violating any constraint?”). Current...

  9. Carbon dioxide stripping in aquaculture -- part III: model verification

    Science.gov (United States)

    Colt, John; Watten, Barnaby; Pfeiffer, Tim

    2012-01-01

    Based on conventional mass transfer models developed for oxygen, the use of the non-linear ASCE method, 2-point method, and one parameter linear-regression method were evaluated for carbon dioxide stripping data. For values of KLaCO2 < approximately 1.5/h, the 2-point or ASCE method are a good fit to experimental data, but the fit breaks down at higher values of KLaCO2. How to correct KLaCO2 for gas phase enrichment remains to be determined. The one-parameter linear regression model was used to vary the C*CO2 over the test, but it did not result in a better fit to the experimental data when compared to the ASCE or fixed C*CO2 assumptions.

  10. A New Approach to Model Verification, Falsification and Selection

    Directory of Open Access Journals (Sweden)

    Andrew J. Buck

    2015-06-01

    Full Text Available This paper shows that a qualitative analysis, i.e., an assessment of the consistency of a hypothesized sign pattern for structural arrays with the sign pattern of the estimated reduced form, can always provide decisive insight into a model’s validity both in general and compared to other models. Qualitative analysis can show that it is impossible for some models to have generated the data used to estimate the reduced form, even though standard specification tests might show the model to be adequate. A partially specified structural hypothesis can be falsified by estimating as few as one reduced form equation. Zero restrictions in the structure can themselves be falsified. It is further shown how the information content of the hypothesized structural sign patterns can be measured using a commonly applied concept of statistical entropy. The lower the hypothesized structural sign pattern’s entropy, the more a priori information it proposes about the sign pattern of the estimated reduced form. As an hypothesized structural sign pattern has a lower entropy, it is more subject to type 1 error and less subject to type 2 error. Three cases illustrate the approach taken here.

  11. Computational reverse shoulder prosthesis model: Experimental data and verification.

    Science.gov (United States)

    Martins, A; Quental, C; Folgado, J; Ambrósio, J; Monteiro, J; Sarmento, M

    2015-09-18

    The reverse shoulder prosthesis aims to restore the stability and function of pathological shoulders, but the biomechanical aspects of the geometrical changes induced by the implant are yet to be fully understood. Considering a large-scale musculoskeletal model of the upper limb, the aim of this study is to evaluate how the Delta reverse shoulder prosthesis influences the biomechanical behavior of the shoulder joint. In this study, the kinematic data of an unloaded abduction in the frontal plane and an unloaded forward flexion in the sagittal plane were experimentally acquired through video-imaging for a control group, composed of 10 healthy shoulders, and a reverse shoulder group, composed of 3 reverse shoulders. Synchronously, the EMG data of 7 superficial muscles were also collected. The muscle force sharing problem was solved through the minimization of the metabolic energy consumption. The evaluation of the shoulder kinematics shows an increase in the lateral rotation of the scapula in the reverse shoulder group, and an increase in the contribution of the scapulothoracic joint to the shoulder joint. Regarding the muscle force sharing problem, the musculoskeletal model estimates an increased activity of the deltoid, teres minor, clavicular fibers of the pectoralis major, and coracobrachialis muscles in the reverse shoulder group. The comparison between the muscle forces predicted and the EMG data acquired revealed a good correlation, which provides further confidence in the model. Overall, the shoulder joint reaction force was lower in the reverse shoulder group than in the control group. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Model-Based Verification and Validation of Spacecraft Avionics

    Science.gov (United States)

    Khan, Mohammed Omair

    2012-01-01

    Our simulation was able to mimic the results of 30 tests on the actual hardware. This shows that simulations have the potential to enable early design validation - well before actual hardware exists. Although simulations focused around data processing procedures at subsystem and device level, they can also be applied to system level analysis to simulate mission scenarios and consumable tracking (e.g. power, propellant, etc.). Simulation engine plug-in developments are continually improving the product, but handling time for time-sensitive operations (like those of the remote engineering unit and bus controller) can be cumbersome.

  13. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  14. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models. These ...

  15. Verification of the two-dimensional hydrodynamic model based on remote sensing

    Science.gov (United States)

    Sazonov, Alexey; Mikhailukova, Polina; Krylenko, Inna; Frolova, Natalya; Kireeva, Mariya

    2016-04-01

    Mathematical modeling methods are used more and more actively to evaluate possible damage, identify potential flood zone and the influence of individual factors affecting the river during the passage of the flood. Calculations were performed by means of domestic software complex «STREAM-2D» which is based on the numerical solution of two-dimensional St. Venant equations. One of the major challenges in mathematical modeling is the verification of the model. This is usually made using data on water levels from hydrological stations: the smaller the difference of the actual level and the simulated one, the better the quality of the model used. Data from hydrological stations are not always available, so alternative sources of verification, such as remote sensing, are increasingly used. The aim of this work is to develop a method of verification of hydrodynamic model based on a comparison of actual flood zone area, which in turn is determined on the basis of the automated satellite image interpretation methods for different imaging systems and flooded area obtained in the course of the model. The study areas are Lena River, The North Dvina River, Amur River near Blagoveshchensk. We used satellite images made by optical and radar sensors: SPOT-5/HRG, Resurs-F, Radarsat-2. Flooded area were calculated using unsupervised classification (ISODATA and K-mean) for optical images and segmentation for Radarsat-2. Knowing the flow rate and the water level at a given date for the upper and lower limits of the model, respectively, it is possible to calculate flooded area by means of program STREAM-2D and GIS technology. All the existing vector layers with the boundaries of flooding are included in a GIS project for flood area calculation. This study was supported by the Russian Science Foundation, project no. 14-17-00155.

  16. Using the SAL technique for spatial verification of cloud processes: A sensitivity analysis

    CERN Document Server

    Weniger, Michael

    2016-01-01

    The feature based spatial verification method SAL is applied to cloud data, i.e. two-dimensional spatial fields of total cloud cover and spectral radiance. Model output is obtained from the COSMO-DE forward operator SynSat and compared to SEVIRI satellite data. The aim of this study is twofold. First, to assess the applicability of SAL to this kind of data, and second, to analyze the role of external object identification algorithms (OIA) and the effects of observational uncertainties on the resulting scores. As a feature based method, SAL requires external OIA. A comparison of three different algorithms shows that the threshold level, which is a fundamental part of all studied algorithms, induces high sensitivity and unstable behavior of object dependent SAL scores (i.e. even very small changes in parameter values can lead to large changes in the resulting scores). An in-depth statistical analysis reveals significant effects on distributional quantities commonly used in the interpretation of SAL, e.g. median...

  17. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  18. Verification of Advances in a Coupled Snow-runoff Modeling Framework for Operational Streamflow Forecasts

    Science.gov (United States)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.

  19. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar Saavedra, J.A.; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  20. An analysis of the U.S. Navy verification, validation, and accreditation (VVandA) process for modeling and simulation (MandS) used for operational test (OT) of surface ships and weapons

    OpenAIRE

    Griffith, Anthony A.; Locke, W. Michael.

    2006-01-01

    In this climate of declining budgets and resources, models and simulations (MandS) have become very beneficial to the U.S. Navy. However, the U.S. Navy's investment in, and use of, MandS for addressing critical operational issues (COIs) within a warship's operational test (OT) program would not be practical unless the particular MandS was determined to be a credible representation of that which would be physically tested. Commander Operational Test and Evaluation Force (COMOPTEVFOR) is res...

  1. Verification and Validation of a Three-Dimensional Orthotropic Plasticity Constitutive Model Using a Unidirectional Composite

    Directory of Open Access Journals (Sweden)

    Canio Hoffarth

    2017-03-01

    Full Text Available A three-dimensional constitutive model has been developed for modeling orthotropic composites subject to impact loads. It has three distinct components—a deformation model involving elastic and plastic deformations; a damage model; and a failure model. The model is driven by tabular data that is generated either using laboratory tests or via virtual testing. A unidirectional composite—T800/F3900, commonly used in the aerospace industry, is used in the verification and validation tests. While the failure model is under development, these tests indicate that the implementation of the deformation and damage models in a commercial finite element program, LS-DYNA, is efficient, robust and accurate.

  2. Numerical Modelling of Wind Waves. Problems, Solutions, Verifications, and Applications

    CERN Document Server

    Polnikov, Vladislav

    2011-01-01

    The time-space evolution of the field is described by the transport equation for the 2-dimensional wave energy spectrum density, S(x,t), spread in the space, x, and time, t. This equation has the forcing named the source function, F, depending on both the wave spectrum, S, and the external wave-making factors: local wind, W(x, t), and local current, U(x, t). The source function contains certain physical mechanisms responsible for a wave spectrum evolution. It is used to distinguish three terms in function F: the wind-wave energy exchange mechanism, In; the energy conservative mechanism of nonlinear wave-wave interactions, Nl; and the wave energy loss mechanism, Dis. Differences in mathematical representation of the source function terms determine general differences between wave models. The problem is to derive analytical representations for the source function terms said above from the fundamental wave equations. Basing on publications of numerous authors and on the last two decades studies of the author, th...

  3. Verification of model for heat and mass transfer process in cross flow heat-source tower%叉流热源塔传热传质模型的建立及实验验证

    Institute of Scientific and Technical Information of China (English)

    文先太; 梁彩华; 刘成兴; 张小松

    2012-01-01

    As a new heat and mass transfer device, heat-source tower plays an important role in the heat-source tower heat pump system. Heat is absorbed from the ambient in winter and supplied to the evaporator of heat pump. There are some similarities and differences in heat and mass transfer between heat-source tower and cooling tower. Six primary differences are pointed out as follows: heat resistance, physical property of liquid, ratio of latent heat percentage, flow rate of circular water/liquid, effect of liquid float on the whole system, and direction and rate of heat transfer. Based on the similarities and differences of heat-source tower and cooling tower, the mathematical model of heat and mass transfer in the cross heat-source tower is proposed and the model is confirmed by experiments. Results show that the latent heat percentage of cross flow heat-source tower is less than 35% and the error of heat transfer rate between simulation and experiment is less than 10%. The model is accurate enough to simulate the heat transfer characteristic in the heat-source tower.%热源塔作为新型的热质交换设备,在热源塔热泵机组运行过程中起着重要作用.其在冬季运行时从空气中吸收热量,为热泵机组提供低品位热量.热源塔与冷却塔在传热传质上存在一定异同点,指出了冷却塔与热源塔在传热传质上存在热阻、液体物性、潜热换热量比例、循环水/液体流量、飘液对系统的影响、热量传递方向和换热量大小等方面的差异.根据热源塔与冷却塔差异建立叉流热源塔传热传质数学模型,并采用实验验证模型的准确性.结果表明叉流热源塔潜热百分比低于35%,模型结果与实验结果相比换热性能误差低于10%,该模型能够较为精确地对叉流热源塔换热性能进行模拟.

  4. Quantitative Safety: Linking Proof-Based Verification with Model Checking for Probabilistic Systems

    CERN Document Server

    Ndukwu, Ukachukwu

    2009-01-01

    This paper presents a novel approach for augmenting proof-based verification with performance-style analysis of the kind employed in state-of-the-art model checking tools for probabilistic systems. Quantitative safety properties usually specified as probabilistic system invariants and modeled in proof-based environments are evaluated using bounded model checking techniques. Our specific contributions include the statement of a theorem that is central to model checking safety properties of proof-based systems, the establishment of a procedure; and its full implementation in a prototype system (YAGA) which readily transforms a probabilistic model specified in a proof-based environment to its equivalent verifiable PRISM model equipped with reward structures. The reward structures capture the exact interpretation of the probabilistic invariants and can reveal succinct information about the model during experimental investigations. Finally, we demonstrate the novelty of the technique on a probabilistic library cas...

  5. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  6. Modelling and Formal Verification of Timing Aspects in Large PLC Programs

    CERN Document Server

    Fernandez Adiego, B; Blanco Vinuela, E; Tournier, J-C; Gonzalez Suarez, V M; Blech, J O

    2014-01-01

    One of the main obstacle that prevents model checking from being widely used in industrial control systems is the complexity of building formal models out of PLC programs, especially when timing aspects need to be integrated. This paper brings an answer to this obstacle by proposing a methodology to model and verify timing aspects of PLC programs. Two approaches are proposed to allow the users to balance the trade-off between the complexity of the model, i.e. its number of states, and the set of specifications possible to be verified. A tool supporting the methodology which allows to produce models for different model checkers directly from PLC programs has been developed. Verification of timing aspects for real-life PLC programs are presented in this paper using NuSMV.

  7. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  8. Business Process Modeling: Blueprinting

    OpenAIRE

    Al-Fedaghi, Sabah

    2017-01-01

    This paper presents a flow-based methodology for capturing processes specified in business process modeling. The proposed methodology is demonstrated through re-modeling of an IBM Blueworks case study. While the Blueworks approach offers a well-proven tool in the field, this should not discourage workers from exploring other ways of thinking about effectively capturing processes. The diagrammatic representation presented here demonstrates a viable methodology in this context. It is hoped this...

  9. Verification of precipitation forecasts by the DWD limited area model LME over Cyprus

    Directory of Open Access Journals (Sweden)

    K. Savvidou

    2007-01-01

    Full Text Available A comparison is made between the precipitation forecasts by the non-hydrostatic limited area model LME of the German Weather Service (DWD and observations from a network of rain gauges in Cyprus. This is a first attempt to carry out a preliminary verification and evaluation of the LME precipitation forecasts over the area of Cyprus. For the verification, model forecasts and observations were used covering an eleven month period, from 1/2/2005 till 31/12/2005. The observations were made by three Automatic Weather Observing Systems (AWOS located at Larnaka and Paphos airports and at Athalassa synoptic station, as well as at 6, 6 and 8 rain gauges within a radius of about 30 km around these stations, respectively. The observations were compared with the model outputs, separately for each of the three forecast days. The "probability of detection" (POD of a precipitation event and the "false alarm rate" (FAR were calculated. From the selected cases of the forecast precipitation events, the average forecast precipitation amounts in the area around the three stations were compared with the measured ones. An attempt was also made to evaluate the model's skill in predicting the spatial distribution of precipitation and, in this respect, the geographical position of the maximum forecast precipitation amount was contrasted to the position of the corresponding observed maximum. Maps with monthly precipitation totals observed by a local network of 150 rain gauges were compared with the corresponding forecast precipitation maps.

  10. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Energy Technology Data Exchange (ETDEWEB)

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  11. Certification and verification for Northrup model NSC-01-0732 fresnel lens concentrating solar collector

    Science.gov (United States)

    1979-01-01

    Structural analysis and certification of the collector system is presented. System verification against the interim performance criteria is presented and indicated by matrices. The verification discussion, analysis, and test results are also given.

  12. Particle Tracking Model Transport Process Verification: Diffusion Algorithm

    Science.gov (United States)

    2015-07-01

    ks ) as 90*3 Dks = (m) (4) In these equations, Π is a random number uniformly distributed between 0 and 1, U is the free stream flow velocity (m...the areas under the two distribution curves (PTM/Analytical), and “Corr Coef” (correlation coefficient ) is a statistical, pair-wise comparison of...the comparisons are provided in Table 2. All of the types of comparisons, including the correlation coefficients , show strong agreement between the

  13. Formal Verification of a Secure Model for Building E-Learning Systems

    Directory of Open Access Journals (Sweden)

    Farhan M Al Obisat

    2016-06-01

    Full Text Available Internet is considered as common medium for E-learning to connect several parties with each other (instructors and students as they are supposed to be far away from each other. Both wired and wireless networks are used in this learning environment to facilitate mobile access to educational systems. This learning environment requires a secure connection and data exchange. An E-learning model was implemented and evaluated by conducting student’s experiments. Before the approach is deployed in the real world a formal verification for the model is completed which shows that unreachability case does not exist. The model in this paper which is concentrated on the security of e-content has successfully validated the model using SPIN Model Checker where no errors were found.

  14. Truth in Complex Adaptive Systems Models Should BE Based on Proof by Constructive Verification

    Science.gov (United States)

    Shipworth, David

    It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. `Emergent' properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.

  15. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-03-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences yet, few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from different parameter uncertainty estimation methods. The Generalized Uncertainty Likelihood Estimator (GLUE, a modified version of GLUE, and the Shuffle Complex Evolution Metropolis (SCEM are used to generate model ensembles for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of parameter uncertainty, one that is commensurate with the dimension of the ensembles themselves. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  16. Code and Solution Verification of 3D Numerical Modeling of Flow in the Gust Erosion Chamber

    Science.gov (United States)

    Yuen, A.; Bombardelli, F. A.

    2014-12-01

    Erosion microcosms are devices commonly used to investigate the erosion and transport characteristics of sediments at the bed of rivers, lakes, or estuaries. In order to understand the results these devices provide, the bed shear stress and flow field need to be accurately described. In this research, the UMCES Gust Erosion Microcosm System (U-GEMS) is numerically modeled using Finite Volume Method. The primary aims are to simulate the bed shear stress distribution at the surface of the sediment core/bottom of the microcosm, and to validate the U-GEMS produces uniform bed shear stress at the bottom of the microcosm. The mathematical model equations are solved by on a Cartesian non-uniform grid. Multiple numerical runs were developed with different input conditions and configurations. Prior to developing the U-GEMS model, the General Moving Objects (GMO) model and different momentum algorithms in the code were verified. Code verification of these solvers was done via simulating the flow inside the top wall driven square cavity on different mesh sizes to obtain order of convergence. The GMO model was used to simulate the top wall in the top wall driven square cavity as well as the rotating disk in the U-GEMS. Components simulated with the GMO model were rigid bodies that could have any type of motion. In addition cross-verification was conducted as results were compared with numerical results by Ghia et al. (1982), and good agreement was found. Next, CFD results were validated by simulating the flow within the conventional microcosm system without suction and injection. Good agreement was found when the experimental results by Khalili et al. (2008) were compared. After the ability of the CFD solver was proved through the above code verification steps. The model was utilized to simulate the U-GEMS. The solution was verified via classic mesh convergence study on four consecutive mesh sizes, in addition to that Grid Convergence Index (GCI) was calculated and based on

  17. Verification and optimization of a PLC control schedule

    NARCIS (Netherlands)

    Brinksma, Ed; Mader, Angelika; Fehnker, Ansgar

    2002-01-01

    We report on the use of model checking techniques for both the verification of a process control program and the derivation of optimal control schedules. Most of this work has been carried out as part of a case study for the EU VHS project (Verification of Hybrid Systems), in which the program for a

  18. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...... method of data analysis. Findings - A comprehensive literature review and analysis resulted in a list of business model process configurations systematically organized under five classification groups, namely, revenue model; value proposition; value configuration; target customers, and strategic...

  19. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  20. Modeling and Verification of Reconfigurable and Energy-Efficient Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Jiafeng Zhang

    2015-01-01

    Full Text Available This paper deals with the formal modeling and verification of reconfigurable and energy-efficient manufacturing systems (REMSs that are considered as reconfigurable discrete event control systems. A REMS not only allows global reconfigurations for switching the system from one configuration to another, but also allows local reconfigurations on components for saving energy when the system is in a particular configuration. In addition, the unreconfigured components of such a system should continue running during any reconfiguration. As a result, during a system reconfiguration, the system may have several possible paths and may fail to meet control requirements if concurrent reconfiguration events and normal events are not controlled. To guarantee the safety and correctness of such complex systems, formal verification is of great importance during a system design stage. This paper extends the formalism reconfigurable timed net condition/event systems (R-TNCESs in order to model all possible dynamic behavior in such systems. After that, the designed system based on extended R-TNCESs is verified with the help of a software tool SESA for functional, temporal, and energy-efficient properties. This paper is illustrated by an automatic assembly system.

  1. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  2. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    Science.gov (United States)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough

  3. Vibratory response modeling and verification of a high precision optical positioning system.

    Energy Technology Data Exchange (ETDEWEB)

    Barraza, J.; Kuzay, T.; Royston, T. J.; Shu, D.

    1999-06-18

    A generic vibratory-response modeling program has been developed as a tool for designing high-precision optical positioning systems. Based on multibody dynamics theory, the system is modeled as rigid-body structures connected by linear elastic elements, such as complex actuators and bearings. The full dynamic properties of each element are determined experimentally or theoretically, then integrated into the program as inertial and stiffness matrices. Utilizing this program, the theoretical and experimental verification of the vibratory behavior of a double-multilayer monochromator support and positioning system is presented. Results of parametric design studies that investigate the influence of support floor dynamics and highlight important design issues are also presented. Overall, good matches between theory and experiment demonstrate the effectiveness of the program as a dynamic modeling tool.

  4. Liveness and Reachability Analysis of BPMN Process Models

    Directory of Open Access Journals (Sweden)

    Anass Rachdi

    2016-06-01

    Full Text Available Business processes are usually defined by business experts who require intuitive and informal graphical notations such as BPMN (Business Process Management Notation for documenting and communicating their organization activities and behavior. However, BPMN has not been provided with a formal semantics, which limits the analysis of BPMN models to using solely informal techniques such as simulation. In order to address this limitation and use formal verification, it is necessary to define a certain “mapping” between BPMN and a formal language such as Concurrent Sequential Processes (CSP and Petri Nets (PN. This paper proposes a method for the verification of BPMN models by defining formal semantics of BPMN in terms of a mapping to Time Petri Nets (TPN, which are equipped with very efficient analytical techniques. After the translation of BPMN models to TPN, verification is done to ensure that some functional properties are satisfied by the model under investigation, namely liveness and reachability properties. The main advantage of our approach over existing ones is that it takes into account the time components in modeling Business process models. An example is used throughout the paper to illustrate the proposed method.

  5. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification.

    Science.gov (United States)

    Sager, Jennifer E; Yu, Jingjing; Ragueneau-Majlessi, Isabelle; Isoherranen, Nina

    2015-11-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms "PBPK" and "physiologically based pharmacokinetic model" to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines.

  6. Verification of seiching processes in a large and deep lake (Trichonis, Greece

    Directory of Open Access Journals (Sweden)

    I. ZACHARIAS

    2012-12-01

    Full Text Available A computational analysis of the periods and structure of surface seiches of Lake Trichonis in Greece and its experimental verification from three simultaneous water gauge recordings, mounted along the shores in Myrtia, Panetolio and Trichonio is given. The first five theoretical modes are calculated with a finite difference code of tidal equations, which yield the eigenperiodes, co-range and co-tidal lines that are graphically displayed and discussed in detail.Experimental verifications are from recordings taken during spring. Visual observations of the record permit identification of the five lowest order modes, including inter station phase shift. Power spectral analysis of two time series and interstation phase difference and coherence spectra allow the identification of the same five modes. Agreement between the theoretically predicted and the experimentally determined periods was excellent for most of the calculated modes.

  7. Robust verification analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William, E-mail: wjrider@sandia.gov [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States); Witkowski, Walt [Sandia National Laboratories, Verification and Validation, Uncertainty Quantification, Credibility Processes Department, Engineering Sciences Center, Albuquerque, NM 87185 (United States); Kamm, James R. [Los Alamos National Laboratory, Methods and Algorithms Group, Computational Physics Division, Los Alamos, NM 87545 (United States); Wildey, Tim [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States)

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  8. Applied and engineering versions of the theory of elastoplastic processes of active complex loading part 2: Identification and verification

    Science.gov (United States)

    Peleshko, V. A.

    2016-06-01

    The deviator constitutive relation of the proposed theory of plasticity has a three-term form (the stress, stress rate, and strain rate vectors formed from the deviators are collinear) and, in the specialized (applied) version, in addition to the simple loading function, contains four dimensionless constants of the material determined from experiments along a two-link strain trajectory with an orthogonal break. The proposed simple mechanism is used to calculate the constants of themodel for four metallic materials that significantly differ in the composition and in the mechanical properties; the obtained constants do not deviate much from their average values (over the four materials). The latter are taken as universal constants in the engineering version of the model, which thus requires only one basic experiment, i. e., a simple loading test. If the material exhibits the strengthening property in cyclic circular deformation, then the model contains an additional constant determined from the experiment along a strain trajectory of this type. (In the engineering version of the model, the cyclic strengthening effect is not taken into account, which imposes a certain upper bound on the difference between the length of the strain trajectory arc and the module of the strain vector.) We present the results of model verification using the experimental data available in the literature about the combined loading along two- and multi-link strain trajectories with various lengths of links and angles of breaks, with plane curvilinear segments of various constant and variable curvature, and with three-dimensional helical segments of various curvature and twist. (All in all, we use more than 80 strain programs; the materials are low- andmedium-carbon steels, brass, and stainless steel.) These results prove that the model can be used to describe the process of arbitrary active (in the sense of nonnegative capacity of the shear) combine loading and final unloading of originally

  9. Foam process models.

    Energy Technology Data Exchange (ETDEWEB)

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  10. SYSTEM-COGNITIVE MODEL OF FORECASTING THE DEVELOPMENT OF DIVERSIFIED AGRO-INDUSTRIAL CORPORATIONS. PART II. SYNTHESIS AND MODEL VERIFICATION

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-11-01

    Full Text Available In this article, in accordance with the methodology of the Automated system-cognitive analysis (ASCanalysis, we examine the implementation of the 3rd ASC-analysis: synthesis and verification of forecasting models of development of diversified agro-industrial corporations. In this step, we have synthesis and verification of 3 statistical and 7 system-cognitive models: ABS – matrix of the absolute frequencies, PRC1 and PRC2 – matrix of the conditional and unconditional distributions, INF1 and INF2 private criterion: the amount of knowledge based on A. Kharkevich, INF3 – private criterion: the Chi-square test: difference between the actual and the theoretically expected absolute frequencies INF4 and INF5 – private criterion: ROI - Return On Investment, INF6 and INF7 – private criterion: the difference between conditional and unconditional probability (coefficient of relationship. The reliability of the created models was assessed in accordance with the proposed metric is similar to the known F-test, but does not involve the performance of normal distribution, linearity of the object modeling, the independence and additivity acting factors. The accuracy of the obtained models was high enough to resolve the subsequent problems of identification, forecasting and decision making, as well as studies of the modeled object by studying its model, scheduled for consideration in future articles

  11. Toward automated parasitic extraction of silicon photonics using layout physical verifications

    Science.gov (United States)

    Ismail, Mohamed; El Shamy, Raghi S.; Madkour, Kareem; Hammouda, Sherif; Swillam, Mohamed A.

    2016-08-01

    A physical verification flow of the layout of silicon photonic circuits is suggested. Simple empirical models are developed to estimate the bend power loss and coupled power in photonic integrated circuits fabricated using SOI standard wafers. These models are utilized in physical verification flow of the circuit layout to verify reliable fabrication using any electronic design automation tool. The models are accurate compared with electromagnetic solvers. The models are closed form and circumvent the need to utilize any EM solver for the verification process. Hence, it dramatically reduces the time of the verification process.

  12. Wave dispersion in the hybrid-Vlasov model: verification of Vlasiator

    CERN Document Server

    Kempf, Yann; von Alfthan, Sebastian; Vaivads, Andris; Palmroth, Minna; Koskinen, Hannu E J

    2013-01-01

    Vlasiator is a new hybrid-Vlasov plasma simulation code aimed at simulating the entire magnetosphere of the Earth. The code treats ions (protons) kinetically through Vlasov's equation in the six-dimensional phase space while electrons are a massless charge-neutralizing fluid [M. Palmroth et al., Journal of Atmospheric and Solar-Terrestrial Physics 99, 41 (2013); A. Sandroos et al., Parallel Computing 39, 306 (2013)]. For first global simulations of the magnetosphere, it is critical to verify and validate the model by established methods. Here, as part of the verification of Vlasiator, we characterize the low-\\beta\\ plasma wave modes described by this model and compare with the solution computed by the Waves in Homogeneous, Anisotropic Multicomponent Plasmas (WHAMP) code [K. R\\"onnmark, Kiruna Geophysical Institute Reports 179 (1982)], using dispersion curves and surfaces produced with both programs. The match between the two fundamentally different approaches is excellent in the low-frequency, long wavelength...

  13. Formal Modeling and Verification of Context-Aware Systems using Event-B

    Directory of Open Access Journals (Sweden)

    Hong Anh Le

    2014-12-01

    Full Text Available Context awareness is a computing paradigm that makes applications responsive and adaptive with their environment. Formal modeling and verification of context-aware systems are challenging issues in the development as they are complex and uncertain. In this paper, we propose an approach to use a formal method Event-B to model and verify such systems. First, we specify a context aware system’s components such as context data entities, context rules, context relations by Event-B notions. In the next step, we use the Rodin platform to verify the system’s desired properties such as context constraint preservation. It aims to benefit from natural representation of context awareness concepts in Event-B and proof obligations generated by refinement mechanism to ensure the correctness of systems. We illustrate the use of our approach on a scenario of an Adaptive Cruise Control system.

  14. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    Directory of Open Access Journals (Sweden)

    Paweł Drapikowski

    2016-06-01

    Full Text Available This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated.

  15. Refactoring Process Models in Large Process Repositories.

    NARCIS (Netherlands)

    Weber, B.; Reichert, M.U.

    2008-01-01

    With the increasing adoption of process-aware information systems (PAIS), large process model repositories have emerged. Over time respective models have to be re-aligned to the real-world business processes through customization or adaptation. This bears the risk that model redundancies are introdu

  16. Refactoring Process Models in Large Process Repositories.

    NARCIS (Netherlands)

    Weber, B.; Reichert, M.U.

    With the increasing adoption of process-aware information systems (PAIS), large process model repositories have emerged. Over time respective models have to be re-aligned to the real-world business processes through customization or adaptation. This bears the risk that model redundancies are

  17. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    Science.gov (United States)

    Mao, S. P.; Rottenberg, X.; Rochus, V.; Czarnecki, P.; Helin, P.; Severi, S.; Nauwelaers, B.; Tilmans, H. A. C.

    2017-03-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp.). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells.

  18. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    Science.gov (United States)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture

  19. Real-Time Kennedy Space Center and Cape Canaveral Air Force Station High-Resolution Model Implementation and Verification

    Science.gov (United States)

    Shafer, Jaclyn A.; Watson, Leela R.

    2015-01-01

    .33-kilometer domain model performance for the 2014 warm season (May-September). Verification statistics were computed using the Model Evaluation Tools, which compared the model forecasts to observations. The mean error values were close to 0 and the root mean square error values were less than 1.8 for mean sea-level pressure (millibars), temperature (degrees Kelvin), dewpoint temperature (degrees Kelvin), and wind speed (per millisecond), all very small differences between the forecast and observations considering the normal magnitudes of the parameters. The precipitation forecast verification results showed consistent under-forecasting of the precipitation object size. This could be an artifact of calculating the statistics for each hour rather than for the entire 12-hour period. The AMU will continue to generate verification statistics for the 1.33-kilometer WRF-EMS domain as data become available in future cool and warm seasons. More data will produce more robust statistics and reveal a more accurate assessment of model performance. Once the formal task was complete, the AMU conducted additional work to better understand the wind direction results. The results were stratified diurnally and by wind speed to determine what effects the stratifications would have on the model wind direction verification statistics. The results are summarized in the addendum at the end of this report. In addition to verifying the model's performance, the AMU also made the output available in the Advanced Weather Interactive Processing System II (AWIPS II). This allows the 45 WS and AMU staff to customize the model output display on the AMU and Range Weather Operations AWIPS II client computers and conduct real-time subjective analyses. In the future, the AMU will implement an updated version of the WRF-EMS model that incorporates local data assimilation. This model will also run in real-time and be made available in AWIPS II.

  20. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  1. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  2. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  3. Development and verification of a screening model for surface spreading of petroleum

    Science.gov (United States)

    Hussein, Maged; Jin, Minghui; Weaver, James W.

    2002-08-01

    Overflows and leakage from aboveground storage tanks and pipelines carrying crude oil and petroleum products occur frequently. The spilled hydrocarbons pose environmental threats by contaminating the surrounding soil and the underlying ground water. Predicting the fate and transport of these chemicals is required for environmental risk assessment and for remedial measure design. The present paper discusses the formulation and application of the Oil Surface Flow Screening Model (OILSFSM) for predicting the surface flow of oil by taking into account infiltration and evaporation. Surface flow is simulated using a semi-analytical model based on the lubrication theory approximation of viscous flow. Infiltration is simulated using a version of the Green and Ampt infiltration model, which is modified to account for oil properties. Evaporation of volatile compounds is simulated using a compositional model that accounts for the changes in the fraction of each compound in the spilled oil. The coupling between surface flow, infiltration and evaporation is achieved by incorporating the infiltration and evaporation fluxes into the global continuity equation of the spilled oil. The model was verified against numerical models for infiltration and analytical models for surface flow. The verification study demonstrates the applicability of the model.

  4. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  5. Verification of communication protocols in web services model-checking service compositions

    CERN Document Server

    Tari, Zahir; Mukherjee, Anshuman

    2014-01-01

    Gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with the essential, state-of-the-art information about sensor networking. In the near future, wireless sensor networks will become an integral part of our day-to-day life. To solve different sensor networking related issues, researchers have put a great deal of effort into coming up with innovative ideas. Verification of Communication Protocols in Web Services: Model-Checking Service Compositions gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with integral information about sensor networking. It introduces current technological trends, particularly in node organization, and provides implementation details of each networking type to help readers set up sensor networks in their related job fields. In addition, it identifies the limitations of current technologies, as well as future research directions.

  6. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  7. Study of verification, validation, and testing in the automated data processing system at the Department of Veterans Affairs

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, A. (Argonne National Lab., IL (USA). Energy Systems Div.); Formento, J.W.; Hill, L.G.; Riemer, C.A. (Argonne National Lab., IL (USA). Environmental Assessment and Information Sciences Div.)

    1990-01-01

    Argonne National Laboratory (ANL) studied the role of verification, validation, and testing (VV T) in the Department of Veterans Affairs (VA) automated data processing (ADP) system development life cycle (SDLC). In this study, ANL reviewed and compared standard VV T practices in the private and government sectors with those in the VA. The methodology included extensive interviews with, and surveys of, users, analysts, and staff in the Systems Development Division (SDD) and Systems Verification and Testing Division (SV TD) of the VA, as well as representatives of private and government organizations, and a review of ADP standards. The study revealed that VA's approach to VV T already incorporates some industry practices -- in particular, the use of an independent organization that relies on the acceptability of test results to validate a software system. Argonne recommends that the role of SV TD be limited to validation and acceptance testing (defined as formal testing conducted independently to determine whether a software system satisfies its acceptance criteria). It also recommends that the role of the SDD be expanded to include verification testing (defined as formal testing or revaluation conducted by the developer to determine whether a software development satisfies design criteria). Integrated systems testing should be performed by Operations in a production-like environment under stressful situations to assess how trouble-free and acceptable the software is to the end user. A separate, independent, quality assurance group should be responsible for ADP auditing and for helping to establish policies for managing software configurations and should report directly to the VA central office. Finally, and of no less importance, an in-house training program and procedures manual should be instituted for the entire SDLC for all involved staff; it should incorporate or reference ADP standards.

  8. Process analysis of two-layered tube hydroforming with analytical and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Seyedkashi, S. M. Hossein [The University of Birjand, Birjand (Iran, Islamic Republic of); Panahizadeh R, Valiollah [Shahid Rajaee Teacher Training University, Tehran (Iran, Islamic Republic of); Xu, Haibin; Kim, Sang Yun; Moon, Young Hoon [Pusan National University, Busan (Korea, Republic of)

    2013-01-15

    Two-layered tubular joints are suitable for special applications. Designing and manufacturing of two layered components require enough knowledge about the tube material behavior during the hydroforming process. In this paper, hydroforming of two-layered tubes is investigated analytically, and the results are verified experimentally. The aim of this study is to derive an analytical model which can be used in the process design. Fundamental equations are written for both of the outer and inner tubes, and the total forming pressure is obtained from these equations. Hydroforming experiments are carried out on two different combinations of materials for inner and outer tubes; case 1: copper/aluminum and case 2: carbon steel/stainless steel. It is observed that experimental results are in good agreement with the theoretical model obtained for estimation of forming pressure able to avoid wrinkling.

  9. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  10. A Novel Verification Approach of Workflow Schema

    Institute of Scientific and Technical Information of China (English)

    WANG Guangqi; WANG Juying; WANG Yan; SONG Baoyan; YU Ge

    2006-01-01

    A workflow schema is an abstract description of the business processed by workflow model, and plays a critical role in analyzing, executing and reorganizing business processes. The verification issue on the correctness of complicated workflow schemas is difficult in the field of workflow. We make an intensive study of it in this paper. We present here local errors and schema logic errors (global errors) in workflow schemas in detail, and offer some constraint rules trying to avoid schema errors during modeling. In addition, we propose a verification approach based on graph reduction and graph spread, and give the algorithm. The algorithm is implemented in a workflow prototype system e-ScopeWork.

  11. A general simulation model developing process based on five-object framework

    Institute of Scientific and Technical Information of China (English)

    胡安斌; 伞冶; 陈建明; 陈永强

    2003-01-01

    Different paradigms that relate verification and validation to the simulation model have different development process. A simulation model developing process based on Five-Object Framework (FOF) is discussed in this paper. An example is given to demonstrate the applications of the proposed method.

  12. Modeling the Magnetospheric X-ray Emission from Solar Wind Charge Exchange with Verification from XMM-Newton Observations

    Science.gov (United States)

    2016-08-26

    and Astronomy, University of Leicester, Leicester, UK, 2Finnish Meteorological Institute, Helsinki, Finland Abstract An MHD-based model of terrestrial...check confirms that we should continue the analysis with these new simulations. Figure 9 shows the comparison of these newly calculated model count rates...Journal of Geophysical Research: Space Physics Modeling the magnetospheric X-ray emission from solar wind charge exchange with verification from XMM

  13. ENSO Forecasts in the North American Multi-Model Ensemble: Composite Analysis and Verification

    Science.gov (United States)

    Chen, L. C.

    2015-12-01

    In this study, we examine precipitation and temperature forecasts during El Nino/Southern Oscillation (ENSO) events in six models in the North American Multi-Model Ensemble (NMME), including the CFSv2, CanCM3, CanCM4, FLOR, GEOS5, and CCSM4 models, by comparing the model-based ENSO composites to the observed. The composite analysis is conducted using the 1982-2010 hindcasts for each of the six models with selected ENSO episodes based on the seasonal Ocean Nino Index (ONI) just prior to the date the forecasts were initiated. Two sets of composites are constructed over the North American continent: one based on precipitation and temperature anomalies, the other based on their probability of occurrence in a tercile-based system. The composites apply to monthly mean conditions in November, December, January, February, and March, respectively, as well as to the five-month aggregates representing the winter conditions. For the anomaly composites, we use the anomaly correlation coefficient and root-mean-square error against the observed composites for evaluation. For the probability composites, unlike conventional probabilistic forecast verification assuming binary outcomes to the observations, both model and observed composites are expressed in probability terms. Performance metrics for such validation are limited. Therefore, we develop a probability anomaly correlation measure and a probability score for assessment, so the results are comparable to the anomaly composite evaluation. We found that all NMME models predict ENSO precipitation patterns well during wintertime; however, some models have large discrepancies between the model temperature composites and the observed. The skill is higher for the multi-model ensemble, as well as the five-month aggregates. Comparing to the anomaly composites, the probability composites have superior skill in predicting ENSO temperature patterns and are less sensitive to the sample used to construct the composites, suggesting that

  14. Verification of forward kinematics of the numerical and analytical model of Fanuc AM100iB robot

    Science.gov (United States)

    Cholewa, A.; Świder, J.; Zbilski, A.

    2016-08-01

    The article presents the verification of forward kinematics of Fanuc AM100iB robot. The developed kinematic model of the machine was verified using tests on an actual robot model. The tests consisted in positioning the robot operating in the mode of controlling the values of natural angles in selected points of its workspace and reading the indications of the coordinate values of the TCP point in the robot's global coordinate system on the operator panel. Validation of the model consisted of entering the same values of natural angles that were used for positioning the robot in its inputs and calculating the coordinate values of the TCP of the machine's CAE model, and then comparing the results obtained with the values read. These results are the introduction to the partial verification of the dynamic model of the analysed device.

  15. Real-time Performance Verification of Core Protection and Monitoring System with Integrated Model for SMART Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Bon-Seung; Kim, Sung-Jin; Hwang, Dae-Hyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    In keeping with these purposes, a real-time model of the digital core protection and monitoring systems for simulator implementation was developed on the basis of SCOPS and SCOMS algorithms. In addition, important features of the software models were explained for the application to SMART simulator, and the real-time performance of the models linked with DLL was examined for various simulation scenarios. In this paper, performance verification of core protection and monitoring software is performed with integrated simulator model. A real-time performance verification of core protection and monitoring software for SMART simulator was performed with integrated simulator model. Various DLL connection tests were done for software algorithm change. In addition, typical accident scenarios of SMART were simulated with 3KEYMASTER and simulated results were compared with those of DLL linked core protection and monitoring software. Each calculational result showed good agreements.

  16. Learning Markov Decision Processes for Model Checking

    DEFF Research Database (Denmark)

    Mao, Hua; Chen, Yingke; Jaeger, Manfred

    2012-01-01

    Constructing an accurate system model for formal model verification can be both resource demanding and time-consuming. To alleviate this shortcoming, algorithms have been proposed for automatically learning system models based on observed system behaviors. In this paper we extend the algorithm on...

  17. Thermal Pollution Mathematical Model. Volume 4: Verification of Three-Dimensional Rigid-Lid Model at Lake Keowee. [envrionment impact of thermal discharges from power plants

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1980-01-01

    The rigid lid model was developed to predict three dimensional temperature and velocity distributions in lakes. This model was verified at various sites (Lake Belews, Biscayne Bay, etc.) and th verification at Lake Keowee was the last of these series of verification runs. The verification at Lake Keowee included the following: (1) selecting the domain of interest, grid systems, and comparing the preliminary results with archival data; (2) obtaining actual ground truth and infrared scanner data both for summer and winter; and (3) using the model to predict the measured data for the above periods and comparing the predicted results with the actual data. The model results compared well with measured data. Thus, the model can be used as an effective predictive tool for future sites.

  18. A Modeling and Verification Study of Summer Precipitation Systems Using NASA Surface Initialization Datasets

    Science.gov (United States)

    Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.

    2010-01-01

    One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using

  19. Manufactured solutions and the verification of three-dimensional Stokes ice-sheet models

    Directory of Open Access Journals (Sweden)

    W. Leng

    2013-01-01

    Full Text Available The manufactured solution technique is used for the verification of computational models in many fields. In this paper, we construct manufactured solutions for the three-dimensional, isothermal, nonlinear Stokes model for flows in glaciers and ice sheets. The solution construction procedure starts with kinematic boundary conditions and is mainly based on the solution of a first-order partial differential equation for the ice velocity that satisfies the incompressibility condition. The manufactured solutions depend on the geometry of the ice sheet, basal sliding parameters, and ice softness. Initial conditions are taken from the periodic geometry of a standard problem of the ISMIP-HOM benchmark tests. The upper surface is altered through the manufactured solution procedure to generate an analytic solution for the time-dependent flow problem. We then use this manufactured solution to verify a parallel, high-order accurate, finite element Stokes ice-sheet model. Simulation results from the computational model show good convergence to the manufactured analytic solution.

  20. Verification of a 1-dimensional model for predicting shallow infiltration at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Hevesi, J.A.; Flint, A.L. [Geological Survey, Mercury, NV (United States); Flint, L.E. [Foothill Engineering Consultants, Mercury, Nevada (United States)

    1994-12-31

    A characterization of net infiltration rates is needed for site-scale evaluation of groundwater flow at Yucca Mountain, Nevada. Shallow infiltration caused by precipitation may be a potential source of net infiltration. A 1-dimensional finite difference model of shallow infiltration with a moisture-dependant evapotranspiration function and a hypothetical root-zone was calibrated and verified using measured water content profiles, measured precipitation, and estimated potential evapotranspiration. Monthly water content profiles obtained from January 1990 through October 1993 were measured by geophysical logging of 3 boreholes located in the alluvium channel of Pagany Wash on Yucca Mountain. The profiles indicated seasonal wetting and drying of the alluvium in response to winter season precipitation and summer season evapotranspiration above a depth of 2.5 meters. A gradual drying trend below a depth of 2.5 meters was interpreted as long-term redistribution and/or evapotranspiration following a deep infiltration event caused by runoff in Pagany Wash during 1984. An initial model, calibrated using the 1990 to 1 992 record, did not provide a satisfactory prediction of water content profiles measured in 1993 following a relatively wet winter season. A re-calibrated model using a modified, seasonally-dependent evapotranspiration function provided an improved fit to the total record. The new model provided a satisfactory verification using water content changes measured at a distance of 6 meters from the calibration site, but was less satisfactory in predicting changes at a distance of 18 meters.

  1. Verification of a 1-dimensional model for predicting shallow infiltration at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Hevesi, J.; Flint, A.L. [Geological Survey, Mercury, NV (United States); Flint, L.E. [Foothill Eng. Consultants, Mercury, NV (United States)

    1994-12-31

    A characterization of net infiltration rates is needed for site-scale evaluation of groundwater flow at Yucca Mountain, Nevada. Shallow infiltration caused by precipitation may be a potential source of net infiltration. A 1-dimensional finite difference model of shallow infiltration with a moisture-dependent evapotranspiration function and a hypothetical root-zone was calibrated and verified using measured water content profiles, measured precipitation, and estimated potential evapotranspiration. Monthly water content profiles obtained from January 1990 through October 1993 were measured by geophysical logging of 3 boreholes located in the alluvium channel of Pagany Wash on Yucca Mountain. The profiles indicated seasonal wetting and drying of the alluvium in response to winter season precipitation and summer season evapotranspiration above a depth of 2.5 meters. A gradual drying trend below a depth of 2.5 meters was interpreted as long-term redistribution and/or evapotranspiration following a deep infiltration event caused by runoff in Pagany Wash during 1984. An initial model, calibrated using the 1990 to 1992 record, did not provide a satisfactory prediction of water content profiles measured in 1993 following a relatively wet winter season. A re-calibrated model using a modified, seasonally-dependent evapotranspiration function provided an improved fit to the total record. The new model provided a satisfactory verification using water content changes measured at a distance of 6 meters from the calibration site, but was less satisfactory in predicting changes at a distance of 18 meters.

  2. Chatter reduction in boring process by using piezoelectric shunt damping with experimental verification

    Science.gov (United States)

    Yigit, Ufuk; Cigeroglu, Ender; Budak, Erhan

    2017-09-01

    Chatter is a self-excited type of vibration that develops during machining due to process-structure dynamic interactions resulting in modulated chip thickness. Chatter is an important problem as it results in poor surface quality, reduced productivity and tool life. The stability of a cutting process is strongly influenced by the frequency response function (FRF) at the cutting point. In this study, the effect of piezoelectric shunt damping on chatter vibrations in a boring process is studied. In piezoelectric shunt damping method, an electrical impedance is connected to a piezoelectric transducer which is bonded on cutting tool. Electrical impedance of the circuit consisting of piezoceramic transducer and passive shunt is tuned to the desired natural frequency of the cutting tool in order to maximize damping. The optimum damping is achieved in analytical and finite element models (FEM) by using a genetic algorithm focusing on the real part of the tool point FRF rather than the amplitude. Later, a practical boring bar is considered where the optimum circuit parameters are obtained by the FEM. Afterwards, the effect of the optimized piezoelectric shunt damping on the dynamic rigidity and absolute stability limit of the cutting process are investigated experimentally by modal analysis and cutting tests. It is both theoretically and experimentally shown that application of piezoelectric shunt damping results in a significant increase in the absolute stability limit in boring operations.

  3. Vega library for processing DICOM data required in Monte Carlo verification of radiotherapy treatment plans

    CERN Document Server

    Locke, C

    2008-01-01

    Monte Carlo (MC) method provides the most accurate to-date dose calculations in heterogeneous media and complex geometries, and this spawns increasing interest in incorporating MC calculations to treatment planning quality assurance process. This process involves MC dose calculations for the treatment plans produced clinically. To perform these calculations a number of treatment plan parameters specifying radiation beam and patient geometries needs to be transferred to MC codes such as BEAMnrc and DOSXYZnrc. Extracting these parameters from DICOM files is not a trivial task that has previously been performed mostly using Matlab-based software. This paper describes DICOM tags that contain information required for MC modeling of conformal and IMRT plans, and reports development of an in-house DICOM interface through a library (named Vega) of platform-independent, object-oriented C++ codes. Vega library is small and succinct, offering just the fundamental functions for reading/modifying/writing DICOM files in a ...

  4. Fiction and reality in the modelling world - Balance between simplicity and complexity, calibration and identifiability, verification and falsification

    DEFF Research Database (Denmark)

    Harremoës, P.; Madsen, H.

    1999-01-01

    Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable...... by calibration/verification on the basis of the data series available, which generates elements of sheer guessing - unless the universality of the model is be based on induction, i.e. experience from the sum of all previous investigations. There is a need to deal more explicitly with uncertainty...... and to incorporate that in the design, operation and control of urban drainage structures. (C) 1999 IAWQ Published by Elsevier Science Ltd. All rights reserved....

  5. Model based correction of placement error in EBL and its verification

    Science.gov (United States)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-05-01

    In maskmaking, the main source of error contributing to placement error is charging. DISPLACE software corrects the placement error for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. One important step is the calibration of physical model. A test layout on a single calibration mask was used for calibration. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for the verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE. A good correlation of the measured and predicted values of the correction confirmed the high accuracy of the charging placement error correction.

  6. ParFlow.RT: Development and Verification of a New Reactive Transport Model

    Science.gov (United States)

    Beisman, J. J., III

    2015-12-01

    In natural subsurface systems, total elemental fluxes are often heavily influenced by areas of disproportionately high reaction rates. These pockets of high reaction rates tend to occur at interfaces, such as the hyporheic zone, where a hydrologic flowpath converges with either a chemically distinct hydrologic flowpath or a reactive substrate. Understanding the affects that these highly reactive zones have on the behavior of shallow subsurface systems is integral to the accurate quantification of nutrient fluxes and biogeochemical cycling. Numerical simulations of these systems may be able to offer some insight. To that end, we have developed a new reactive transport model, ParFlow.RT, by coupling the parallel flow and transport code ParFlow with the geochemical engines of both PFLOTRAN and CrunchFlow. The coupling was accomplished via the Alquimia biogeochemistry API, which provides a unified interface to several geochemical codes and allows a relatively simple implementation of advanced geochemical functionality in flow and transport codes. This model uses an operator-splitting approach, where the transport and reaction steps are solved separately. Here, we present the details of this new model, and the results of verification simulations and biogeochemical cycling simulations of the DOE's East River field site outside of Gothic, CO.

  7. Model-based virtual VSB mask writer verification for efficient mask error checking and optimization prior to MDP

    Science.gov (United States)

    Pack, Robert C.; Standiford, Keith; Lukanc, Todd; Ning, Guo Xiang; Verma, Piyush; Batarseh, Fadi; Chua, Gek Soon; Fujimura, Akira; Pang, Linyong

    2014-10-01

    A methodology is described wherein a calibrated model-based `Virtual' Variable Shaped Beam (VSB) mask writer process simulator is used to accurately verify complex Optical Proximity Correction (OPC) and Inverse Lithography Technology (ILT) mask designs prior to Mask Data Preparation (MDP) and mask fabrication. This type of verification addresses physical effects which occur in mask writing that may impact lithographic printing fidelity and variability. The work described here is motivated by requirements for extreme accuracy and control of variations for today's most demanding IC products. These extreme demands necessitate careful and detailed analysis of all potential sources of uncompensated error or variation and extreme control of these at each stage of the integrated OPC/ MDP/ Mask/ silicon lithography flow. The important potential sources of variation we focus on here originate on the basis of VSB mask writer physics and other errors inherent in the mask writing process. The deposited electron beam dose distribution may be examined in a manner similar to optical lithography aerial image analysis and image edge log-slope analysis. This approach enables one to catch, grade, and mitigate problems early and thus reduce the likelihood for costly long-loop iterations between OPC, MDP, and wafer fabrication flows. It moreover describes how to detect regions of a layout or mask where hotspots may occur or where the robustness to intrinsic variations may be improved by modification to the OPC, choice of mask technology, or by judicious design of VSB shots and dose assignment.

  8. Modelling horizontal steam generator with ATHLET. Verification of different nodalization schemes and implementation of verified constitutive equations

    Energy Technology Data Exchange (ETDEWEB)

    Beliaev, J.; Trunov, N.; Tschekin, I. [OKB Gidropress (Russian Federation); Luther, W. [GRS Garching (Germany); Spolitak, S. [RNC-KI (Russian Federation)

    1995-12-31

    Currently the ATHLET code is widely applied for modelling of several Power Plants of WWER type with horizontal steam generators. A main drawback of all these applications is the insufficient verification of the models for the steam generator. This paper presents the nodalization schemes for the secondary side of the steam generator, the results of stationary calculations, and preliminary comparisons to experimental data. The consideration of circulation in the water inventory of the secondary side is proved to be necessary. (orig.). 3 refs.

  9. Performance and Probabilistic Verification of Regional Parameter Estimates for Conceptual Rainfall-runoff Models

    Science.gov (United States)

    Franz, K.; Hogue, T.; Barco, J.

    2007-12-01

    Identification of appropriate parameter sets for simulation of streamflow in ungauged basins has become a significant challenge for both operational and research hydrologists. This is especially difficult in the case of conceptual models, when model parameters typically must be "calibrated" or adjusted to match streamflow conditions in specific systems (i.e. some of the parameters are not directly observable). This paper addresses the performance and uncertainty associated with transferring conceptual rainfall-runoff model parameters between basins within large-scale ecoregions. We use the National Weather Service's (NWS) operational hydrologic model, the SACramento Soil Moisture Accounting (SAC-SMA) model. A Multi-Step Automatic Calibration Scheme (MACS), using the Shuffle Complex Evolution (SCE), is used to optimize SAC-SMA parameters for a group of watersheds with extensive hydrologic records from the Model Parameter Estimation Experiment (MOPEX) database. We then explore "hydroclimatic" relationships between basins to facilitate regionalization of parameters for an established ecoregion in the southeastern United States. The impact of regionalized parameters is evaluated via standard model performance statistics as well as through generation of hindcasts and probabilistic verification procedures to evaluate streamflow forecast skill. Preliminary results show climatology ("climate neighbor") to be a better indicator of transferability than physical similarities or proximity ("nearest neighbor"). The mean and median of all the parameters within the ecoregion are the poorest choice for the ungauged basin. The choice of regionalized parameter set affected the skill of the ensemble streamflow hindcasts, however, all parameter sets show little skill in forecasts after five weeks (i.e. climatology is as good an indicator of future streamflows). In addition, the optimum parameter set changed seasonally, with the "nearest neighbor" showing the highest skill in the

  10. Modular process modeling for OPC

    Science.gov (United States)

    Keck, M. C.; Bodendorf, C.; Schmidtling, T.; Schlief, R.; Wildfeuer, R.; Zumpe, S.; Niehoff, M.

    2007-03-01

    Modular OPC modeling, describing mask, optics, resist and etch processes separately is an approach to keep efforts for OPC manageable. By exchanging single modules of a modular OPC model, a fast response to process changes during process development is possible. At the same time efforts can be reduced, since only single modular process steps have to be re-characterized as input for OPC modeling as the process is adjusted and optimized. Commercially available OPC tools for full chip processing typically make use of semi-empirical models. The goal of our work is to investigate to what extent these OPC tools can be applied for modeling of single process steps as separate modules. For an advanced gate level process we analyze the modeling accuracy over different process conditions (focus and dose) when combining models for each process step - optics, resist and etch - for differing single processes to a model describing the total process.

  11. Remaining Sites Verification Package for the 100-B-14:1 Process Sewer, Waste Site Reclassification Form 2004-005

    Energy Technology Data Exchange (ETDEWEB)

    L. M. Dittmer

    2007-02-22

    The 100-B-14:1 subsite encompasses the former process sewer main associated with the 105-B Reactor Building, 108-B Chemical Pumphouse and Tritium Separation Facility, 184-B Boiler House and the 100-B water treatment facilities, as well as the feeder lines associated with the 108-B facility, formerly discharging to the 116-B-7 Outfall Structure. The subsite has been remediated to achieve the remedial action objectives specified in the Remaining Sites ROD. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.

  12. Verification of an interaction model of an ultrasonic oscillatory system with periodontal tissues

    Directory of Open Access Journals (Sweden)

    V. A. Karpuhin

    2014-01-01

    Full Text Available Verification of an interaction model of an ultrasonic oscillatory system with biological tissues which was developed in COMSOL Multiphysics was carried out. It was shown that calculation results in COMSOL Multiphysics obtained using the “Finer” grid (the ratio of the grid step to a minimum transversal section area of the model ≤ 0.3 mm-1 best of all qualitatively and quantitatively corresponded to practical results. The average relative error of the obtained results in comparison with the experimental ones did not exceed 4.0%. Influence of geometrical parameters (thickness of load on electrical admittance of the ultrasonic oscillatory system interacting with biological tissues was investigated. It was shown that increase in thickness of load within the range from 0 to 95 mm led to decrease in calculated values of natural resonance frequency of longitudinal fluctuations and electrical admittance from 26,58 to 26,35 kHz and from 0,86 to 0,44 mS.

  13. Rheological-dynamical continuum damage model for concrete under uniaxial compression and its experimental verification

    Directory of Open Access Journals (Sweden)

    Milašinović Dragan D.

    2015-01-01

    Full Text Available A new analytical model for the prediction of concrete response under uniaxial compression and its experimental verification is presented in this paper. The proposed approach, referred to as the rheological-dynamical continuum damage model, combines rheological-dynamical analogy and damage mechanics. Within the framework of this approach the key continuum parameters such as the creep coefficient, Poisson’s ratio and damage variable are functionally related. The critical values of the creep coefficient and damage variable under peak stress are used to describe the failure mode of the concrete cylinder. The ultimate strain is determined in the post-peak regime only, using the secant stress-strain relation from damage mechanics. The post-peak branch is used for the energy analysis. Experimental data for five concrete compositions were obtained during the examination presented herein. The principal difference between compressive failure and tensile fracture is that there is a residual stress in the specimens, which is a consequence of uniformly accelerated motion of load during the examination of compressive strength. The critical interpenetration displacements and crushing energy are obtained theoretically based on the concept of global failure analysis. [Projekat Ministarstva nauke Republike Srbije, br. ON 174027: Computational Mechanics in Structural Engineering i br. TR 36017: Utilization of by-products and recycled waste materials in concrete composites for sustainable construction development in Serbia: Investigation and environmental assessment of possible applications

  14. Verification test of the SURF and SURFplus models in xRage

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-18

    As a verification test of the SURF and SURFplus models in the xRage code we use a propagating underdriven detonation wave in 1-D. This is about the only test cases for which an accurate solution can be determined based on the theoretical structure of the solution. The solution consists of a steady ZND reaction zone profile joined with a scale invariant rarefaction or Taylor wave and followed by a constant state. The end of the reaction profile and the head of the rarefaction coincide with the sonic CJ state of the detonation wave. The constant state is required to match a rigid wall boundary condition. For a test case, we use PBX 9502 with the same EOS and burn rate as previously used to test the shock detector algorithm utilized by the SURF model. The detonation wave is propagated for 10 μs (slightly under 80mm). As expected, the pointwise errors are largest in the neighborhood of discontinuities; pressure discontinuity at the lead shock front and pressure derivative discontinuities at the head and tail of the rarefaction. As a quantitative measure of the overall accuracy, the L2 norm of the difference of the numerical pressure and the exact solution is used. Results are presented for simulations using both a uniform grid and an adaptive grid that refines the reaction zone.

  15. Verification and Validation of RADTRAN 5.5.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHOTOACOUSTIC SPECTROPHOTOMATER INNOVA AIR TECH INSTRUMENTS MODEL 1312 MULTI-GAS MONITOR

    Science.gov (United States)

    The U.S. Environmental Protection Agency, Through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report documents demons...

  17. Development of the VESUVIUS module. Molten jet breakup modeling and model verification

    Energy Technology Data Exchange (ETDEWEB)

    Vierow, K. [Nuclear Power Engineering Corp., Tokyo (Japan); Nagano, Katsuhiro; Araki, Kazuhiro

    1998-01-01

    With the in-vessel vapor explosion issue ({alpha}-mode failure) now considered to pose an acceptably small risk to the safety of a light water reactor, ex-vessel vapor explosions are being given considerable attention. Attempts are being made to analytically model breakup of continuous-phase jets, however uncertainty exists regarding the basic phenomena. In addition, the conditions upon reactor vessel failure, which determine the starting point of the ex-vessel vapor explosion process, are difficult to quantify. Herein, molten jet ejection from the reactor pressure vessel is characterized. Next, the expected mode of jet breakup is determined and the current state of analytical modeling is reviewed. A jet breakup model for ex-vessel scenarios, with the primary breakup mechanism being the Kelvin-Helmholtz instability, is described. The model has been incorporated into the VESUVIUS module and comparisons of VESUVIUS calculations against FARO L-06 experimental data show differences, particularly in the pressure curve and amount of jet breakup. The need for additional development to resolve these differences is discussed. (author)

  18. The MODUS approach to formal verification

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Soler, José; Berger, Michael Stübert

    2014-01-01

    in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality) project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution...... Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model...... verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development...

  19. A Formal Verification Methodology for Checking Data Integrity

    CERN Document Server

    Umezawa, Yasushi

    2011-01-01

    Formal verification techniques have been playing an important role in pre-silicon validation processes. One of the most important points considered in performing formal verification is to define good verification scopes; we should define clearly what to be verified formally upon designs under tests. We considered the following three practical requirements when we defined the scope of formal verification. They are (a) hard to verify (b) small to handle, and (c) easy to understand. Our novel approach is to break down generic properties for system into stereotype properties in block level and to define requirements for Verifiable RTL. Consequently, each designer instead of verification experts can describe properties of the design easily, and formal model checking can be applied systematically and thoroughly to all the leaf modules. During the development of a component chip for server platforms, we focused on RAS (Reliability, Availability, and Serviceability) features and described more than 2000 properties in...

  20. The construction of the milling process simulation models

    Science.gov (United States)

    Ślusarczyk, Ł.

    2016-09-01

    The paper has aimed at presentation of the possibilities of using computer-based techniques into scope of machine cutting processes, and mostly of analytical and numerical modeling of the milling process for austenitic high-alloy chromium-nickel steel X 5 CrNi 18-10 and verification of the results experiments. The study was mostly focused on measuring and assessment of deformations in the given sample with the specific load. The simulations were executed in modern computer simulation software which supports such activities. These include: NX by Siemens and Simulia Abaqus. The selection of parameters was based on the real values measured during the milling process.

  1. A Simplification of a Real-Time Verification Problem

    CERN Document Server

    Saha, Indranil; Roy, Suman; 10.1007/978-3-540-75596-8_21

    2010-01-01

    We revisit the problem of real-time verification with dense dynamics using timeout and calendar based models and simplify this to a finite state verification problem. To overcome the complexity of verification of real-time systems with dense dynamics, Dutertre and Sorea, proposed timeout and calender based transition systems to model the behavior of real-time systems and verified safety properties using k-induction in association with bounded model checking. In this work, we introduce a specification formalism for these models in terms of Timeout Transition Diagrams and capture their behavior in terms of semantics of Timed Transition Systems. Further, we discuss a technique, which reduces the problem of verification of qualitative temporal properties on infinite state space of (a large fragment of) these timeout and calender based transition systems into that on clockless finite state models through a two-step process comprising of digitization and canonical finitary reduction. This technique enables us to ve...

  2. Modular Verification of Interactive Systems with an Application to Biology

    Directory of Open Access Journals (Sweden)

    P. Milazzo

    2011-01-01

    Full Text Available We propose sync-programs, an automata-based formalism for the description of biological systems, and a modular verification technique for such a formalism that allows properties expressed in the universal fragment of CTL to be verified on suitably chosen fragments of models, rather than on whole models. As an application we show the modelling of the lac operon regulation process and the modular verification of some properties. Verification of properties is performed by using the NuSMV model checker and we show that by applying our modular verification technique we can verify properties in shorter times than those necessary to verify the same properties in the whole model.

  3. Applying multi-physics requirements and loads in FEM analysis and testing—The JET KL11 endoscope design verification process

    Energy Technology Data Exchange (ETDEWEB)

    Zauner, C., E-mail: zauner@krp-m.de [KRP-Mechatec Engineering GbR, D-85748 Garching (Germany); Klammer, J. [KRP-Mechatec Engineering GbR, D-85748 Garching (Germany); Hartl, M.; Kampf, D. [Kayser-Threde GmbH, D-81379 Munich (Germany); Huber, A.; Mertens, Ph.; Schweer, B.; Terra, A. [Institute of Energy and Climate Research – Plasma Physics, Forschungszentrum Jülich, EURATOM Association, Trilateral Euregio Cluster, D-52425 Jülich (Germany); Balshaw, N. [Euratom-CCFE Fusion Association, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom)

    2013-10-15

    Considering multi-physics requirements and loads in the early design phase as well as during the later experimental verification is especially important for the design of fusion devices due to the extreme environmental conditions and loads. Typical disciplines in design of fusion devices are thermodynamics, structural-mechanics, electro-magnetics, and optics. The interaction of these disciplines as well as an efficient approach to implement this interaction in numerical and experimental simulations is presented as applied at the new JET KL11 divertor endoscope design and verification process. The endoscope's first pictures already showed the very good performance of the instrument.

  4. A New Integrated Weighted Model in SNOW-V10: Verification of Continuous Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results of nowcasts of four continuous variables generated from an integrated weighted model and underlying Numerical Weather Prediction (NWP) models. Real-time monitoring of fast changing weather conditions and the provision of short term forecasts, or nowcasts, in complex terrain within coastal regions is challenging to do with sufficient accuracy. A recently developed weighting, evaluation, bias correction and integration system was used in the Science of Nowcasting Olympic Weather for Vancouver 2010 project to generate integrated weighted forecasts (INTW) out to 6 h. INTW forecasts were generated with in situ observation data and background gridded forecasting data from Canadian high-resolution deterministic NWP system with three nested grids at 15-, 2.5- and 1-km horizontal grid-spacing configurations. In this paper, the four variables of temperature, relative humidity, wind speed and wind gust are treated as continuous variables for verifying the INTW forecasts. Fifteen sites were selected for the comparison of the model performances. The results of the study show that integrating surface observation data with the NWP forecasts produce better statistical scores than using either the NWP forecasts or an objective analysis of observed data alone. Overall, integrated observation and NWP forecasts improved forecast accuracy for the four continuous variables. The mean absolute errors from the INTW forecasts for the entire test period (12 February to 21 March 2010) are smaller than those from NWP forecasts with three configurations. The INTW is the best and most consistent performer among all models regardless of location and variable analyzed.

  5. A Method Based on Active Appearance Model and Gradient Orientation Pyramid of Face Verification as People Age

    Directory of Open Access Journals (Sweden)

    Ji-Xiang Du

    2014-01-01

    Full Text Available Face verification in the presence of age progression is an important problem that has not been widely addressed. In this paper, we propose to use the active appearance model (AAM and gradient orientation pyramid (GOP feature representation for this problem. First, we use the AAM on the dataset and generate the AAM images; we then get the representation of gradient orientation on a hierarchical model, which is the appearance of GOP. When combined with a support vector machine (SVM, experimental results show that our approach has excellent performance on two public domain face aging datasets: FGNET and MORPH. Second, we compare the performance of the proposed methods with a number of related face verification methods; the results show that the new approach is more robust and performs better.

  6. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    Science.gov (United States)

    Hillen, F.; Höfle, B.; Ehlers, M.; Reinartz, P.

    2014-02-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories.

  7. Verification of the model of predisposition in triathlon – structural model of confirmative factor analysis

    Directory of Open Access Journals (Sweden)

    Lenka Kovářová

    2012-09-01

    Full Text Available BACKGROUND: The triathlon is a combination of three different types of sport – swimming, cycling, and running. Each of these requires different top level predispositions and complex approach to talent selection is a rather difficult process. Attempts to identify assumptions in the triathlon have so far been specific and focused only on some groups of predispositions (physiology, motor tests, and psychology. The latest studies missed the structural approach and were based on determinants of sport performance, theory of sports training and expert assessment. OBJECTIVE: The aim of our study was to verify the model of predisposition in the short triathlon for talent assessment of young male athletes age 17–20 years. METHODS: The research sample consisted of 55 top level triathletes – men, who were included in the Government supported sports talent programme in the Czech Republic at the age of 17–20 years. We used a confirmative factor analysis (FA and Path diagram to verify the model, which allow us to explain mutual relationships among observed variables. For statistical data processing we used a structure equating modeling (SEM by software Lisrel L88. RESULTS: The study confirms best structural model for talent selection in triathlon at the age of 17–20 years old men, which composed seventeen indicators (tests and explained 91% of all cross-correlations (Goodness of Fit Index /GFI/ 0.91, Root Mean Square Residual /RMSR/ 0.13. Tests for predispositions in triathlons were grouped into five items, three motor predispositions (swimming, cycling and running skills, aerobic and psychological predispositions. Aerobic predispositions showed the highest importance to the assumptions to the general factor (1.00; 0. Running predispositions were measured as a very significant factor (–0.85; 0.28 which confirms importance of this critical stage of the race. Lower factor weight showed clusters of swimming (–0.61; 0.63 and cycling (0.53; 0

  8. Methods, Computational Platform, Verification, and Application of Earthquake-Soil-Structure-Interaction Modeling and Simulation

    Science.gov (United States)

    Tafazzoli, Nima

    Seismic response of soil-structure systems has attracted significant attention for a long time. This is quite understandable with the size and the complexity of soil-structure systems. The focus of three important aspects of ESSI modeling could be on consistent following of input seismic energy and a number of energy dissipation mechanisms within the system, numerical techniques used to simulate dynamics of ESSI, and influence of uncertainty of ESSI simulations. This dissertation is a contribution to development of one such tool called ESSI Simulator. The work is being done on extensive verified and validated suite for ESSI Simulator. Verification and validation are important for high fidelity numerical predictions of behavior of complex systems. This simulator uses finite element method as a numerical tool to obtain solutions for large class of engineering problems such as liquefaction, earthquake-soil-structure-interaction, site effect, piles, pile group, probabilistic plasticity, stochastic elastic-plastic FEM, and detailed large scale parallel models. Response of full three-dimensional soil-structure-interaction simulation of complex structures is evaluated under the 3D wave propagation. Domain-Reduction-Method is used for applying the forces as a two-step procedure for dynamic analysis with the goal of reducing the large size computational domain. The issue of damping of the waves at the boundary of the finite element models is studied using different damping patterns. This is used at the layer of elements outside of the Domain-Reduction-Method zone in order to absorb the residual waves coming out of the boundary layer due to structural excitation. Extensive parametric study is done on dynamic soil-structure-interaction of a complex system and results of different cases in terms of soil strength and foundation embedment are compared. High efficiency set of constitutive models in terms of computational time are developed and implemented in ESSI Simulator

  9. FEM modeling for 3D dynamic analysis of deep-ocean mining pipeline and its experimental verification

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    3D dynamic analysis models of 1000 m deep-ocean mining pipeline, including steel lift pipe, pump, buffer and flexible hose, were established by finite element method (FEM). The coupling effect of steel lift pipe and flexible hose, and main external loads of pipeline were considered in the models, such as gravity, buoyancy, hydrodynamic forces, internal and external fluid pressures, concentrated suspension buoyancy on the flexible hose, torsional moment and axial force induced by pump working.Some relevant FEM models and solution techniques were developed, according to various 3D transient behaviors of integrated deep-ocean mining pipeline, including towing motions of track-keeping operation and launch process of pipeline. Meanwhile, an experimental verification system in towing water tank that had similar characteristics of designed mining pipeline was developed to verify the accuracy of the FEM models and dynamic simulation. The experiment results show that the experimental records and simulation results of stress of pipe are coincided. Based on the further simulations of 1 000 m deep-ocean mining pipeline, the simulation results show that, to form configuration of a saddle shape, the total concentrated suspension buoyancy of flexible hose should be 95%-105% of the gravity of flexible hose in water, the first suspension point occupies 1/3 of the total buoyancy, and the second suspension point occupies 2/3 of the total buoyancy. When towing velocity of mining system is less than 0.5 m/s, the towing track of buffer is coincided with the setting route of ship on the whole and the configuration of flexible hose is also kept well.

  10. On Activity modelling in process modeling

    Directory of Open Access Journals (Sweden)

    Dorel Aiordachioaie

    2001-12-01

    Full Text Available The paper is looking to the dynamic feature of the meta-models of the process modelling process, the time. Some principles are considered and discussed as main dimensions of any modelling activity: the compatibility of the substances, the equipresence of phenomena and the solvability of the model. The activity models are considered and represented at meta-level.

  11. A "Kane's Dynamics" Model for the Active Rack Isolation System Part Two: Nonlinear Model Development, Verification, and Simplification

    Science.gov (United States)

    Beech, G. S.; Hampton, R. D.; Rupert, J. K.

    2004-01-01

    Many microgravity space-science experiments require vibratory acceleration levels that are unachievable without active isolation. The Boeing Corporation's active rack isolation system (ARIS) employs a novel combination of magnetic actuation and mechanical linkages to address these isolation requirements on the International Space Station. Effective model-based vibration isolation requires: (1) An isolation device, (2) an adequate dynamic; i.e., mathematical, model of that isolator, and (3) a suitable, corresponding controller. This Technical Memorandum documents the validation of that high-fidelity dynamic model of ARIS. The verification of this dynamics model was achieved by utilizing two commercial off-the-shelf (COTS) software tools: Deneb's ENVISION(registered trademark), and Online Dynamics Autolev(trademark). ENVISION is a robotics software package developed for the automotive industry that employs three-dimensional computer-aided design models to facilitate both forward and inverse kinematics analyses. Autolev is a DOS-based interpreter designed, in general, to solve vector-based mathematical problems and specifically to solve dynamics problems using Kane's method. The simplification of this model was achieved using the small-angle theorem for the joint angle of the ARIS actuators. This simplification has a profound effect on the overall complexity of the closed-form solution while yielding a closed-form solution easily employed using COTS control hardware.

  12. Feature-Aware Verification

    CERN Document Server

    Apel, Sven; Wendler, Philipp; von Rhein, Alexander; Beyer, Dirk

    2011-01-01

    A software product line is a set of software products that are distinguished in terms of features (i.e., end-user--visible units of behavior). Feature interactions ---situations in which the combination of features leads to emergent and possibly critical behavior--- are a major source of failures in software product lines. We explore how feature-aware verification can improve the automatic detection of feature interactions in software product lines. Feature-aware verification uses product-line verification techniques and supports the specification of feature properties along with the features in separate and composable units. It integrates the technique of variability encoding to verify a product line without generating and checking a possibly exponential number of feature combinations. We developed the tool suite SPLverifier for feature-aware verification, which is based on standard model-checking technology. We applied it to an e-mail system that incorporates domain knowledge of AT&T. We found that feat...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL RESIDENTIAL HOMES, WATERLOO BIOFILTER® MODEL 4-BEDROOM (NSF 02/03/WQPC-SWP)

    Science.gov (United States)

    Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...

  14. An empirical model for independent dose verification of the Gamma Knife treatment planning.

    Science.gov (United States)

    Phaisangittisakul, Nakorn; Ma, Lijun

    2002-09-01

    A formalism for an independent dose verification of the Gamma Knife treatment planning is developed. It is based on the approximation that isodose distribution for a single shot is in the shape of an ellipsoid in three-dimensional space. The dose profiles for a phantom along each of the three major axes are fitted to a function which contains the terms that represent the contributions from a point source, an extrafocal scattering, and a flat background. The fitting parameters are extracted for all four helmet collimators, at various shot locations, and with different skull shapes. The 33 parameters of a patient's skull shape obtained from the Skull Scaling Instrument measurements are modeled for individual patients. The relative doses for a treatment volume in the form of 31 x 31 x 31 matrix of points are extracted from the treatment planning system, the Leksell Gamma-Plan (LGP). Our model evaluates the relative doses using the same input parameters as in the LGP, which are skull measurement data, shot location, weight, gamma-angle of the head frame, and helmet collimator size. For 29 single-shot cases, the discrepancy of dose at the focus point between the calculation and the LGP is found to be within -1% to 2%. For multi-shot cases, the value and the coordinate of the maximum dose point from the calculation agree within +/-7% and +/-3 mm with the LGP results. In general, the calculated doses agree with the LGP calculations within +/-10% for the off-center locations. Results of calculation with this method for the dimension and location of the 50% isodose line are in good agreement with results from Leksell GammaPlan. Therefore, this method can be served as a useful tool for secondary quality assurance of Gamma Knife treatment plans.

  15. Auditory processing models

    DEFF Research Database (Denmark)

    Dau, Torsten

    2008-01-01

    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook will pr...

  16. M3 version 3.0: Verification and validation; Hydrochemical model of ground water at repository site

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Javier B. (Dept. of Earth Sciences, Univ. of Zaragoza, Zaragoza (Spain)); Laaksoharju, Marcus (Geopoint AB, Sollentuna (Sweden)); Skaarman, Erik (Abscondo, Bromma (Sweden)); Gurban, Ioana (3D-Terra (Canada))

    2009-01-15

    Hydrochemical evaluation is a complex type of work that is carried out by specialists. The outcome of this work is generally presented as qualitative models and process descriptions of a site. To support and help to quantify the processes in an objective way, a multivariate mathematical tool entitled M3 (Multivariate Mixing and Mass balance calculations) has been constructed. The computer code can be used to trace the origin of the groundwater, and to calculate the mixing proportions and mass balances from groundwater data. The M3 code is a groundwater response model, which means that changes in the groundwater chemistry in terms of sources and sinks are traced in relation to an ideal mixing model. The complexity of the measured groundwater data determines the configuration of the ideal mixing model. Deviations from the ideal mixing model are interpreted as being due to reactions. Assumptions concerning important mineral phases altering the groundwater or uncertainties associated with thermodynamic constants do not affect the modelling because the calculations are solely based on the measured groundwater composition. M3 uses the opposite approach to that of many standard hydrochemical models. In M3, mixing is evaluated and calculated first. The constituents that cannot be described by mixing are described by reactions. The M3 model consists of three steps: the first is a standard principal component analysis, followed by mixing and finally mass balance calculations. The measured groundwater composition can be described in terms of mixing proportions (%), while the sinks and sources of an element associated with reactions are reported in mg/L. This report contains a set of verification and validation exercises with the intention of building confidence in the use of the M3 methodology. At the same time, clear answers are given to questions related to the accuracy and the precision of the results, including the inherent uncertainties and the errors that can be made

  17. M3 version 3.0: Verification and validation; Hydrochemical model of ground water at repository site

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Javier B. (Dept. of Earth Sciences, Univ. of Zaragoza, Zaragoza (Spain)); Laaksoharju, Marcus (Geopoint AB, Sollentuna (Sweden)); Skaarman, Erik (Abscondo, Bromma (Sweden)); Gurban, Ioana (3D-Terra (Canada))

    2009-01-15

    Hydrochemical evaluation is a complex type of work that is carried out by specialists. The outcome of this work is generally presented as qualitative models and process descriptions of a site. To support and help to quantify the processes in an objective way, a multivariate mathematical tool entitled M3 (Multivariate Mixing and Mass balance calculations) has been constructed. The computer code can be used to trace the origin of the groundwater, and to calculate the mixing proportions and mass balances from groundwater data. The M3 code is a groundwater response model, which means that changes in the groundwater chemistry in terms of sources and sinks are traced in relation to an ideal mixing model. The complexity of the measured groundwater data determines the configuration of the ideal mixing model. Deviations from the ideal mixing model are interpreted as being due to reactions. Assumptions concerning important mineral phases altering the groundwater or uncertainties associated with thermodynamic constants do not affect the modelling because the calculations are solely based on the measured groundwater composition. M3 uses the opposite approach to that of many standard hydrochemical models. In M3, mixing is evaluated and calculated first. The constituents that cannot be described by mixing are described by reactions. The M3 model consists of three steps: the first is a standard principal component analysis, followed by mixing and finally mass balance calculations. The measured groundwater composition can be described in terms of mixing proportions (%), while the sinks and sources of an element associated with reactions are reported in mg/L. This report contains a set of verification and validation exercises with the intention of building confidence in the use of the M3 methodology. At the same time, clear answers are given to questions related to the accuracy and the precision of the results, including the inherent uncertainties and the errors that can be made

  18. Functional verification coverage measurement and analysis

    CERN Document Server

    Piziali, Andrew

    2007-01-01

    This book addresses a means of quantitatively assessing functional verification progress. Without this process, design and verification engineers, and management, are left guessing whether or not they have completed verifying the device they are designing.

  19. Synergy between Emissions Verification for Climate and Air Quality: Results from Modeling Analysis over the Contiguous US using CMAQ

    Science.gov (United States)

    Liu, Z.; Bambha, R.; Pinto, J. P.; Zeng, T.; Michelsen, H. A.

    2013-12-01

    The synergy between emissions-verification exercises for fossil-fuel CO2 and traditional air pollutants (TAPs, e.g., NOx, SO2, CO, and PM) stems from the common physical processes underlying the generation, transport, and perturbations of their emissions. Better understanding and characterizing such a synergetic relationship are of great interest and benefit for science and policy. To this end, we have been developing a modeling framework that allows for studying CO2 along with TAPs on regional-through-urban scales. The framework is based on the EPA Community Multi-Scale Air Quality (CMAQ) modeling system and has been implemented on a domain over the contiguous US, where abundant observational data and complete emissions information is available. In this presentation, we will show results from a comprehensive analysis of atmospheric CO2 and an array of TAPs observed from multiple networks and platforms (in situ and satellite observations) and those simulated by CMAQ over the contiguous US for a full year of 2007. We will first present the model configurations and input data used for CMAQ CO2 simulations and the results from model evaluations [1]. In light of the unique properties of CO2 compared to TAPs, we tested the sensitivity of model-simulated CO2 to different initial and boundary conditions, biosphere-atmosphere bidirectional fluxes and fossil-fuel emissions. We then examined the variability of CO2 and TAPs simulated by CMAQ and observed from the NOAA ESRL tall-tower network, the EPA AQS network, and satellites (e.g., SCIAMACHY and OMI) at various spatial and temporal scales. Finally, we diagnosed in CMAQ the roles of fluxes and transport in regulating the covariance between CO2 and TAPs manifested in both surface concentrations and column-integrated densities. We will discuss the implications from these results on how to understand trends and characteristics fossil-fuel emissions by exploiting and combining currently available observational and modeling

  20. 基于 OCL的本体模型校验方法%ONTOLOGY MODEL VERIFICATION APPROACH BASED ON OCL

    Institute of Scientific and Technical Information of China (English)

    钱鹏飞; 王英林; 张申生

    2015-01-01

    In this paper, by combining the set and relation theory with ontology model and introducing and expanding Object Constraint Language ( OCL) in object oriented technology, we present an OCL-based ontology verification method.The method extracts an ontology defi-nition meta-model ( ODM) , which is based on set and relation theory, from a large number of ontology models.The ontology model is divided into'entity related element' and'constraint rule related element' , and through a series of OCL expansion functions the formalised expres-sion of the above 2 kinds of ontology model elements are completed so as to fulfil the OCL-based formalised ontology model verification.In the end, the issue of realising ontology model conflict inspection and reconciliation using this model verification approach is further discussed through an ontology model verification sample of'vehicle management ontology slice of Baosteel information sharing platform' .%将集合关系理论与本体模型相结合,同时引入并扩展面向对象中的OCL( Object Constraint Language)语言,提出一种基于OCL的本体校验方法. 该方法从大量本体模型中抽象出一个本体定义元模型ODM(Ontology Constraint Meta-model),该元模型基于集合关系理论,将本体模型划分为"实体相关元素"和"约束规则相关元素",并通过一系列OCL扩展函数来完成上述两种本体模型元素的形式化表示,以完成基于OCL的本体模型形式化校验. 最后,通过宝钢信息共享平台车辆管理本体片段的本体模型校验实例,进一步讨论如何使用该模型校验方法实现本体模型的冲突检测和冲突消解.

  1. Verification of operation of the actuator control system using the integration the B&R Automation Studio software with a virtual model of the actuator system

    Science.gov (United States)

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.

  2. The experimental verification of the condition of the magnetic material caused by different technological processes

    CERN Document Server

    Tumanski, S

    2000-01-01

    The changes of electrical steel parameters caused by different technological processes have been tested using the magnetovision method. The effects of cutting into the strips, stamping the shape, laser scribing, annealing, bending and stressing have been investigated.

  3. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2017-01-01

    -dependent. We show that the proposed method significantly reduces the error rates of text-dependent speaker verification for the non-target types: target-wrong and impostor-wrong while it maintains comparable TD-SV performance when impostors speak a correct utterance with respect to the conventional system...

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH ELECTRONIC SENSOR TECHNOLOGY MODEL 4100

    Science.gov (United States)

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. As part of this program, the...

  5. GREAT Process Modeller user manual

    OpenAIRE

    Rueda, Urko; España, Sergio; Ruiz, Marcela

    2015-01-01

    This report contains instructions to install, uninstall and use GREAT Process Modeller, a tool that supports Communication Analysis, a communication-oriented business process modelling method. GREAT allows creating communicative event diagrams (i.e. business process models), specifying message structures (which describe the messages associated to each communicative event), and automatically generating a class diagram (representing the data model of an information system that would support suc...

  6. Forecast Verification for North American Mesoscale (NAM) Operational Model over Karst/Non-Karst regions

    Science.gov (United States)

    Sullivan, Z.; Fan, X.

    2014-12-01

    Karst is defined as a landscape that contains especially soluble rocks such as limestone, gypsum, and marble in which caves, underground water systems, over-time sinkholes, vertical shafts, and subterranean river systems form. The cavities and voids within a karst system affect the hydrology of the region and, consequently, can affect the moisture and energy budget at surface, the planetary boundary layer development, convection, and precipitation. Carbonate karst landscapes comprise about 40% of land areas over the continental U.S east of Tulsa, Oklahoma. Currently, due to the lack of knowledge of the effects karst has on the atmosphere, no existing weather model has the capability to represent karst landscapes and to simulate its impact. One way to check the impact of a karst region on the atmosphere is to check the performance of existing weather models over karst and non-karst regions. The North American Mesoscale (NAM) operational forecast is the best example, of which historical forecasts were archived. Variables such as precipitation, maximum/minimum temperature, dew point, evapotranspiration, and surface winds were taken into account when checking the model performance over karst versus non-karst regions. The forecast verification focused on a five-year period from 2007-2011. Surface station observations, gridded observational dataset, and North American Regional Reanalysis (for certain variables with insufficient observations) were used. Thirteen regions of differing climate, size, and landscape compositions were chosen across the Contiguous United States (CONUS) for the investigation. Equitable threat score (ETS), frequency bias (fBias), and root-mean-square error (RMSE) scores were calculated and analyzed for precipitation. RMSE and mean bias (Bias) were analyzed for other variables. ETS, fBias, and RMSE scores show generally a pattern of lower forecast skills, a greater magnitude of error, and a greater under prediction of precipitation over karst than

  7. Verification of a laboratory-based dilation model for in situ conditions using continuum models

    Institute of Scientific and Technical Information of China (English)

    G. Walton; M.S. Diederichs; L.R. Alejano; J. Arzúa

    2014-01-01

    With respect to constitutive models for continuum modeling applications, the post-yield domain re-mains the area of greatest uncertainty. Recent studies based on laboratory testing have led to the development of a number of models for brittle rock dilation, which account for both the plastic shear strain and confining stress dependencies of this phenomenon. Although these models are useful in providing an improved understanding of how dilatancy evolves during a compression test, there has been relatively little work performed examining their validity for modeling brittle rock yield in situ. In this study, different constitutive models for rock dilation are reviewed and then tested, in the context of a number of case studies, using a continuum finite-difference approach (FLAC). The uncertainty associated with the modeling of brittle fracture localization is addressed, and the overall ability of mobilized dilation models to replicate in situ deformation measurements and yield patterns is evaluated.

  8. Operative temperature and thermal comfort in the sun - Implementation and verification of a model for IDA ICE

    DEFF Research Database (Denmark)

    Karlsen, Line; Grozman, Grigori; Heiselberg, Per Kvols;

    2015-01-01

    (MRT) model for IDA Indoor Climate and Energy (IDA ICE). The new feature of the model is that it includes the effect of shortwave radiation in the room and contributes to a more comprehensive prediction of operative temperature, e.g. of a person exposed to direct sun light. The verification...... comfort of persons affected by direct solar radiation. This may further have implications on the predicted energy use and design of the façade, since e.g. an enlarged need for local cooling or use of dynamic solar shading might be discovered....

  9. Experimental Verification of the Physical Model for Droplet-Particles Cleaning in Pulsed Bias Arc Ion Plating

    Institute of Scientific and Technical Information of China (English)

    Yanhui ZHAO; Guoqiang LIN; Chuang DONG; Lishi WEN

    2005-01-01

    It has been reported that application of pulsed biases in arc ion plating could effectively eliminate droplet particles.The present paper aims at experimental verification of a physical model proposed previously by us which is based on particle charging and repulsion in the pulsed plasma sheath. An orthogonal experiment was designed for this purpose,using the electrical parameters of the pulsed bias for the deposition of TiN films on stainless steel substrates. The effect of these parameters on the amount and the size distribution of the particles were analyzed, and the results provided sufficient evidence for the physical model.

  10. Process Document, Joint Verification Protocol, and Joint Test Plan for Verification of HACH-LANGE GmbH LUMIStox 300 Bench Top Luminometer and ECLOX Handheld Luminometer for Luminescent Bacteria Test for use in Wastewater

    Science.gov (United States)

    The Danish Environmental Technology Verification program (DANETV) Water Test Centre operated by DHI, is supported by the Danish Ministry for Science, Technology and Innovation. DANETV, the United States Environmental Protection Agency Environmental Technology Verification Progra...

  11. INNOVATION PROCESS MODELLING

    Directory of Open Access Journals (Sweden)

    JANUSZ K. GRABARA

    2011-01-01

    Full Text Available Modelling phenomena in accordance with the structural approach enables one to simplify the observed relations and to present the classification grounds. An example may be a model of organisational structure identifying the logical relations between particular units and presenting the division of authority, work.

  12. A room acoustical computer model for industrial environments - the model and its verification

    DEFF Research Database (Denmark)

    Christensen, Claus Lynge; Foged, Hans Torben

    1998-01-01

    This paper presents an extension to the traditional room acoustic modelling methods allowing computer modelling of huge machinery in industrial spaces. The program in question is Odeon 3.0 Industrial and Odeon 3.0 Combined which allows the modelling of point sources, surface sources and line...... sources. Combining these three source types it is possible to model huge machinery in an easy and visually clear way. Traditionally room acoustic simulations have been aimed at auditorium acoustics. The aim of the simulations has been to model the room acoustic measuring setup consisting...

  13. Focus points and convergent process operators (A proof strategy for protocol verification)

    NARCIS (Netherlands)

    Groote, J.F.; Springintveld, J.

    2008-01-01

    We present a strategy for finding algebraic correctness proofs for communication systems. It is described in the setting of μCRL [11], which is, roughly, ACP [2,3] extended with a formal treatment of the interaction between data and processes. The strategy has already been applied successfully in [4

  14. Formal verification of automated teller machine systems using SPIN

    Science.gov (United States)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  15. Part weight verification between simulation and experiment of plastic part in injection moulding process

    Science.gov (United States)

    Amran, M. A. M.; Idayu, N.; Faizal, K. M.; Sanusi, M.; Izamshah, R.; Shahir, M.

    2016-11-01

    In this study, the main objective is to determine the percentage difference of part weight between experimental and simulation work. The effect of process parameters on weight of plastic part is also investigated. The process parameters involved were mould temperature, melt temperature, injection time and cooling time. Autodesk Simulation Moldflow software was used to run the simulation of the plastic part. Taguchi method was selected as Design of Experiment to conduct the experiment. Then, the simulation result was validated with the experimental result. It was found that the minimum and maximum percentage of differential of part weight between simulation and experimental work are 0.35 % and 1.43 % respectively. In addition, the most significant parameter that affected part weight is the mould temperature, followed by melt temperature, injection time and cooling time.

  16. Design and verification of shielding for the advanced spent fuel conditioning process facility.

    Science.gov (United States)

    Cho, I J; Kook, D H; Kwon, K C; Lee, E P; Choung, W M; You, G S

    2008-05-01

    An Advanced spent fuel Conditioning Process Facility (ACPF) has recently been constructed by a modification of previously unused cells. ACPF is a hot cell with two rooms located in the basement of the Irradiated Materials Experiment Facility (IMEF) at the Korea Atomic Energy Research Institute. This is for demonstrating the advanced spent fuel conditioning process being proposed in Korea, which is an electrolytic reduction process of spent oxide fuels into a metallic form. The ACPF was designed with a more than 90 cm thick high density concrete shield wall to handle 1.38 PBq (37,430 Ci) of radioactive materials with dose rates lower than 10 muSv h in the operational areas (7,000 zone) and 150 muSv h in the service areas (8,000 zone). In Monte Carlo calculations with a design basis source inventory, the results for the bounding wall showed a maximum of 3 muSv h dose rate at an exterior surface of the ACPF for gamma radiation and 0.76 muSv h for neutrons. All the bounding structures of the ACPF were investigated to check on the shielding performance of the facility to ensure the radiation safety of the facility. A test was performed with a 2.96 TBq (80 Ci) 60Co source unit and the test results were compared with the calculation results. A few failure points were discovered and carefully fixed to meet the design criteria. After fixing the problems, the failure points were rechecked and the safety of the shielding structures was confirmed. In conclusion, it was confirmed that all the investigated parts of the ACPF passed the shielding safety limits by using this program and the ACPF is ready to fulfill its tasks for the advanced spent fuel conditioning process.

  17. Wallas' Four-Stage Model of the Creative Process: More than Meets the Eye?

    Science.gov (United States)

    Sadler-Smith, Eugene

    2015-01-01

    Based on a detailed reading of Graham Wallas' "Art of Thought" (1926) it is argued that his four-stage model of the creative process (Preparation, Incubation, Illumination, Verification), in spite of holding sway as a conceptual anchor for many creativity researchers, does not reflect accurately Wallas' full account of the creative…

  18. BPMN Impact on Process Modeling

    OpenAIRE

    Polak, Przemyslaw

    2013-01-01

    Recent years have seen huge rise in popularity of BPMN in the area of business process modeling, especially among business analysts. This notation has characteristics that distinguish it significantly from the previously popular process modeling notations, such as EPC. The article contains the analysis of some important characteristics of BPMN and provides author’s conclusions on the impact that the popularity and specificity of BPMN can have on the practice of process modeling. Author's obse...

  19. Assembly, integration, and verification (AIV) in ALMA: series processing of array elements

    Science.gov (United States)

    Lopez, Bernhard; Jager, Rieks; Whyborn, Nicholas D.; Knee, Lewis B. G.; McMullin, Joseph P.

    2012-09-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) is a joint project between astronomical organizations in Europe, North America, and East Asia, in collaboration with the Republic of Chile. ALMA will consist of at least 54 twelve-meter antennas and 12 seven-meter antennas operating as an aperture synthesis array in the (sub)millimeter wavelength range. It is the responsibility of ALMA AIV to deliver the fully assembled, integrated, and verified antennas (array elements) to the telescope array. After an initial phase of infrastructure setup AIV activities began when the first ALMA antenna and subsystems became available in mid 2008. During the second semester of 2009 a project-wide effort was made to put in operation a first 3- antenna interferometer at the Array Operations Site (AOS). In 2010 the AIV focus was the transition from event-driven activities towards routine series production. Also, due to the ramp-up of operations activities, AIV underwent an organizational change from an autonomous department into a project within a strong matrix management structure. When the subsystem deliveries stabilized in early 2011, steady-state series processing could be achieved in an efficient and reliable manner. The challenge today is to maintain this production pace until completion towards the end of 2013. This paper describes the way ALMA AIV evolved successfully from the initial phase to the present steady-state of array element series processing. It elaborates on the different project phases and their relationships, presents processing statistics, illustrates the lessons learned and relevant best practices, and concludes with an outlook of the path towards completion.

  20. Experimental determination of temperatures of the inner wall of a boiler combustion chamber for the purpose of verification of a CFD model

    Directory of Open Access Journals (Sweden)

    Petr Trávníček

    2011-01-01

    Full Text Available The paper focuses on the non-destructive method of determination of temperatures in the boiler combustion chamber. This method proves to be significant mainly as regards CFD (Computational Fluid Dynamics simulations of combustion processes, in case of which it is subsequently advisable to verify the data calculated using CFD software application with the actually measured data. Verification of the method was based on usage of reference combustion equipment (130 kW which performs combustion of a mixture of waste sawdust and shavings originating in the course of production of wooden furniture. Measuring of temperatures inside the combustion chamber is – considering mainly the high temperature values – highly demanding and requires a special type of temperature sensors. Furthermore, as regards standard operation, it is not possible to install such sensors without performing structural alterations of the boiler. Therefore, for the purpose of determination of these temperatures a special experimental device was constructed while exploiting a thermal imaging system used for monitoring of the surface temperature of outer wall of the reference boiler. Temperatures on the wall of the boiler combustion chamber were determined on the basis of data measured using the experimental device as well as data from the thermal imaging system. These values might serve for verification of the respective CFD model of combustion equipment.

  1. Verification of extended model of goal directed behavior applied on aggression

    Directory of Open Access Journals (Sweden)

    Katarína Vasková

    2016-01-01

    behavioral desire. Also important impact of this factor on prevolitional stages of aggressive behavior was identified. Next important predictor of behavioral desire was anticipation of positive emotions, but not negative emotions. These results correspond with theory of self-regulation where behavior that is focused on goal attainment is accompanied with positive emotions (see for example Cacioppo, Gardner & Berntson, 1999, Carver, 2004. Results confirmed not only sufficient model fit, but also explained 53% of variance of behavioral desire, 68% of intention and 37% of behavior. Some limitations should be mentioned - especially unequal gender representation in the second sample. Some results could be affected by lower sample size. For the future we recommend use also other types of aggressive behavior in verification EMGB and also to apply more complex incorporation of inhibition to the model. At last, character of this study is co-relational, therefore further researches should manipulate with key variables in experimental way to appraise main characteristics of stated theoretical background.

  2. Temperature Modeling of Lost Creek Lake Using CE-QUAL-W2: A Report on the Development, Calibration, Verification, and Application of the Model

    Science.gov (United States)

    2017-05-01

    ER D C/ EL T R- 17 -6 Temperature Modeling of Applegate Lake Using CE-QUAL-W2 A Report on the Development, Calibration, Verification...and Application of the Model En vi ro nm en ta l L ab or at or y Tammy L. Threadgill, Daniel F. Turner, Laurie A. Nicholas, Barry W. Bunch...Dorothy H. Tillman, and David L. Smith May 2017 Approved for public release; distribution is unlimited. The U.S. Army Engineer Research and

  3. Remaining Sites Verification Package for the 100-F-26:12, 1.8-m (72-in.) Main Process Sewer Pipeline, Waste Site Reclassification Form 2007-034

    Energy Technology Data Exchange (ETDEWEB)

    J. M. Capron

    2008-04-29

    The 100-F-26:12 waste site was an approximately 308-m-long, 1.8-m-diameter east-west-trending reinforced concrete pipe that joined the North Process Sewer Pipelines (100-F-26:1) and the South Process Pipelines (100-F-26:4) with the 1.8-m reactor cooling water effluent pipeline (100-F-19). In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.

  4. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-21

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  5. Community Radiative Transfer Model for Inter-Satellites Calibration and Verification

    Science.gov (United States)

    Liu, Q.; Nalli, N. R.; Ignatov, A.; Garrett, K.; Chen, Y.; Weng, F.; Boukabara, S. A.; van Delst, P. F.; Groff, D. N.; Collard, A.; Joseph, E.; Morris, V. R.; Minnett, P. J.

    2014-12-01

    Developed at the Joint Center for Satellite Data Assimilation, the Community Radiative Transfer Model (CRTM) [1], operationally supports satellite radiance assimilation for weather forecasting. The CRTM also supports JPSS/NPP and GOES-R missions [2] for instrument calibration, validation, monitoring long-term trending, and satellite retrieved products [3]. The CRTM is used daily at the NOAA NCEP to quantify the biases and standard deviations between radiance simulations and satellite radiance measurements in a time series and angular dependency. The purposes of monitoring the data assimilation system are to ensure the proper performance of the assimilation system and to diagnose problems with the system for future improvements. The CRTM is a very useful tool for cross-sensor verifications. Using the double difference method, it can remove the biases caused by slight differences in spectral response and geometric angles between measurements of the two instruments. The CRTM is particularly useful to reduce the difference between instruments for climate studies [4]. In this study, we will carry out the assessment of the Suomi National Polar-orbiting Partnership (SNPP) [5] Cross-track Infrared Sounder (CrIS) data [6], Advanced Technology Microwave Sounder (ATMS) data, and data for Visible Infrared Imaging Radiometer Suite (VIIRS) [7][8] thermal emissive bands. We use dedicated radiosondes and surface data acquired from NOAA Aerosols and Ocean Science Expeditions (AEROSE) [9]. The high quality radiosondes were launched when Suomi NPP flew over NOAA Ship Ronald H. Brown situated in the tropical Atlantic Ocean. The atmospheric data include profiles of temperature, water vapor, and ozone, as well as total aerosol optical depths. The surface data includes air temperature and humidity at 2 meters, skin temperature (Marine Atmospheric Emitted Radiance Interferometer, M-AERI [10]), surface temperature, and surface wind vector. [1] Liu, Q., and F. Weng, 2006: JAS [2] Liu, Q

  6. Operational Characteristics Identification and Simulation Model Verification for Incheon International Airport

    Science.gov (United States)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Zhu, Zhifan; Jung, Yoon C.; Jeong, Myeongsook; Kim, Hyounkyong; Oh, Eunmi; Hong, Sungkwon; Lee, Junwon

    2016-01-01

    integrated into NASA's Airspace Technology Demonstration-2 (ATD-2) project for technology demonstration of Integrated Arrival-Departure-Surface (IADS) operations at CLT. This study is a part of the international research collaboration between KAIA (Korea Agency for Infrastructure Technology Advancement), KARI (Korea Aerospace Research Institute) and NASA, which is being conducted to validate the effectiveness of SARDA concept as a controller decision support tool for departure and surface management of ICN. This paper presents the preliminary results of the collaboration effort. It includes investigation of the operational environment of ICN, data analysis for identification of the operational characteristics of the airport, construction and verification of airport simulation model using Surface Operations Simulator and Scheduler (SOSS), NASA's fast-time simulation tool.

  7. Radiolysis Process Model

    Energy Technology Data Exchange (ETDEWEB)

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  8. Verification of topological relationship in 2-D grain growth process by simulation

    Institute of Scientific and Technical Information of China (English)

    Chao Wang; Guoquan Liu; Ya Sun; Xiangge Qin

    2004-01-01

    Behaviors of the quasi-steady state grain size distribution and the corresponding topological relationship were investigated using the Ports Monte Carlo method to simulate the normal grain growth process. The observed quasi-steady state grain size distribution can be well fit by the Weibull function rather than the Hillert distribution. It is also found that the grain size and average number of grain sides are not linearly related. The reason that the quasi-steady state grain size distribution deviates from the Hillert distribution may contribute to the nonlinearity of the relation of the average number of grain sides with the grain size. The results also exhibit the reasonability of the relationship deduced by Mullins between the grain size distribution and the average number of grain sides.

  9. Verification and Validation of the Spalart-Allmaras Turbulence Model for Strand Grids

    Science.gov (United States)

    2013-01-01

    pp. 4703–4723. [11] Feynman , R., The Feynman Lectures on Physics : Mainly Mechanics, Radiation and Heat , 6th ed., Basic Books, New York, NY, 1977... Lecture Notes in Physics , Vol. 323, 1989, pp. 273–277. [34] Folkner, D., Katz, A., and Sankaran, V., “Design and Verification Methodology of Boundary...unpredictable, thus making its predic- tion and simulation difficult. Nobel Laureate Richard Feynman famously described turbu- lence as “the most important

  10. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao;

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated...

  11. A mathematical model of the nickel converter: Part I. Model development and verification

    Science.gov (United States)

    Kyllo, A. K.; Richards, G. G.

    1991-04-01

    A mathematical model of the nickel converter has been developed. The primary assumption of the model is that the three phases in the converter are in thermal and chemical equilibrium. All matte, slag, and gas in the converter is brought to equilibrium at the end of each of a series of short time steps throughout an entire charge. An empirical model of both the matte and slag is used to characterize the activity coefficients in each phase. Two nickel sulfide species were used to allow for the modeling of sulfur-deficient mattes. A heat balance is carried out over each time step, considering the major heat flows in the converter. The model was validated by a detailed comparison with measured data from six industrial charges. The overall predicted mass balance was shown to be close to that seen in actual practice, and the heat balance gave a good fit of converter temperature up to the last two or three blows of a charge. At this point, reactions in the converter begin to deviate strongly from “equilibrium,” probably due to the converter reactions coming under liquid-phase mass-transfer control. While the equilibrium assumption does work, it is not strictly valid, and the majority of the charge is probably under gas-phase mass-transfer control.

  12. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  13. Verification and process oriented validation of the MiKlip decadal prediction system

    Directory of Open Access Journals (Sweden)

    Frank Kaspar

    2016-12-01

    Full Text Available Decadal prediction systems are designed to become a valuable tool for decision making in different sectors of economy, administration or politics. Progress in decadal predictions is also expected to improve our scientific understanding of the climate system. The German Federal Ministry for Education and Research (BMBF therefore funds the German national research project MiKlip (Mittelfristige Klimaprognosen. A network of German research institutions contributes to the development of the system by conducting individual research projects. This special issue presents a collection of papers with results of the evaluation activities within the first phase of MiKlip. They document the improvements of the MiKlip decadal prediction system which were achieved during the first phase. Key aspects are the role of initialization strategies, model resolution or ensemble size. Additional topics are the evaluation of specific weather parameters in selected regions and the use of specific observational datasets for the evaluation.

  14. Manufactured solutions and the numerical verification of isothermal, nonlinear, three-dimensional Stokes ice-sheet models

    Directory of Open Access Journals (Sweden)

    W. Leng

    2012-07-01

    Full Text Available The technique of manufactured solutions is used for verification of computational models in many fields. In this paper we construct manufactured solutions for models of three-dimensional, isothermal, nonlinear Stokes flow in glaciers and ice sheets. The solution construction procedure starts with kinematic boundary conditions and is mainly based on the solution of a first-order partial differential equation for the ice velocity that satisfies the incompressibility condition. The manufactured solutions depend on the geometry of the ice sheet and other model parameters. Initial conditions are taken from the periodic geometry of a standard problem of the ISMIP-HOM benchmark tests and altered through the manufactured solution procedure to generate an analytic solution for the time-dependent flow problem. We then use this manufactured solution to verify a parallel, high-order accurate, finite element Stokes ice-sheet model. Results from the computational model show excellent agreement with the manufactured analytic solutions.

  15. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    Science.gov (United States)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  16. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  17. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object...... of Refinement of Component and Object Systems (rCOS) and illustrates it with experiences from the work on the Common Component Modelling Example (CoCoME). This gives evidence that the formal techniques developed in rCOS can be integrated into a model-driven development process and shows where it may...

  18. Distinct neural correlates for pragmatic and semantic meaning processing: an event-related potential investigation of scalar implicature processing using picture-sentence verification.

    Science.gov (United States)

    Politzer-Ahles, Stephen; Fiorentino, Robert; Jiang, Xiaoming; Zhou, Xiaolin

    2013-01-15

    The present study examines the brain-level representation and composition of meaning in scalar quantifiers (e.g., some), which have both a semantic meaning (at least one) and a pragmatic meaning (not all). We adopted a picture-sentence verification design to examine event-related potential (ERP) effects of reading infelicitous quantifiers for which the semantic meaning was correct with respect to the context but the pragmatic meaning was not, compared to quantifiers for which the semantic meaning was inconsistent with the context and no additional pragmatic meaning is available. In the first experiment, only pragmatically inconsistent quantifiers, not semantically inconsistent quantifiers, elicited a sustained posterior negative component. This late negativity contrasts with the N400 effect typically elicited by nouns that are incongruent with their context, suggesting that the recognition of scalar implicature errors elicits a qualitatively different ERP signature than the recognition of lexico-semantic errors. We hypothesize that the sustained negativity reflects cancellation of the pragmatic inference and retrieval of the semantic meaning. In our second experiment, we found that the process of re-interpreting the quantifier was independent from lexico-semantic processing: the N400 elicited by lexico-semantic violations was not modulated by the presence of a pragmatic inconsistency. These findings suggest that inferential pragmatic aspects of meaning are processed using different mechanisms than lexical or combinatorial semantic aspects of meaning, that inferential pragmatic meaning can be realized rapidly, and that the computation of meaning involves continuous negotiation between different aspects of meaning.

  19. FEM Analysis and Experimental Verification of the Integral Forging Process for AP1000 Primary Coolant Pipe

    Science.gov (United States)

    Wang, Shenglong; Yu, Xiaoyi; Yang, Bin; Zhang, Mingxian; Wu, Huanchun

    2016-10-01

    AP1000 primary coolant pipes must be manufactured by integral forging technology according to the designer—Westinghouse Electric Co. The characteristics of these large, special-shaped pipes create nonuniform temperatures, effective stress, and effective strain during shaping of the pipes. This paper presents a three-dimensional finite element simulation (3D FEM) of the integral forging process, and qualitatively evaluates the likelihood of forging defects. By analyzing the evolution histories of the three field variables, we concluded that the initial forging temperature should be strictly controlled within the interval 1123 K to 1423 K (850 °C to 1150 °C) to avoid second-phase precipitation. In the hard deformation zones, small strains do not contribute to recrystallization resulting in coarse grains. Conversely, in the free deformation zone, the large strains can contribute to the dynamic recrystallization, favoring grain refinement and closure of voids. Cracks are likely to appear, however, on the workpiece surface when forging leads to large deformations. Based on the simulation results, an eligible workpiece with good mechanical properties, few macroscopic defects, and favorable grain size has been successfully forged by experiments at an industrial scale, which validates the FEM simulation.

  20. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  1. Analysing the Logic and Rigor in the Process of Verification of HACCP Plan%论验证HACCP计划过程中的逻辑性和严密性

    Institute of Scientific and Technical Information of China (English)

    秦红

    2013-01-01

    HACCP体系是一种系统性的食品安全预防控制体系,现已经越来越受到出口食品加工企业的重视,在许多企业中广泛运用并且质量提升效果明显。但是随着HACCP的不断发展,无论国内外官方有无要求,正有越来越多的企业申请HACCP验证或认证。现在企业制订HACCP计划,基本是采用美国国家水产品HACCP培训和教育联盟编写的“HACCP教程”给出的模式,但这种模式的要求是非常严格的。笔者将从逻辑性和严密性两个方面论证此模式下的HACCP计划的验证。旨在帮助指导审核人员完善HACCP计划验证过程,确保HACCP计划的有效实施。%HACCP system is a systematic preventive food safety control system, nowadays,the export food processing enterprises pay more and more attention to it , which has been widely used in many enterprises and the quality improvement effect is obvious. But with the development of HACCP , more and more enterprises apply for HACCP verification or certification. Basically , the enterprise use the model of the"HACCP Guidance"writing by the SHA, to make its HACCP plan, however. This model's requirement is very strict. I will analyse the logic and rigor in the process of verification of HACCP plan. In order to help guiding the personnel perfecting the HACCP plan Verification process, ensure the effective implementation of the HACCP plan.

  2. Modeling Software Processes and Artifacts

    NARCIS (Netherlands)

    van den Berg, Klaas; Bosch, Jan; Mitchell, Stuart

    1997-01-01

    The workshop on Modeling Software Processes and Artifacts explored the application of object technology in process modeling. After the introduction and the invited lecture, a number of participants presented their position papers. First, an overview is given on some background work, and the aims, as

  3. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  4. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    Science.gov (United States)

    Zavodsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use

  5. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    . In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid......The subject of this thesis is to develop a methodological framework that can systematically guide mathematical model building for better understanding of multi-enzyme processes. In this way, opportunities for process improvements can be identified by analyzing simulations of either existing...... in the scientific literature. Reliable mathematical models of such multi-catalytic schemes can exploit the potential benefit of these processes. In this way, the best outcome of the process can be obtained understanding the types of modification that are required for process optimization. An effective evaluation...

  6. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  7. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  8. Modeling nuclear processes by Simulink

    Science.gov (United States)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  9. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    Directory of Open Access Journals (Sweden)

    Tuo Ming Fu

    2016-01-01

    Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion

  10. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  11. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  12. Verification of mathematical models for calculation of viscosity of molten oxide systems

    Directory of Open Access Journals (Sweden)

    S. Rosypalová

    2014-06-01

    Full Text Available The subject of this work is the comparison of numerically obtained values of dynamic viscosity using different types of mathematical models and experimentally measured data of viscosity of oxide systems. The ternary system of SiO2-CaO-Al2O3, which presents simplified base of the casting powders used in technological process, was submitted to the experiment. Experimental research of dynamic viscosity is highly limited by its complexity. That’s why model studies play such an important role in this field. For mathematic calculation of viscosity the NPL model, Iida model and Urbain model were chosen. The results of simulation were compared with the experimentally obtained values of viscosity.

  13. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time......-change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared...

  14. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative...... hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore, we analyze specifications obtained via a simple deterministic time-change of a homogeneous Lévy process. While...... the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on all the single names included in the iTraxx Europe index. The performances are compared with those of the classical CIR...

  15. Quantitative Verification in Practice

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.; Katoen, Joost-Pieter; Larsen, Kim G.

    2010-01-01

    Soon after the birth of model checking, the first theoretical achievements have been reported on the automated verification of quanti- tative system aspects such as discrete probabilities and continuous time. These theories have been extended in various dimensions, such as con- tinuous probabilities

  16. Verification of sub-grid filtered drag models for gas-particle fluidized beds with immersed cylinder arrays

    Energy Technology Data Exchange (ETDEWEB)

    Sarkar, Avik [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sun, Xin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sundaresan, Sankaran [Princeton Univ., NJ (United States)

    2014-04-23

    The accuracy of coarse-grid multiphase CFD simulations of fluidized beds may be improved via the inclusion of filtered constitutive models. In our previous study (Sarkar et al., Chem. Eng. Sci., 104, 399-412), we developed such a set of filtered drag relationships for beds with immersed arrays of cooling tubes. Verification of these filtered drag models is addressed in this work. Predictions from coarse-grid simulations with the sub-grid filtered corrections are compared against accurate, highly-resolved simulations of full-scale turbulent and bubbling fluidized beds. The filtered drag models offer a computationally efficient yet accurate alternative for obtaining macroscopic predictions, but the spatial resolution of meso-scale clustering heterogeneities is sacrificed.

  17. Modelling of CWS combustion process

    Science.gov (United States)

    Rybenko, I. A.; Ermakova, L. A.

    2016-10-01

    The paper considers the combustion process of coal water slurry (CWS) drops. The physico-chemical process scheme consisting of several independent parallel-sequential stages is offered. This scheme of drops combustion process is proved by the particle size distribution test and research stereomicroscopic analysis of combustion products. The results of mathematical modelling and optimization of stationary regimes of CWS combustion are provided. During modeling the problem of defining possible equilibrium composition of products, which can be obtained as a result of CWS combustion processes at different temperatures, is solved.

  18. Integration of process design and controller design for chemical processes using model-based methodology

    DEFF Research Database (Denmark)

    Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan; Gani, Rafiqul

    2010-01-01

    In this paper, a novel systematic model-based methodology for performing integrated process design and controller design (IPDC) for chemical processes is presented. The methodology uses a decomposition method to solve the IPDC typically formulated as a mathematical programming (optimization...... with constraints) problem. Accordingly the optimization problem is decomposed into four sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection and verification, which are relatively easier to solve. The methodology makes use of thermodynamic-process...... insights and the reverse design approach to arrive at the final process design–controller design decisions. The developed methodology is illustrated through the design of: (a) a single reactor, (b) a single separator, and (c) a reactor–separator-recycle system and shown to provide effective solutions...

  19. Formal Verification of UML Profil

    DEFF Research Database (Denmark)

    Bhutto, Arifa; Hussain, Dil Muhammad Akbar

    2011-01-01

    The Unified Modeling Language (UML) is based on the Model Driven Development (MDD) approach which capturing the system functionality using the platform-independent model (PMI) and appropriate domain-specific languages. In UML base system notations, structural view is model by the class, components...... and object diagrams and behavioral view model by the activity, use case, state, and sequence diagram. However, UML does not provide the formal syntax, therefore its semantics is not formally definable, so for assure of correctness, we need to incorporate semantic reasoning through verification, specification......, refinement, and incorporate into the development process. Our motivation of research is to make an easy structural view and suggest formal technique/ method which can be best applied or used for the UML based development system. We investigate the tools and methods, which broadly used for the formal...

  20. Arms control verification: The technologies that make it possible

    Energy Technology Data Exchange (ETDEWEB)

    Tsipis, K.; Hafemeister, D.W.; Janeway, P.

    1986-01-01

    This book presents papers on arms control verification. Topics considered include the politics of treaty verification and compliance, national security, remote sensing, image processing, image enhancement by digital computer, charge-coupled device image sensors, radar imaging, infrared surveillance, monitoring, seismological aspects, satellite verifications, seismic verification, and verifying a fissile material production freeze.

  1. Social Models: Blueprints or Processes?

    Science.gov (United States)

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  2. Nanofibre distribution in composites manufactured with epoxy reinforced with nanofibrillated cellulose: model prediction and verification

    Science.gov (United States)

    Aitomäki, Yvonne; Westin, Mikael; Korpimäki, Jani; Oksman, Kristiina

    2016-07-01

    In this study a model based on simple scattering is developed and used to predict the distribution of nanofibrillated cellulose in composites manufactured by resin transfer moulding (RTM) where the resin contains nanofibres. The model is a Monte Carlo based simulation where nanofibres are randomly chosen from probability density functions for length, diameter and orientation. Their movements are then tracked as they advance through a random arrangement of fibres in defined fibre bundles. The results of the model show that the fabric filters the nanofibres within the first 20 µm unless clear inter-bundle channels are available. The volume fraction of the fabric fibres, flow velocity and size of nanofibre influence this to some extent. To verify the model, an epoxy with 0.5 wt.% Kraft Birch nanofibres was made through a solvent exchange route and stained with a colouring agent. This was infused into a glass fibre fabric using an RTM process. The experimental results confirmed the filtering of the nanofibres by the fibre bundles and their penetration in the fabric via the inter-bundle channels. Hence, the model is a useful tool for visualising the distribution of the nanofibres in composites in this manufacturing process.

  3. DISCRETE DYNAMIC MODEL OF BEVEL GEAR – VERIFICATION THE PROGRAM SOURCE CODE FOR NUMERICAL SIMULATION

    Directory of Open Access Journals (Sweden)

    Krzysztof TWARDOCH

    2014-06-01

    Full Text Available In the article presented a new model of physical and mathematical bevel gear to study the influence of design parameters and operating factors on the dynamic state of the gear transmission. Discusses the process of verifying proper operation of copyright calculation program used to determine the solutions of the dynamic model of bevel gear. Presents the block diagram of a computing algorithm that was used to create a program for the numerical simulation. The program source code is written in an interactive environment to perform scientific and engineering calculations, MATLAB

  4. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  5. Modelling and verification of single slope solar still using ANSYS-CFX

    Energy Technology Data Exchange (ETDEWEB)

    Panchal, Hitesh N. [Research Scholar, Kadi Sarvavishwavidyalaya University, Gandhinagar (India); Shah, P.K. [Principal, Silver Oak College of Engineering and Technology, Ahmedabad (India)

    2011-07-01

    Solar distillation method is an easy, small scale and cost effective technique for providing safe water. It requires an energy input as heat and the solar radiation can be source of energy. Solar still is a device which uses process of solar distillation. Here, a two phase, three dimensional model was made for evaporation as well as condensation process in solar still by using ANSYS CFX method to simulate the present model. Simulation results of solar still compared with actual experiment data of single basin solar still at climate conditions of Mehsana (23{sup o}12' N, 72{sup o}30'). There is a good agreement with experimental results and simulation results of distillate output, water temperature and heat transfer coefficients. Overall study shows the ANSYS CFX is a powerful tool for diagnostic as well as analysis of solar still.

  6. Model-based risk analysis of coupled process steps.

    Science.gov (United States)

    Westerberg, Karin; Broberg-Hansen, Ernst; Sejergaard, Lars; Nilsson, Bernt

    2013-09-01

    A section of a biopharmaceutical manufacturing process involving the enzymatic coupling of a polymer to a therapeutic protein was characterized with regards to the process parameter sensitivity and design space. To minimize the formation of unwanted by-products in the enzymatic reaction, the substrate was added in small amounts and unreacted protein was separated using size-exclusion chromatography (SEC) and recycled to the reactor. The quality of the final recovered product was thus a result of the conditions in both the reactor and the SEC, and a design space had to be established for both processes together. This was achieved by developing mechanistic models of the reaction and SEC steps, establishing the causal links between process conditions and product quality. Model analysis was used to complement the qualitative risk assessment, and design space and critical process parameters were identified. The simulation results gave an experimental plan focusing on the "worst-case regions" in terms of product quality and yield. In this way, the experiments could be used to verify both the suggested process and the model results. This work demonstrates the necessary steps of model-assisted process analysis, from model development through experimental verification.

  7. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  8. Adsorption and biodegradation of 2-chlorophenol by mixed culture using activated carbon as a supporting medium-reactor performance and model verification

    Science.gov (United States)

    Lin, Yen-Hui

    2016-12-01

    A non-steady-state mathematical model system for the kinetics of adsorption and biodegradation of 2-chlorophenol (2-CP) by attached and suspended biomass on activated carbon process was derived. The mechanisms in the model system included 2-CP adsorption by activated carbon, 2-CP mass transport diffusion in biofilm, and biodegradation by attached and suspended biomass. Batch kinetic tests were performed to determine surface diffusivity of 2-CP, adsorption parameters for 2-CP, and biokinetic parameters of biomass. Experiments were conducted using a biological activated carbon (BAC) reactor system with high recycled rate to approximate a completely mixed flow reactor for model verification. Concentration profiles of 2-CP by model predictions indicated that biofilm bioregenerated the activated carbon by lowering the 2-CP concentration at the biofilm-activated carbon interface as the biofilm grew thicker. The removal efficiency of 2-CP by biomass was approximately 98.5% when 2-CP concentration in the influent was around 190.5 mg L-1 at a steady-state condition. The concentration of suspended biomass reached up to about 25.3 mg L-1 while the thickness of attached biomass was estimated to be 636 μm at a steady-state condition by model prediction. The experimental results agree closely with the results of the model predictions.

  9. Multi-dimensional boron transport modeling in subchannel approach: Part I. Model selection, implementation and verification of COBRA-TF boron tracking model

    Energy Technology Data Exchange (ETDEWEB)

    Ozdemir, Ozkan Emre, E-mail: ozdemir@psu.edu [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, University Park, PA 16802 (United States); Avramova, Maria N., E-mail: mna109@psu.edu [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, University Park, PA 16802 (United States); Sato, Kenya, E-mail: kenya_sato@mhi.co.jp [Mitsubishi Heavy Industries (MHI), Kobe (Japan)

    2014-10-15

    Highlights: ► Implementation of multidimensional boron transport model in a subchannel approach. ► Studies on cross flow mechanism, heat transfer and lateral pressure drop effects. ► Verification of the implemented model via code-to-code comparison with CFD code. - Abstract: The risk of reflux condensation especially during a Small Break Loss Of Coolant Accident (SB-LOCA) and the complications of tracking the boron concentration experimentally inside the primary coolant system have stimulated and subsequently have been a focus of many computational studies on boron tracking simulations in nuclear reactors. This paper presents the development and implementation of a multidimensional boron transport model with Modified Godunov Scheme within a thermal-hydraulic code based on a subchannel approach. The cross flow mechanism in multiple-subchannel rod bundle geometry as well as the heat transfer and lateral pressure drop effects are considered in the performed studies on simulations of deboration and boration cases. The Pennsylvania State University (PSU) version of the COBRA-TF (CTF) code was chosen for the implementation of three different boron tracking models: First Order Accurate Upwind Difference Scheme, Second Order Accurate Godunov Scheme, and Modified Godunov Scheme. Based on the performed nodalization sensitivity studies, the Modified Godunov Scheme approach with a physical diffusion term was determined to provide the best solution in terms of precision and accuracy. As a part of the verification and validation activities, a code-to-code comparison was carried out with the STAR-CD computational fluid dynamics (CFD) code and presented here. The objective of this study was two-fold: (1) to verify the accuracy of the newly developed CTF boron tracking model against CFD calculations; and (2) to investigate its numerical advantages as compared to other thermal-hydraulics codes.

  10. Design and verification of a simple 3D dynamic model of speed skating which mimics observed forces and motions.

    Science.gov (United States)

    van der Kruk, E; Veeger, H E J; van der Helm, F C T; Schwab, A L

    2017-09-14

    Advice about the optimal coordination pattern for an individual speed skater, could be addressed by simulation and optimization of a biomechanical speed skating model. But before getting to this optimization approach one needs a model that can reasonably match observed behaviour. Therefore, the objective of this study is to present a verified three dimensional inverse skater model with minimal complexity, which models the speed skating motion on the straights. The model simulates the upper body transverse translation of the skater together with the forces exerted by the skates on the ice. The input of the model is the changing distance between the upper body and the skate, referred to as the leg extension (Euclidean distance in 3D space). Verification shows that the model mimics the observed forces and motions well. The model is most accurate for the position and velocity estimation (respectively 1.2% and 2.9% maximum residuals) and least accurate for the force estimations (underestimation of 4.5-10%). The model can be used to further investigate variables in the skating motion. For this, the input of the model, the leg extension, can be optimized to obtain a maximal forward velocity of the upper body. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  11. EXPERIMENTAL VERIFICATION OF THE THREE-DIMENSIONAL THERMAL-HYDRAULIC MODELS IN THE BEST-ESTIMATE CODE BAGIRA.

    Energy Technology Data Exchange (ETDEWEB)

    KALINICHENKO,S.D.KROSHILIN,A.E.KROSHILIN,V.E.SMIRNOV,A.V.KOHUT,P.

    2004-03-15

    In this paper we present verification results of the BAGIRA code that was performed using data from integral thermal-hydraulic experimental test facilities as well as data obtained from operating nuclear power plants. BAGIRA is a three-dimensional numerical best-estimate code that includes non-homogeneous modeling. Special consideration was given to the recently completed experimental data from the PSB-VVER integral test facility (EREC, Electrogorsk, Russia)--a new Russian large-scale four-loop unit, which has been designed to model the primary circuits of VVER-1000 type reactors. It is demonstrated that the code BAGIRA can be used to analyze nuclear reactor behavior under normal and accident conditions.

  12. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  13. Experimental verification of a precooled mixed gas Joule-Thomson cryoprobe model

    Science.gov (United States)

    Passow, Kendra Lynn; Skye, Harrison; Nellis, Gregory; Klein, Sanford

    2012-06-01

    Cryosurgery is a medical technique that uses a cryoprobe to apply extreme cold to undesirable tissue such as cancers. Precooled Mixed Gas Joule-Thomson (pMGJT) cycles with Hampson-style recuperators are integrated with the latest generation of cryoprobes to create more powerful and compact instruments. Selection of gas mixtures for these cycles is not a trivial process; the focus of this research is the development of a detailed model that can be integrated with an optimization algorithm to select optimal gas mixtures. A test facility has been constructed to experimentally tune and verify this model. The facility uses a commercially available cryoprobe system that was modified to integrate measurement instrumentation sufficient to determine the performance of the system and its component parts. Spatially resolved temperature measurements allow detailed measurements of the heat transfer within the recuperator and therefore computation of the spatially resolved conductance. These data can be used to study the multiphase, multicomponent heat transfer process in the complicated recuperator geometry. The optimization model has been expanded to model the pressure drop associated with the flow to more accurately predict the performance of the system. The test facility has been used to evaluate the accuracy and usefulness of this improvement.

  14. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  15. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  16. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables...... of the next stage with the purpose of obtaining optimal or almost optimal quality of the output variable. An important aspect of the methods presented is the possibility of extensive graphic analysis of data that can provide the engineer with a detailed view of the multi-variate variation in data.......Many production processes are carried out in stages. At the end of each stage, the production engineer can analyze the intermediate results and correct process parameters (variables) of the next stage. Both analysis of the process and correction to process parameters at next stage should...

  17. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  18. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Se Ho; Lee, Seung Wook [Pusan National University, Busan (Korea, Republic of); Han, Su Chul; Park, Seung Woo [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-05-15

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

  19. Verification of the New FAST v8 Capabilities for the Modeling of Fixed-Bottom Offshore Wind Turbines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Barahona, B.; Jonkman, J.; Damiani, R.; Robertson, A.; Hayman, G.

    2014-12-01

    Coupled dynamic analysis has an important role in the design of offshore wind turbines because the systems are subject to complex operating conditions from the combined action of waves and wind. The aero-hydro-servo-elastic tool FAST v8 is framed in a novel modularization scheme that facilitates such analysis. Here, we present the verification of new capabilities of FAST v8 to model fixed-bottom offshore wind turbines. We analyze a series of load cases with both wind and wave loads and compare the results against those from the previous international code comparison projects-the International Energy Agency (IEA) Wind Task 23 Subtask 2 Offshore Code Comparison Collaboration (OC3) and the IEA Wind Task 30 OC3 Continued (OC4) projects. The verification is performed using the NREL 5-MW reference turbine supported by monopile, tripod, and jacket substructures. The substructure structural-dynamics models are built within the new SubDyn module of FAST v8, which uses a linear finite-element beam model with Craig-Bampton dynamic system reduction. This allows the modal properties of the substructure to be synthesized and coupled to hydrodynamic loads and tower dynamics. The hydrodynamic loads are calculated using a new strip theory approach for multimember substructures in the updated HydroDyn module of FAST v8. These modules are linked to the rest of FAST through the new coupling scheme involving mapping between module-independent spatial discretizations and a numerically rigorous implicit solver. The results show that the new structural dynamics, hydrodynamics, and coupled solutions compare well to the results from the previous code comparison projects.

  20. Modeling, Simulation and Dynamics Analysis Issues of Electric Motor, for Mechatronics Applications, Using Different Approaches and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Ahmad A. Mahfouz

    2013-04-01

    Full Text Available The accurate control of motion is a fundamental concern in mechatronics applications, where placing an object in the exact desired location with the exact possible amount of force and torque at the correct exact time is essential for efficient system operation. An accurate modeling, simulation and dynamics analysis of actuators for mechatronics motion control applications is of big concern. The ultimate goal of this paper addresses different approaches used to derive mathematical models, building corresponding simulink models and dynamic analysis of the basic open loop electric DC motor system, used in mechatronics motion control applications, particularly, to design, construct and control of a mechatronics robot arm with single degree of freedom, and verification by MATLAB/Simulink. To simplify and accelerate the process of DC motors sizing, selection, dynamic analysis and evaluation for different motion applications, different mathematical models in terms of output position, speed, current, acceleration and torque, as well as corresponding simulink models, supporting MATLAB m.file and general function block models are to be introduced. The introduced models were verified using MATLAB/ Simulink. These models are intended for research purposes as well as for the application in educational process.This paper is part I of writers' research about mechatronics motion control, the ultimate goal of this research addresses design, modeling, simulation, dynamics analysis and controller selection and design issues, of mechatronics single joint robot arm. where a electric DC motor is used and a control system is selected and designed to move a Robot arm to a desired output position, θ corresponding to applied input voltage, V_(in and satisfying all required design specifications.

  1. Modeling and experimental verification of tubular product formation during spray forming

    Institute of Scientific and Technical Information of China (English)

    LIU Dong-ming; ZHAO Jiu-zhou; LI Mu-sen

    2009-01-01

    A mathematical model is formulated to predict the shape evolution and the final geometry of a tubular product prepared by spray forming. The effects of several important processing parameters on the shape evolution of the tube are investigated. The model is validated against experiments of spray formed large diameter tubes. The experimental and the modeling results show that there are three distinct regions in the preform, i.e., the left transition region, the middle uniform diameter region and the right transition region. The results show that the atomization parameters as and bs, traversing speed v of the substrate, the outer diameter D0 of the substrate, and the initial deposition distance d0 play important roles in the contour and the wall thickness of the spray formed tube. But the angular velocity ω of the substrate has little effect on the buildup of the deposit. After a certain time from the beginning of the process, the deposit will come into a steady growth state. In addition, an equation is provided to estimate the wall thickness of the deposit under the steady growth state based on the mass conservation.

  2. Security Policy Development: Towards a Life-Cycle and Logic-Based Verification Model

    Directory of Open Access Journals (Sweden)

    Luay A. Wahsheh

    2008-01-01

    Full Text Available Although security plays a major role in the design of software systems, security requirements and policies are usually added to an already existing system, not created in conjunction with the product. As a result, there are often numerous problems with the overall design. In this paper, we discuss the relationship between software engineering, security engineering, and policy engineering and present a security policy life-cycle; an engineering methodology to policy development in high assurance computer systems. The model provides system security managers with a procedural engineering process to develop security policies. We also present an executable Prolog-based model as a formal specification and knowledge representation method using a theorem prover to verify system correctness with respect to security policies in their life-cycle stages.

  3. Verification Test of the SURF and SURFplus Models in xRage: Part III Affect of Mesh Alignment

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-15

    The previous studies used an underdriven detonation wave in 1-dimension (steady ZND reaction zone profile followed by a scale-invariant rarefaction wave) for PBX 9502 as a verification test of the implementation of the SURF and SURFplus models in the xRage code. Since the SURF rate is a function of the lead shock pressure, the question arises as to the effect on accuracy of variations in the detected shock pressure due to the alignment of the shock front with the mesh. To study the effect of mesh alignment we simulate a cylindrically diverging detonation wave using a planar 2-D mesh. The leading issue is the magnitude of azimuthal asymmetries in the numerical solution. The 2-D test case does not have an exact analytic solution. To quantify the accuracy, the 2-D solution along rays through the origin are compared to a highly resolved 1-D simulation in cylindrical geometry.

  4. Bedrock geology Forsmark. Modelling stage 2.3. Implications for and verification of the deterministic geological models based on complementary data

    Energy Technology Data Exchange (ETDEWEB)

    Stephens, Michael B. (Geological Survey of Sweden, Uppsala (Sweden)); Simeonov, Assen (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Isaksson, Hans (GeoVista AB, Luleaa (Sweden))

    2008-12-15

    The Swedish Nuclear Fuel and Waste Management Company is in the process of completing site descriptive modelling at two locations in Sweden, with the objective to site a deep geological repository for spent nuclear fuel. At Forsmark, the results of the stage 2.2 geological modelling formed the input for downstream users. Since complementary ground and borehole geological and geophysical data, acquired after model stage 2.2, were not planned to be included in the deterministic rock domain, fracture domain and deformation zone models supplied to the users, it was deemed necessary to evaluate the implications of these stage 2.3 data for the stage 2.2 deterministic geological models and, if possible, to make use of these data to verify the models. This report presents the results of the analysis of the complementary stage 2.3 geological and geophysical data. Model verification from borehole data has been implemented in the form of a prediction-outcome test. The stage 2.3 geological and geophysical data at Forsmark mostly provide information on the bedrock outside the target volume. Additional high-resolution ground magnetic data and the data from the boreholes KFM02B, KFM11A, KFM12A and HFM33 to HFM37 can be included in this category. Other data complement older information of identical character, both inside and outside this volume. These include the character and kinematics of deformation zones and fracture mineralogy. In general terms, it can be stated that all these new data either confirm the geological modelling work completed during stage 2.2 or are in good agreement with the data that were used in this work. In particular, although the new high-resolution ground magnetic data modify slightly the position and trace length of some stage 2.2 deformation zones at the ground surface, no new or modified deformation zones with a trace length longer than 3,000 m at the ground surface have emerged. It is also apparent that the revision of fracture orientation data

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS: TRI-DIM FILTER CORP. PREDATOR II MODEL 8VADTP123C23

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Predator II, Model 8VADTP123C23CC000 air filter for dust and bioaerosol filtration manufactured by Tri-Dim Filter Corporation. The pressure drop across the filter was 138 Pa clean and...

  6. Modeling of the Hydroentanglement Process

    Directory of Open Access Journals (Sweden)

    Ping Xiang

    2006-11-01

    Full Text Available Mechanical performance of hydroentangled nonwovens is determined by the degree of the fiber entanglement, which depends on parameters of the fibers, fiberweb, forming surface, water jet and the process speed. This paper develops a computational fluid dynamics model of the hydroentanglement process. Extensive comparison with experimental data showed that the degree of fiber entanglement is linearly related to flow vorticity in the fiberweb, which is induced by impinging water jets. The fiberweb is modeled as a porous material of uniform porosity and the actual geometry of forming wires is accounted for in the model. Simulation results are compared with experimental data for a Perfojet ® sleeve and four woven forming surfaces. Additionally, the model is used to predict the effect of fiberweb thickness on the degree of fiber entanglement for different forming surfaces.

  7. Multi-layer VEB modeling: capturing interlayer etch process effects for multi-patterning process

    Science.gov (United States)

    Hu, Lin; Jung, Sunwook; Li, Jianliang; Kim, Young; Bar, Yuval; Lobb, Granger; Liang, Jim; Ogino, Atsushi; Sturtevant, John; Bailey, Todd

    2016-03-01

    Self-Aligned Via (SAV) process is commonly used in back end of line (BEOL) patterning. As the technology node advances, tightening CD and overlay specs require continuous improvement in model accuracy of the SAV process. Traditional single layer Variable Etch Bias (VEB) model is capable of describing the micro-loading and aperture effects associated with the reactive ion etch (RIE), but it does not include effects from under layers. For the SAV etch, a multi-layer VEB model is needed to account for the etch restriction from metal trenches. In this study, we characterize via post-etch dimensions through pitch and through metal trench widths, and show that VEB model prediction accuracy for SAV CDs after SAV formation can be significantly improved by applying a multi-layer scheme. Using a multi-layer VEB, it is demonstrated that the output via size changes with varying trench dimensions, which matches the silicon results. The model also reports via shape post-etch as a function of trench environment, where elliptical vias are correctly produced. The multi-layer VEB model can be applied both multi-layer correction and verification in full chip flow. This paper will also suggest that the multi-layer VEB model can be used in other FEOL layers with interlayer etch process effects, such as gate cut, to support the robustness of new model.

  8. Modified Claus process probabilistic model

    Energy Technology Data Exchange (ETDEWEB)

    Larraz Mora, R. [Chemical Engineering Dept., Univ. of La Laguna (Spain)

    2006-03-15

    A model is proposed for the simulation of an industrial Claus unit with a straight-through configuration and two catalytic reactors. Process plant design evaluations based on deterministic calculations does not take into account the uncertainties that are associated with the different input variables. A probabilistic simulation method was applied in the Claus model to obtain an impression of how some of these inaccuracies influences plant performance. (orig.)

  9. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  10. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    The subject of this thesis is to develop a methodological framework that can systematically guide mathematical model building for better understanding of multi-enzyme processes. In this way, opportunities for process improvements can be identified by analyzing simulations of either existing...... are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...

  11. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip;

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  12. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...

  13. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...

  14. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...

  15. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...

  16. Simulink based behavioural modelling of a pulse oximeter for deployment in rapid development, prototyping and verification.

    Science.gov (United States)

    Shokouhian, M; Morling, R C S; Kale, I

    2012-01-01

    The pulse oximeter is a well-known device for measuring the level of oxygen in blood. Since their invention, pulse oximeters have been under constant development in both aspects of hardware and software; however there are still unsolved problems that limit their performance [6], [7]. Many fresh algorithms and new design techniques are being suggested every year by industry and academic researchers which claim that they can improve accuracy of measurements [8], [9]. With the lack of an accurate computer-based behavioural model for pulse oximeters, the only way for evaluation of these newly developed systems and algorithms is through hardware implementation which can be both expensive and time consuming. This paper presents an accurate Simulink based behavioural model for a pulse oximeter that can be used by industry and academia alike working in this area, as an exploration as well as productivity enhancement tool during their research and development process. The aim of this paper is to introduce a new computer-based behavioural model which provides a simulation environment from which new ideas can be rapidly evaluated long before the real implementation.

  17. Experimental verification of bridge seismic damage states quantified by calibrating analytical models with empirical field data

    Institute of Scientific and Technical Information of China (English)

    Swagata Banerjee; Masanobu Shinozuka

    2008-01-01

    Bridges are one of the most vulnerable components of a highway transportation network system subjected to earthquake ground motions.Prediction of resilience and sustainability of bridge performance in a probabilistic manner provides valuable information for pre-event system upgrading and post-event functional recovery of the network.The current study integrates bridge seismic damageability information obtained through empirical,analytical and experimental procedures and quantifies threshold limits of bridge damage states consistent with the physical damage description given in HAZUS.Experimental data from a large-scale shaking table test are utilized for this purpose.This experiment was conducted at the University of Nevada,Reno,where a research team from the University of California,Irvine,participated.Observed experimental damage data are processed to idemify and quantify bridge damage states in terms of rotational ductility at bridge column ends.In parallel,a mechanistic model for fragility curves is developed in such a way that the model can be calibrated against empirical fragility curves that have been constructed from damage data obtained during the 1994 Northridge earthquake.This calibration quantifies threshold values of bridge damage states and makes the analytical study consistent with damage data observed in past earthquakes.The mechanistic model is transportable and applicable to most types and sizes of bridges.Finally,calibrated damage state definitions are compared with that obtained using experimental findings.Comparison shows excellent consistency among results from analytical,empirical and experimental observations.

  18. Linear Modeling, Simulation and Experimental Verification of a Pressure Regulator for CNG Injection Systems

    Directory of Open Access Journals (Sweden)

    Dirk Hübner

    2008-08-01

    Full Text Available The number of motor vehicles powered by internal combustion engines keeps growing despite shrinking oil reserves. As a result, compressed natural gas (CNG is gaining currency as an emerging combustion engine fuel. To this day, CNG systems – e.g., in passenger cars – are not fully integrated into the development process as conducted by vehicle or engine manufacturers. Instead, they are usually "adapted in" at a downstream stage by small, specialized companies. The present paper initially outlines the state of the art in advanced gas injection technologies. Especially the development towards sequential injection systems is described. A pressure regulator for CNG driven combustion engines is examined in detail, given its role as a highly sensitive and critical system component. Based on a precise theoretical analysis, a linear model of this pressure regulator is derived and subjected to dynamic simulation. The analytical approach is accompanied by an experimental investigation of the device. On a test rig developed at the Trier University of Applied Sciences, the static and dynamic features of the pressure regulator can be measured with the requisite precision. The comparison of measured and simulated data yields a validation of the dynamic simulation model. With the approaches developed it is now possible for the first time to model, simulate and optimize single- or multi-stage pressure regulators for CNG driven engines with less effort and higher accuracy.

  19. System-level modeling and verification of a micro pitch-tunable grating

    Science.gov (United States)

    Lv, Xianglian; Xu, Jinghui; Yu, Yiting; He, Yang; Yuan, Weizheng

    2010-10-01

    Micro Pitch-tunable Grating based on microeletromechanical systems(MEMS) technology can modulate the grating period dynamically by controlling the drive voltage. The device is so complex that it is impossible to model and sumulation by FEA method or only analysis macromodel. In this paper, a new hybrid system-level modeling method was presented. Firstly the grating was decomposed into function components such as grating beam, supporting beam, electrostatic comb-driver. Block Arnoldi algorithm was used to obtain the numerical macromodel of the grating beams and supporting beams, the analytical macromodels called multi-port-elements(MPEs) of the comb-driver and other parts were also established, and the elements were connected together to form hybrid network for representing the systemlevel models of the grating in MEME Garden, which is a MEMS CAD tool developed by Micro and Nano Electromechanical Systems Laboratory, Northwestern Polytechnical University. Both frequency and time domain simulation were implemented. The grating was fabricated using silicon-on-glass(SOG) process. The measured working displacement is 16.5μm at a driving voltage of 40V. The simulation result is 17.6μm which shows an acceptable agreement with the measurement result within the error tolerance of 6.7%. The method proposed in this paper can solve the voltage-displacement simulation problem of this kind of complex grating. It can also be adapted to similar MEMS/MOEMS devices simulations.

  20. Formal verification of industrial control systems

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  1. Representations used by mathematics student teachers in mathematical modeling process

    Directory of Open Access Journals (Sweden)

    Aytuğ Özaltun

    2014-02-01

    Full Text Available The purpose of this study is to determine representations used by mathematics student teachers in steps of mathematical modeling process based on their solutions of problems formed in the context of different classification of modeling. The study was conducted with fifteen secondary mathematics student teachers given a Mathematical Modeling course. The participants were separated into five collaboration groups of three students. Data were collected with the detailed written papers given by the groups for the problems and GeoGebra solution files. The groups benefited from verbal, algebraic, figural, tabular and dynamic representations while they were solving the problems. Considering all steps of the process, groups at most used verbal and algebraic representations. While they used only verbal representation in analyzing the problem, they benefited from at most verbal representation and then figural representation in establishing the systematic structure. The most used is algebraic and then verbal representations in the steps of mathematization, meta-mathematization, and mathematical analysis. In the steps of interpretation/evaluation and the model verification, the groups mainly benefited from verbal and then algebraic representations. Further researches towards why representations are preferred in the specific steps of the mathematical modeling process are suggested.Key Words: Mathematical modeling, modeling problems, mathematics student teachers, representations.

  2. Network Model Building (Process Mapping)

    OpenAIRE

    Blau, Gary; Yih, Yuehwern

    2004-01-01

    12 slides Provider Notes:See Project Planning Video (Windows Media) Posted at the bottom are Gary Blau's slides. Before watching, please note that "process mapping" and "modeling" are mentioned in the video and notes. Here they are meant to refer to the NSCORT "project plan"

  3. Numerical climate modeling and verification of selected areas for heat waves of Pakistan using ensemble prediction system

    Science.gov (United States)

    Amna, S.; Samreen, N.; Khalid, B.; Shamim, A.

    2013-06-01

    Depending upon the topography, there is an extreme variation in the temperature of Pakistan. Heat waves are the Weather-related events, having significant impact on the humans, including all socioeconomic activities and health issues as well which changes according to the climatic conditions of the area. The forecasting climate is of prime importance for being aware of future climatic changes, in order to mitigate them. The study used the Ensemble Prediction System (EPS) for the purpose of modeling seasonal weather hind-cast of three selected areas i.e., Islamabad, Jhelum and Muzaffarabad. This research was purposely carried out in order to suggest the most suitable climate model for Pakistan. Real time and simulated data of five General Circulation Models i.e., ECMWF, ERA-40, MPI, Meteo France and UKMO for selected areas was acquired from Pakistan Meteorological Department. Data incorporated constituted the statistical temperature records of 32 years for the months of June, July and August. This study was based on EPS to calculate probabilistic forecasts produced by single ensembles. Verification was done out to assess the quality of the forecast t by using standard probabilistic measures of Brier Score, Brier Skill Score, Cross Validation and Relative Operating Characteristic curve. The results showed ECMWF the most suitable model for Islamabad and Jhelum; and Meteo France for Muzaffarabad. Other models have significant results by omitting particular initial conditions.

  4. Modeling of the reburning process

    Energy Technology Data Exchange (ETDEWEB)

    Rota, R.; Bonini, F.; Servida, A.; Morbidelli, M.; Carra, S. [Politecnico di Milano, Milano (Italy). Dip. di Chimica Fisica Applicata

    1997-07-01

    Reburning has become a popular method of abating NO{sub x} emission in power plants. Its effectiveness is strongly affected by the interaction between gas phase chemistry and combustion chamber fluid dynamics. Both the mixing of the reactant streams and the elementary reactions in the gas phase control the overall kinetics of the process. This work developed a model coupling a detailed kinetic mechanism to a simplified description of the fluid dynamics of the reburning chamber. The model was checked with reference to experimental data from the literature. Detailed kinetic modeling was found to be essential to describe the reburning process, since the fluid dynamics of the reactor have a strong influence on reactions within. 20 refs., 9 figs., 3 tabs.

  5. THE FLOOD RISK IN THE LOWER GIANH RIVER: MODELLING AND FIELD VERIFICATION

    Directory of Open Access Journals (Sweden)

    NGUYEN H. D.

    2016-03-01

    Full Text Available Problems associated with flood risk definitely represent a highly topical issue in Vietnam. The case of the lower Gianh River in the central area of Vietnam, with a watershed area of 353 km2, is particularly interesting. In this area, periodically subject to flood risk, the scientific question is strongly linked to risk management. In addition, flood risk is the consequence of the hydrological hazard of an event and the damages related to this event. For this reason, our approach is based on hydrodynamic modelling using Mike Flood to simulate the runoff during a flood event. Unfortunately the data in the studied area are quite limited. Our computation of the flood risk is based on a three-step modelling process, using rainfall data coming from 8 stations, cross sections, the topographic map and the land-use map. The first step consists of creating a 1-D model using Mike 11, in order to simulate the runoff in the minor river bed. In the second step, we use Mike 21 to create a 2-D model to simulate the runoff in the flood plain. The last step allows us to couple the two models in order to precisely describe the variables for the hazard analysis in the flood plain (the water level, the speed, the extent of the flooding. Moreover the model is calibrated and verified using observational data of the water level at hydrologic stations and field control data (on the one hand flood height measurements, on the other hand interviews with the community and with the local councillors. We then generate GIS maps in order to improve flood hazard management, which allows us to create flood hazard maps by coupling the flood plain map and the runoff speed map. Our results show that: the flood peak, caused by typhoon Nari, reached more than 6 m on October 16th 2013 at 4 p.m. (its area was extended by 149 km². End that the typhoon constitutes an extreme flood hazard for 11.39%, very high for 10.60%, high for 30.79%, medium for 31.91% and a light flood hazard for 15

  6. Verification of the exponential model of body temperature decrease after death in pigs.

    Science.gov (United States)

    Kaliszan, Michal; Hauser, Roman; Kaliszan, Roman; Wiczling, Paweł; Buczyñski, Janusz; Penkowski, Michal

    2005-09-01

    The authors have conducted a systematic study in pigs to verify the models of post-mortem body temperature decrease currently employed in forensic medicine. Twenty-four hour automatic temperature recordings were performed in four body sites starting 1.25 h after pig killing in an industrial slaughterhouse under typical environmental conditions (19.5-22.5 degrees C). The animals had been randomly selected under a regular manufacturing process. The temperature decrease time plots drawn starting 75 min after death for the eyeball, the orbit soft tissues, the rectum and muscle tissue were found to fit the single-exponential thermodynamic model originally proposed by H. Rainy in 1868. In view of the actual intersubject variability, the addition of a second exponential term to the model was demonstrated to be statistically insignificant. Therefore, the two-exponential model for death time estimation frequently recommended in the forensic medicine literature, even if theoretically substantiated for individual test cases, provides no advantage as regards the reliability of estimation in an actual case. The improvement of the precision of time of death estimation by the reconstruction of an individual curve on the basis of two dead body temperature measurements taken 1 h apart or taken continuously for a longer time (about 4 h), has also been proved incorrect. It was demonstrated that the reported increase of precision of time of death estimation due to use of a multiexponential model, with individual exponential terms to account for the cooling rate of the specific body sites separately, is artifactual. The results of this study support the use of the eyeball and/or the orbit soft tissues as temperature measuring sites at times shortly after death. A single-exponential model applied to the eyeball cooling has been shown to provide a very precise estimation of the time of death up to approximately 13 h after death. For the period thereafter, a better estimation of the time

  7. Description and verification of a novel flow and transport model for silicate-gel emplacement

    Science.gov (United States)

    Walther, Marc; Solpuker, Utku; Böttcher, Norbert; Kolditz, Olaf; Liedl, Rudolf; Schwartz, Frank W.

    2014-02-01

    We present a novel approach for the numerical simulation of the gelation of silicate solutions under density-dependent flow conditions. The method utilizes an auxiliary, not density-dependent solute that is subject to a linear decay function to provide temporal information that is used to describe the viscosity change of the fluid. By comparing the modeling results to experimental data, we are able to simulate the behavior and the gelation process of the injected solute for three different compositions, including long-term stability of the gelated area, and non-gelation of low concentrations due to hydro-dynamic dispersion. This approach can also be used for other types of solutes with this gelling property and is useful in a variety of applications in geological, civil and environmental engineering.

  8. Neural network modeling for dynamic pulsed GTAW process with wire filler based on MATLAB

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Double-sided weld pool shapes were determined by multiple welding parameters and wire feed parameters during pulsed GTAW with wire filler. Aiming at such a system with multiple inputs and outputs, an effective modeling method, consisting of the impulse signal design, model structure and parameter identification and verification, was developed based on MATLAB software. Then, dynamic neural network models, TDNNM (Topside dynamic neural network model) and BHDNNM (Backside width and topside height dynamic neural network model), were established to predict double-sided shape parameters of the weld pool. The characteristic relationship of the welding process was simulated and analyzed with the models.

  9. Constraining millennial scale dynamics of a Greenland tidewater glacier for the verification of a calving criterion based numerical model

    Science.gov (United States)

    Lea, J.; Mair, D.; Rea, B.; Nick, F.; Schofield, E.

    2012-04-01

    The ability to successfully model the behaviour of Greenland tidewater glaciers is pivotal to understanding the controls on their dynamics and potential impact on global sea level. However, to have confidence in the results of numerical models in this setting, the evidence required for robust verification must extend well beyond the existing instrumental record. Perhaps uniquely for a major Greenland outlet glacier, both the advance and retreat dynamics of Kangiata Nunata Sermia (KNS), Nuuk Fjord, SW Greenland over the last ~1000 years can be reasonably constrained through a combination of geomorphological, sedimentological and archaeological evidence. It is therefore an ideal location to test the ability of the latest generation of calving criterion based tidewater models to explain millennial scale dynamics. This poster presents geomorphological evidence recording the post-Little Ice Age maximum dynamics of KNS, derived from high-resolution satellite imagery. This includes evidence of annual retreat moraine complexes suggesting controlled rather than catastrophic retreat between pinning points, in addition to a series of ice dammed lake shorelines, allowing detailed interpretation of the dynamics of the glacier as it thinned and retreated. Pending ground truthing, this evidence will contribute towards the calibration of results obtained from a calving criterion numerical model (Nick et al, 2010), driven by an air temperature reconstruction for the KNS region determined from ice core data.

  10. Multi-Mode GF-3 Satellite Image Geometric Accuracy Verification Using the RPC Model.

    Science.gov (United States)

    Wang, Taoyang; Zhang, Guo; Yu, Lei; Zhao, Ruishan; Deng, Mingjun; Xu, Kai

    2017-09-01

    The GaoFen-3 (GF-3) satellite is the first C-band multi-polarization synthetic aperture radar (SAR) imaging satellite with a resolution up to 1 m in China. It is also the only SAR satellite of the High-Resolution Earth Observation System designed for civilian use. There are 12 different imaging models to meet the needs of different industry users. However, to use SAR satellite images for related applications, they must possess high geometric accuracy. In order to verify the geometric accuracy achieved by the different modes of GF-3 images, we analyze the SAR geometric error source and perform geometric correction tests based on the RPC model with and without ground control points (GCPs) for five imaging modes. These include the spotlight (SL), ultra-fine strip (UFS), Fine Strip I (FSI), Full polarized Strip I (QPSI), and standard strip (SS) modes. Experimental results show that the check point residuals are large and consistent without GCPs, but the root mean square error of the independent checkpoints for the case of four corner control points is better than 1.5 pixels, achieving a similar level of geometric positioning accuracy to that of international satellites. We conclude that the GF-3 satellite can be used for high-accuracy geometric processing and related industry applications.

  11. Erythrocyte lysis in isotonic solution of ammonium chloride: theoretical modeling and experimental verification.

    Science.gov (United States)

    Chernyshev, Andrey V; Tarasov, Peter A; Semianov, Konstantin A; Nekrasov, Vyacheslav M; Hoekstra, Alfons G; Maltsev, Valeri P

    2008-03-07

    A mathematical model of erythrocyte lysis in isotonic solution of ammonium chloride is presented in frames of a statistical approach. The model is used to evaluate several parameters of mature erythrocytes (volume, surface area, hemoglobin concentration, number of anionic exchangers on membrane, elasticity and critical tension of membrane) through their sphering and lysis measured by a scanning flow cytometer (SFC). SFC allows measuring the light-scattering pattern (indicatrix) of an individual cell over the angular range from 10 degrees to 60 degrees . Comparison of the experimentally measured and theoretically calculated light scattering patterns allows discrimination of spherical from non-spherical erythrocytes and evaluation of volume and hemoglobin concentration for individual spherical cells. Three different processes were applied for erythrocytes sphering: (1) colloid osmotic lysis in isotonic solution of ammonium chloride, (2) isovolumetric sphering in the presence of sodium dodecyl sulphate and albumin in neutrally buffered isotonic saline, and (3) osmotic fragility test in hypotonic media. For the hemolysis in ammonium chloride, the evolution of distributions of sphered erythrocytes on volume and hemoglobin content was monitored in real-time experiments. The analysis of experimental data was performed in the context of a statistical approach, taking into account that parameters of erythrocytes vary from cell to cell.

  12. Protocol-Based Verification of Message-Passing Parallel Programs

    DEFF Research Database (Denmark)

    López-Acosta, Hugo-Andrés; Eduardo R. B. Marques, Eduardo R. B.; Martins, Francisco

    2015-01-01

    translated into a representation read by VCC, a software verifier for C. We successfully verified several MPI programs in a running time that is independent of the number of processes or other input parameters. This contrasts with alternative techniques, notably model checking and runtime verification...

  13. Remaining Sites Verification Package for the 100-C-9:1 Main Process Sewer Collection Line, Waste Site Reclassification Form 2004-012

    Energy Technology Data Exchange (ETDEWEB)

    L. M. Dittmer

    2007-06-11

    The 100-C-9:1 main process sewer pipeline, also known as the twin box culvert, was a dual reinforced process sewer that collected process effluent from the 183-C and 190-C water treatment facilities, discharging at the 132-C-2 Outfall. For remedial action purposes, the 100-C-9:1 waste site was subdivided into northern and southern sections. The 100-C-9:1 subsite has been remediated to achieve the remedial action objectives specified in the Remaining Sites ROD. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.

  14. Verification of the computer code KORSAR taking into account the effect of nondensables on thermal-hydraulic processes

    Science.gov (United States)

    Gudoshnikov, A. N.; Migrov, Yu. A.

    2008-11-01

    Calculations to verify the Russian computer code KORSAR were carried out for the B4.1 experimental operating conditions, in which nitrogen was supplied to the reactor coolant (primary) circuit of a reactor plant model, and which were simulated at the PKL III integral test facility. It is shown that dissolution of gases in coolant has an essential effect on the thermal-hydraulic processes during long-term passive removal of heat from the primary to secondary coolant circuit of the reactor plant model under the conditions of natural circulation.

  15. Animal models and conserved processes

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-09-01

    Full Text Available Abstract Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is

  16. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H

    2006-09-15

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard.

  17. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  18. Monte Carlo modeling of HD120 multileaf collimator on Varian TrueBeam linear accelerator for verification of 6X and 6X FFF VMAT SABR treatment plans.

    Science.gov (United States)

    Bergman, Alanah M; Gete, Ermias; Duzenli, Cheryl; Teke, Tony

    2014-05-08

    A Monte Carlo (MC) validation of the vendor-supplied Varian TrueBeam 6 MV flattened (6X) phase-space file and the first implementation of the Siebers-Keall MC MLC model as applied to the HD120 MLC (for 6X flat and 6X flattening filter-free (6X FFF) beams) are described. The MC model is validated in the context of VMAT patient-specific quality assurance. The Monte Carlo commissioning process involves: 1) validating the calculated open-field percentage depth doses (PDDs), profiles, and output factors (OF), 2) adapting the Siebers-Keall MLC model to match the new HD120-MLC geometry and material composition, 3) determining the absolute dose conversion factor for the MC calculation, and 4) validating this entire linac/MLC in the context of dose calculation verification for clinical VMAT plans. MC PDDs for the 6X beams agree with the measured data to within 2.0% for field sizes ranging from 2 × 2 to 40 × 40 cm2. Measured and MC profiles show agreement in the 50% field width and the 80%-20% penumbra region to within 1.3 mm for all square field sizes. MC OFs for the 2 to 40 cm2 square fields agree with measurement to within 1.6%. Verification of VMAT SABR lung, liver, and vertebra plans demonstrate that measured and MC ion chamber doses agree within 0.6% for the 6X beam and within 2.0% for the 6X FFF beam. A 3D gamma factor analysis demonstrates that for the 6X beam, > 99% of voxels meet the pass criteria (3%/3 mm). For the 6X FFF beam, > 94% of voxels meet this criteria. The TrueBeam accelerator delivering 6X and 6X FFF beams with the HD120 MLC can be modeled in Monte Carlo to provide an independent 3D dose calculation for clinical VMAT plans. This quality assurance tool has been used clinically to verify over 140 6X and 16 6X FFF TrueBeam treatment plans.

  19. Advanced oxidation processes: overall models

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, M. [Univ. de los Andes, Escuela Basica de Ingenieria, La Hechicera, Merida (Venezuela); Curco, D.; Addardak, A.; Gimenez, J.; Esplugas, S. [Dept. de Ingenieria Quimica. Univ. de Barcelona, Barcelona (Spain)

    2003-07-01

    Modelling AOPs implies to consider all the steps included in the process, that means, mass transfer, kinetic (reaction) and luminic steps. In this way, recent works develop models which relate the global reaction rate to catalyst concentration and radiation absorption. However, the application of such models requires to know what is the controlling step for the overall process. In this paper, a simple method is explained which allows to determine the controlling step. Thus, it is assumed that reactor is divided in two hypothetical zones (dark and illuminated), and according to the experimental results, obtained by varying only the reaction volume, it can be decided if reaction occurs only in the illuminated zone or in the all reactor, including dark zone. The photocatalytic degradation of phenol, by using titania degussa P-25 as catalyst, is studied as reaction model. The preliminary results obtained are presented here, showing that it seems that, in this case, reaction only occurs in the illuminated zone of photoreactor. A model is developed to explain this behaviour. (orig.)

  20. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  1. Face Processing: Models For Recognition

    Science.gov (United States)

    Turk, Matthew A.; Pentland, Alexander P.

    1990-03-01

    The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.

  2. Steady-State Process Modelling

    DEFF Research Database (Denmark)

    2011-01-01

    illustrate the “equation oriented” approach as well as the “sequential modular” approach to solving complex flowsheets for steady state applications. The applications include the Williams-Otto plant, the hydrodealkylation (HDA) of toluene, conversion of ethylene to ethanol and a bio-ethanol process.......This chapter covers the basic principles of steady state modelling and simulation using a number of case studies. Two principal approaches are illustrated that develop the unit operation models from first principles as well as through application of standard flowsheet simulators. The approaches...

  3. Switching Processes in Queueing Models

    CERN Document Server

    Anisimov, Vladimir V

    2008-01-01

    Switching processes, invented by the author in 1977, is the main tool used in the investigation of traffic problems from automotive to telecommunications. The title provides a new approach to low traffic problems based on the analysis of flows of rare events and queuing models. In the case of fast switching, averaging principle and diffusion approximation results are proved and applied to the investigation of transient phenomena for wide classes of overloading queuing networks.  The book is devoted to developing the asymptotic theory for the class of switching queuing models which covers  mode

  4. Use of an Existing Airborne Radon Data Base in the Verification of the NASA/AEAP Core Model

    Science.gov (United States)

    Kritz, Mark A.

    1998-01-01

    The primary objective of this project was to apply the tropospheric atmospheric radon (Rn222) measurements to the development and verification of the global 3-D atmospheric chemical transport model under development by NASA's Atmospheric Effects of Aviation Project (AEAP). The AEAP project had two principal components: (1) a modeling effort, whose goal was to create, test and apply an elaborate three-dimensional atmospheric chemical transport model (the NASA/AEAP Core model to an evaluation of the possible short and long-term effects of aircraft emissions on atmospheric chemistry and climate--and (2) a measurement effort, whose goal was to obtain a focused set of atmospheric measurements that would provide some of the observational data used in the modeling effort. My activity in this project was confined to the first of these components. Both atmospheric transport and atmospheric chemical reactions (as well the input and removal of chemical species) are accounted for in the NASA/AEAP Core model. Thus, for example, in assessing the effect of aircraft effluents on the chemistry of a given region of the upper troposphere, the model must keep track not only of the chemical reactions of the effluent species emitted by aircraft flying in this region, but also of the transport into the region of these (and other) species from other, remote sources--for example, via the vertical convection of boundary layer air to the upper troposphere. Radon, because of its known surface source and known radioactive half-life, and freedom from chemical production or loss, and from removal from the atmosphere by physical scavenging, is a recognized and valuable tool for testing the transport components of global transport and circulation models.

  5. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  6. Analytical design model for a piezo-composite unimorph actuator and its verification using lightweight piezo-composite curved actuators

    Science.gov (United States)

    Yoon, K. J.; Park, K. H.; Lee, S. K.; Goo, N. S.; Park, H. C.

    2004-06-01

    This paper describes an analytical design model for a layered piezo-composite unimorph actuator and its numerical and experimental verification using a LIPCA (lightweight piezo-composite curved actuator) that is lighter than other conventional piezo-composite type actuators. The LIPCA is composed of top fiber composite layers with high modulus and low CTE (coefficient of thermal expansion), a middle PZT ceramic wafer, and base layers with low modulus and high CTE. The advantages of the LIPCA design are to replace the heavy metal layer of THUNDER by lightweight fiber-reinforced plastic layers without compromising the generation of high force and large displacement and to have design flexibility by selecting the fiber direction and the number of prepreg layers. In addition to the lightweight advantage and design flexibility, the proposed device can be manufactured without adhesive layers when we use a resin prepreg system. A piezo-actuation model for a laminate with piezo-electric material layers and fiber composite layers is proposed to predict the curvature and residual stress of the LIPCA. To predict the actuation displacement of the LIPCA with curvature, a finite element analysis method using the proposed piezo-actuation model is introduced. The predicted deformations are in good agreement with the experimental ones.

  7. Combustion Process Modelling and Control

    Directory of Open Access Journals (Sweden)

    Vladimír Maduda

    2007-10-01

    Full Text Available This paper deals with realization of combustion control system on programmable logic controllers. Control system design is based on analysis of the current state of combustion control systems in technological device of raw material processing area. Control system design is composed of two subsystems. First subsystem is represented by software system for measured data processing and for data processing from simulation of the combustion mathematical model. Outputs are parameters for setting of controller algorithms. Second subsystem consists from programme modules. The programme module is presented by specific control algorithm, for example proportional regulation, programmed proportional regulation, proportional regulation with correction on the oxygen in waste gas, and so on. According to the specific combustion control requirements it is possible built-up concrete control system by programme modules. The programme modules were programmed by Automation studio that is used for development, debugging and testing software for B&R controllers.

  8. Kinematic Modelling and Simulation of a 2-R Robot Using SolidWorks and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Mahmoud Gouasmi

    2012-12-01

    Full Text Available The simulation of robot systems is becoming very popular, especially with the lowering of the cost of computers, and it can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. The trajectory planning of redundant manipulators is a very active area since many tasks require special characteristics to be satisfied. The importance of redundant manipulators has increased over the last two decades because of the possibility of avoiding singularities as well as obstacles within the course of motion. The angle that the last link of a 2 DOF manipulator makes with the x-axis is required in order to find the solution for the inverse kinematics problem. This angle could be optimized with respect to a given specified key factor (time, velocity, torques while the end-effector performs a chosen trajectory (i.e., avoiding an obstacle in the task space. Modeling and simulation of robots could be achieved using either of the following models: the geometrical model (positions, postures, the kinematic model and the dynamic model. To do so, the modelization of a 2-R robot type is implemented. Our main tasks are comparing two robot postures with the same trajectory (path and for the same length of time, and establishing a computing code to obtain the kinematic and dynamic parameters. SolidWorks and MATLAB/Simulink softwares are used to check the theory and the robot motion simulation. This could be easily generalized to a 3-R robot and possibly therefore to any serial robot (Scara, Puma, etc.. The verification of the obtained results by both softwares allows us to qualitatively evaluate and underline the validityof the chosen model and obtain the right conclusions. The results of the simulations are discussed and an agreement between the two softwares is certainly obtained.

  9. Verification SEBAL and Hargreaves –Samani Models to Estimate Evapotranspiration by Lysimeter Data

    Directory of Open Access Journals (Sweden)

    Ali Morshedi

    2017-02-01

    .272 mm/day and 0.700 for the d index, respectively. Similar indices for the Hargreaves-Samani model were 1.003, 0.580 and 0.290 mm/day and 0.917 for the d index. For HS model results show that RMSE, MAE and MBE values were 0.813, 0.477 and 0.206 mm/day, and 0.930 for the index of d, during the entire growing period (185 days. Conclusion: However, results showed that the efficiency and reliability of the SEBAL model by processing satellite visible, near infrared and thermal infrared bands. The need for irrigation water requirements and ET estimation are noteworthy, during the growth of various plants, which vary and thus the complete time series of satellite imageries is required to estimate the total and annual evapotranspiration.

  10. Performance study of a heat pump dryer system for speciality crops - Pt. 2: model verification

    Energy Technology Data Exchange (ETDEWEB)

    Adapa, P.K.; Schoenau, G.J.; Sokhansanj, S. [University of Saskatchewan (Canada). College of Engineering

    2002-07-01

    The experimental and predicted performance data of a heat pump dryer system is reported. Chopped alfalfa was dried in a cabinet dryer in batches and also by emulating continuous bed drying using two heat pumps operating in parallel. Results showed that alfalfa was dried from an initial moisture content of 70% (wb) to a final moisture content of 10% (wb). The batch drying took about 4.5 h while continuous bed drying took 4 h to dry the same amount of material. The average air velocity inside the dryer was 0.36 m s{sup -1}. Low temperatures (30-45{sup o}C) for safe drying of specialty crops were achieved experimentally. The heat pump drying system used in this study was about 50% more efficient in recovering the latent heat from the dryer exhaust compared to the conventional dryers. Specific moisture extraction rate (SMER) was maximum when relative humidity stayed above 40%. The dryer was shown to be capable of SMER of between 0.5 and 1.02 kg kW{sup -1} h{sup -1}. It was concluded that continuous bed drying is potentially a better option than batch drying because high process air humidity ratios at the entrance of the evaporator and constant moisture extraction rate and specific moisture extraction rate values can be maintained. An uncertainty analysis confirmed the accuracy of the model. (author)

  11. Dynamic CT myocardial perfusion imaging: detection of ischemia in a porcine model with FFR verification

    Science.gov (United States)

    Fahmi, Rachid; Eck, Brendan L.; Vembar, Mani; Bezerra, Hiram G.; Wilson, David L.

    2014-03-01

    Dynamic cardiac CT perfusion (CTP) is a high resolution, non-invasive technique for assessing myocardial blood ow (MBF), which in concert with coronary CT angiography enable CT to provide a unique, comprehensive, fast analysis of both coronary anatomy and functional ow. We assessed perfusion in a porcine model with and without coronary occlusion. To induce occlusion, each animal underwent left anterior descending (LAD) stent implantation and angioplasty balloon insertion. Normal ow condition was obtained with balloon completely de ated. Partial occlusion was induced by balloon in ation against the stent with FFR used to assess the extent of occlusion. Prospective ECG-triggered partial scan images were acquired at end systole (45% R-R) using a multi-detector CT (MDCT) scanner. Images were reconstructed using FBP and a hybrid iterative reconstruction (iDose4, Philips Healthcare). Processing included: beam hardening (BH) correction, registration of image volumes using 3D cubic B-spline normalized mutual-information, and spatio-temporal bilateral ltering to reduce partial scan artifacts and noise variation. Absolute blood ow was calculated with a deconvolutionbased approach using singular value decomposition (SVD). Arterial input function was estimated from the left ventricle (LV) cavity. Regions of interest (ROIs) were identi ed in healthy and ischemic myocardium and compared in normal and occluded conditions. Under-perfusion was detected in the correct LAD territory and ow reduction agreed well with FFR measurements. Flow was reduced, on average, in LAD territories by 54%.

  12. Ground-based verification and data processing of Yutu rover Active Particle-induced X-ray Spectrometer

    CERN Document Server

    Guo, Dongya; Peng, Wenxi; Cui, Xingzhu; Zhang, Chengmo; Liu, Yaqing; Liang, Xiaohua; Dong, Yifan; Wang, Jinzhou; Gao, Min; Yang, Jiawei; Zhang, Jiayu; Li, Chunlai; Zou, Yongliao; Zhang, Guangliang; Zhang, Liyan; Fu, Xiaohui

    2015-01-01

    The Active Particle-induced X-ray Spectrometer (APXS) is one of the payloads on board the Yutu rover of Chang'E-3 mission. In order to assess the instrumental performance of APXS, a ground verification test was done for two unknown samples (basaltic rock, mixed powder sample). In this paper, the details of the experiment configurations and data analysis method are presented. The results show that the elemental abundance of major elements can be well determined by the APXS with relative deviations < 15 wt. % (detection distance = 30 mm, acquisition time = 30 min). The derived detection limit of each major element is inversely proportional to acquisition time and directly proportional to detection distance, suggesting that the appropriate distance should be < 50mm.

  13. Ground-based verification and data processing of Yutu rover Active Particle-induced X-ray Spectrometer

    Institute of Scientific and Technical Information of China (English)

    GUO Dong-Ya; WANG Huan-Yu; PENG Wen-Xi; CUI Xing-Zhu; ZHANG Cheng-Mo; LIU Ya-Qing; LIANG Xiao-Hua

    2015-01-01

    The Active Particle-induced X-ray Spectrometer (APXS) is one of the payloads on board the Yutu rover of the Chang'E-3 mission.In order to assess the instrumental performance of APXS,a ground verification test was performed for two unknown samples (basaltic rock,mixed powder sample).In this paper,the details of the experiment configurations and data analysis method are presented.The results show that the elemental abundance of major elements can be well determined by the APXS with relative deviations <15 wt.% (detection distance=30 mm,acquisition time=30 min).The derived detection limit of each major element is inversely proportional to acquisition time and directly proportional to detection distance,suggesting that the appropriate distance should be <50 mm.

  14. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  15. The Photometric Calibration of the Dark Energy Survey (DES): Results from the Summer 2013 Re-processing of the DES Science Verification Data

    Science.gov (United States)

    Tucker, Douglas L.; Allam, S. S.; Annis, J. T.; Armstrong, R.; Bauer, A.; Bernstein, G.; Burke, D.; Fix, M.; Foust, W.; Gruendl, R. A.; Head, H.; Kuehn, K.; Kuhlmann, S.; Li, T.; Lin, H.; Rykoff, E. S.; Smith, J.; Wester, W.; Wyatt, S.; Yanny, B.; Energy Survey, Dark

    2014-01-01

    The Dark Energy Survey (DES) -- a five-year 5000 sq deg grizY survey of the Southern sky to probe the parameters of dark energy -- recently began operations using the new 3 sq deg DECam imager on the Blanco 4m telescope at the Cerro Tololo Interamerican Observatory. In order to achieve its science goals, the DES has tight requirements on both its relative and absolute photometric calibrations. The 5-year requirements are (1) an internal (relative) photometric calibration of 2% rms (2) an absolute color calibration of 0.5%, and (3) an absolute flux calibration of 0.5% (in i-band relative to BD+17 4708). In preparation for DES operations, the instrument+telescope underwent a period of Science Verification between November 2012 and February 2013. These Science Verification (SV) data were quickly processed to determine whether the image data were being produced with sufficient quality and efficiency to meet DES science goals. These data were also useful for initial science, and they were re-processed and re-calibrated during Summer 2013. The photometric goals for Summer 2013 re-processing of the DES SV were intentionally more relaxed than the requirements for the final 5-year survey: (1) an all-sky internal (relative) calibration goal of 3%, (2) an absolute color goal of 3%, and (3) an absolute flux goal of 3%. Here, we describe the results from the photometric calibration of the Summer 2013 re-processing of the DES SV data, the lessons learned, and plans for the future.

  16. Kinematic Modeling and Simulation of a 2-R Robot by Using Solid Works and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Fernini Brahim

    2012-05-01

    Full Text Available Simulation of robot systems which is getting very popular, especially with the lowering cost of computers, can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. Object staging modelisation using robots holds, wether for the object or the robot, the following models: The geometric one, the kinematics one and the dynamic one. To do so, the modelisation of a 2-R robot type is being implemented. Comparing between two robot postures with the same trajectory (path and for the same length of time and establishing a computing code to obtain the kinematic and dynamic parameters are the main tasks. SolidWorks and Matlab/Simulink softwares are used to check the theory and the robot motion simulation. The verification of the obtained results by both softwares allows us to, qualitatively evaluate ,underline the rightness of the chosen model and to get the right conclusions. The results of simulations were discussed. An agreement between the two softwares is certainly Obtained.

  17. Informational model verification of ZVS Buck quasi-resonant DC-DC converter

    Science.gov (United States)

    Vakovsky, Dimiter; Hinov, Nikolay

    2016-12-01

    The aim of the paper is to create a polymorphic informational model of a ZVS Buck quasi-resonant DC-DC converter for the modeling purposes of the object. For the creation of the model is applied flexible open standards for setting, storing, publishing and exchange of data in distributed information environment. The created model is useful for creation of many and different by type variants with different configuration of the composing elements and different inner model of the examined object.

  18. Verification and Validation of the Coastal Modeling System. Report 2: CMS-Wave

    Science.gov (United States)

    2011-12-01

    wave models in this category are ideal for generation, growth and transformation of wind-waves over large distances (fetches) in regional -scale...quantitative model -to-data intercomparison or model -to- model intercomparison . Both evaluations involve assessment of the methods and data required for...combined wind and wave modeling capabilities of CMS-Wave in a large tidally- dominated inlet environment with an energetic wave climate . Extensive field

  19. Development and verification of signal processing system of avalanche photo diode for the active shields onboard ASTRO-H

    Science.gov (United States)

    Ohno, M.; Kawano, T.; Edahiro, I.; Shirakawa, H.; Ohashi, N.; Okada, C.; Habata, S.; Katsuta, J.; Tanaka, Y.; Takahashi, H.; Mizuno, T.; Fukazawa, Y.; Murakami, H.; Kobayashi, S.; Miyake, K.; Ono, K.; Kato, Y.; Furuta, Y.; Murota, Y.; Okuda, K.; Wada, Y.; Nakazawa, K.; Mimura, T.; Kataoka, J.; Ichinohe, Y.; Uchida, Y.; Katsuragawa, M.; Yoneda, H.; Sato, G.; Sato, R.; Kawaharada, M.; Harayama, A.; Odaka, H.; Hayashi, K.; Ohta, M.; Watanabe, S.; Kokubun, M.; Takahashi, T.; Takeda, S.; Kinoshita, M.; Yamaoka, K.; Tajima, H.; Yatsu, Y.; Uchiyama, H.; Saito, S.; Yuasa, T.; Makishima, K.

    2016-09-01

    The hard X-ray Imager and Soft Gamma-ray Detector onboard ASTRO-H demonstrate high sensitivity to hard X-ray (5-80 keV) and soft gamma-rays (60-600 keV), respectively. To reduce the background, both instruments are actively shielded by large, thick Bismuth Germanate scintillators. We have developed the signal processing system of the avalanche photodiode in the BGO active shields and have demonstrated its effectiveness after assembly in the flight model of the HXI/SGD sensor and after integration into the satellite. The energy threshold achieved is about 150 keV and anti-coincidence efficiency for cosmic-ray events is almost 100%. Installed in the BGO active shield, the developed signal processing system successfully reduces the room background level of the main detector.

  20. Modelling Real World Using Stochastic Processes and Filtration

    Directory of Open Access Journals (Sweden)

    Jaeger Peter

    2016-03-01

    Full Text Available First we give an implementation in Mizar [2] basic important definitions of stochastic finance, i.e. filtration ([9], pp. 183 and 185, adapted stochastic process ([9], p. 185 and predictable stochastic process ([6], p. 224. Second we give some concrete formalization and verification to real world examples.

  1. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  2. Hysteresis modelling and experimental verification of a Fe–Ga alloy magnetostrictive actuator

    Science.gov (United States)

    Wei, Zhu; Lei Xiang, Bian; Gangli, Chen; Shuxin, Liu; Qinbo, Zhou; Xiaoting, Rui

    2017-03-01

    In order to accurately describe the asymmetric rate-bias-dependent hysteresis of a Fe–Ga alloy magnetostrictive actuator, a comprehensive model, which is composed of a phenomenon model, describing hysteresis by the modified Bouc–Wen hysteresis operator, and a theoretical model, representing the dynamics characteristics, is put forward. The experimental system is setup to verify the performance of the comprehensive model. Results show that the modified Bouc–Wen model can effectively describe the dynamics and hysteresis characteristics of the Fe–Ga alloy magnetostrictive actuator. The results highlight significantly improved accuracy in the modelling of the magnetostrictive actuator.

  3. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  4. Modelling and Simulation of Variable Speed Thruster Drives with Full-Scale Verification

    Directory of Open Access Journals (Sweden)

    Jan F. Hansen

    2001-10-01

    Full Text Available In this paper considerations about modelling and simulation of variable speed thruster drives are made with comparison to full scale measurements from Varg FPSO. For special purpose vessels with electric propulsion operating in DP (Dynamic Positioning mode the thruster drives are essential for the vessel operation. Different model strategies of thruster drives are discussed. An advanced thruster drive model with a dynamic motor model and field vector control principle is shown. Simulations are performed with both the advanced model and a simplified model. These are compared with full-scale measurements from Varg FPSO. The simulation results correspond well with the measurements, for both the simplified model and the advanced model.

  5. Experimental verification of optical models of graphene with multimode slab waveguides.

    Science.gov (United States)

    Chang, Zeshan; Chiang, Kin Seng

    2016-05-01

    We compare three optical models of graphene, namely, the interface model, the isotropic model, and the anisotropic model, and verify them experimentally with two multimode slab waveguide samples operating at the wavelengths of 632.8 and 1536 nm. By comparing the calculated graphene-induced losses and the measurement data, we confirm that the interface model and the anisotropic model give correct results for both the transverse electric (TE) and transverse magnetic modes, while the isotropic model gives correct results only for the TE modes. With the experimental data, we also quantitatively verify the widely used expression for the surface conductivity of graphene in the optical regime. Our findings clarify the issue of modeling graphene in the analysis of graphene-incorporated waveguides and offer deeper insight into the optical properties of graphene for waveguide applications.

  6. Analysis of the State of the Art Contingency Analysis Model (SOTACA), Air Module Verification

    Science.gov (United States)

    1990-03-01

    General Information 6 Model Uses 7 Proponent and Users 8 System Requirements 8 History 8 Reasons to Use 9 Model Operation 10 Preprocessor 11 Area Files...Definition File (ADF) 42 Theather Data 42 Munitions and Target Effects Data 44 Air Data 44 Decision Threshold File (DTF) 44 Scenario 45 Test Cases 46 Null Case...information, model uses, proponent and users, system requirements, model history , and reasons to use SOTACA. This scction also covers the four phases of

  7. Mathematical modeling and microbiological verification of ohmic heating of a multicomponent mixture of particles in a continuous flow ohmic heater system with electric field parallel to flow.

    Science.gov (United States)

    Kamonpatana, Pitiya; Mohamed, Hussein M H; Shynkaryk, Mykola; Heskitt, Brian; Yousef, Ahmed E; Sastry, Sudhir K

    2013-11-01

    To accomplish continuous flow ohmic heating of a low-acid food product, sufficient heat treatment needs to be delivered to the slowest-heating particle at the outlet of the holding section. This research was aimed at developing mathematical models for sterilization of a multicomponent food in a pilot-scale ohmic heater with electric-field-oriented parallel to the flow and validating microbial inactivation by inoculated particle methods. The model involved 2 sets of simulations, one for determination of fluid temperatures, and a second for evaluating the worst-case scenario. A residence time distribution study was conducted using radio frequency identification methodology to determine the residence time of the fastest-moving particle from a sample of at least 300 particles. Thermal verification of the mathematical model showed good agreement between calculated and experimental fluid temperatures (P > 0.05) at heater and holding tube exits, with a maximum error of 0.6 °C. To achieve a specified target lethal effect at the cold spot of the slowest-heating particle, the length of holding tube required was predicted to be 22 m for a 139.6 °C process temperature with volumetric flow rate of 1.0 × 10(-4) m3/s and 0.05 m in diameter. To verify the model, a microbiological validation test was conducted using at least 299 chicken-alginate particles inoculated with Clostridium sporogenes spores per run. The inoculated pack study indicated the absence of viable microorganisms at the target treatment and its presence for a subtarget treatment, thereby verifying model predictions.

  8. Principles of polymer processing modelling

    Directory of Open Access Journals (Sweden)

    Agassant Jean-François

    2016-01-01

    Full Text Available Polymer processing involves three thermo-mechanical stages: Plastication of solid polymer granules or powder to an homogeneous fluid which is shaped under pressure in moulds or dies and finally cooled and eventually drawn to obtain the final plastic part. Physical properties of polymers (high viscosity, non-linear rheology, low thermal diffusivity as well as the complex shape of most plastic parts make modelling a challenge. Several examples (film blowing extrusion dies, injection moulding, blow moulding are presented and discussed.

  9. Validation and verification of agent models for trust: Independent compared to relative trust

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van

    2011-01-01

    In this paper, the results of a validation experiment for two existing computational trust models describing human trust are reported. One model uses experiences of performance in order to estimate the trust in different trustees. The second model in addition carries the notion of relative trust. Th

  10. METHANOGENESIS AND SULFATE REDUCTION IN CHEMOSTATS: II. MODEL DEVELOPMENT AND VERIFICATION

    Science.gov (United States)

    A comprehensive dynamic model is presented that simulates methanogenesis and sulfate reduction in a continuously stirred tank reactor (CSTR). This model incorporates the complex chemistry of anaerobic systems. A salient feature of the model is its ability to predict the effluent ...

  11. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  12. Derivation, calibration and verification of macroscopic model for urban traffic flow. Part 1

    CERN Document Server

    Kholodov, Yaroslav A; Kholodov, Aleksandr S; Vasiliev, Mikhail O; Kurzhanskiy, Alexander A

    2016-01-01

    In this paper we present a second-order hydrodynamic traffic model that generalizes the existing second-order models of Payne-Whithem, Zhang and Aw-Rascle. In the proposed model, we introduce the pressure equation describing the dependence of "traffic pressure" on traffic density. The pressure equation is constructed for each road segment from the fundamental diagram that is estimated using measurements from traffic detectors. We show that properties of any phenomenological model are fully defined by the pressure equation. We verify the proposed model through simulations of the Interstate 580 freeway segment in California, USA, with traffic measurements from the Performance Measurement System (PeMS).

  13. Towards the Availability of the Distributed Cluster Rendering System: Automatic Modeling and Verification

    DEFF Research Database (Denmark)

    Wang, Kemin; Jiang, Zhengtao; Wang, Yongbin;

    2012-01-01

    , whenever the number of node-n and related parameters vary, we can create the PRISM model file rapidly and then we can use PRISM model checker to verify ralated system properties. At the end of this study, we analyzed and verified the availability distributions of the Distributed Cluster Rendering System......In this study, we proposed a Continuous Time Markov Chain Model towards the availability of n-node clusters of Distributed Rendering System. It's an infinite one, we formalized it, based on the model, we implemented a software, which can automatically model with PRISM language. With the tool...

  14. A Review of Process Modeling Language Paradigms

    Institute of Scientific and Technical Information of China (English)

    MA Qin-hai; GUAN Zhi-min; LI Ying; ZHAO Xi-nan

    2002-01-01

    Process representation or modeling plays an important role in business process engineering.Process modeling languages can be evaluated by the extent to which they provide constructs useful for representing and reasoning about the aspects of a process, and subsequently are chosen for a certain purpose.This paper reviews process modeling language paradigms and points out their advantages and disadvantages.

  15. Development and verification of signal processing system of avalanche photo diode for the active shields onboard ASTRO-H

    Energy Technology Data Exchange (ETDEWEB)

    Ohno, M., E-mail: ohno@hep01.hepl.hiroshima-u.ac.jp [Department of Physical Sciences, Hiroshima University, Hiroshima 739-8526 (Japan); Kawano, T.; Edahiro, I.; Shirakawa, H.; Ohashi, N.; Okada, C.; Habata, S.; Katsuta, J.; Tanaka, Y.; Takahashi, H.; Mizuno, T.; Fukazawa, Y. [Department of Physical Sciences, Hiroshima University, Hiroshima 739-8526 (Japan); Murakami, H.; Kobayashi, S.; Miyake, K.; Ono, K.; Kato, Y.; Furuta, Y.; Murota, Y.; Okuda, K. [Department of Physics, University of Tokyo, Tokyo 113-0033 (Japan); and others

    2016-09-21

    The hard X-ray Imager and Soft Gamma-ray Detector onboard ASTRO-H demonstrate high sensitivity to hard X-ray (5–80 keV) and soft gamma-rays (60–600 keV), respectively. To reduce the background, both instruments are actively shielded by large, thick Bismuth Germanate scintillators. We have developed the signal processing system of the avalanche photodiode in the BGO active shields and have demonstrated its effectiveness after assembly in the flight model of the HXI/SGD sensor and after integration into the satellite. The energy threshold achieved is about 150 keV and anti-coincidence efficiency for cosmic-ray events is almost 100%. Installed in the BGO active shield, the developed signal processing system successfully reduces the room background level of the main detector. - Highlights: • A detail of development of signal processing system for ASTRO-H is presented. • Digital filer with FPGA instead of discrete analog circuit is applied. • Expected performance is verified after integration of the satellite.

  16. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  17. Verification in referral-based crowdsourcing.

    Directory of Open Access Journals (Sweden)

    Victor Naroditskiy

    Full Text Available Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  18. Documentation, User Support, and Verification of Wind Turbine and Plant Models

    Energy Technology Data Exchange (ETDEWEB)

    Robert Zavadil; Vadim Zheglov; Yuriy Kazachkov; Bo Gong; Juan Sanchez; Jun Li

    2012-09-18

    As part of the Utility Wind Energy Integration Group (UWIG) and EnerNex's Wind Turbine Modeling Project, EnerNex has received ARRA (federal stimulus) funding through the Department of Energy (DOE) to further the progress of wind turbine and wind plant models. Despite the large existing and planned wind generation deployment, industry-standard models for wind generation have not been formally adopted. Models commonly provided for interconnection studies are not adequate for use in general transmission planning studies, where public, non-proprietary, documented and validated models are needed. NERC MOD (North American Electric Reliability Corporation) reliability standards require that power flow and dynamics models be provided, in accordance with regional requirements and procedures. The goal of this project is to accelerate the appropriate use of generic wind turbine models for transmission network analysis by: (1) Defining proposed enhancements to the generic wind turbine model structures that would allow representation of more advanced; (2) Comparative testing of the generic models against more detailed (and sometimes proprietary) versions developed by turbine vendors; (3) Developing recommended parameters for the generic models to best mimic the performance of specific commercial wind turbines; (4) Documenting results of the comparative simulations in an application guide for users; (5) Conducting technology transfer activities in regional workshops for dissemination of knowledge and information gained, and to engage electric power and wind industry personnel in the project while underway; (6) Designing of a "living" homepage to establish an online resource for transmission planners.

  19. Fingerprint verification based on wavelet subbands

    Science.gov (United States)

    Huang, Ke; Aviyente, Selin

    2004-08-01

    Fingerprint verification has been deployed in a variety of security applications. Traditional minutiae detection based verification algorithms do not utilize the rich discriminatory texture structure of fingerprint images. Furthermore, minutiae detection requires substantial improvement of image quality and is thus error-prone. In this paper, we propose an algorithm for fingerprint verification using the statistics of subbands from wavelet analysis. One important feature for each frequency subband is the distribution of the wavelet coefficients, which can be modeled with a Generalized Gaussian Density (GGD) function. A fingerprint verification algorithm that combines the GGD parameters from different subbands is proposed to match two fingerprints. The verification algorithm in this paper is tested on a set of 1,200 fingerprint images. Experimental results indicate that wavelet analysis provides useful features for the task of fingerprint verification.

  20. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.