WorldWideScience

Sample records for model verification process

  1. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  2. BProVe: A formal verification framework for business process models

    DEFF Research Database (Denmark)

    Corradini, Flavio; Fornari, Fabrizio; Polini, Andrea

    2017-01-01

    Business Process Modelling has acquired increasing relevance in software development. Available notations, such as BPMN, permit to describe activities of complex organisations. On the one hand, this shortens the communication gap between domain experts and IT specialists. On the other hand......, this permits to clarify the characteristics of software systems introduced to provide automatic support for such activities. Nevertheless, the lack of formal semantics hinders the automatic verification of relevant properties. This paper presents a novel verification framework for BPMN 2.0, called BPro......Ve. It is based on an operational semantics, implemented using MAUDE, devised to make the verification general and effective. A complete tool chain, based on the Eclipse modelling environment, allows for rigorous modelling and analysis of Business Processes. The approach has been validated using more than one...

  3. Desublimation process: verification and applications of a theoretical model

    International Nuclear Information System (INIS)

    Eby, R.S.

    1979-01-01

    A theoretical model simulating the simultaneous heat and mass transfer which takes place during the desublimation of a gas to a solid is presented. Desublimer column loading profiles to experimentally verify the model were obtained using a gamma scintillation technique. The data indicate that, if the physical parameters of the desublimed frost material are known, the model can accurately predict the desublimation phenomenon. The usefulness of the model in different engineering applications is also addressed

  4. Requirement Assurance: A Verification Process

    Science.gov (United States)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  5. Experimental verification of the energetic model of the dry mechanical reclamation process

    Directory of Open Access Journals (Sweden)

    R. Dańko

    2008-04-01

    Full Text Available The experimental results of the dry mechanical reclamation process, which constituted the bases for the verification of the energetic model of this process, developed by the author on the grounds of the Rittinger’s deterministic hypothesis of the crushing process, are presented in the paper. Used foundry sands with bentonite, with water-glass from the floster technology and used sands with furan FL 105 resin were used in the reclamation tests. In the mechanical and mechanical-cryogenic reclamation a wide range of time variations and reclamation conditions influencing intensity of the reclamation process – covering all possible parameters used in industrial devices - were applied. The developed theoretical model constitutes a new tool allowing selecting optimal times for the reclamation treatment of the given spent foundry sand at the assumed process intensity realized in rotor reclaimers - with leaves or rods as grinding elements mounted horizontally on the rotor axis.

  6. Specification, Verification and Optimisation of Business Processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas

    Model and Notation (BPMN). The automated analysis of business processes is done by means of quantitative probabilistic model checking which allows verification of validation and performance properties through use of an algorithm for the translation of business process models into a format amenable...

  7. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  8. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  9. Verification of product quality from process control

    International Nuclear Information System (INIS)

    Drobot, A.; Bunnell, L.R.; Freeborn, W.P.; Macedo, P.B.; Mellinger, G.B.; Pegg, I.L.; Piepel, G.F.; Reimus, M.A.H.; Routt, K.R.; Saad, E.

    1989-01-01

    Process models were developed to characterize the waste vitrification at West Valley, in terms of process operating constraints and glass compositions achievable. The need for verification of compliance with the proposed Waste Acceptance Preliminary Specification criteria led to development of product models, the most critical one being a glass durability model. Both process and product models were used in developing a target composition for the waste glass. This target composition designed to ensure that glasses made to this target will be of acceptable durability after all process variations have been accounted for. 4 refs., 11 figs., 5 tabs

  10. Process verification of a hydrological model using a temporal parameter sensitivity analysis

    OpenAIRE

    M. Pfannerstill; B. Guse; D. Reusser; N. Fohrer

    2015-01-01

    To ensure reliable results of hydrological models, it is essential that the models reproduce the hydrological process dynamics adequately. Information about simulated process dynamics is provided by looking at the temporal sensitivities of the corresponding model parameters. For this, the temporal dynamics of parameter sensitivity are analysed to identify the simulated hydrological processes. Based on these analyses it can be verified if the simulated hydrological processes ...

  11. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  12. BProVe: Tool support for business process verification

    DEFF Research Database (Denmark)

    Corradini, Flavio; Fornari, Fabrizio; Polini, Andrea

    2017-01-01

    This demo introduces BProVe, a tool supporting automated verification of Business Process models. BProVe analysis is based on a formal operational semantics defined for the BPMN 2.0 modelling language, and is provided as a freely accessible service that uses open standard formats as input data...

  13. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  14. Verification of Stochastic Process Calculi

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya

    process calculi. The description of a system in the syntax of a particular stochastic process calculus can be analysed in a compositional way, without expanding the state space by explicitly resolving all the interdependencies between the subsystems which may lead to the state space explosion problem....... In support of this claim we have developed analysis methods that belong to a particular type of Static Analysis { Data Flow / Pathway Analysis. These methods have previously been applied to a number of non-stochastic process calculi. In this thesis we are lifting them to the stochastic calculus...... description of a system. The presented methods have a clear application in the areas of embedded systems, (randomised) protocols run between a fixed number of parties etc....

  15. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  16. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  17. Towards agent-based modelling and verification of collaborative business processes : An approach centred on interactions and behaviours

    NARCIS (Netherlands)

    Stuit, M.; Szirbik, N.

    2009-01-01

    This paper presents the process-oriented aspects of a formal and visual agent-based business process modeling language. The language is of use for (networks of) organizations that elect or envisage multi-agent systems for the support of collaborative business processes. The paper argues that the

  18. 300 Area Process Trenches Verification Package

    International Nuclear Information System (INIS)

    Lerch, J.A.

    1998-03-01

    The purpose of this verification package is to document achievement of the remedial action objectives for the 300 Area Process Trenches (300 APT) located within the 300-FF-1 Operable Unit (OU). The 300 APT became active in 1975 as a replacement for the North and South Process Pond system that is also part of the 300-FF-1 OU. The trenches received 300 Area process effluent from the uranium fuel fabrication facilities. Waste from the 300 Area laboratories that was determined to be below discharge limits based on monitoring performed at the 307 retention basin was also released to the trenches. Effluent flowed through the headworks sluice gates, down a concrete apron, and into the trenches. From the beginning of operations in 1975 until 1993, a continuous, composite sampler was located at the headwork structure to analyze process effluent at the point of discharge to the environment

  19. Formal Verification of Effectiveness of Control Activities in Business Processes

    Science.gov (United States)

    Arimoto, Yasuhito; Iida, Shusaku; Futatsugi, Kokichi

    It has been an important issue to deal with risks in business processes for achieving companies' goals. This paper introduces a method for applying a formal method to analysis of risks and control activities in business processes in order to evaluate control activities consistently, exhaustively, and to give us potential to have scientific discussion on the result of the evaluation. We focus on document flows in business activities and control activities and risks related to documents because documents play important roles in business. In our method, document flows including control activities are modeled and it is verified by OTS/CafeOBJ Method that risks about falsification of documents are avoided by control activities in the model. The verification is done by interaction between humans and CafeOBJ system with theorem proving, and it raises potential to discuss the result scientifically because the interaction gives us rigorous reasons why the result is derived from the verification.

  20. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    VandV for all safety-related nuclear facility design, analyses, and operations. In fact, DNFSB 2002-1 recommends to the DOE and National Nuclear Security Administration (NNSA) that a VandV process be performed for all safety related software and analysis. Model verification and validation are the primary processes for quantifying and building credibility in numerical models. Verification is the process of determining that a model implementation accurately represents the developer's conceptual description of the model and its solution. Validation is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model. Both verification and validation are processes that accumulate evidence of a model's correctness or accuracy for a specific scenario; thus, VandV cannot prove that a model is correct and accurate for all possible scenarios, but, rather, it can provide evidence that the model is sufficiently accurate for its intended use. Model VandV is fundamentally different from software VandV. Code developers developing computer programs perform software VandV to ensure code correctness, reliability, and robustness. In model VandV, the end product is a predictive model based on fundamental physics of the problem being solved. In all applications of practical interest, the calculations involved in obtaining solutions with the model require a computer code, e.g., finite element or finite difference analysis. Therefore, engineers seeking to develop credible predictive models critically need model VandV guidelines and procedures. The expected outcome of the model VandV process is the quantified level of agreement between experimental data and model prediction, as well as the predictive accuracy of the model. This report attempts to describe the general philosophy, definitions, concepts, and processes for conducting a successful VandV program. This objective is motivated by the need for

  1. Authoring and verification of clinical guidelines: a model driven approach.

    Science.gov (United States)

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc

  2. 21 CFR 120.25 - Process verification for certain processors.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Process verification for certain processors. 120... Pathogen Reduction § 120.25 Process verification for certain processors. Each juice processor that relies... covered by this section, processors shall take subsamples according to paragraph (a) of this section for...

  3. Mesoscale model forecast verification during monsoon 2008

    Indian Academy of Sciences (India)

    There have been very few mesoscale modelling studies of the Indian monsoon, with focus on the verification and intercomparison of the operational real time forecasts. With the exception of Das et al (2008), most of the studies in the literature are either the case studies of tropical cyclones and thunderstorms or the sensitivity ...

  4. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  5. Modelling and experimental verification of pressure prediction in the in-mould coating (IMC) process for injection moulded parts

    Science.gov (United States)

    Bhagavatula, Narayan; Castro, Jose M.

    2007-03-01

    In-mould coating (IMC) is carried out by injecting a liquid low viscosity thermosetting material onto the surface of the injection moulded part after it has solidified, while it is still in the mould. The coating then cures and adheres to the substrate. Due to its successful application to exterior body panels made from compression moulded sheet moulding compound, IMC is being developed as an environmentally friendly technology that would ultimately replace painting of injection moulded thermoplastic parts. In the short term, however, we believe IMC has the potential of being a substitute for primers. In a previous paper, we presented a Hele-Shaw based mathematical model to simulate the coating flow during the IMC process assuming the coating to be a power law fluid and using the traditional no-slip boundary condition. This model adequately predicted the fill patterns but did not predict pressures correctly. This deviation has been attributed to slip (or apparent slip) at the wall, as often found in flow through microchannels, due to the microscopic length scale of the flow, and the existence of a high shear rate Newtonian plateau for the coating viscosity. In this paper we present a mathematical model that includes the slip boundary condition and the Carreau viscosity model to properly describe the rheological behaviour of the coating material at high shear rates. The rheological and slip parameters of the coating material were measured using a customized micro slit rheometer. The predicted pressures are compared with experimental results obtained using our IMC pilot facility. It was found that the predicted pressures and coating thickness agree well with the measured values.

  6. Active Learning of Markov Decision Processes for System Verification

    DEFF Research Database (Denmark)

    Chen, Yingke; Nielsen, Thomas Dyhre

    2012-01-01

    Formal model verification has proven a powerful tool for verifying and validating the properties of a system. Central to this class of techniques is the construction of an accurate formal model for the system being investigated. Unfortunately, manual construction of such models can be a resource...... demanding process, and this shortcoming has motivated the development of algorithms for automatically learning system models from observed system behaviors. Recently, algorithms have been proposed for learning Markov decision process representations of reactive systems based on alternating sequences...... of input/output observations. While alleviating the problem of manually constructing a system model, the collection/generation of observed system behaviors can also prove demanding. Consequently we seek to minimize the amount of data required. In this paper we propose an algorithm for learning...

  7. CFD modeling and experimental verification of oscillating flow and heat transfer processes in the micro coaxial Stirling-type pulse tube cryocooler operating at 90-170 Hz

    Science.gov (United States)

    Zhao, Yibo; Yu, Guorui; Tan, Jun; Mao, Xiaochen; Li, Jiaqi; Zha, Rui; Li, Ning; Dang, Haizheng

    2018-03-01

    This paper presents the CFD modeling and experimental verifications of oscillating flow and heat transfer processes in the micro coaxial Stirling-type pulse tube cryocooler (MCSPTC) operating at 90-170 Hz. It uses neither double-inlet nor multi-bypass while the inertance tube with a gas reservoir becomes the only phase-shifter. The effects of the frequency on flow and heat transfer processes in the pulse tube are investigated, which indicates that a low enough frequency would lead to a strong mixing between warm and cold fluids, thereby significantly deteriorating the cooling performance, whereas a high enough frequency would produce the downward sloping streams flowing from the warm end to the axis and almost puncturing the gas displacer from the warm end, thereby creating larger temperature gradients in radial directions and thus undermining the cooling performance. The influence of the pulse tube length on the temperature and velocity when the frequencies are much higher than the optimal one are also discussed. A MCSPTC with an overall mass of 1.1 kg is worked out and tested. With an input electric power of 59 W and operating at 144 Hz, it achieves a no-load temperature of 61.4 K and a cooling capacity of 1.0 W at 77 K. The changing tendencies of tested results are in good agreement with the simulations. The above studies will help to thoroughly understand the underlying mechanism of the inertance MCSPTC operating at very high frequencies.

  8. Verification and Planning for Stochastic Processes with Asynchronous Events

    National Research Council Canada - National Science Library

    Younes, Hakan L

    2005-01-01

    .... The most common assumption is that of history-independence: the Markov assumption. In this thesis, the author considers the problems of verification and planning for stochastic processes with asynchronous events, without relying on the Markov assumption...

  9. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  10. FMEF Electrical single line diagram and panel schedule verification process

    International Nuclear Information System (INIS)

    Fong, S.K.

    1998-01-01

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF

  11. TU electric reactor model verification

    International Nuclear Information System (INIS)

    Willingham, C.E.; Killgore, M.R.

    1989-01-01

    Power reactor benchmark calculations using the code package CASMO-3/SIMULATE-3 have been performed for six cycles of Prairie Island Unit 1. The reload fuel designs for the selected cycles include gadolinia as a burnable absorber, natural uranium axial blankets, and increased water-to-fuel ratio. The calculated results for both low-power physics tests (boron end points, control rod worths, and isothermal temperature coefficients) and full-power operation (power distributions and boron letdown) are compared to measured plant data. These comparisons show that the TU Electric reactor physics models accurately predict important physics parameters for power reactors

  12. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    OLGUIN, L.J.

    2000-01-01

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  13. Numerical modeling of nitrogen oxide emission and experimental verification

    Directory of Open Access Journals (Sweden)

    Szecowka Lech

    2003-12-01

    Full Text Available The results of nitrogen reduction in combustion process with application of primary method are presented in paper. The reduction of NOx emission, by the recirculation of combustion gasses, staging of fuel and of air was investigated, and than the reduction of NOx emission by simultaneous usage of the mentioned above primary method with pulsatory disturbances.The investigations contain numerical modeling of NOx reduction and experimental verification of obtained numerical calculation results.

  14. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  15. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  16. Formal Modeling and Verification for MVB

    Directory of Open Access Journals (Sweden)

    Mo Xia

    2013-01-01

    Full Text Available Multifunction Vehicle Bus (MVB is a critical component in the Train Communication Network (TCN, which is widely used in most of the modern train techniques of the transportation system. How to ensure security of MVB has become an important issue. Traditional testing could not ensure the system correctness. The MVB system modeling and verification are concerned in this paper. Petri Net and model checking methods are used to verify the MVB system. A Hierarchy Colored Petri Net (HCPN approach is presented to model and simulate the Master Transfer protocol of MVB. Synchronous and asynchronous methods are proposed to describe the entities and communication environment. Automata model of the Master Transfer protocol is designed. Based on our model checking platform M3C, the Master Transfer protocol of the MVB is verified and some system logic critical errors are found. Experimental results show the efficiency of our methods.

  17. Verification of A Process Based Hydrograph Separation

    Science.gov (United States)

    Fackel, P.; Naef, F.

    Depending on soil structure, geology, topography, etc. runoff during intense precipi- tation is formed either by HOF (Hortonian overland flow), SOF (saturated overland flow), SSF (subsurface flow) or DP (deep percolation). A methodology was developed to delineate the different dominant runoff processes in varying intensities (1 very fast to 3 retarded) in a catchment. In the small Rohr catchment (2.1 km2) the dominant runoff processes were mapped with the help of soil investigations, sprinkling exper- iments, etc.. Also a series of aerial infrared photographs were taken after a heavy rainfall event to observe the drying of the catchment, indicating the extent of satu- rated areas. To verify the mapping we installed a hydrometric measurement network. Discharge was measured form a tile drain system, from areas where only SOF was expected, and from the whole catchment. Shallow groundwater wells were installed in areas where SOF and SSF of different intensities were mapped. Wells on SOF 1 areas showed saturation after 30 mm of precipitation while SOF 2 and 3 soils needed a significantly higher amount of rainfall for saturation. Water levels in wells in SSF areas didnSt rise to ground level even after large amounts of rainfall. Discharge from the SOF areas corresponded well to measured soil saturation. The runoff from the drainage system responded very fast and strong to precipitation, indicating prefer- ential flow through the soil. In a parallel project, the EAWAG investigated pesticide transport in the Rohr catchment. They applied the pesticide atrazine plus a labelling substance to 12 maize fields and measured their concentrations in runoff with a high temporal resolution. Based on the runoff process map the IHW model Qarea simu- lates the contribution of each runoff process to the storm hydrograph separately, thus implicitly yields a process based hydrograph separation. Due to the high spatial reso- lution of the process map it was possible to calculate the spatial

  18. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    Science.gov (United States)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  19. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  20. Modeling and Verification of Insider Threats Using Logical Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2017-01-01

    and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider......In this paper, we combine formal modeling and analysis of infrastructures of organizations with sociological explanation to provide a framework for insider threat analysis. We use the higher order logic (HOL) proof assistant Isabelle/HOL to support this framework. In the formal model, we exhibit...... theory constructed in Isabelle that implements this process of social explanation. To validate that the social explanation is generally useful for the analysis of insider threats and to demonstrate our framework, we model and verify the insider threat patterns of entitled independent and Ambitious Leader...

  1. Verification and transfer of thermal pollution model. Volume 5: Verification of 2-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorate (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  2. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Jang-Soo; Kim, Jang-Yeol

    2004-01-01

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  3. IP cores design from specifications to production modeling, verification, optimization, and protection

    CERN Document Server

    Mohamed, Khaled Salah

    2016-01-01

    This book describes the life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection. Various trade-offs in the design process are discussed, including  those associated with many of the most common memory cores, controller IPs  and system-on-chip (SoC) buses. Readers will also benefit from the author’s practical coverage of new verification methodologies. such as bug localization, UVM, and scan-chain.  A SoC case study is presented to compare traditional verification with the new verification methodologies. ·         Discusses the entire life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection; ·         Introduce a deep introduction for Verilog for both implementation and verification point of view.  ·         Demonstrates how to use IP in applications such as memory controllers and SoC buses. ·         Describes a new ver...

  4. Verification of Orthogrid Finite Element Modeling Techniques

    Science.gov (United States)

    Steeve, B. E.

    1996-01-01

    The stress analysis of orthogrid structures, specifically with I-beam sections, is regularly performed using finite elements. Various modeling techniques are often used to simplify the modeling process but still adequately capture the actual hardware behavior. The accuracy of such 'Oshort cutso' is sometimes in question. This report compares three modeling techniques to actual test results from a loaded orthogrid panel. The finite element models include a beam, shell, and mixed beam and shell element model. Results show that the shell element model performs the best, but that the simpler beam and beam and shell element models provide reasonable to conservative results for a stress analysis. When deflection and stiffness is critical, it is important to capture the effect of the orthogrid nodes in the model.

  5. Evaluation factors for verification and validation of low-level waste disposal site models

    International Nuclear Information System (INIS)

    Moran, M.S.; Mezga, L.J.

    1982-01-01

    The purpose of this paper is to identify general evaluation factors to be used to verify and validate LLW disposal site performance models in order to assess their site-specific applicability and to determine their accuracy and sensitivity. It is intended that the information contained in this paper be employed by model users involved with LLW site performance model verification and validation. It should not be construed as providing protocols, but rather as providing a framework for the preparation of specific protocols or procedures. A brief description of each evaluation factor is provided. The factors have been categorized according to recommended use during either the model verification or the model validation process. The general responsibilities of the developer and user are provided. In many cases it is difficult to separate the responsibilities of the developer and user, but the user is ultimately accountable for both verification and validation processes. 4 refs

  6. Internet-based dimensional verification system for reverse engineering processes

    International Nuclear Information System (INIS)

    Song, In Ho; Kim, Kyung Don; Chung, Sung Chong

    2008-01-01

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  7. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  8. Unmanned Aerial Systems in the Process of Juridical Verification of Cadastral Border

    Science.gov (United States)

    Rijsdijk, M.; van Hinsbergh, W. H. M.; Witteveen, W.; ten Buuren, G. H. M.; Schakelaar, G. A.; Poppinga, G.; van Persie, M.; Ladiges, R.

    2013-08-01

    Quite often in the verification of cadastral borders, owners of the parcels involved are not able to make their attendance at the appointed moment in time. New appointments have to be made in order to complete the verification process, and as a result often costs and throughput times grow beyond what is considered to be acceptable. To improve the efficiency of the verification process an experiment was set up that refrains from the conventional terrestrial methods for border verification. The central research question was formulated as "How useful are Unmanned Aerial Systems in the juridical verification process of cadastral borders of ownership at het Kadaster in the Netherlands?". For the experiment, operational evaluations were executed at two different locations. The first operational evaluation took place at the Pyramid of Austerlitz, a flat area with a 30 m high pyramid built by troops of Napoleon, with low civilian attendance. Two subsequent evaluations were situated in a small neighbourhood in the city of Nunspeet, where the cadastral situation recently changed, resulting from twenty new houses that were build. Initially a mini-UAS of the KLPD was used to collect photo datasets with less than 1 cm spatial resolution. In a later stage the commercial service provider Orbit Gis was hired. During the experiment four different software packages were used for processing the photo datasets into accurate geo-referenced ortho-mosaics. In this article more details will be described on the experiments carried out. Attention will be paid to the mini-UAS platforms (AscTec Falcon 8, Microdrone MD-4), the cameras used, the photo collection plan, the usage of ground control markers and the calibration of the camera's. Furthermore the results and experiences of the different used SFM software packages (Visual SFM/Bundler, PhotoScan, PhotoModeler and the Orbit software) will be shared.

  9. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the Knowledge Workers Productivity Approach. It is hoped that this paper will help managers to implement different corresponding measures. A case study is presented where this model measure and validates at the ...

  10. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available This paper provides a methodology for Validation and Verification (V&V) of a Bayesian Network (BN) model for aircraft vulnerability against Infrared (IR) missile threats. The model considers that the aircraft vulnerability depends both on a missile...

  11. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  12. Pneumatic Adaptive Absorber: Mathematical Modelling with Experimental Verification

    Directory of Open Access Journals (Sweden)

    Grzegorz Mikułowski

    2016-01-01

    Full Text Available Many of mechanical energy absorbers utilized in engineering structures are hydraulic dampers, since they are simple and highly efficient and have favourable volume to load capacity ratio. However, there exist fields of applications where a threat of toxic contamination with the hydraulic fluid contents must be avoided, for example, food or pharmacy industries. A solution here can be a Pneumatic Adaptive Absorber (PAA, which is characterized by a high dissipation efficiency and an inactive medium. In order to properly analyse the characteristics of a PAA, an adequate mathematical model is required. This paper proposes a concept for mathematical modelling of a PAA with experimental verification. The PAA is considered as a piston-cylinder device with a controllable valve incorporated inside the piston. The objective of this paper is to describe a thermodynamic model of a double chamber cylinder with gas migration between the inner volumes of the device. The specific situation considered here is that the process cannot be defined as polytropic, characterized by constant in time thermodynamic coefficients. Instead, the coefficients of the proposed model are updated during the analysis. The results of the experimental research reveal that the proposed mathematical model is able to accurately reflect the physical behaviour of the fabricated demonstrator of the shock absorber.

  13. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    Michael Horsfall

    ABSTRACT: Improving validation and verification (vv) has been one of the most important tasks of the century. However, we have few measures or management interventions to make such improvement possible, and it is difficult to identify patterns that should be followed by developers because systems and processes in an.

  14. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Panduro, T. E.; Thorsen, B. J.

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however...

  15. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however...

  16. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker Verification

    OpenAIRE

    Sarkar, A. K.; Tan, Zheng-Hua

    2016-01-01

    In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using the utterances of a particular pass-phrase. During training, pass-phrase specific target speaker models are derived from the particular PBM using the training data for the respective target model. ...

  17. Verification of the karst flow model under laboratory controlled conditions

    Science.gov (United States)

    Gotovac, Hrvoje; Andric, Ivo; Malenica, Luka; Srzic, Veljko

    2016-04-01

    Karst aquifers are very important groundwater resources around the world as well as in coastal part of Croatia. They consist of extremely complex structure defining by slow and laminar porous medium and small fissures and usually fast turbulent conduits/karst channels. Except simple lumped hydrological models that ignore high karst heterogeneity, full hydraulic (distributive) models have been developed exclusively by conventional finite element and finite volume elements considering complete karst heterogeneity structure that improves our understanding of complex processes in karst. Groundwater flow modeling in complex karst aquifers are faced by many difficulties such as a lack of heterogeneity knowledge (especially conduits), resolution of different spatial/temporal scales, connectivity between matrix and conduits, setting of appropriate boundary conditions and many others. Particular problem of karst flow modeling is verification of distributive models under real aquifer conditions due to lack of above-mentioned information. Therefore, we will show here possibility to verify karst flow models under the laboratory controlled conditions. Special 3-D karst flow model (5.6*2.6*2 m) consists of concrete construction, rainfall platform, 74 piezometers, 2 reservoirs and other supply equipment. Model is filled by fine sand (3-D porous matrix) and drainage plastic pipes (1-D conduits). This model enables knowledge of full heterogeneity structure including position of different sand layers as well as conduits location and geometry. Moreover, we know geometry of conduits perforation that enable analysis of interaction between matrix and conduits. In addition, pressure and precipitation distribution and discharge flow rates from both phases can be measured very accurately. These possibilities are not present in real sites what this model makes much more useful for karst flow modeling. Many experiments were performed under different controlled conditions such as different

  18. VERIFICATION OF GEAR DYNAMIC MODEL IN DIFFERENT OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ

    2014-09-01

    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  19. Fossil Fuel Emission Verification Modeling at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Cameron-Smith, P; Kosovic, B; Guilderson, T; Monache, L D; Bergmann, D

    2009-08-06

    We have an established project at LLNL to develop the tools needed to constrain fossil fuel carbon dioxide emissions using measurements of the carbon-14 isotope in atmospheric samples. In Figure 1 we show the fossil fuel plumes from Los Angeles and San Francisco for two different weather patterns. Obviously, a measurement made at any given location is going to depend on the weather leading up to the measurement. Thus, in order to determine the GHG emissions from some region using in situ measurements of those GHGs, we use state-of-the-art global and regional atmospheric chemistry-transport codes to simulate the plumes: the LLNL-IMPACT model (Rotman et al., 2004) and the WRFCHEM community code (http://www.wrf-model.org/index.php). Both codes can use observed (aka assimilated) meteorology in order to recreate the actual transport that occurred. The measured concentration of each tracer at a particular spatio-temporal location is a linear combination of the plumes from each region at that location (for non-reactive species). The challenge is to calculate the emission strengths for each region that fit the observed concentrations. In general this is difficult because there are errors in the measurements and modeling of the plumes. We solve this inversion problem using the strategy illustrated in Figure 2. The Bayesian Inference step combines the a priori estimates of the emissions, and their uncertainty, for each region with the results of the observations, and their uncertainty, and an ensemble of model predicted plumes for each region, and their uncertainty. The result is the mathematical best estimate of the emissions and their errors. In the case of non-linearities, or if we are using a statistical sampling technique such as a Markov Chain Monte Carlo technique, then the process is iterated until it converges (ie reaches stationarity). For the Bayesian inference we can use both a direct inversion capability, which is fast but requires assumptions of linearity and

  20. Study on formal modeling and verification of safety computer platform

    Directory of Open Access Journals (Sweden)

    Xi Wang

    2016-05-01

    Full Text Available With the development of automatic control and communication technology, communication-based train control system is adopted by more and more urban mass transit system to automatically supervise the train speed to follow a desired trajectory. Taking reliability, availability, maintainability, and safety into consideration, 2 × 2-out-of-2 safety computer platform is usually utilized as the hardware platform of safety-critical subsystem in communication-based train control. To enhance the safety integrity level of safety computer platform, safety-related logics have to be verified before integrating them into practical systems. Therefore, a significant problem of developing safety computer platform is how to guarantee that system behaviors will satisfy the function requirements as well as responding to external events and processes within the limit of right time. Based on the qualitative and quantitative analysis of function and timing characteristics, this article introduces a formal modeling and verification approach for this real-time system. In the proposed method, timed automata network model for 2 × 2-out-of-2 safety computer platform is built, and system requirements are specified and formalized as computation tree logic properties which can be verified by UPPAAL model checker.

  1. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    Science.gov (United States)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  2. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  3. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  4. Formal Requirements Modeling for Simulation-Based Verification

    OpenAIRE

    Otter, Martin; Thuy, Nguyen; Bouskela, Daniel; Buffoni, Lena; Elmqvist, Hilding; Fritzson, Peter; Garro, Alfredo; Jardin, Audrey; Olsson, Hans; Payelleville, Maxime; Schamai, Wladimir; Thomas, Eric; Tundis, Andrea

    2015-01-01

    This paper describes a proposal on how to model formal requirements in Modelica for simulation-based verification. The approach is implemented in the open source Modelica_Requirements library. It requires extensions to the Modelica language, that have been prototypically implemented in the Dymola and Open-Modelica software. The design of the library is based on the FOrmal Requirement Modeling Language (FORM-L) defined by EDF, and on industrial use cases from EDF and Dassault Aviation. It uses...

  5. 78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension

    Science.gov (United States)

    2013-09-12

    ... Process'' flowchart. PHMSA is using this notice to announce the revised ``Integrity Verification Process'' flowchart and extend the comment period from September 9, 2013, to October 7, 2013. DATES: The closing date...

  6. Mesoscale model forecast verification during monsoon 2008

    Indian Academy of Sciences (India)

    The systematic error in the 850 hPa temperature indicates that largely the WRF model forecasts feature warm bias and the MM5 model forecasts feature cold bias. Features common to all the three models include warm bias over northwest India and cold bias over southeast peninsula. The 850 hPa specific humidity forecast ...

  7. From Wireless Sensor Networks to Wireless Body Area Networks: Formal Modeling and Verification on Security Using PAT

    Directory of Open Access Journals (Sweden)

    Tieming Chen

    2016-01-01

    Full Text Available Model checking has successfully been applied on verification of security protocols, but the modeling process is always tedious and proficient knowledge of formal method is also needed although the final verification could be automatic depending on specific tools. At the same time, due to the appearance of novel kind of networks, such as wireless sensor networks (WSN and wireless body area networks (WBAN, formal modeling and verification for these domain-specific systems are quite challenging. In this paper, a specific and novel formal modeling and verification method is proposed and implemented using an expandable tool called PAT to do WSN-specific security verification. At first, an abstract modeling data structure for CSP#, which is built in PAT, is developed to support the node mobility related specification for modeling location-based node activity. Then, the traditional Dolev-Yao model is redefined to facilitate modeling of location-specific attack behaviors on security mechanism. A throughout formal verification application on a location-based security protocol in WSN is described in detail to show the usability and effectiveness of the proposed methodology. Furthermore, also a novel location-based authentication security protocol in WBAN can be successfully modeled and verified directly using our method, which is, to the best of our knowledge, the first effort on employing model checking for automatic analysis of authentication protocol for WBAN.

  8. Model of supercapacitor with laboratory verification

    Directory of Open Access Journals (Sweden)

    Simić Mitar S.

    2014-01-01

    Full Text Available In this paper historical development and most important production technologies of supercapacitors are presented. Simple model of supecapacitor based on parameters obtained from manufacturer's datasheet is also realized. Purposed model is verified with charge, discharge and self-discharge tests with obtained results close to experimental.

  9. Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Bliguet, Marie Le; Kjær, Andreas

    2010-01-01

    This paper describes how relay interlocking systems as used by the Danish railways can be formally modelled and verified. Such systems are documented by circuit diagrams describing their static layout. It is explained how to derive a state transition system model for the dynamic behaviour of a re...

  10. VERIFICATION AND VALIDATION OF THE SPARC MODEL

    Science.gov (United States)

    Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values--that is, the physical and chemical constants that govern reactivity. Although empirical structure-activity relationships that allow estimation of some ...

  11. User verification of the FRBR conceptual model

    OpenAIRE

    Pisanski, Jan; Žumer, Maja

    2015-01-01

    Purpose - The paper aims to build on of a previous study of mental model s of the bibliographic universe, which found that the Functional Requirements for Bibliographic Records (FRBR) conceptual model is intuitive. Design/ methodology/approach - A total 120 participants were presented with a list of bibliographic entities and six graphs each. They were asked to choose the graph they thought best represented the relationships between entities described. Findings - The graph bas ed on the FRBR ...

  12. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...... as well as slip. An Unscented Kalman Filter (UKF) based on the dynamic model is used for sensor fusion, feeding sensor measurements back to the robot controller in an intelligent manner. Through practical experiments with the robot, the UKF is demonstrated to improve the reliability of the sensor signals...

  13. A verification system survival probability assessment model test methods

    International Nuclear Information System (INIS)

    Jia Rui; Wu Qiang; Fu Jiwei; Cao Leituan; Zhang Junnan

    2014-01-01

    Subject to the limitations of funding and test conditions, the number of sub-samples of large complex system test less often. Under the single sample conditions, how to make an accurate evaluation of the performance, it is important for reinforcement of complex systems. It will be able to significantly improve the technical maturity of the assessment model, if that can experimental validation and evaluation model. In this paper, a verification system survival probability assessment model test method, the method by the test system sample test results, verify the correctness of the assessment model and a priori information. (authors)

  14. Models and formal verification of multiprocessor system-on-chips

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2008-01-01

    , the configuration of the execution platform and the mapping of the application onto this platform. The computational model provides a basis for formal analysis of systems. The model is translated to timed automata and a tool for system verification and simulation has been developed using Uppaal as backend. We...... a discrete model of computation for such systems and characterize the size of the computation tree it suffices to consider when checking for schedulability. Analysis of multiprocessor system on chips is a major challenge due to the freedom of interrelated choices concerning the application level...

  15. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2017-01-01

    In this article, we present a method and an associated toolchain for the formal verification of the new Danish railway interlocking systems that are compatible with the European Train Control System (ETCS) Level 2. We have made a generic and reconfigurable model of the system behaviour and generic...... safety properties. This model accommodates sequential release - a feature in the new Danish interlocking systems. To verify the safety of an interlocking system, first a domain-specific description of interlocking configuration data is constructed and validated. Then the generic model and safety...

  16. Modeling and Verification of the Bitcoin Protocol

    Directory of Open Access Journals (Sweden)

    Kaylash Chaudhary

    2015-11-01

    Full Text Available Bitcoin is a popular digital currency for online payments, realized as a decentralized peer-to-peer electronic cash system. Bitcoin keeps a ledger of all transactions; the majority of the participants decides on the correct ledger. Since there is no trusted third party to guard against double spending, and inspired by its popularity, we would like to investigate the correctness of the Bitcoin protocol. Double spending is an important threat to electronic payment systems. Double spending would happen if one user could force a majority to believe that a ledger without his previous payment is the correct one. We are interested in the probability of success of such a double spending attack, which is linked to the computational power of the attacker. This paper examines the Bitcoin protocol and provides its formalization as an UPPAAL model. The model will be used to show how double spending can be done if the parties in the Bitcoin protocol behave maliciously, and with what probability double spending occurs.

  17. Software Quality Assurance and Verification for the MPACT Library Generation Process

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yuxuan [Univ. of Michigan, Ann Arbor, MI (United States); Williams, Mark L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wiarda, Dorothea [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Clarno, Kevin T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Celik, Cihangir [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-05-01

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX and VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.

  18. Verification of the NWP models operated at ICM, Poland

    Science.gov (United States)

    Melonek, Malgorzata

    2010-05-01

    Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw (ICM) started its activity in the field of NWP in May 1997. Since this time the numerical weather forecasts covering Central Europe have been routinely published on our publicly available website. First NWP model used in ICM was hydrostatic Unified Model developed by the UK Meteorological Office. It was a mesoscale version with horizontal resolution of 17 km and 31 levels in vertical. At present two NWP non-hydrostatic models are running in quasi-operational regime. The main new UM model with 4 km horizontal resolution, 38 levels in vertical and forecats range of 48 hours is running four times a day. Second, the COAMPS model (Coupled Ocean/Atmosphere Mesoscale Prediction System) developed by the US Naval Research Laboratory, configured with the three nested grids (with coresponding resolutions of 39km, 13km and 4.3km, 30 vertical levels) are running twice a day (for 00 and 12 UTC). The second grid covers Central Europe and has forecast range of 84 hours. Results of the both NWP models, ie. COAMPS computed on 13km mesh resolution and UM, are verified against observations from the Polish synoptic stations. Verification uses surface observations and nearest grid point forcasts. Following meteorological elements are verified: air temperature at 2m, mean sea level pressure, wind speed and wind direction at 10 m and 12 hours accumulated precipitation. There are presented different statistical indices. For continous variables Mean Error(ME), Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) in 6 hours intervals are computed. In case of precipitation the contingency tables for different thresholds are computed and some of the verification scores such as FBI, ETS, POD, FAR are graphically presented. The verification sample covers nearly one year.

  19. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  20. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  1. Transforming PLC Programs into Formal Models for Verification Purposes

    CERN Document Server

    Darvas, D; Blanco, E

    2013-01-01

    Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.

  2. Design verification and cold-flow modeling test report

    Energy Technology Data Exchange (ETDEWEB)

    1993-07-01

    This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, and TRW proprietary information has been excluded.

  3. SoS contract verification using statistical model checking

    Directory of Open Access Journals (Sweden)

    Alessandro Mignogna

    2013-11-01

    Full Text Available Exhaustive formal verification for systems of systems (SoS is impractical and cannot be applied on a large scale. In this paper we propose to use statistical model checking for efficient verification of SoS. We address three relevant aspects for systems of systems: 1 the model of the SoS, which includes stochastic aspects; 2 the formalization of the SoS requirements in the form of contracts; 3 the tool-chain to support statistical model checking for SoS. We adapt the SMC technique for application to heterogeneous SoS. We extend the UPDM/SysML specification language to express the SoS requirements that the implemented strategies over the SoS must satisfy. The requirements are specified with a new contract language specifically designed for SoS, targeting a high-level English- pattern language, but relying on an accurate semantics given by the standard temporal logics. The contracts are verified against the UPDM/SysML specification using the Statistical Model Checker (SMC PLASMA combined with the simulation engine DESYRE, which integrates heterogeneous behavioral models through the functional mock-up interface (FMI standard. The tool-chain allows computing an estimation of the satisfiability of the contracts by the SoS. The results help the system architect to trade-off different solutions to guide the evolution of the SoS.

  4. A verification strategy for web services composition using enhanced stacked automata model.

    Science.gov (United States)

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  5. Verification and transfer of thermal pollution model. Volume 3: Verification of 3-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free-surface model also provides surface height variations with time.

  6. A comparative verification of high resolution precipitation forecasts using model output statistics

    Science.gov (United States)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  7. Efficient speaker verification using Gaussian mixture model component clustering.

    Energy Technology Data Exchange (ETDEWEB)

    De Leon, Phillip L. (New Mexico State University, Las Cruces, NM); McClanahan, Richard D.

    2012-04-01

    In speaker verification (SV) systems that employ a support vector machine (SVM) classifier to make decisions on a supervector derived from Gaussian mixture model (GMM) component mean vectors, a significant portion of the computational load is involved in the calculation of the a posteriori probability of the feature vectors of the speaker under test with respect to the individual component densities of the universal background model (UBM). Further, the calculation of the sufficient statistics for the weight, mean, and covariance parameters derived from these same feature vectors also contribute a substantial amount of processing load to the SV system. In this paper, we propose a method that utilizes clusters of GMM-UBM mixture component densities in order to reduce the computational load required. In the adaptation step we score the feature vectors against the clusters and calculate the a posteriori probabilities and update the statistics exclusively for mixture components belonging to appropriate clusters. Each cluster is a grouping of multivariate normal distributions and is modeled by a single multivariate distribution. As such, the set of multivariate normal distributions representing the different clusters also form a GMM. This GMM is referred to as a hash GMM which can be considered to a lower resolution representation of the GMM-UBM. The mapping that associates the components of the hash GMM with components of the original GMM-UBM is referred to as a shortlist. This research investigates various methods of clustering the components of the GMM-UBM and forming hash GMMs. Of five different methods that are presented one method, Gaussian mixture reduction as proposed by Runnall's, easily outperformed the other methods. This method of Gaussian reduction iteratively reduces the size of a GMM by successively merging pairs of component densities. Pairs are selected for merger by using a Kullback-Leibler based metric. Using Runnal's method of reduction, we

  8. Development and verification of an agent-based model of opinion leadership.

    Science.gov (United States)

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The

  9. Formal verification of reactor process control software using assertion checking environment

    International Nuclear Information System (INIS)

    Sharma, Babita; Balaji, Sowmya; John, Ajith K.; Bhattacharjee, A.K.; Dhodapkar, S.D.

    2005-01-01

    Assertion Checking Environment (ACE) was developed in-house for carrying out formal (rigorous/ mathematical) functional verification of embedded software written in MISRA C. MISRA C is an industrially sponsored safe sub-set of C programming language and is well accepted in the automotive and aerospace industries. ACE uses static assertion checking technique for verification of MISRA C programs. First the functional specifications of the program are derived from the specifications in the form of pre- and post-conditions for each C function. These pre- and post-conditions are then introduced as assertions (formal comments) in the program code. The annotated C code is then formally verified using ACE. In this paper we present our experience of using ACE for the formal verification of process control software of a nuclear reactor. The Software Requirements Document (SRD) contained textual specifications of the process control software. The SRD was used by the designers to draw logic diagrams which were given as input to a code generator. The verification of the generated C code was done at 2 levels viz. (i) verification against specifications derived from logic diagrams, and (ii) verification against specifications derived from SRD. In this work we checked approximately 600 functional specifications of the software having roughly 15000 lines of code. (author)

  10. Verification of Conjugate Heat Transfer Models in a Closed Volume with Radiative Heat Source

    Directory of Open Access Journals (Sweden)

    Maksimov Vyacheslav I.

    2016-01-01

    Full Text Available The results of verification of mathematical model of convective-conductive heat transfer in a closed volume with a thermally conductive enclosing structures are presented. Experiments were carried out to determine the temperature of floor premises in the working conditions of radiant heating systems. Comparison of mathematical modelling of temperature fields and experiments showed their good agreement. It is concluded that the mathematical model of conjugate heat transfers in the air cavity with a heat-conducting and heat-retaining walls correspond to the real process of formation of temperature fields in premises with gas infrared heaters system.

  11. On the verification and validation of detonation models

    Science.gov (United States)

    Quirk, James

    2013-06-01

    This talk will consider the verification and validation of detonation models, such as Wescott-Stewart-Davis (Journal of Applied Physics. 2005), from the perspective of the American Institute of Aeronautics and Astronautics policy on numerical accuracy (AIAA J. Vol. 36, No. 1, 1998). A key aspect of the policy is that accepted documentation procedures must be used for journal articles with the aim of allowing the reported work to be reproduced by the interested reader. With the rise of electronic documents, since the policy was formulated, it is now possible to satisfy this mandate in its strictest sense: that is, it is now possible to run a comptuational verification study directly in a PDF, thereby allowing a technical author to report numerical subtleties that traditionally have been ignored. The motivation for this document-centric approach is discussed elsewhere (Quirk2003, Adaptive Mesh Refinement Theory and Practice, Springer), leaving the talk to concentrate on specific detonation examples that should be of broad interest to the shock-compression community.

  12. New possibilities of digital luminescence radiography (DLR) and digital image processing for verification and portal imaging

    International Nuclear Information System (INIS)

    Zimmermann, J.S.; Blume, J.; Wendhausen, H.; Hebbinghaus, D.; Kovacs, G.; Eilf, K.; Schultze, J.; Kimmig, B.N.

    1995-01-01

    We developed a method, using digital luminescence radiography (DLR), not only for portal imaging of photon beams in an excellent quality, but also for verification of electron beams. Furtheron, DLR was used as basic instrument for image fusion of portal and verification film and simulation film respectively for image processing in ''beams-eye-view'' verification (BEVV) of rotating beams or conformation therapy. Digital radiographs of an excellent quality are gained for verification of photon and electron beams. In photon beams, quality improvement vs. conventional portal imaging may be dramatic, even more for high energy beams (e.g. 15-MV-photon beams) than for Co-60. In electron beams, excellent results may be easily obtained. By digital image fusion of 1 or more verification films on simulation film or MRI-planning film, more precise judgement even on small differences between simulation and verification films becomes possible. Using BEVV, it is possible to compare computer aided simulation in rotating beams or conformation therapy with the really applied treatment. The basic principle of BEVV is also suitable for dynamic multileaf collimation. (orig.) [de

  13. The virtual product-process design laboratory to manage the complexity in the verification of formulated products

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Malik, Tahir I.

    2011-01-01

    -Process Design laboratory (virtual PPD-lab) software is based on this decomposition strategy for the design of formulated liquid products. When the needed models are available in the software, the solution of formulation design/verification problems is straightforward, while when models are not available...... in the software library, they need to be developed and/or implemented. The potential of the virtual PPD-lab in managing the complexity in the verification of formulated products, after the needed models have been developed and implemented, is highlighted in this paper through a case study from industry dealing...... mixtures need to be predicted. This complexity has to be managed through decomposition of the problem into sub-problems. Each sub-problem is solved and analyzed and, from the knowledge gained, an overall evaluation of the complex chemical system representing the product is made. The virtual Product...

  14. D Modeling with Photogrammetry by Uavs and Model Quality Verification

    Science.gov (United States)

    Barrile, V.; Bilotta, G.; Nunnari, A.

    2017-11-01

    This paper deals with a test lead by Geomatics laboratory (DICEAM, Mediterranea University of Reggio Calabria), concerning the application of UAV photogrammetry for survey, monitoring and checking. The study case relies with the surroundings of the Department of Agriculture Sciences. In the last years, such area was interested by landslides and survey activities carried out to take the phenomenon under control. For this purpose, a set of digital images were acquired through a UAV equipped with a digital camera and GPS. Successively, the processing for the production of a 3D georeferenced model was performed by using the commercial software Agisoft PhotoScan. Similarly, the use of a terrestrial laser scanning technique allowed to product dense cloud and 3D models of the same area. To assess the accuracy of the UAV-derived 3D models, a comparison between image and range-based methods was performed.

  15. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Contamination with Microorganisms... § 381.94 Contamination with Microorganisms; process control verification criteria and testing; pathogen... maintaining process controls sufficient to prevent fecal contamination. FSIS shall take further action as...

  16. Verification of Fault Tree Models with RBDGG Methodology

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    Currently, fault tree analysis is widely used in the field of probabilistic safety assessment (PSA) of nuclear power plants (NPPs). To guarantee the correctness of fault tree models, which are usually manually constructed by analysts, a review by other analysts is widely used for verifying constructed fault tree models. Recently, an extension of the reliability block diagram was developed, which is named as RBDGG (reliability block diagram with general gates). The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system and, therefore, the modeling of a system for a system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar to that of the development of the RGGG (Reliability Graph with General Gates) methodology. The difference between the RBDGG methodology and RGGG methodology is that the RBDGG methodology focuses on the block failures while the RGGG methodology focuses on the connection line failures. But, it is also known that an RGGG model can be converted to an RBDGG model and vice versa. In this paper, a new method for the verification of the constructed fault tree models using the RBDGG methodology is proposed and demonstrated

  17. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  18. Verification of Model of Calculation of Intra-Chamber Parameters In Hybrid Solid-Propellant Rocket Engines

    Directory of Open Access Journals (Sweden)

    Zhukov Ilya S.

    2016-01-01

    Full Text Available On the basis of obtained analytical estimate of characteristics of hybrid solid-propellant rocket engine verification of earlier developed physical and mathematical model of processes in a hybrid solid-propellant rocket engine for quasi-steady-state flow regime was performed. Comparative analysis of calculated and analytical data indicated satisfactory comparability of simulation results.

  19. Simscape Modeling Verification in the Simulink Development Environment

    Science.gov (United States)

    Volle, Christopher E. E.

    2014-01-01

    The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.

  20. Control and verification of industrial hybrid systems using models specified with the formalism $ chi $

    NARCIS (Netherlands)

    J.J.H. Fey

    1996-01-01

    textabstractControl and verification of hybrid systems is studied using two industrial examples. The hybrid models of a conveyor-belt and of a biochemical plant for the production of ethanol are specified in the formalism $chi .$ A verification of the closed-loop systems for those examples,

  1. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2018-01-01

    is compared to conventional text-independent background model based TD-SV systems using either Gaussian mixture model (GMM)-universal background model (UBM) or Hidden Markov model (HMM)-UBM or i-vector paradigms. In addition, we consider two approaches to build PBMs: speaker-independent and speaker......In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using...... and the selected PBM is then used for the log likelihood ratio (LLR) calculation with respect to the claimant model. The proposed method incorporates the pass-phrase identification step in the LLR calculation, which is not considered in conventional standalone TD-SV systems. The performance of the proposed method...

  2. Verification of temporal-causal network models by mathematical analysis

    Directory of Open Access Journals (Sweden)

    Jan Treur

    2016-04-01

    Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.

  3. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Contamination with microorganisms... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... controls sufficient to prevent fecal contamination. FSIS shall take further action as appropriate to ensure...

  4. 49 CFR 40.151 - What are MROs prohibited from doing as part of the verification process?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false What are MROs prohibited from doing as part of the... Verification Process § 40.151 What are MROs prohibited from doing as part of the verification process? As an... consider any evidence from tests of urine samples or other body fluids or tissues (e.g., blood or hair...

  5. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    Science.gov (United States)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  6. Additional Model Datasets and Results to Accelerate the Verification and Validation of RELAP-7

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-11-01

    The RELAP-7 code verification and validation activities are ongoing under the code assessment plan proposed in the previous document (INL-EXT-16-40015). Among the list of V&V test problems in the ‘RELAP-7 code V&V RTM (Requirements Traceability Matrix)’, the RELAP-7 7-equation model has been tested with additional demonstration problems and the results of these tests are reported in this document. In this report, we describe the testing process, the test cases that were conducted, and the results of the evaluation.

  7. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  8. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  9. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Christina [Bundesanstalt fuer Materialforschung und -pruefung (BAM), Berlin (Germany); Oeberg, Tomas [Tomas Oeberg Konsult AB, Lyckeby (Sweden)

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance

  10. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    International Nuclear Information System (INIS)

    Mueller, Christina; Oeberg, Tomas

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance. This estimate can

  11. Experimental verification of mathematical model of the heat transfer in exhaust system

    Directory of Open Access Journals (Sweden)

    Petković Snežana

    2011-01-01

    Full Text Available A Catalyst convertor has maximal efficiency when it reaches working temperature. In a cold start phase efficiency of the catalyst is low and exhaust emissions have high level of air pollutants. The exhaust system optimization, in order to decrease time of achievement of the catalyst working temperature, caused reduction of the total vehicle emission. Implementation of mathematical models in development of exhaust systems decrease total costs and reduce time. Mathematical model has to be experimentally verified and calibrated, in order to be useful in the optimization process. Measurement installations have been developed and used for verification of the mathematical model of unsteady heat transfer in exhaust systems. Comparisons between experimental results and the mathematical model are presented in this paper. Based on obtained results, it can be concluded that there is a good agreement between the model and the experimental results.

  12. A program for verification of phylogenetic network models.

    Science.gov (United States)

    Gunawan, Andreas D M; Lu, Bingxin; Zhang, Louxin

    2016-09-01

    Genetic material is transferred in a non-reproductive manner across species more frequently than commonly thought, particularly in the bacteria kingdom. On one hand, extant genomes are thus more properly considered as a fusion product of both reproductive and non-reproductive genetic transfers. This has motivated researchers to adopt phylogenetic networks to study genome evolution. On the other hand, a gene's evolution is usually tree-like and has been studied for over half a century. Accordingly, the relationships between phylogenetic trees and networks are the basis for the reconstruction and verification of phylogenetic networks. One important problem in verifying a network model is determining whether or not certain existing phylogenetic trees are displayed in a phylogenetic network. This problem is formally called the tree containment problem. It is NP-complete even for binary phylogenetic networks. We design an exponential time but efficient method for determining whether or not a phylogenetic tree is displayed in an arbitrary phylogenetic network. It is developed on the basis of the so-called reticulation-visible property of phylogenetic networks. A C-program is available for download on http://www.math.nus.edu.sg/∼matzlx/tcp_package matzlx@nus.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Linear models to perform treaty verification tasks for enhanced information security

    Science.gov (United States)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  14. Linear models to perform treaty verification tasks for enhanced information security

    International Nuclear Information System (INIS)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-01-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  15. Safety Verification of Piecewise-Deterministic Markov Processes

    DEFF Research Database (Denmark)

    Wisniewski, Rafael; Sloth, Christoffer; Bujorianu, Manuela

    2016-01-01

    We consider the safety problem of piecewise-deterministic Markov processes (PDMP). These are systems that have deterministic dynamics and stochastic jumps, where both the time and the destination of the jumps are stochastic. Specifically, we solve a p-safety problem, where we identify the set...

  16. The algorithm of verification of welding process for plastic pipes

    Science.gov (United States)

    Rzasinski, R.

    2017-08-01

    The study analyzes the process of butt welding of PE pipes in terms of proper selection of connector parameters. The process was oriented to the elements performed as a series of types of pipes. Polymeric materials commonly referred to as polymers or plastics, synthetic materials are produced from oil products in the polyreaction compounds of low molecular weight, called monomers. During the polyreactions monomers combine to build a macromolecule material monomer named with the prefix poly polypropylene, polyethylene or polyurethane, creating particles in solid state on the order of 0,2 to 0,4 mm. Finished products from polymers of virtually any shape and size are obtained by compression molding, injection molding, extrusion, laminating, centrifugal casting, etc. Weld can only be a thermoplastic that softens at an elevated temperature, and thus can be connected via a clamp. Depending on the source and method of supplying heat include the following welding processes: welding contact, radiant welding, friction welding, dielectric welding, ultrasonic welding. The analysis will be welding contact. In connection with the development of new generation of polyethylene, and the production of pipes with increasing dimensions (diameter, wall thickness) is important to select the correct process.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT LASER TOUCH AND TECHNOLOGIES, LLC LASER TOUCH MODEL LT-B512

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of Laser Touch model LT-B512 targeting device manufactured by Laser Touch and Technologies, LLC, for manual spray painting operations. The relative transfer efficiency (TE) improved an avera...

  18. Results of the Independent Verification and Validation Study for the D2-Puff Model

    National Research Council Canada - National Science Library

    Bowers, J

    1999-01-01

    .... The independent verification and validation (IV&V) study of D2-Puff Version 2.0.6 focused on these accreditation requirements and the implicit requirement that the model provide safe-sided hazard estimates...

  19. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at{sub R}isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at{sub R}isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the

  20. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  1. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Science.gov (United States)

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  2. TALYS/TENDL verification and validation processes: Outcomes and recommendations

    Directory of Open Access Journals (Sweden)

    Fleming Michael

    2017-01-01

    Full Text Available The TALYS-generated Evaluated Nuclear Data Libraries (TENDL provide truly general-purpose nuclear data files assembled from the outputs of the T6 nuclear model codes system for direct use in both basic physics and engineering applications. The most recent TENDL-2015 version is based on both default and adjusted parameters of the most recent TALYS, TAFIS, TANES, TARES, TEFAL, TASMAN codes wrapped into a Total Monte Carlo loop for uncertainty quantification. TENDL-2015 contains complete neutron-incident evaluations for all target nuclides with Z ≤116 with half-life longer than 1 second (2809 isotopes with 544 isomeric states, up to 200 MeV, with covariances and all reaction daughter products including isomers of half-life greater than 100 milliseconds. With the added High Fidelity Resonance (HFR approach, all resonances are unique, following statistical rules. The validation of the TENDL-2014/2015 libraries against standard, evaluated, microscopic and integral cross sections has been performed against a newly compiled UKAEA database of thermal, resonance integral, Maxwellian averages, 14 MeV and various accelerator-driven neutron source spectra. This has been assembled using the most up-to-date, internationally-recognised data sources including the Atlas of Resonances, CRC, evaluated EXFOR, activation databases, fusion, fission and MACS. Excellent agreement was found with a small set of errors within the reference databases and TENDL-2014 predictions.

  3. TALYS/TENDL verification and validation processes: Outcomes and recommendations

    Science.gov (United States)

    Fleming, Michael; Sublet, Jean-Christophe; Gilbert, Mark R.; Koning, Arjan; Rochman, Dimitri

    2017-09-01

    The TALYS-generated Evaluated Nuclear Data Libraries (TENDL) provide truly general-purpose nuclear data files assembled from the outputs of the T6 nuclear model codes system for direct use in both basic physics and engineering applications. The most recent TENDL-2015 version is based on both default and adjusted parameters of the most recent TALYS, TAFIS, TANES, TARES, TEFAL, TASMAN codes wrapped into a Total Monte Carlo loop for uncertainty quantification. TENDL-2015 contains complete neutron-incident evaluations for all target nuclides with Z ≤116 with half-life longer than 1 second (2809 isotopes with 544 isomeric states), up to 200 MeV, with covariances and all reaction daughter products including isomers of half-life greater than 100 milliseconds. With the added High Fidelity Resonance (HFR) approach, all resonances are unique, following statistical rules. The validation of the TENDL-2014/2015 libraries against standard, evaluated, microscopic and integral cross sections has been performed against a newly compiled UKAEA database of thermal, resonance integral, Maxwellian averages, 14 MeV and various accelerator-driven neutron source spectra. This has been assembled using the most up-to-date, internationally-recognised data sources including the Atlas of Resonances, CRC, evaluated EXFOR, activation databases, fusion, fission and MACS. Excellent agreement was found with a small set of errors within the reference databases and TENDL-2014 predictions.

  4. Raman laser spectrometer optical head: qualification model assembly and integration verification

    Science.gov (United States)

    Ramos, G.; Sanz-Palomino, M.; Moral, A. G.; Canora, C. P.; Belenguer, T.; Canchal, R.; Prieto, J. A. R.; Santiago, A.; Gordillo, C.; Escribano, D.; Lopez-Reyes, G.; Rull, F.

    2017-08-01

    Raman Laser Spectrometer (RLS) is the Pasteur Payload instrument of the ExoMars mission, within the ESA's Aurora Exploration Programme, that will perform for the first time in an out planetary mission Raman spectroscopy. RLS is composed by SPU (Spectrometer Unit), iOH (Internal Optical Head), and ICEU (Instrument Control and Excitation Unit). iOH focuses the excitation laser on the samples (excitation path), and collects the Raman emission from the sample (collection path, composed on collimation system and filtering system). Its original design presented a high laser trace reaching to the detector, and although a certain level of laser trace was required for calibration purposes, the high level degrades the Signal to Noise Ratio confounding some Raman peaks. So, after the bread board campaign, some light design modifications were implemented in order to fix the desired amount of laser trace, and after the fabrication and the commitment of the commercial elements, the assembly and integration verification process was carried out. A brief description of the iOH design update for the engineering and qualification model (iOH EQM) as well as the assembly process are briefly described in this papers. In addition, the integration verification and the first functional tests, carried out with the RLS calibration target (CT), results are reported on.

  5. A safeguards verification technique for solution homogeneity and volume measurements in process tanks

    International Nuclear Information System (INIS)

    Suda, S.; Franssen, F.

    1987-01-01

    A safeguards verification technique is being developed for determining whether process-liquid homogeneity has been achieved in process tanks and for authenticating volume-measurement algorithms involving temperature corrections. It is proposed that, in new designs for bulk-handling plants employing automated process lines, bubbler probes and thermocouples be installed at several heights in key accountability tanks. High-accuracy measurements of density using an electromanometer can now be made which match or even exceed analytical-laboratory accuracies. Together with regional determination of tank temperatures, these measurements provide density, liquid-column weight and temperature gradients over the fill range of the tank that can be used to ascertain when the tank solution has reached equilibrium. Temperature-correction algorithms can be authenticated by comparing the volumes obtained from the several bubbler-probe liquid-height measurements, each based on different amounts of liquid above and below the probe. The verification technique is based on the automated electromanometer system developed by Brookhaven National Laboratory (BNL). The IAEA has recently approved the purchase of a stainless-steel tank equipped with multiple bubbler and thermocouple probes for installation in its Bulk Calibration Laboratory at IAEA Headquarters, Vienna. The verification technique is scheduled for preliminary trials in late 1987

  6. Model-based verification method for solving the parameter uncertainty in the train control system

    International Nuclear Information System (INIS)

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  7. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  8. Model Based Verification of Cyber Range Event Environments

    Science.gov (United States)

    2015-12-10

    built a software development environment that uses to support the development and verification of on-board software for spacecraft. Tanizaki et al...Technologies, Piscataway, 2011. [16] Hiroaki Tanizaki , Toshiaki Aoki, and Takuya Katayama, "A Variability Management Method for Software

  9. Evaluation and verification of thermal stratification models for was

    African Journals Online (AJOL)

    USER

    form bacteria and verification. INTRODUCTION ... bottom. The thermal stratification can be stable-persisting for months or intermittent, appearing for a few hours in the day (Dor et al. 1993; Pedahzur et al., 1993; Torres et al.,. 1997). In waste stabilization ponds the .... water depth of 0.2 and a thick sludge deposits. The WSP ...

  10. NbTi Strands Verification for ITER PF CICC Process Qualification of CNDA

    Science.gov (United States)

    Liu, F.; Liu, H.; Liu, S.; Liu, B.; Lei, L.; Wu, Y.

    2014-05-01

    China is in charge of most of Poloidal Field (PF) conductors production for the International Thermonuclear Experimental Reactor (ITER). The execution for PF conductors shall be in three main phases. According to ITER Procurement Arrangement (PA), the Domestic Agency (DA) shall be required to verify the room and low temperature acceptance tests carried out by the strand suppliers. As the reference laboratory of Chinese DA (CNDA), the superconducting strands test laboratory of Institute of Plasma Physics, Chinese Academy of Sciences (ASIPP) was undertaking the task of strands verification for ITER conductors. The verification test includes: diameter, Nickel plating thickness, copper-to-non-copper volume ratio, twist pitch direction and length, standard critical current (IC) and resistive transition index (n), residual resistance ration (RRR), and hysteresis loss. 48 NbTi strands with 7 billets were supplied for the PF Cable-In-Conduit Conductor (CICC) process qualification. In total, 54 samples were measured. The verification level for PF CICC process qualification was 100%. The test method, facility and results of each item are described in detail in this publication.

  11. Demonstrations and verification of debris bed models in the MEDICI reactor cavity code

    International Nuclear Information System (INIS)

    Trebilcock, W.R.; Bergeron, K.D.; Gorham-Bergeron, E.D.

    1984-01-01

    The MEDICI reactor cavity model is under development at Sandia National Laboratories to provide a simple but realistic treatment of ex-vessel severe accident phenomena. Several demonstration cases have been run and are discussed as illustrations of the model's capabilities. Verification of the model with experiments has supplied confidence in the model

  12. On-Ground Processing of Yaogan-24 Remote Sensing Satellite Attitude Data and Verification Using Geometric Field Calibration

    Science.gov (United States)

    Wang, Mi; Fan, Chengcheng; Yang, Bo; Jin, Shuying; Pan, Jun

    2016-01-01

    Satellite attitude accuracy is an important factor affecting the geometric processing accuracy of high-resolution optical satellite imagery. To address the problem whereby the accuracy of the Yaogan-24 remote sensing satellite’s on-board attitude data processing is not high enough and thus cannot meet its image geometry processing requirements, we developed an approach involving on-ground attitude data processing and digital orthophoto (DOM) and the digital elevation model (DEM) verification of a geometric calibration field. The approach focuses on three modules: on-ground processing based on bidirectional filter, overall weighted smoothing and fitting, and evaluation in the geometric calibration field. Our experimental results demonstrate that the proposed on-ground processing method is both robust and feasible, which ensures the reliability of the observation data quality, convergence and stability of the parameter estimation model. In addition, both the Euler angle and quaternion could be used to build a mathematical fitting model, while the orthogonal polynomial fitting model is more suitable for modeling the attitude parameter. Furthermore, compared to the image geometric processing results based on on-board attitude data, the image uncontrolled and relative geometric positioning result accuracy can be increased by about 50%. PMID:27483287

  13. Verification of the Skorohod-Olevsky Viscous Sintering (SOVS) Model

    Energy Technology Data Exchange (ETDEWEB)

    Lester, Brian T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-16

    Sintering refers to a manufacturing process through which mechanically pressed bodies of ceramic (and sometimes metal) powders are heated to drive densification thereby removing the inherit porosity of green bodies. As the body densifies through the sintering process, the ensuing material flow leads to macroscopic deformations of the specimen and as such the final configuration differs form the initial. Therefore, as with any manufacturing step, there is substantial interest in understanding and being able to model the sintering process to predict deformation and residual stress. Efforts in this regard have been pursued for face seals, gear wheels, and consumer products like wash-basins. To understand the sintering process, a variety of modeling approaches have been pursued at different scales.

  14. Verification and improvement of a predictive model for radionuclide migration

    International Nuclear Information System (INIS)

    Miller, C.W.; Benson, L.V.; Carnahan, C.L.

    1982-01-01

    Prediction of the rates of migration of contaminant chemical species in groundwater flowing through toxic waste repositories is essential to the assessment of a repository's capability of meeting standards for release rates. A large number of chemical transport models, of varying degrees of complexity, have been devised for the purpose of providing this predictive capability. In general, the transport of dissolved chemical species through a water-saturated porous medium is influenced by convection, diffusion/dispersion, sorption, formation of complexes in the aqueous phase, and chemical precipitation. The reliability of predictions made with the models which omit certain of these processes is difficult to assess. A numerical model, CHEMTRN, has been developed to determine which chemical processes govern radionuclide migration. CHEMTRN builds on a model called MCCTM developed previously by Lichtner and Benson

  15. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    Science.gov (United States)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated

  16. Channel Verification Results for the SCME models in a Multi-Probe Based MIMO OTA Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; S. Ashta, Jagjit

    2013-01-01

    , where the focus is on comparing results from various proposed methods. Channel model verification is necessary to ensure that the target channel models are correctly implemented inside the test area. This paper shows that the all the key parameters of the SCME models, i.e., power delay profile, temporal...

  17. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View Texas A& M; DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-12-20

    This project was successfully executed to provide valuable adsorption data and improve a comprehensive model developed in previous work by the authors. Data obtained were used in an integrated computer program to predict the behavior of adsorption columns. The model is supported by experimental data and has been shown to predict capture of off gas similar to that evolving during the reprocessing of nuclear waste. The computer program structure contains (a) equilibrium models of off-gases with the adsorbate; (b) mass-transfer models to describe off-gas mass transfer to a particle, diffusion through the pores of the particle, and adsorption on the active sites of the particle; and (c) incorporation of these models into fixed bed adsorption modeling, which includes advection through the bed. These models are being connected with the MOOSE (Multiphysics Object-Oriented Simulation Environment) software developed at the Idaho National Laboratory through DGOSPREY (Discontinuous Galerkin Off-gas SeParation and REcoverY) computer codes developed in this project. Experiments for iodine and water adsorption have been conducted on reduced silver mordenite (Ag0Z) for single layered particles. Adsorption apparatuses have been constructed to execute these experiments over a useful range of conditions for temperatures ranging from ambient to 250°C and water dew points ranging from -69 to 19°C. Experimental results were analyzed to determine mass transfer and diffusion of these gases into the particles and to determine which models best describe the single and binary component mass transfer and diffusion processes. The experimental results were also used to demonstrate the capabilities of the comprehensive models developed to predict single-particle adsorption and transients of the adsorption-desorption processes in fixed beds. Models for adsorption and mass transfer have been developed to mathematically describe adsorption kinetics and transport via diffusion and advection

  18. Verification of Context-Dependent Channel-Based Service Models

    NARCIS (Netherlands)

    N. Kokash (Natallia); , C. (born Köhler, , C.) Krause (Christian); E.P. de Vink (Erik Peter)

    2010-01-01

    htmlabstractThe paradigms of service-oriented computing and model-driven development are becoming of increasing importance in the eld of software engineering. According to these paradigms, new systems are composed with added value from existing stand-alone services to support business processes

  19. Experimental verification of preset time count rate meters based on adaptive digital signal processing algorithms

    Directory of Open Access Journals (Sweden)

    Žigić Aleksandar D.

    2005-01-01

    Full Text Available Experimental verifications of two optimized adaptive digital signal processing algorithms implemented in two pre set time count rate meters were per formed ac cording to appropriate standards. The random pulse generator realized using a personal computer, was used as an artificial radiation source for preliminary system tests and performance evaluations of the pro posed algorithms. Then measurement results for background radiation levels were obtained. Finally, measurements with a natural radiation source radioisotope 90Sr-90Y, were carried out. Measurement results, con ducted without and with radio isotopes for the specified errors of 10% and 5% showed to agree well with theoretical predictions.

  20. Specification, Model Generation, and Verification of Distributed Applications

    OpenAIRE

    Madelaine, Eric

    2011-01-01

    Since 2001, in the Oasis team, I have developed research on the semantics of applications based on distributed objects, applying in the context of a real language, and applications of realistic size, my previous researches in the field of process algebras. The various aspects of this work naturally include behavioral semantics and the definition of procedures for model generation, taking into account the different concepts of distributed applications, but also upstream, static code analysis a...

  1. On the need for data for the verification of service life models for frost damage

    DEFF Research Database (Denmark)

    Geiker, Mette Rica; Engelund, Sven

    1999-01-01

    The purpose of this paper is to draw the attention to the need for the verification of service life models for frost attack on concrete and the collection of relevant data. To illustrate the type of data needed the paper presents models for internal freeze/thaw damage (internal cracking including...

  2. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  3. Predictions and Verification of an Isotope Marine Boundary Layer Model

    Science.gov (United States)

    Feng, X.; Posmentier, E. S.; Sonder, L. J.; Fan, N.

    2017-12-01

    A one-dimensional (1D), steady state isotope marine boundary layer (IMBL) model is constructed. The model includes meteorologically important features absent in Craig and Gordon type models, namely height-dependent diffusion/mixing and convergence of subsiding external air. Kinetic isotopic fractionation results from this height-dependent diffusion which starts as pure molecular diffusion at the air-water interface and increases linearly with height due to turbulent mixing. The convergence permits dry, isotopically depleted air subsiding adjacent to the model column to mix into ambient air. In δD-δ18O space, the model results fill a quadrilateral, of which three sides represent 1) vapor in equilibrium with various sea surface temperatures (SSTs) (high d18O boundary of quadrilateral); 2) mixture of vapor in equilibrium with seawater and vapor in the subsiding air (lower boundary depleted in both D and 18O); and 3) vapor that has experienced the maximum possible kinetic fractionation (high δD upper boundary). The results can be plotted in d-excess vs. δ18O space, indicating that these processes all cause variations in d-excess of MBL vapor. In particular, due to relatively high d-excess in the descending air, mixing of this air into the MBL causes an increase in d-excess, even without kinetic isotope fractionation. The model is tested by comparison with seven datasets of marine vapor isotopic ratios, with excellent correspondence; >95% of observational data fall within the quadrilateral area predicted by the model. The distribution of observations also highlights the significant influence of vapor from the nearby converging descending air on isotopic variations in the MBL. At least three factors may explain the affect the isotopic composition of precipitation. The model can be applied to modern as well as paleo- climate conditions.

  4. Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling

    Science.gov (United States)

    Kraan, Aafke Christine

    2015-01-01

    Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β+ emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects. PMID:26217586

  5. Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling.

    Science.gov (United States)

    Kraan, Aafke Christine

    2015-01-01

    Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β (+) emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.

  6. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  7. Results of verification and investigation of wind velocity field forecast. Verification of wind velocity field forecast model

    International Nuclear Information System (INIS)

    Ogawa, Takeshi; Kayano, Mitsunaga; Kikuchi, Hideo; Abe, Takeo; Saga, Kyoji

    1995-01-01

    In Environmental Radioactivity Research Institute, the verification and investigation of the wind velocity field forecast model 'EXPRESS-1' have been carried out since 1991. In fiscal year 1994, as the general analysis, the validity of weather observation data, the local features of wind field, and the validity of the positions of monitoring stations were investigated. The EXPRESS which adopted 500 m mesh so far was improved to 250 m mesh, and the heightening of forecast accuracy was examined, and the comparison with another wind velocity field forecast model 'SPEEDI' was carried out. As the results, there are the places where the correlation with other points of measurement is high and low, and it was found that for the forecast of wind velocity field, by excluding the data of the points with low correlation or installing simplified observation stations to take their data in, the forecast accuracy is improved. The outline of the investigation, the general analysis of weather observation data and the improvements of wind velocity field forecast model and forecast accuracy are reported. (K.I.)

  8. Hearing aids in children: the importance of the verification and validation processes.

    Science.gov (United States)

    Rissatto, Mara Renata; Novaes, Beatriz Cavalcanti de Albuquerque Caiuby

    2009-01-01

    during the fitting of hearing aids in children it is important, besides using a verification protocol, to have a validation process. to describe and discuss the use of a protocol for the fitting and the verification of hearing aids in children, as well as the impact of the adjustment of the acoustic characteristics in speech perception tasks. ten children aging from three to eleven years were enrolled in this study. All children presented bilateral sensorineural hearing impairment, were users of hearing aids and were followed at a public hearing health care service in Bahia. The children were submitted to the following procedures: pure tone air and bone conduction thresholds; real-ear coupler difference (RECD); verification with real-ear measurement equipment: coupler gain/output and insertion gain and to speech perception tasks: 'The Six-Sound Test' (Ling, 2006) and the 'Word Associations for Syllable Perception' (WASP - Koch, 1999). The programmed electro acoustic characteristics of the hearing aids were compared to the electro acoustic characteristics prescribed by the DSL [i/o] v4.1 software. The speech perception tasks were reapplied on three occasions: straight after the modification of the electro acoustic characteristics, after 30 days and 60 days. for more than 50% of the tested children, the programmed electro acoustic characteristics of the hearing aids did not correspond to that suggested by the DSL [i/o] software. Adequate prescription was verified in 70% of the investigated sample; this was also confirmed by the results in the speech perception tasks (p=0.000). This data confirmed that the mean percentage of correct answers increased after the modification of the electro acoustic characteristics. the use of a protocol that verifies and validates the fitting of hearing aids in children is necessary.

  9. Sorption Modeling and Verification for Off-Gas Treatment

    International Nuclear Information System (INIS)

    Tavlarides, Lawrence L.; Lin, Ronghong; Nan, Yue; Yiacoumi, Sotira; Tsouris, Costas; Ladshaw, Austin; Sharma, Ketki; Gabitto, Jorge; DePaoli, David

    2015-01-01

    The project has made progress toward developing a comprehensive modeling capability for the capture of target species in off gas evolved during the reprocessing of nuclear fuel. The effort has integrated experimentation, model development, and computer code development for adsorption and absorption processes. For adsorption, a modeling library has been initiated to include (a) equilibrium models for uptake of off-gas components by adsorbents, (b) mass transfer models to describe mass transfer to a particle, diffusion through the pores of the particle and adsorption on the active sites of the particle, and (c) interconnection of these models to fixed bed adsorption modeling which includes advection through the bed. For single-component equilibria, a Generalized Statistical Thermodynamic Adsorption (GSTA) code was developed to represent experimental data from a broad range of isotherm types; this is equivalent to a Langmuir isotherm in the two-parameter case, and was demonstrated for Kr on INL-engineered sorbent HZ PAN, water sorption on molecular sieve A sorbent material (MS3A), and Kr and Xe capture on metal-organic framework (MOF) materials. The GSTA isotherm was extended to multicomponent systems through application of a modified spreading pressure surface activity model and generalized predictive adsorbed solution theory; the result is the capability to estimate multicomponent adsorption equilibria from single-component isotherms. This advance, which enhances the capability to simulate systems related to off-gas treatment, has been demonstrated for a range of real-gas systems in the literature and is ready for testing with data currently being collected for multicomponent systems of interest, including iodine and water on MS3A. A diffusion kinetic model for sorbent pellets involving pore and surface diffusion as well as external mass transfer has been established, and a methodology was developed for determining unknown diffusivity parameters from transient

  10. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence L. [Syracuse Univ., NY (United States); Lin, Ronghong [Syracuse Univ., NY (United States); Nan, Yue [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Ladshaw, Austin [Georgia Inst. of Technology, Atlanta, GA (United States); Sharma, Ketki [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View A & M Univ., Prairie View, TX (United States); DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-04-29

    The project has made progress toward developing a comprehensive modeling capability for the capture of target species in off gas evolved during the reprocessing of nuclear fuel. The effort has integrated experimentation, model development, and computer code development for adsorption and absorption processes. For adsorption, a modeling library has been initiated to include (a) equilibrium models for uptake of off-gas components by adsorbents, (b) mass transfer models to describe mass transfer to a particle, diffusion through the pores of the particle and adsorption on the active sites of the particle, and (c) interconnection of these models to fixed bed adsorption modeling which includes advection through the bed. For single-component equilibria, a Generalized Statistical Thermodynamic Adsorption (GSTA) code was developed to represent experimental data from a broad range of isotherm types; this is equivalent to a Langmuir isotherm in the two-parameter case, and was demonstrated for Kr on INL-engineered sorbent HZ PAN, water sorption on molecular sieve A sorbent material (MS3A), and Kr and Xe capture on metal-organic framework (MOF) materials. The GSTA isotherm was extended to multicomponent systems through application of a modified spreading pressure surface activity model and generalized predictive adsorbed solution theory; the result is the capability to estimate multicomponent adsorption equilibria from single-component isotherms. This advance, which enhances the capability to simulate systems related to off-gas treatment, has been demonstrated for a range of real-gas systems in the literature and is ready for testing with data currently being collected for multicomponent systems of interest, including iodine and water on MS3A. A diffusion kinetic model for sorbent pellets involving pore and surface diffusion as well as external mass transfer has been established, and a methodology was developed for determining unknown diffusivity parameters from transient

  11. The middle range verification of numerical model performance for heavy rainfall in North China

    Science.gov (United States)

    Zhang, Bo; Zhao, Bin; Niu, Ruoyun

    2017-04-01

    The heavy rainfall forecast in North China is the focus and difficulty in middle range numerical weather forecast. 70 typical heavy precipitation cases in North China in summer from 2010 to 2016 are selected, which are divided into vortex type, the west trough and shear line type according to the atmospheric circulation. Based on ECMWF model and the Chinese operational model T639, a spatial verification method MODE is used, the middle range precipitation forecast abilities for heavy rain in summer in North China are evaluated according to contrast the difference of centroidal distance, axis angel and aspect ratios. It is found that the ECMWF model and the T639 model all show weak predictive ability for the low-vortex-type heavy rainfall in Northern China from all the similarities. When the area of rainfall is larger, the precipitation patterns of the two models are mostly northeast-southwest. It is consistent with the actual situation. For a large area of precipitation area, both models predict the precipitation area aspect ratio is less than 1. It shows that precipitation drop area is long and narrow, and the forecast is also consistent with the actual situation. However, as far as T639 and ECMWF models are concerned, there are systematic deviations in the precipitation area, and the predicted precipitation area is located on the southwestern side of the field. For smaller/larger areas of precipitation, the predicted precipitation area is larger/smaller than the actual situation. In addition, a sensitive test for the regional heavy precipitation process in North China (such as Huanghuai and other regions) from July 18 to 20, 2016 is also done and the results show that each numerical model of the process prediction is not successful. Therefore, further research is needed on the future correction of systematic bias of numerical models of regional heavy precipitation in medium-term forecasters.

  12. A verification of the bakery protocol combining algebraic and model-oriented techniques

    NARCIS (Netherlands)

    C. Brovedani; A.S. Klusener (Steven)

    1996-01-01

    textabstractIn this paper we give a specification of the so called Bakery protocol in an extension of the process algebra ACP with abstract datatypes. We prove that this protocol is equal to a Queue, modulo branching bisimulation equivalence. The verification is as follows. First we give a linear

  13. Verification of COMDES-II Systems Using UPPAAL with Model Transformation

    DEFF Research Database (Denmark)

    Xu, Ke; Pettersson, Paul; Sierszecki, Krzysztof

    2008-01-01

    in a timed multitasking environment, modal continuous operation combining reactive control behavior with continuous data processing, etc., by following the principle of separation-of-concerns. In the paper we present a transformational approach to the formal verification of both timing and reactive behaviors...

  14. Finite element code FENIA verification and application for 3D modelling of thermal state of radioactive waste deep geological repository

    Science.gov (United States)

    Butov, R. A.; Drobyshevsky, N. I.; Moiseenko, E. V.; Tokarev, U. N.

    2017-11-01

    The verification of the FENIA finite element code on some problems and an example of its application are presented in the paper. The code is being developing for 3D modelling of thermal, mechanical and hydrodynamical (THM) problems related to the functioning of deep geological repositories. Verification of the code for two analytical problems has been performed. The first one is point heat source with exponential heat decrease, the second one - linear heat source with similar behavior. Analytical solutions have been obtained by the authors. The problems have been chosen because they reflect the processes influencing the thermal state of deep geological repository of radioactive waste. Verification was performed for several meshes with different resolution. Good convergence between analytical and numerical solutions was achieved. The application of the FENIA code is illustrated by 3D modelling of thermal state of a prototypic deep geological repository of radioactive waste. The repository is designed for disposal of radioactive waste in a rock at depth of several hundred meters with no intention of later retrieval. Vitrified radioactive waste is placed in the containers, which are placed in vertical boreholes. The residual decay heat of radioactive waste leads to containers, engineered safety barriers and host rock heating. Maximum temperatures and corresponding times of their establishment have been determined.

  15. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  16. Sealing of process valves for the HEU downblending verification experiment at Portsmouth

    International Nuclear Information System (INIS)

    Baldwin, G.T.; Bartberger, J.C.; Jenkins, C.D.; Perlinski, A.W.; Schoeneman, J.L.; Gordon, D.M.; Whiting, N.E.; Bonner, T.N.; Castle, J.M.

    1998-01-01

    At the Portsmouth Gaseous Diffusion Plant in Piketon, Ohio, USA, excess inventory of highly-enriched uranium (HEU) from US defense programs is being diluted to low-enriched uranium (LEU) for commercial use. The conversion is subject to a Verification Experiment overseen by the International Atomic Energy Agency (IAEA). The Verification Experiment is making use of monitoring technologies developed and installed by several DOE laboratories. One of the measures is a system for sealing valves in the process piping, which secures the path followed by uranium hexafluoride gas (UF 6 ) from cylinders at the feed stations to the blend point, where the HEU is diluted with LEU. The Authenticated Item Monitoring System (AIMS) was the alternative proposed by Sandia National Laboratories that was selected by the IAEA. Approximately 30 valves were sealed by the IAEA using AIMS fiber-optic seals (AFOS). The seals employ single-core plastic fiber rated to 125 C to withstand the high-temperature conditions of the heated piping enclosures at Portsmouth. Each AFOS broadcasts authenticated seal status and state-of-health messages via a tamper-protected radio-frequency transmitter mounted outside of the heated enclosure. The messages are received by two collection stations, operated redundantly

  17. Current status of the verification and processing system GALILÉE-1 for evaluated data

    Science.gov (United States)

    Coste-Delclaux, Mireille; Jouanne, Cédric; Moreau, Frédéric; Mounier, Claude; Visonneau, Thierry; Dolci, Florence

    2017-09-01

    This paper describes the current status of GALILÉE-1 that is the new verification and processing system for evaluated data, developed at CEA. It consists of various components respectively dedicated to read/write the evaluated data whatever the format is, to diagnose inconsistencies in the evaluated data and to provide continuous-energy and multigroup data as well as probability tables for transport and depletion codes. All these components are written in C+ + language and share the same objects. Cross-comparisons with other processing systems (NJOY, CALENDF or PREPRO) are systematically carried out at each step in order to fully master possible discrepancies. Some results of such comparisons are provided.

  18. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor...

  19. The Parametric Model for PLC Reference Chanells and its Verification in Real PLC Environment

    Directory of Open Access Journals (Sweden)

    Rastislav Roka

    2008-01-01

    Full Text Available For the expansion of PLC systems, it is necesssary to have a detailed knowledge of the PLC transmission channel properties. This contribution shortly discusses characteristics of the PLC environment and a classification of PLC transmission channels. A main part is focused on the parametric model for PLC reference channels and its verification in the real PLC environment utilizing experimental measurements.

  20. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  1. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  2. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    Science.gov (United States)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  3. Process Sensitivity, Performance, and Direct Verification Testing of Adhesive Locking Features

    Science.gov (United States)

    Golden, Johnny L.; Leatherwood, Michael D.; Montoya, Michael D.; Kato, Ken A.; Akers, Ed

    2012-01-01

    during assembly by measuring the dynamic prevailing torque. Adhesive locking features or LLCs are another method of providing redundant locking, but a direct verification method has not been used in aerospace applications to verify proper installation when using LLCs because of concern for damage to the adhesive bond. The reliability of LLCs has also been questioned due to failures observed during testing with coupons for process verification, although the coupon failures have often been attributed to a lack of proper procedures. It is highly desirable to have a direct method of verifying the LLC cure or bond integrity. The purpose of the Phase II test program was to determine if the torque applied during direct verification of an adhesive locking feature degrades that locking feature. This report documents the test program used to investigate the viability of such a direct verification method. Results of the Phase II testing were positive, and additional investigation of direct verification of adhesive locking features is merited.

  4. Modeling and verification of hemispherical solar still using ANSYS CFD

    Energy Technology Data Exchange (ETDEWEB)

    Panchal, Hitesh N. [KSV University, Gujarat Power Engineering and Research Institute, Mehsana (India); Shah, P.K. [Silver Oak College of Engineering and Technology, Ahmedabad, Gujarat (India)

    2013-07-01

    In every efficient solar still design, water temperature, vapor temperature and distillate output, and difference between water temperature and inner glass cover temperatures are very important. Here, two dimensional three phase model of hemispherical solar still is made for evaporation as well as condensation process in ANSYS CFD. Simulation results like water temperature, vapor temperature, distillate output compared with actual experimental results of climate conditions of Mehsana (latitude of 23° 59’ and longitude of 72° 38) of hemispherical solar still. Water temperature and distillate output were good agreement with actual experimental results. Study shows that ANSYS-CFD is very powerful as well as efficient tool for design, comparison purpose of hemispherical solar still.

  5. Land surface Verification Toolkit (LVT) - a generalized framework for land surface model evaluation

    Science.gov (United States)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J.; Harrison, K.; Liu, Y.; Shaw, M.

    2012-06-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  6. NRPB models for calculating the transfer of radionuclides through the environment. Verification and validation

    International Nuclear Information System (INIS)

    Attwood, C.; Barraclough, I.; Brown, J.

    1998-06-01

    There is a wide range of models available at NRPB to predict the transfer of radionuclides through the environment. Such models form an essential part of assessments of the radiological impact of releases of radionuclides into the environment. These models cover: the atmosphere; the aquatic environment; the geosphere; the terrestrial environment including foodchains. It is important that the models used for radiological impact assessments are robust, reliable and suitable for the assessment being undertaken. During model development it is, therefore, important that the model is both verified and validated. Verification of a model involves ensuring that it has been implemented correctly, while validation consists of demonstrating that the model is an adequate representation of the real environment. The extent to which a model can be verified depends on its complexity and whether similar models exist. For relatively simple models verification is straightforward, but for more complex models verification has to form part of the development, coding and testing of the model within quality assurance procedures. Validation of models should ideally consist of comparisons between the results of the models and experimental or environmental measurement data that were not used to develop the model. This is more straightforward for some models than for others depending on the quantity and type of data available. Validation becomes increasingly difficult for models which are intended to predict environmental transfer at long times or at great distances. It is, therefore, necessary to adopt qualitative validation techniques to ensure that the model is an adequate representation of the real environment. This report summarises the models used at NRPB to predict the transfer of radionuclides through the environment as part of a radiological impact assessment. It outlines the work carried out to verify and validate the models. The majority of these models are not currently available

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: BROME AGRI SALES, LTD., MAXIMIZER SEPARATOR, MODEL MAX 1016 - 03/01/WQPC-SWP

    Science.gov (United States)

    Verification testing of the Brome Agri Sales Ltd. Maximizer Separator, Model MAX 1016 (Maximizer) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The Maximizer is an inclined screen solids separator that can be used to s...

  8. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  9. VERIFICATION OF 3D BUILDING MODELS USING MUTUAL INFORMATION IN AIRBORNE OBLIQUE IMAGES

    Directory of Open Access Journals (Sweden)

    A. P. Nyaruhuma

    2012-07-01

    Full Text Available This paper describes a method for automatic verification of 3D building models using airborne oblique images. The problem being tackled is identifying buildings that are demolished or changed since the models were constructed or identifying wrong models using the images. The models verified are of CityGML LOD2 or higher since their edges are expected to coincide with actual building edges. The verification approach is based on information theory. Corresponding variables between building models and oblique images are used for deriving mutual information for individual edges, faces or whole buildings, and combined for all perspective images available for the building. The wireframe model edges are projected to images and verified using low level image features – the image pixel gradient directions. A building part is only checked against images in which it may be visible. The method has been tested with models constructed using laser points against Pictometry images that are available for most cities of Europe and may be publically viewed in the so called Birds Eye view of the Microsoft Bing Maps. Results are that nearly all buildings are correctly categorised as existing or demolished. Because we now concentrate only on roofs we also used the method to test and compare results from nadir images. This comparison made clear that especially height errors in models can be more reliably detected in oblique images because of the tilted view. Besides overall building verification, results per individual edges can be used for improving the 3D building models.

  10. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  11. PLM-based Approach for Design Verification and Validation using Manufacturing Process Knowledge

    Directory of Open Access Journals (Sweden)

    Luis Toussaint

    2010-02-01

    Full Text Available Out of 100 hours of engineering work, only 20 are dedicated to real engineering and 80 are spent on what is considered as routine activities. Readjusting the ratio of innovative vs. routine work is a considerable challenge in the product lifecycle management (PLM strategy. Therefore, the main objective is to develop an approach in order to accelerate routine processes in engineering design. The proposed methodology called FabK consists of capturing manufacturing knowledge and its application towards the design verification and validation of new engineering designs. The approach is implemented into a Web-based PLM prototype and a Computer Aided Design system. A series of experiments from an industrial case study is introduced to provide significant results.

  12. Issues to be considered on obtaining plant models for formal verification purposes

    Science.gov (United States)

    Pacheco, R.; Gonzalez, L.; Intriago, M.; Machado, J.; Prisacaru, G.; Olaru, D.

    2016-08-01

    The development of dependable software for mechatronic systems can be a very complex and hard task. For facilitating the obtaining of dependable software for industrial controllers, some powerful software tools and analysis techniques can be used. Mainly, when using simulation and formal verification analysis techniques, it is necessary to develop plant models, in order to describe the plant behavior of those systems. However, developing a plant model implies that designer takes his (or her) decisions concerning granularity and level of abstraction of models; approach to consider for modeling (global or modular); and definition of strategies for simulation and formal verification tasks. This paper intends to highlight some aspects that can be considered for taking into account those decisions. For this purpose, it is presented a case study and there are illustrated and discussed very important aspects concerning above exposed issues.

  13. Modelling and Verification of Multiple UAV Mission Using SMV

    Directory of Open Access Journals (Sweden)

    Gopinadh Sirigineedi

    2010-03-01

    Full Text Available Model checking has been used to verify the correctness of digital circuits, security protocols, communication protocols, as they can be modelled by means of finite state transition model. However, modelling the behaviour of hybrid systems like UAVs in a Kripke model is challenging. This work is aimed at capturing the behaviour of an UAV performing cooperative search mission into a Kripke model, so as to verify it against the temporal properties expressed in Computational Tree Logic (CTL. SMV model checker is used for the purpose of model checking.

  14. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  15. Game Theory Models for the Verification of the Collective Behaviour of Autonomous Cars

    OpenAIRE

    Varga, László Z.

    2017-01-01

    The collective of autonomous cars is expected to generate almost optimal traffic. In this position paper we discuss the multi-agent models and the verification results of the collective behaviour of autonomous cars. We argue that non-cooperative autonomous adaptation cannot guarantee optimal behaviour. The conjecture is that intention aware adaptation with a constraint on simultaneous decision making has the potential to avoid unwanted behaviour. The online routing game model is expected to b...

  16. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or spee...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  17. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    Science.gov (United States)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  18. Human factors verification and validation process of the human system interface for Chinese power plants: Lessons learned from new designs - 15415

    International Nuclear Information System (INIS)

    Trueba Alonso, P.; Fernandez Illobre, L.; Fernando Ortega, P.

    2015-01-01

    Tecnatom has been participating in the design and implementation of the Human System Interfaces (HSIs) of several Nuclear Power Plants in China equipped with Pressurized Water Reactors (PWR). The design process had associated a Human Factors Engineering (HFE) program, where Verification and Validation (VV) activities were included. These VV activities were performed by an independent team, following as the primary reference the Human Factors Engineering Program Review Model (HFEPRM), as described in NUREG-0711. The plants under the scope of these VV activities were Fuqing 1, 2, 3 and 4 units (PWR 1000 MW), Fangjiashan 1 and 2 units (PWR 1000 MW), and Changjiang 1 and 2 units (PWR 600 MW). These plants are owned by the Chinese National Nuclear Corporation (CNNC), all with a M310+ reactor design (Generation II), and plant engineering supplied by China Nuclear Power Engineering (CNPE). These plants included a new advanced HSIs concerning the main control room and the remote shutdown station, with a major use of digital technology, using only conventional technology as a backup in a specific panel. This paper describes the activities and experience obtained during the execution of the HFE VV activities on the HSIs designed and implemented by Tecnatom. The VV activities covered the development of implementation plans and methodologies, HSI task support verifications (TSV), HSI design verifications (DV), partial validations (PV), integrated system validation (ISV), HSI implementation verification (or as built verification) and human engineering discrepancy (HED) management. The project was led by Invensys Operation Management (IOM, currently Schneider Electric), as the main contractor of the Distributed Control System, but with a HFE part, where the VV took place. (authors)

  19. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Tsao, Jeffrey Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trucano, Timothy G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kleban, Stephen D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naugle, Asmeret Bier [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Verzi, Stephen Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Curtis M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flanagan, Tatiana Paz [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gabert, Kasimir Georg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lave, Matthew Samuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chen, Wei [Northwestern Univ., Evanston, IL (United States); DeLaurentis, Daniel [Purdue Univ., West Lafayette, IN (United States); Hubler, Alfred [Univ. of Illinois, Urbana, IL (United States); Oberkampf, Bill [WLO Consulting, Austin, TX (United States)

    2016-08-01

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledge gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?

  20. FAST Mast Structural Response to Axial Loading: Modeling and Verification

    Science.gov (United States)

    Knight, Norman F., Jr.; Elliott, Kenny B.; Templeton, Justin D.; Song, Kyongchan; Rayburn, Jeffery T.

    2012-01-01

    The International Space Station s solar array wing mast shadowing problem is the focus of this paper. A building-block approach to modeling and analysis is pursued for the primary structural components of the solar array wing mast structure. Starting with an ANSYS (Registered Trademark) finite element model, a verified MSC.Nastran (Trademark) model is established for a single longeron. This finite element model translation requires the conversion of several modeling and analysis features for the two structural analysis tools to produce comparable results for the single-longeron configuration. The model is then reconciled using test data. The resulting MSC.Nastran (Trademark) model is then extended to a single-bay configuration and verified using single-bay test data. Conversion of the MSC. Nastran (Trademark) single-bay model to Abaqus (Trademark) is also performed to simulate the elastic-plastic longeron buckling response of the single bay prior to folding.

  1. Verification of aseismic design model by using experimental results

    International Nuclear Information System (INIS)

    Mizuno, N.; Sugiyama, N.; Suzuki, T.; Shibata, Y.; Miura, K.; Miyagawa, N.

    1985-01-01

    A lattice model is applied as an analysis model for an aseismic design of the Hamaoka nuclear reactor building. With object to verify an availability of this design model, two reinforced concrete blocks are constructed on the ground and the forced vibration tests are carried out. The test results are well followed by simulation analysis using the lattice model. Damping value of the ground obtained from the test is more conservative than the design value. (orig.)

  2. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  3. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  4. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  5. Development and verification of printed circuit board toroidal transformer model

    DEFF Research Database (Denmark)

    Pejtersen, Jens; Mønster, Jakob Døllner; Knott, Arnold

    2013-01-01

    by comparing calculated parameters with 3D finite element simulations and experimental measurement results. The developed transformer model shows good agreement with the simulated and measured results. The model can be used to predict the parameters of printed circuit board toroidal transformer configurations......An analytical model of an air core printed circuit board embedded toroidal transformer configuration is presented. The transformer has been developed for galvanic isolation of very high frequency switch-mode dc-dc power converter applications. The theoretical model is developed and verified...

  6. Electric and mechanical basic parameters to elaborate a process for a technical verification of safety related design modifications

    International Nuclear Information System (INIS)

    Lamuno Fernandez, Mercedes; La Roca Mallofre, GISEL; Bano Azcon, Alberto

    2010-01-01

    This paper presents a systematic process to check a design in order to achieve all the requirements that regulations demand. Nuclear engineers must verify that a design is done according to the safety requirements, and this paper presents how we have elaborated a process to improve the technical project verification. For a faster, better and easier verification process, here we summarize how to select the electric and mechanical basic parameters, which ensure the correct project verification of safety related design modifications. This process considers different aspects, which guarantee that the design preserves the availability, reliability and functional capability of the Structures, Systems and Components needed to operate the Nuclear Power Station with security. Electric and mechanical reference parameters are identified and discussed as well as others related ones, which are critical to safety. The implementation procedure to develop tasks performed in any company that has a quality plan is a requirement. On the engineering business, it is important not to use the personal criteria to do a technical analysis of a project; although, many times it is the checker's criteria and knowledge responsibility to ensure the correct development of a design modification. Then, the checker capabilities are the basis of the modification verification. This kind of procedure's development is not easy, because in an engineering project with important technical contents, there are multiple scenarios, but lots of them have a common basis. If we can identify the technical common basis of these projects, we will make good project verification but there are many difficulties we can encounter along this process. (authors)

  7. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  8. Evaluation and Verification of Thermal Stratification Models for ...

    African Journals Online (AJOL)

    Stratification is a usual phenomenon occurring in waste stabilization ponds which needs to be incorporated in prediction models. The occurrence of stratification in waste stabilization pond (WSP) alters the flow pattern of the pond. Hence, its study is important for complete and accurate modeling of the WSP. In this study, two ...

  9. Introducing uncertainty of radar-rainfall estimates to the verification of mesoscale model precipitation forecasts

    Directory of Open Access Journals (Sweden)

    M. P. Mittermaier

    2008-05-01

    Full Text Available A simple measure of the uncertainty associated with using radar-derived rainfall estimates as "truth" has been introduced to the Numerical Weather Prediction (NWP verification process to assess the effect on forecast skill and errors. Deterministic precipitation forecasts from the mesoscale version of the UK Met Office Unified Model for a two-day high-impact event and for a month were verified at the daily and six-hourly time scale using a spatially-based intensity-scale method and various traditional skill scores such as the Equitable Threat Score (ETS and log-odds ratio. Radar-rainfall accumulations from the UK Nimrod radar-composite were used.

    The results show that the inclusion of uncertainty has some effect, shifting the forecast errors and skill. The study also allowed for the comparison of results from the intensity-scale method and traditional skill scores. It showed that the two methods complement each other, one detailing the scale and rainfall accumulation thresholds where the errors occur, the other showing how skillful the forecast is. It was also found that for the six-hourly forecasts the error distributions remain similar with forecast lead time but skill decreases. This highlights the difference between forecast error and forecast skill, and that they are not necessarily the same.

  10. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    Science.gov (United States)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  11. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  12. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-05-12

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  13. Verification of Occupants’ Behaviour Models in Residential Buildings

    DEFF Research Database (Denmark)

    Andersen, Rune Korsholm

    During the last decade, studies about stochastic models of occupants’ behaviour in relation to control of the indoor environment have been published. Often the overall aim of these models is to enable more reliable predictions of building performance using building performance simulations (BPS...... to determine if the event takes place or not. Finally, the simulated window position is compared to the measured ones and the True Positive Rate and False Positive Rate along with other metrics can be calculated and compared. The method evaluates the models abilities to predict the position of the window...... based on the dataset. In each time step, the probabilities are subtracted from the observed transitions, to find the residuals. Finally, the residuals can be averaged, and compared. The validation by simulation relies on detailed Building Performance Simulations (BPS) using models under evaluation...

  14. Experimental Verification of the Transient Model in an Enrichment Circuit

    International Nuclear Information System (INIS)

    Fernandino, Maria; Brasnarof, Daniel; Delmastro, Dario

    2003-01-01

    In the present work an experimental closed loop representing a single stage of an uranium gaseous diffusion enrichment cascade is described, loop that is used to experimentally validate an analytical model that describes the dynamics inside such a loop.The conditions established inside the experimental loop after a few working hours were reproduced by the analytical model, leaving the slower thermal phenomena taking place for future studies.Two kinds of perturbations were experimentally introduced: a change in the range of operation of one of the compressors and the addition of mass into the loop.Numerical and experimental results are compared and presented in this work. The analytical model proposed was verified against these two changes, with very good agreement in the time response and measured values.This analytical model allows us to determine the characteristic time response of the system

  15. A verification procedure for MSC/NASTRAN Finite Element Models

    Science.gov (United States)

    Stockwell, Alan E.

    1995-01-01

    Finite Element Models (FEM's) are used in the design and analysis of aircraft to mathematically describe the airframe structure for such diverse tasks as flutter analysis and actively controlled landing gear design. FEM's are used to model the entire airplane as well as airframe components. The purpose of this document is to describe recommended methods for verifying the quality of the FEM's and to specify a step-by-step procedure for implementing the methods.

  16. Target Soil Impact Verification: Experimental Testing and Kayenta Constitutive Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Broome, Scott Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Flint, Gregory Mark [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Dewers, Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Newell, Pania [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This report details experimental testing and constitutive modeling of sandy soil deformation under quasi - static conditions. This is driven by the need to understand constitutive response of soil to target/component behavior upon impact . An experimental and constitutive modeling program was followed to determine elastic - plastic properties and a compressional failure envelope of dry soil . One hydrostatic, one unconfined compressive stress (UCS), nine axisymmetric compression (ACS) , and one uniaxial strain (US) test were conducted at room temperature . Elastic moduli, assuming isotropy, are determined from unload/reload loops and final unloading for all tests pre - failure and increase monotonically with mean stress. Very little modulus degradation was discernable from elastic results even when exposed to mean stresses above 200 MPa . The failure envelope and initial yield surface were determined from peak stresses and observed onset of plastic yielding from all test results. Soil elasto - plastic behavior is described using the Brannon et al. (2009) Kayenta constitutive model. As a validation exercise, the ACS - parameterized Kayenta model is used to predict response of the soil material under uniaxial strain loading. The resulting parameterized and validated Kayenta model is of high quality and suitable for modeling sandy soil deformation under a range of conditions, including that for impact prediction.

  17. Model of PE-CVD Apparatus: Verification and Simulations

    Directory of Open Access Journals (Sweden)

    J. Geiser

    2010-01-01

    Full Text Available In this paper, we present a simulation of chemical vapor deposition with metallic bipolar plates. In chemical vapor deposition, a delicate optimization between temperature, pressure and plasma power is important to obtain homogeneous deposition. The aim is to reduce the number of real-life experiments in a given CVD plasma reactor. Based on the large physical parameter space, there are a hugh number of possible experiments. A detailed study of the physical experiments in a CVD plasma reactor allows to reduce the problem to an approximate mathematical model, which is the underlying transport-reaction model. Significant regions of the CVD apparatus are approximated and physical parameters are transferred to the mathematical parameters. Such an approximation reduces the mathematical parameter space to a realistic number of numerical experiments. The numerical results are discussed with physical experiments to give a valid model for the assumed growth and we could reduce expensive physical experiments.

  18. Modelling, property verification and behavioural equivalence of lactose operon regulation.

    Science.gov (United States)

    Pinto, Marcelo Cezar; Foss, Luciana; Mombach, José Carlos Merino; Ribeiro, Leila

    2007-02-01

    Understanding biochemical pathways is one of the biggest challenges in the field of molecular biology nowadays. Computer science can contribute in this area by providing formalisms and tools to simulate and analyse pathways. One formalism that is suited for modelling concurrent systems is Milner's Calculus of Communicating Systems (CCS). This paper shows the viability of using CCS to model and reason about biochemical networks. As a case study, we describe the regulation of lactose operon. After describing this operon formally using CCS, we validate our model by automatically checking some known properties for lactose regulation. Moreover, since biological systems tend to be very complex, we propose to use multiple descriptions of the same system at different levels of abstraction. The compatibility of these multiple views can be assured via mathematical proofs of observational equivalence.

  19. Robust control design verification using the modular modeling system

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ben-Abdennour, A.; Lee, K.Y.

    1991-01-01

    The Modular Modeling System (B ampersand W MMS) is being used as a design tool to verify robust controller designs for improving power plant performance while also providing fault-accommodating capabilities. These controllers are designed based on optimal control theory and are thus model based controllers which are targeted for implementation in a computer based digital control environment. The MMS is being successfully used to verify that the controllers are tolerant of uncertainties between the plant model employed in the controller and the actual plant; i.e., that they are robust. The two areas in which the MMS is being used for this purpose is in the design of (1) a reactor power controller with improved reactor temperature response, and (2) the design of a multiple input multiple output (MIMO) robust fault-accommodating controller for a deaerator level and pressure control problem

  20. Carbon dioxide stripping in aquaculture -- part III: model verification

    Science.gov (United States)

    Colt, John; Watten, Barnaby; Pfeiffer, Tim

    2012-01-01

    Based on conventional mass transfer models developed for oxygen, the use of the non-linear ASCE method, 2-point method, and one parameter linear-regression method were evaluated for carbon dioxide stripping data. For values of KLaCO2 down at higher values of KLaCO2. How to correct KLaCO2 for gas phase enrichment remains to be determined. The one-parameter linear regression model was used to vary the C*CO2 over the test, but it did not result in a better fit to the experimental data when compared to the ASCE or fixed C*CO2 assumptions.

  1. Structure-dynamic model verification calculation of PWR 5 tests

    International Nuclear Information System (INIS)

    Engel, R.

    1980-02-01

    Within reactor safety research project RS 16 B of the German Federal Ministry of Research and Technology (BMFT), blowdown experiments are conducted at Battelle Institut e.V. Frankfurt/Main using a model reactor pressure vessel with a height of 11,2 m and internals corresponding to those in a PWR. In the present report the dynamic loading on the pressure vessel internals (upper perforated plate and barrel suspension) during the DWR 5 experiment are calculated by means of a vertical and horizontal dynamic model using the CESHOCK code. The equations of motion are resolved by direct integration. (orig./RW) [de

  2. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Science.gov (United States)

    Franz, K. J.; Hogue, T. S.

    2011-11-01

    The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP) systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE), and the Shuffle Complex Evolution Metropolis (SCEM). Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA) model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  3. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  4. Verification of geological/engineering model in waterflood areas

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, B.; Szpakiewicz, M.; Honarpour, M.; Schatzinger, R.A.; Tillman, R.

    1988-12-01

    The construction of a detailed geological/engineering model is the basis for development of the methodology for characterizing reservoir heterogeneity. The NIPER geological/engineering model is the subject of this report. The area selected for geological and production performance studies is a four-section area within the Powder River Basin which includes the Tertiary Incentive Project (TIP) pilot. Log, well test, production, and core data were acquired for construction of the geological model of a barrier island reservoir. In this investigation, emphasis was on the synthesis and quantification of the abundant geological information acquired from the literature and field studies (subsurface and outcrop) by mapping the geological heterogeneities that influence fluid flow. The geological model was verified by comparing it with the exceptionally complete production data available for Bell Creek field. This integration of new and existing information from various geological, geophysical, and engineering disciplines has enabled better definition of the heterogeneities that influence production during different recovery operations. 16 refs., 26 figs., 6 tabs.

  5. Modelling and Verification of Web Services Business Activity Protocol

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2011-01-01

    WS-Business Activity specification defines two coordination protocols in order to ensure a consistent agreement on the outcome of long-running distributed applications. We use the model checker Uppaal to analyse the Business Agreement with Coordination Completion protocol type. Our analyses show...

  6. Visual Sample Plan (VSP) Models and Code Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, Richard O.; Davidson, James R.; Wilson, John E.; Pulsipher, Brent A.

    2001-03-06

    VSP is an easy to use, visual and graphic software tool being developed to select the right number and location of environmental samples so that the results of statistical tests performed to provide input to environmental decisions have the required confidence and performance. It is a significant help for implementing the 6th and 7th steps of the Data Quality Objectives (DQO) planning process ("Specify Tolerable Limits on Decision Errors" and "Optimize the Design for Obtaining Data," respectively).

  7. Verification and Validation of Numerical Models for Air/Water Flow on Coastal and Navigation Fluid-Structure Interaction Applications

    Science.gov (United States)

    Kees, C. E.; Farthing, M.; Dimakopoulos, A.; DeLataillade, T.

    2015-12-01

    Performance analysis and optimization of coastal and navigation structures is becoming feasible due to recent improvements in numerical methods for multiphase flows and the steady increase in capacity and availability of high performance computing resources. Now that the concept of fully three-dimensional air/water flow modelling for real world engineering analysis is achieving acceptance by the wider engineering community, it is critical to expand careful comparative studies on verification,validation, benchmarking, and uncertainty quantification for the variety of competing numerical methods that are continuing to evolve. Furthermore, uncertainty still remains about the relevance of secondary processes such as surface tension, air compressibility, air entrainment, and solid phase (structure) modelling so that questions about continuum mechanical theory and mathematical analysis of multiphase flow are still required. Two of the most popular and practical numerical approaches for large-scale engineering analysis are the Volume-Of-Fluid (VOF) and Level Set (LS) approaches. In this work we will present a publically available verification and validation test set for air-water-structure interaction problems as well as computational and physical model results including a hybrid VOF-LS method, traditional VOF methods, and Smoothed Particle Hydrodynamics (SPH) results. The test set repository and test problem formats will also be presented in order to facilitate future comparative studies and reproduction of scientific results.

  8. Model Verification and Validation Using Graphical Information Systems Tools

    Science.gov (United States)

    2013-07-31

    coastal ocean sufficiently to have a complete picture of the flow. The analysis will thus consist of comparing these incomplete pictures of the current...50 cm. This would suggest that tidal flats would exist at synoptic scales but not daily because there are expanses of the lagoon that are < 50 cm...historical daily data from the correct time of year but not from the correct day. This indicates that the model flow is generally correct at synoptic

  9. Design and verification of the 'GURI 01' bundle model

    International Nuclear Information System (INIS)

    Benito, G.D.

    1990-01-01

    This work presents a general description of the 'GURI 01' bundle model, designed by INVAP S.E., under international radioactive material transportation regulations, as a B(U) type bundle for international transportation up to a maximum of 350000 Ci of Co60. Moreover, the methodologies used and the results obtained from the structural evaluation of the mechanic essay and from the evaluation of the thermal behaviour under normal or accident conditions are briefly discussed. (Author) [es

  10. A New Approach to Model Verification, Falsification and Selection

    Directory of Open Access Journals (Sweden)

    Andrew J. Buck

    2015-06-01

    Full Text Available This paper shows that a qualitative analysis, i.e., an assessment of the consistency of a hypothesized sign pattern for structural arrays with the sign pattern of the estimated reduced form, can always provide decisive insight into a model’s validity both in general and compared to other models. Qualitative analysis can show that it is impossible for some models to have generated the data used to estimate the reduced form, even though standard specification tests might show the model to be adequate. A partially specified structural hypothesis can be falsified by estimating as few as one reduced form equation. Zero restrictions in the structure can themselves be falsified. It is further shown how the information content of the hypothesized structural sign patterns can be measured using a commonly applied concept of statistical entropy. The lower the hypothesized structural sign pattern’s entropy, the more a priori information it proposes about the sign pattern of the estimated reduced form. As an hypothesized structural sign pattern has a lower entropy, it is more subject to type 1 error and less subject to type 2 error. Three cases illustrate the approach taken here.

  11. Verification of an effective dose equivalent model for neutrons

    International Nuclear Information System (INIS)

    Tanner, J.E.; Piper, R.K.; Leonowich, J.A.; Faust, L.G.

    1992-01-01

    Since the effective dose equivalent, based on the weighted sum of organ dose equivalents, is not a directly measurable quantity, it must be estimated with the assistance of computer modelling techniques and a knowledge of the incident radiation field. Although extreme accuracy is not necessary for radiation protection purposes, a few well chosen measurements are required to confirm the theoretical models. Neutron doses and dose equivalents were measured in a RANDO phantom at specific locations using thermoluminescence dosemeters, etched track dosemeters, and a 1.27 cm (1/2 in) tissue-equivalent proportional counter. The phantom was exposed to a bare and a D 2 O-moderated 252 Cf neutron source at the Pacific Northwest Laboratory's Low Scatter Facility. The Monte Carlo code MCNP with the MIRD-V mathematical phantom was used to model the human body and to calculate the organ doses and dose equivalents. The experimental methods are described and the results of the measurements are compared with the calculations. (author)

  12. Verification of an effective dose equivalent model for neutrons

    International Nuclear Information System (INIS)

    Tanner, J.E.; Piper, R.K.; Leonowich, J.A.; Faust, L.G.

    1991-10-01

    Since the effective dose equivalent, based on the weighted sum of organ dose equivalents, is not a directly measurable quantity, it must be estimated with the assistance of computer modeling techniques and a knowledge of the radiation field. Although extreme accuracy is not necessary for radiation protection purposes, a few well-chosen measurements are required to confirm the theoretical models. Neutron measurements were performed in a RANDO phantom using thermoluminescent dosemeters, track etch dosemeters, and a 1/2-in. (1.27-cm) tissue equivalent proportional counter in order to estimate neutron doses and dose equivalents within the phantom at specific locations. The phantom was exposed to bare and D 2 O-moderated 252 Cf neutrons at the Pacific Northwest Laboratory's Low Scatter Facility. The Monte Carlo code MCNP with the MIRD-V mathematical phantom was used to model the human body and calculate organ doses and dose equivalents. The experimental methods are described and the results of the measurements are compared to the calculations. 8 refs., 3 figs., 3 tabs

  13. TU Electric reactor physics model verification: Power reactor benchmark

    International Nuclear Information System (INIS)

    Willingham, C.E.; Killgore, M.R.

    1988-01-01

    Power reactor benchmark calculations using the advanced code package CASMO-3/SIMULATE-3 have been performed for six cycles of Prairie Island Unit 1. The reload fuel designs for the selected cycles included gadolinia as a burnable absorber, natural uranium axial blankets and increased water-to-fuel ratio. The calculated results for both startup reactor physics tests (boron endpoints, control rod worths, and isothermal temperature coefficients) and full power depletion results were compared to measured plant data. These comparisons show that the TU Electric reactor physics models accurately predict important measured parameters for power reactors

  14. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially......-process models, the last part of the thesis, where the integrated process tank model is tested on three examples of activated sludge systems, is initiated. The three case studies are introduced with an increasing degree of model complexity. All three cases are take basis in Danish municipal wastewater treatment...... plants. The first case study involves the modeling of an activated sludge tank undergoing a special controlling strategy with the intention minimizing the sludge loading on the subsequent secondary settlers during storm events. The applied model is a two-phase model, where the sedimentation of sludge...

  15. Verification of the two-dimensional hydrodynamic model based on remote sensing

    Science.gov (United States)

    Sazonov, Alexey; Mikhailukova, Polina; Krylenko, Inna; Frolova, Natalya; Kireeva, Mariya

    2016-04-01

    Mathematical modeling methods are used more and more actively to evaluate possible damage, identify potential flood zone and the influence of individual factors affecting the river during the passage of the flood. Calculations were performed by means of domestic software complex «STREAM-2D» which is based on the numerical solution of two-dimensional St. Venant equations. One of the major challenges in mathematical modeling is the verification of the model. This is usually made using data on water levels from hydrological stations: the smaller the difference of the actual level and the simulated one, the better the quality of the model used. Data from hydrological stations are not always available, so alternative sources of verification, such as remote sensing, are increasingly used. The aim of this work is to develop a method of verification of hydrodynamic model based on a comparison of actual flood zone area, which in turn is determined on the basis of the automated satellite image interpretation methods for different imaging systems and flooded area obtained in the course of the model. The study areas are Lena River, The North Dvina River, Amur River near Blagoveshchensk. We used satellite images made by optical and radar sensors: SPOT-5/HRG, Resurs-F, Radarsat-2. Flooded area were calculated using unsupervised classification (ISODATA and K-mean) for optical images and segmentation for Radarsat-2. Knowing the flow rate and the water level at a given date for the upper and lower limits of the model, respectively, it is possible to calculate flooded area by means of program STREAM-2D and GIS technology. All the existing vector layers with the boundaries of flooding are included in a GIS project for flood area calculation. This study was supported by the Russian Science Foundation, project no. 14-17-00155.

  16. Double piezoelectric energy harvesting cell: modeling and experimental verification

    Science.gov (United States)

    Wang, Xianfeng; Shi, Zhifei

    2017-06-01

    In this paper, a novel energy transducer named double piezoelectric energy harvesting cell (DPEHC) consisting of two flex-compressive piezoelectric energy harvesting cells (F-C PEHCs) is proposed. At the very beginning, two F-C PEHCs, a kind of cymbal type energy transducer, are assembled together sharing the same end just in order to be placed steady. However, throughout an open-circuit voltage test, additional energy harvesting performance of the DPEHC prototype appears. Taking the interaction between the two F-C PEHCs into account, a mechanical model for analyzing the DPEHC is established. The electric output of the DPEHC under harmonic excitation is obtained theoretically and verified experimentally, and good agreement is found. In addition, as an inverse problem, the method for identifying the key mechanical parameters of the DPEHC is recommended. Finally, the additional energy harvesting performance of the DPEHC is quantitatively discussed. Numerical results show that the additional energy harvesting performance of the DPEHC is correlated with the key mechanical parameters of the DPEHC. For the present DPEHC prototype, the energy harvesting addition is over 400% compared with two independent F-C PEHCs under the same load condition.

  17. Verification and Validation of a Three-Dimensional Orthotropic Plasticity Constitutive Model Using a Unidirectional Composite

    Directory of Open Access Journals (Sweden)

    Canio Hoffarth

    2017-03-01

    Full Text Available A three-dimensional constitutive model has been developed for modeling orthotropic composites subject to impact loads. It has three distinct components—a deformation model involving elastic and plastic deformations; a damage model; and a failure model. The model is driven by tabular data that is generated either using laboratory tests or via virtual testing. A unidirectional composite—T800/F3900, commonly used in the aerospace industry, is used in the verification and validation tests. While the failure model is under development, these tests indicate that the implementation of the deformation and damage models in a commercial finite element program, LS-DYNA, is efficient, robust and accurate.

  18. Software verification, model validation, and hydrogeologic modelling aspects in nuclear waste disposal system simulations. A paradigm shift

    International Nuclear Information System (INIS)

    Sheng, G.M.

    1994-01-01

    This work reviewed the current concept of nuclear waste disposal in stable, terrestrial geologic media with a system of natural and man-made multi-barriers. Various aspects of this concept and supporting research were examined with the emphasis on the Canadian Nuclear Fuel Waste Management Program. Several of the crucial issues and challenges facing the current concept were discussed. These include: The difficulties inherent in a concept that centres around lithologic studies; the unsatisfactory state of software quality assurance in the present computer simulation programs; and the lack of a standardized, comprehensive, and systematic procedure to carry out a rigorous process of model validation and assessment of simulation studies. An outline of such an approach was presented and some of the principles, tools and techniques for software verification were introduced and described. A case study involving an evaluation of the Canadian performance assessment computer program is presented. A new paradigm to nuclear waste disposal was advocated to address the challenges facing the existing concept. The RRC (Regional Recharge Concept) was introduced and its many advantages were described and shown through a modelling exercise. (orig./HP)

  19. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  20. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    strategic preference, as part of their business model innovation activity planned. Practical implications – This paper aimed at strengthening researchers and, particularly, practitioner’s perspectives into the field of business model process configurations. By insuring an [abstracted] alignment between......Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  1. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  2. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  3. Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.

    Science.gov (United States)

    Li, Haoxiang; Hua, Gang

    2018-04-01

    Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.

  4. Modelling and Formal Verification of Timing Aspects in Large PLC Programs

    CERN Document Server

    Fernandez Adiego, B; Blanco Vinuela, E; Tournier, J-C; Gonzalez Suarez, V M; Blech, J O

    2014-01-01

    One of the main obstacle that prevents model checking from being widely used in industrial control systems is the complexity of building formal models out of PLC programs, especially when timing aspects need to be integrated. This paper brings an answer to this obstacle by proposing a methodology to model and verify timing aspects of PLC programs. Two approaches are proposed to allow the users to balance the trade-off between the complexity of the model, i.e. its number of states, and the set of specifications possible to be verified. A tool supporting the methodology which allows to produce models for different model checkers directly from PLC programs has been developed. Verification of timing aspects for real-life PLC programs are presented in this paper using NuSMV.

  5. Extended verification of the model of dynamic near-surface layer of the atmosphere

    Science.gov (United States)

    Polnikov, V. G.

    2013-07-01

    This paper formulates the most general principles for verifying models of the dynamic near-water layer of the atmosphere (DNWLA) and performs an advanced verification of the model proposed by the author earlier [6]. Based on empirical wave spectra from the studies by Donelan [15], Elfouhaily [14], and Kudryavtsev [13] and well-known empirical laws describing the wave-age dependence of the friction coefficient, we adjusted the original version of the model. It was shown that the improvement of model reliability is most dependent on the adequacy of the parameterization of the tangential portion of the total momentum flux to the wavy surface. Then the new version of the model was verified on the basis of field data from two different groups of authors. It was found that the new version of the model is consistent with empirical data with an error not exceeding the measurement error of near-water layer parameters.

  6. Development and verification of a space-dependent dynamic model of a natural circulation steam generator

    International Nuclear Information System (INIS)

    Mewdell, C.G.; Harrison, W.C.; Hawley, E.H.

    1980-01-01

    This paper describes the development and verification of a Non-Linear Space-Dependent Dynamic Model of a Natural Circulation Steam Generator typical of boilers used in CANDU nuclear power stations. The model contains a detailed one-dimensional dynamic description of both the primary and secondary sides of an integral pre-heater natural circulation boiler. Two-phase flow effects on the primary side are included. The secondary side uses a drift-flux model in the boiling sections and a detailed non-equilibrium point model for the steam drum. The paper presents the essential features of the final model called BOILER-2, its solution scheme, the RD-12 loop and test boiler, the boiler steady-state and transient experiments, and the comparison of the model predictions with experimental results. (author)

  7. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  8. Verification of a coupled atmosphere-ocean model using satellite observations over the Adriatic Sea

    Directory of Open Access Journals (Sweden)

    V. Djurdjevic

    2008-07-01

    Full Text Available Verification of the EBU-POM regional atmosphere-ocean coupled model (RAOCM was carried out using satellite observations of SST and surface winds over the Adriatic Sea. The atmospheric component has a horizontal resolution of 0.125 degree (approximately 10 km and 32 vertical levels, while the ocean component has a horizontal resolution of approximately 4 km with 21 sigma vertical levels.

    Verification of the forecasted SST was performed for 15 forecasts during 2006, each of them seven days long. These forecasts coincide with the operating cycle of the Adriatic Regional Model (AREG, which provided the initial fields and boundary conditions for the ocean component of EBU-POM. Two sources of data were used for the initial and boundary conditions of the atmosphere: primary data were obtained from the European Center for Medium-Range Weather Forecasting (ECMWF, while data from National Centers for Environmental Prediction (NCEP were used to test the sensitivity to boundary conditions.

    Forecast skill was expressed in terms of BIAS and root mean square error (RMSE. During most the of verification period, the model had a negative BIAS of approximately −0.3°, while RMSE varied between 1.1° and 1.2°. Interestingly, these errors did not increase over time, which means that the forecast skill did not decline during the integrations.

    The 10-m wind verification was conducted for one period of 17 days in February 2007, during a strong bora episode, for which satellite estimates of surface winds were available. During the same period, SST measurements were conducted twice a day, which enabled us to verify diurnal variations of SST simulated by the RAOCM model. Since ECMWF's deterministic forecasts do not cover such a long period, we decided to use the ECMWF analysis, i.e. we ran the model in hindcast mode. The winds simulated in this analysis were weaker than the satellite estimates, with a mean BIAS of −0.8 m/s.

  9. Verification of a coupled atmosphere-ocean model using satellite observations over the Adriatic Sea

    Directory of Open Access Journals (Sweden)

    V. Djurdjevic

    2008-07-01

    Full Text Available Verification of the EBU-POM regional atmosphere-ocean coupled model (RAOCM was carried out using satellite observations of SST and surface winds over the Adriatic Sea. The atmospheric component has a horizontal resolution of 0.125 degree (approximately 10 km and 32 vertical levels, while the ocean component has a horizontal resolution of approximately 4 km with 21 sigma vertical levels. Verification of the forecasted SST was performed for 15 forecasts during 2006, each of them seven days long. These forecasts coincide with the operating cycle of the Adriatic Regional Model (AREG, which provided the initial fields and boundary conditions for the ocean component of EBU-POM. Two sources of data were used for the initial and boundary conditions of the atmosphere: primary data were obtained from the European Center for Medium-Range Weather Forecasting (ECMWF, while data from National Centers for Environmental Prediction (NCEP were used to test the sensitivity to boundary conditions. Forecast skill was expressed in terms of BIAS and root mean square error (RMSE. During most the of verification period, the model had a negative BIAS of approximately −0.3°, while RMSE varied between 1.1° and 1.2°. Interestingly, these errors did not increase over time, which means that the forecast skill did not decline during the integrations. The 10-m wind verification was conducted for one period of 17 days in February 2007, during a strong bora episode, for which satellite estimates of surface winds were available. During the same period, SST measurements were conducted twice a day, which enabled us to verify diurnal variations of SST simulated by the RAOCM model. Since ECMWF's deterministic forecasts do not cover such a long period, we decided to use the ECMWF analysis, i.e. we ran the model in hindcast mode. The winds simulated in this analysis were weaker than the satellite estimates, with a mean BIAS of −0.8 m/s.

  10. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  11. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Energy Technology Data Exchange (ETDEWEB)

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  12. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Science.gov (United States)

    Chukbar, B. K.

    2015-12-01

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm-3 in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  13. Experimental Validation and Model Verification for a Novel Geometry ICPC Solar Collector

    DEFF Research Database (Denmark)

    Perers, Bengt; Duff, William S.; Daosukho, Jirachote

    at the various specified incident angles provide model verification for the investigation into causes of ray attenuation and provide accounts for rays that escape. Two fourteen tube modules were tested on Sandia National Laboratory’s two-axis tracking (AZTRAK) platform. By adjusting the tracking of the platform...... at the corresponding specified incident angles are compared to the Sandia results. A 100 m2 336 Novel ICPC evacuated tube solar collector array has been in continuous operation at a demonstration project in Sacramento California since 1998. Data from the initial operation of the array are used to further validate...

  14. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  15. Foam process models.

    Energy Technology Data Exchange (ETDEWEB)

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  16. Code and Solution Verification of 3D Numerical Modeling of Flow in the Gust Erosion Chamber

    Science.gov (United States)

    Yuen, A.; Bombardelli, F. A.

    2014-12-01

    Erosion microcosms are devices commonly used to investigate the erosion and transport characteristics of sediments at the bed of rivers, lakes, or estuaries. In order to understand the results these devices provide, the bed shear stress and flow field need to be accurately described. In this research, the UMCES Gust Erosion Microcosm System (U-GEMS) is numerically modeled using Finite Volume Method. The primary aims are to simulate the bed shear stress distribution at the surface of the sediment core/bottom of the microcosm, and to validate the U-GEMS produces uniform bed shear stress at the bottom of the microcosm. The mathematical model equations are solved by on a Cartesian non-uniform grid. Multiple numerical runs were developed with different input conditions and configurations. Prior to developing the U-GEMS model, the General Moving Objects (GMO) model and different momentum algorithms in the code were verified. Code verification of these solvers was done via simulating the flow inside the top wall driven square cavity on different mesh sizes to obtain order of convergence. The GMO model was used to simulate the top wall in the top wall driven square cavity as well as the rotating disk in the U-GEMS. Components simulated with the GMO model were rigid bodies that could have any type of motion. In addition cross-verification was conducted as results were compared with numerical results by Ghia et al. (1982), and good agreement was found. Next, CFD results were validated by simulating the flow within the conventional microcosm system without suction and injection. Good agreement was found when the experimental results by Khalili et al. (2008) were compared. After the ability of the CFD solver was proved through the above code verification steps. The model was utilized to simulate the U-GEMS. The solution was verified via classic mesh convergence study on four consecutive mesh sizes, in addition to that Grid Convergence Index (GCI) was calculated and based on

  17. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...... of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated...

  18. Liveness and Reachability Analysis of BPMN Process Models

    Directory of Open Access Journals (Sweden)

    Anass Rachdi

    2016-06-01

    Full Text Available Business processes are usually defined by business experts who require intuitive and informal graphical notations such as BPMN (Business Process Management Notation for documenting and communicating their organization activities and behavior. However, BPMN has not been provided with a formal semantics, which limits the analysis of BPMN models to using solely informal techniques such as simulation. In order to address this limitation and use formal verification, it is necessary to define a certain “mapping” between BPMN and a formal language such as Concurrent Sequential Processes (CSP and Petri Nets (PN. This paper proposes a method for the verification of BPMN models by defining formal semantics of BPMN in terms of a mapping to Time Petri Nets (TPN, which are equipped with very efficient analytical techniques. After the translation of BPMN models to TPN, verification is done to ensure that some functional properties are satisfied by the model under investigation, namely liveness and reachability properties. The main advantage of our approach over existing ones is that it takes into account the time components in modeling Business process models. An example is used throughout the paper to illustrate the proposed method.

  19. From model conception to verification and validation, a global approach to multiphase Navier-Stoke models with an emphasis on volcanic explosive phenomenology

    Energy Technology Data Exchange (ETDEWEB)

    Dartevelle, Sebastian

    2007-10-01

    Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards. In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key

  20. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  1. Modeling and Verification of Reconfigurable and Energy-Efficient Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Jiafeng Zhang

    2015-01-01

    Full Text Available This paper deals with the formal modeling and verification of reconfigurable and energy-efficient manufacturing systems (REMSs that are considered as reconfigurable discrete event control systems. A REMS not only allows global reconfigurations for switching the system from one configuration to another, but also allows local reconfigurations on components for saving energy when the system is in a particular configuration. In addition, the unreconfigured components of such a system should continue running during any reconfiguration. As a result, during a system reconfiguration, the system may have several possible paths and may fail to meet control requirements if concurrent reconfiguration events and normal events are not controlled. To guarantee the safety and correctness of such complex systems, formal verification is of great importance during a system design stage. This paper extends the formalism reconfigurable timed net condition/event systems (R-TNCESs in order to model all possible dynamic behavior in such systems. After that, the designed system based on extended R-TNCESs is verified with the help of a software tool SESA for functional, temporal, and energy-efficient properties. This paper is illustrated by an automatic assembly system.

  2. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  3. Managing the Verification Trajectory

    NARCIS (Netherlands)

    Ruys, T.C.; Brinksma, Hendrik

    In this paper we take a closer look at the automated analysis of designs, in particular of verification by model checking. Model checking tools are increasingly being used for the verification of real-life systems in an industrial context. In addition to ongoing research aimed at curbing the

  4. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification

    Science.gov (United States)

    Sager, Jennifer E.; Yu, Jingjing; Ragueneau-Majlessi, Isabelle

    2015-01-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms “PBPK” and “physiologically based pharmacokinetic model” to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines. PMID:26296709

  5. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Fehnker, Ansgar; Fehnker, Ansgar

    2002-01-01

    We report on the use of model checking techniques for both the verification of a process control program and the derivation of optimal control schedules. Most of this work has been carried out as part of a case study for the EU VHS project (Verification of Hybrid Systems), in which the program for a

  6. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  7. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nielsen, Kim [Ramboll, Copenhagen (Denmark); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bunnik, Tim [MARIN (Netherlands); Touzon, Imanol [Tecnalia (Spain); Nam, Bo Woo [KRISO (Korea, Rep. of); Kim, Jeong Seok [KRISO (Korea, Rep. of); Janson, Carl Erik [Chalmers University (Sweden); Jakobsen, Ken-Robert [EDRMedeso (Norway); Crowley, Sarah [WavEC (Portugal); Vega, Luis [Hawaii Natural Energy Institute (United States); Rajagopalan, Krishnakimar [Hawaii Natural Energy Institute (United States); Mathai, Thomas [Glosten (United States); Greaves, Deborah [Plymouth University (United Kingdom); Ransley, Edward [Plymouth University (United Kingdom); Lamont-Kane, Paul [Queen' s University Belfast (United Kingdom); Sheng, Wanan [University College Cork (Ireland); Costello, Ronan [Wave Venture (United Kingdom); Kennedy, Ben [Wave Venture (United Kingdom); Thomas, Sarah [Floating Power Plant (Denmark); Heras, Pilar [Floating Power Plant (Denmark); Bingham, Harry [Technical University of Denmark (Denmark); Kurniawan, Adi [Aalborg University (Denmark); Kramer, Morten Mejlhede [Aalborg University (Denmark); Ogden, David [INNOSEA (France); Girardin, Samuel [INNOSEA (France); Babarit, Aurelien [EC Nantes (France); Wuillaume, Pierre-Yves [EC Nantes (France); Steinke, Dean [Dynamic Systems Analysis (Canada); Roy, Andre [Dynamic Systems Analysis (Canada); Beatty, Scott [Cascadia Coast Research (Canada); Schofield, Paul [ANSYS (United States); Kim, Kyong-Hwan [KRISO (Korea, Rep. of); Jansson, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden); BCAM (Spain); Hoffman, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden)

    2017-10-16

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30) [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.

  8. Verification of the Hydrodynamic and Sediment Transport Hybrid Modeling System for Cumberland Sound and Kings Bay Navigation Channel, Georgia

    Science.gov (United States)

    1989-07-01

    different loca- tions and should not be directly evaluated. These data are provided for 72 8’ 0�PHSICAL MODEL 0*0NUMERICAL MODEL E HIGH WATER LE 6...TII. JM~ls a. Station 843 MIODEL. VERIFICATION - 1985 GEOMSETRY tt m •u - FIELD 61*110K le 4. a." -- 0 . 0 p-iOi" Plate \\1. is. in. 14. 1. Is. U. 94...STOTION 3n 4. a. L 0 MODE VEIICTO 195GOMTYw"" l -. L I 31 -6. 10. In. 14. 18. 18. a. *. 14. MIEL TIM. HMES a. Station 396 MI’ODEL VERIFICATION - 1985

  9. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    Science.gov (United States)

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-01-01

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949

  10. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

    Directory of Open Access Journals (Sweden)

    Shrirang Ambaji KULKARNI

    2017-04-01

    Full Text Available Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique for routing in dynamic network. To analyze the correctness of Q-Routing algorithms mathematically, we provide a proof and also implement a SPIN based verification model. We also perform simulation based analysis of Q-Routing for given metrics.

  11. Experimental verification of dynamic radioecological models established after the Chernobyl reactor accident

    International Nuclear Information System (INIS)

    Voigt, G.; Mueller, H.; Proehl, G.; Stocke, H.; Paretzke, H.G.

    1991-01-01

    The experiments reported were carried out for a verification of existing, dynamic radioecological models, especially of the ECOSYS model. The database used for the verification covers the radioactivity concentrations of Cs-134, Cs-137, I-131 measured after the Chernobyl reactor accident in foodstuffs and environmental samples, the results of field experiments on radionuclide translocation after foliar uptake or absorption by the roots of edible plants. The measured data were compared with the model predictions for the radionuclides under review. The Cs-134 and Cs-137 translocation factors which describe the redistribution of these radionuclides in the plant after foliar uptake were experimentally determined by a single sprinkling with Chernobyl rainwater, and were measured to be the following as a function of sprinkling time: winter wheat, 0.002-0.13; spring wheat, 0.003-0.09; winter rye, 0.002-0.27; barley, 0.002-0.04; potatoes, 0.05-0.35; carrots, 0.02-0.07; bush beans, 0.04-0.3; cabbage, 0.1-0.5. The weathering half-life of the radionuclides in lettuce was determined to be ten days. Transfer factors determined for root absorption of Cs-137 were measured to be an average of 0.002 for grains, 0.002 for potatoes, 0.004 for white cabbage, 0.003 for bush beans and carrots, and 0.007 for lettuce. There was an agreement between the ECOSYS model predictions and the measured radioactivity concentrations of the corresponding radionuclides. (orig./HP) [de

  12. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    International Nuclear Information System (INIS)

    Mao, S P; Rottenberg, X; Rochus, V; Czarnecki, P; Helin, P; Severi, S; Tilmans, H A C; Nauwelaers, B

    2017-01-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp .). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells. (paper)

  13. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    Science.gov (United States)

    Mao, S. P.; Rottenberg, X.; Rochus, V.; Czarnecki, P.; Helin, P.; Severi, S.; Nauwelaers, B.; Tilmans, H. A. C.

    2017-03-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp.). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells.

  14. Verification of atmospheric diffusion models using data of long term atmospheric diffusion experiments

    International Nuclear Information System (INIS)

    Tamura, Junji; Kido, Hiroko; Hato, Shinji; Homma, Toshimitsu

    2009-03-01

    Straight-line or segmented plume models as atmospheric diffusion models are commonly used in probabilistic accident consequence assessment (PCA) codes due to cost and time savings. The PCA code, OSCAAR developed by Japan Atomic Energy Research Institute (Present; Japan Atomic Energy Agency) uses the variable puff trajectory model to calculate atmospheric transport and dispersion of released radionuclides. In order to investigate uncertainties involved with the structure of the atmospheric dispersion/deposition model in OSCAAR, we have introduced the more sophisticated computer codes that included regional meteorological models RAMS and atmospheric transport model HYPACT, which were developed by Colorado State University, and comparative analyses between OSCAAR and RAMS/HYPACT have been performed. In this study, model verification of OSCAAR and RAMS/HYPACT was conducted using data of long term atmospheric diffusion experiments, which were carried out in Tokai-mura, Ibaraki-ken. The predictions by models and the results of the atmospheric diffusion experiments indicated relatively good agreements. And it was shown that model performance of OSCAAR was the same degree as it of RAMS/HYPACT. (author)

  15. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  16. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    Science.gov (United States)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture

  17. Processing well logging data, for example for verification and calibration of well logs

    International Nuclear Information System (INIS)

    Suau, J.; Boutemy, Y.

    1981-01-01

    A method is described of machine processing well logging data derived from borehole exploring devices which investigate earth formations traversed by boreholes. The method can be used for verifying and recalibrating logs, reconstructing missing logs and combining the data to form a statistical model of the traversed earth formations. (U.K.)

  18. Real-Time Kennedy Space Center and Cape Canaveral Air Force Station High-Resolution Model Implementation and Verification

    Science.gov (United States)

    Shafer, Jaclyn A.; Watson, Leela R.

    2015-01-01

    .33-kilometer domain model performance for the 2014 warm season (May-September). Verification statistics were computed using the Model Evaluation Tools, which compared the model forecasts to observations. The mean error values were close to 0 and the root mean square error values were less than 1.8 for mean sea-level pressure (millibars), temperature (degrees Kelvin), dewpoint temperature (degrees Kelvin), and wind speed (per millisecond), all very small differences between the forecast and observations considering the normal magnitudes of the parameters. The precipitation forecast verification results showed consistent under-forecasting of the precipitation object size. This could be an artifact of calculating the statistics for each hour rather than for the entire 12-hour period. The AMU will continue to generate verification statistics for the 1.33-kilometer WRF-EMS domain as data become available in future cool and warm seasons. More data will produce more robust statistics and reveal a more accurate assessment of model performance. Once the formal task was complete, the AMU conducted additional work to better understand the wind direction results. The results were stratified diurnally and by wind speed to determine what effects the stratifications would have on the model wind direction verification statistics. The results are summarized in the addendum at the end of this report. In addition to verifying the model's performance, the AMU also made the output available in the Advanced Weather Interactive Processing System II (AWIPS II). This allows the 45 WS and AMU staff to customize the model output display on the AMU and Range Weather Operations AWIPS II client computers and conduct real-time subjective analyses. In the future, the AMU will implement an updated version of the WRF-EMS model that incorporates local data assimilation. This model will also run in real-time and be made available in AWIPS II.

  19. Developing Verification Systems for Building Information Models of Heritage Buildings with Heterogeneous Datasets

    Science.gov (United States)

    Chow, L.; Fai, S.

    2017-08-01

    The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  20. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  1. Verification and Validation Process for Progressive Damage and Failure Analysis Methods in the NASA Advanced Composites Consortium

    Science.gov (United States)

    Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl

    2017-01-01

    The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.

  2. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  3. Further optimisations of constant Q cepstral processing for integrated utterance and text-dependent speaker verification

    DEFF Research Database (Denmark)

    Delgado, Hector; Todisco, Massimiliano; Sahidullah, Md

    2016-01-01

    performance, but also provide an ancillary level of security. This can take the form of explicit utterance verification (UV). An integrated UV + ASV system should then verify access attempts which contain not just the expected speaker, but also the expected text content. This paper presents such a system...... and introduces new features which are used for both UV and ASV tasks. Based upon multi-resolution, spectro-temporal analysis and when fused with more traditional parameterisations, the new features not only generally outperform Mel-frequency cepstral coefficients, but also are shown to be complementary when...

  4. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  5. Computer simulation of thermoforming process and its verification using a rapid tooling mould

    Directory of Open Access Journals (Sweden)

    Modławski Michał

    2018-01-01

    Full Text Available The results of computer simulation of thermoforming process made using ANSYS Polyflow software are presented in this paper. The analysis of the wall thickness distribution across an U-shaped thermoformed product manufactured using a positive mould was made. The simulation results were verified using a mould manufactured in a 3D printing process which was Fused Deposition Modelling (FDM and a poly(ethylene terephthalate formed sheet. It was proven that the computer simulation and a tool made with a Rapid Tooling technology can be useful for predicting the quality of a thermoformed part, particularly to find the problems with thin-walled areas.

  6. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  7. Modelling of hydrodynamics and mecury transport in lake Velenje. Part 2, Modelling and model verification

    OpenAIRE

    Kotnik, Jože; Žagar, Dušan; Rajar, Rudi; Horvat, Milena

    2004-01-01

    PCFLOW3D - a three-dimensional mathematical model that was developed at the Chair of Fluid Mechanics of the Faculty of Civil and Geodetic Engineering, University of Ljubljana, was used for hydrodynamic and Hg transport simulations in Lake Velenje. The model is fully non-linear and computes three velocity components, water elevation and pressure. Transport-dispersion equations for salinity and heat (and/or any pollutant) are further used to compute the distributions of these par...

  8. Statistical Modeling, Simulation, and Experimental Verification of Wideband Indoor Mobile Radio Channels

    Directory of Open Access Journals (Sweden)

    Yuanyuan Ma

    2018-01-01

    Full Text Available This paper focuses on the modeling, simulation, and experimental verification of wideband single-input single-output (SISO mobile fading channels for indoor propagation environments. The indoor reference channel model is derived from a geometrical rectangle scattering model, which consists of an infinite number of scatterers. It is assumed that the scatterers are exponentially distributed over the two-dimensional (2D horizontal plane of a rectangular room. Analytical expressions are derived for the probability density function (PDF of the angle of arrival (AOA, the PDF of the propagation path length, the power delay profile (PDP, and the frequency correlation function (FCF. An efficient sum-of-cisoids (SOC channel simulator is derived from the nonrealizable reference model by employing the SOC principle. It is shown that the SOC channel simulator approximates closely the reference model with respect to the FCF. The SOC channel simulator enables the performance evaluation of wideband indoor wireless communication systems with reduced realization expenditure. Moreover, the rationality and usefulness of the derived indoor channel model is confirmed by various measurements at 2.4, 5, and 60 GHz.

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE — BAYSAVER TECHNOLOGIES, INC. BAYSAVER SEPARATION SYSTEM, MODEL 10K

    Science.gov (United States)

    Verification testing of the BaySaver Separation System, Model 10K was conducted on a 10 acre drainage basin near downtown Griffin, Georgia. The system consists of two water tight pre-cast concrete manholes and a high-density polyethylene BaySaver Separator Unit. The BaySaver Mod...

  10. Beta irradiation as a method of the static MOS RAM memories processing and circuit design verification

    International Nuclear Information System (INIS)

    Wislowski, J.; Jagusztyn, M.

    1985-01-01

    1K NMOS RAM's in plastic packages were investigated after beta irradiation up to 100 Gy (Si) total dose. The memory samples differed as regards processing details and circuit design. Radioisotope beta sources were used for irradiation as the most safe and least expensive. A new version of the model of radiation-induced functional degradation of MOS RAM's has been proposed. 19 refs., 8 figs., 5 tabs. (author)

  11. Verification Process of Behavioral Consistency between Design and Implementation programs of pSET using HW-CBMC

    International Nuclear Information System (INIS)

    Lee, Dong Ah; Lee, Jong Hoon; Yoo, Jun Beom

    2011-01-01

    Controllers in safety critical systems such as nuclear power plants often use Function Block Diagrams (FBDs) to design embedded software. The design is implemented using programming languages such as C to compile it into particular target hardware. The implementation must have the same behavior with the design and the behavior should be verified explicitly. For example, the pSET (POSAFE-Q Software Engineering Tool) is a loader software to program POSAFE-Q PLC (Programmable Logic Controller) and is developed as a part of the KNICS (Korea Nuclear Instrumentation and Control System R and D Center) project. It uses FBDs to design software of PLC, and generates ANSI-C code to compile it into specific machine code. To verify the equivalence between the FBDs and ANSI-C code, mathematical proof of code generator or a verification tools such as RETRANS can help guarantee the equivalence. Mathematical proof, however, has a weakness that requires high expenditure and repetitive fulfillment whenever the translator is modified. On the other hand, RETRANS reconstructs the generated source code without consideration of the generator. It has also a weakness that the reconstruction of generated code needs additional analysis This paper introduces verification process of behavioral consistency between design and its implementation of the pSET using the HW-CBMC. The HW-CBMC is a formal verification tool, verifying equivalence between hardware and software description. It requires two inputs for checking equivalence, Verilog for hard-ware and ANSI-C for software. In this approach, FBDs are translated into semantically equivalent Verilog pro-gram, and the HW-CBMC verifies equivalence between the Verilog program and the ANSI-C program which is generated from the FBDs

  12. Undecidability of model-checking branching-time properties of stateless probabilistic pushdown process

    OpenAIRE

    Lin, T.

    2014-01-01

    In this paper, we settle a problem in probabilistic verification of infinite--state process (specifically, {\\it probabilistic pushdown process}). We show that model checking {\\it stateless probabilistic pushdown process} (pBPA) against {\\it probabilistic computational tree logic} (PCTL) is undecidable.

  13. Fiction and reality in the modelling world - Balance between simplicity and complexity, calibration and identifiability, verification and falsification

    DEFF Research Database (Denmark)

    Harremoës, P.; Madsen, H.

    1999-01-01

    Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable by calibr...... and to incorporate that in the design, operation and control of urban drainage structures. (C) 1999 IAWQ Published by Elsevier Science Ltd. All rights reserved....

  14. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  15. Modelling Template for the Development of the Process Flowsheet

    DEFF Research Database (Denmark)

    Fedorova, Marina; Gani, Rafiqul

    2015-01-01

    provides building blocks for the templates (generic models previously developed); 3) computer aided methods and tools, that include procedures to perform model translation, model analysis, model verification/validation, model solution and model documentation. In this work, the integrated use of all three......Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates......, where the experimental effort could be focused.In this contribution a general modelling framework for systematic model building through modelling templates, which supports the reuse of existing models via its tools integration and model import and export capabilities, is presented. Modelling templates...

  16. Weather forecast in north-western Greece: RISKMED warnings and verification of MM5 model

    Directory of Open Access Journals (Sweden)

    A. Bartzokas

    2010-02-01

    Full Text Available The meteorological model MM5 is applied operationally for the area of north-western Greece for one-year period (1 June 2007–31 May 2008. The model output is used for daily weather forecasting over the area. An early warning system is developed, by dividing the study area in 16 sub-regions and defining specific thresholds for issuing alerts for adverse weather phenomena. The verification of the model is carried out by comparing the model results with observations from three automatic meteorological stations. For air temperature and wind speed, correlation coefficients and biases are calculated, revealing that there is a significant overestimation of the early morning air temperature. For precipitation amount, yes/no contingency tables are constructed for 4 specific thresholds and some categorical statistics are applied, showing that the prediction of precipitation in the area under study is generally satisfactory. Finally, the thunderstorm warnings issued by the system are verified against the observed lightning activity.

  17. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    Energy Technology Data Exchange (ETDEWEB)

    Amadio, G.; et al.

    2017-11-22

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physics models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.

  18. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    Science.gov (United States)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Behera, S. P.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Hariri, F.; Jun, S. Y.; Konstantinov, D.; Kumawat, H.; Ivantchenko, V.; Lima, G.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2017-10-01

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physics models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.

  19. Thermal Pollution Mathematical Model. Volume 4: Verification of Three-Dimensional Rigid-Lid Model at Lake Keowee. [envrionment impact of thermal discharges from power plants

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1980-01-01

    The rigid lid model was developed to predict three dimensional temperature and velocity distributions in lakes. This model was verified at various sites (Lake Belews, Biscayne Bay, etc.) and th verification at Lake Keowee was the last of these series of verification runs. The verification at Lake Keowee included the following: (1) selecting the domain of interest, grid systems, and comparing the preliminary results with archival data; (2) obtaining actual ground truth and infrared scanner data both for summer and winter; and (3) using the model to predict the measured data for the above periods and comparing the predicted results with the actual data. The model results compared well with measured data. Thus, the model can be used as an effective predictive tool for future sites.

  20. Verification of forward kinematics of the numerical and analytical model of Fanuc AM100iB robot

    Science.gov (United States)

    Cholewa, A.; Świder, J.; Zbilski, A.

    2016-08-01

    The article presents the verification of forward kinematics of Fanuc AM100iB robot. The developed kinematic model of the machine was verified using tests on an actual robot model. The tests consisted in positioning the robot operating in the mode of controlling the values of natural angles in selected points of its workspace and reading the indications of the coordinate values of the TCP point in the robot's global coordinate system on the operator panel. Validation of the model consisted of entering the same values of natural angles that were used for positioning the robot in its inputs and calculating the coordinate values of the TCP of the machine's CAE model, and then comparing the results obtained with the values read. These results are the introduction to the partial verification of the dynamic model of the analysed device.

  1. 3D MODELING WITH PHOTOGRAMMETRY BY UAVS AND MODEL QUALITY VERIFICATION

    Directory of Open Access Journals (Sweden)

    V. Barrile

    2017-11-01

    Full Text Available This paper deals with a test lead by Geomatics laboratory (DICEAM, Mediterranea University of Reggio Calabria, concerning the application of UAV photogrammetry for survey, monitoring and checking. The study case relies with the surroundings of the Department of Agriculture Sciences. In the last years, such area was interested by landslides and survey activities carried out to take the phenomenon under control. For this purpose, a set of digital images were acquired through a UAV equipped with a digital camera and GPS. Successively, the processing for the production of a 3D georeferenced model was performed by using the commercial software Agisoft PhotoScan. Similarly, the use of a terrestrial laser scanning technique allowed to product dense cloud and 3D models of the same area. To assess the accuracy of the UAV-derived 3D models, a comparison between image and range-based methods was performed.

  2. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  3. ParFlow.RT: Development and Verification of a New Reactive Transport Model

    Science.gov (United States)

    Beisman, J. J., III

    2015-12-01

    In natural subsurface systems, total elemental fluxes are often heavily influenced by areas of disproportionately high reaction rates. These pockets of high reaction rates tend to occur at interfaces, such as the hyporheic zone, where a hydrologic flowpath converges with either a chemically distinct hydrologic flowpath or a reactive substrate. Understanding the affects that these highly reactive zones have on the behavior of shallow subsurface systems is integral to the accurate quantification of nutrient fluxes and biogeochemical cycling. Numerical simulations of these systems may be able to offer some insight. To that end, we have developed a new reactive transport model, ParFlow.RT, by coupling the parallel flow and transport code ParFlow with the geochemical engines of both PFLOTRAN and CrunchFlow. The coupling was accomplished via the Alquimia biogeochemistry API, which provides a unified interface to several geochemical codes and allows a relatively simple implementation of advanced geochemical functionality in flow and transport codes. This model uses an operator-splitting approach, where the transport and reaction steps are solved separately. Here, we present the details of this new model, and the results of verification simulations and biogeochemical cycling simulations of the DOE's East River field site outside of Gothic, CO.

  4. Model-based virtual VSB mask writer verification for efficient mask error checking and optimization prior to MDP

    Science.gov (United States)

    Pack, Robert C.; Standiford, Keith; Lukanc, Todd; Ning, Guo Xiang; Verma, Piyush; Batarseh, Fadi; Chua, Gek Soon; Fujimura, Akira; Pang, Linyong

    2014-10-01

    A methodology is described wherein a calibrated model-based `Virtual' Variable Shaped Beam (VSB) mask writer process simulator is used to accurately verify complex Optical Proximity Correction (OPC) and Inverse Lithography Technology (ILT) mask designs prior to Mask Data Preparation (MDP) and mask fabrication. This type of verification addresses physical effects which occur in mask writing that may impact lithographic printing fidelity and variability. The work described here is motivated by requirements for extreme accuracy and control of variations for today's most demanding IC products. These extreme demands necessitate careful and detailed analysis of all potential sources of uncompensated error or variation and extreme control of these at each stage of the integrated OPC/ MDP/ Mask/ silicon lithography flow. The important potential sources of variation we focus on here originate on the basis of VSB mask writer physics and other errors inherent in the mask writing process. The deposited electron beam dose distribution may be examined in a manner similar to optical lithography aerial image analysis and image edge log-slope analysis. This approach enables one to catch, grade, and mitigate problems early and thus reduce the likelihood for costly long-loop iterations between OPC, MDP, and wafer fabrication flows. It moreover describes how to detect regions of a layout or mask where hotspots may occur or where the robustness to intrinsic variations may be improved by modification to the OPC, choice of mask technology, or by judicious design of VSB shots and dose assignment.

  5. Modelling horizontal steam generator with ATHLET. Verification of different nodalization schemes and implementation of verified constitutive equations

    Energy Technology Data Exchange (ETDEWEB)

    Beliaev, J.; Trunov, N.; Tschekin, I. [OKB Gidropress (Russian Federation); Luther, W. [GRS Garching (Germany); Spolitak, S. [RNC-KI (Russian Federation)

    1995-12-31

    Currently the ATHLET code is widely applied for modelling of several Power Plants of WWER type with horizontal steam generators. A main drawback of all these applications is the insufficient verification of the models for the steam generator. This paper presents the nodalization schemes for the secondary side of the steam generator, the results of stationary calculations, and preliminary comparisons to experimental data. The consideration of circulation in the water inventory of the secondary side is proved to be necessary. (orig.). 3 refs.

  6. Development of the VESUVIUS module. Molten jet breakup modeling and model verification

    Energy Technology Data Exchange (ETDEWEB)

    Vierow, K. [Nuclear Power Engineering Corp., Tokyo (Japan); Nagano, Katsuhiro; Araki, Kazuhiro

    1998-01-01

    With the in-vessel vapor explosion issue ({alpha}-mode failure) now considered to pose an acceptably small risk to the safety of a light water reactor, ex-vessel vapor explosions are being given considerable attention. Attempts are being made to analytically model breakup of continuous-phase jets, however uncertainty exists regarding the basic phenomena. In addition, the conditions upon reactor vessel failure, which determine the starting point of the ex-vessel vapor explosion process, are difficult to quantify. Herein, molten jet ejection from the reactor pressure vessel is characterized. Next, the expected mode of jet breakup is determined and the current state of analytical modeling is reviewed. A jet breakup model for ex-vessel scenarios, with the primary breakup mechanism being the Kelvin-Helmholtz instability, is described. The model has been incorporated into the VESUVIUS module and comparisons of VESUVIUS calculations against FARO L-06 experimental data show differences, particularly in the pressure curve and amount of jet breakup. The need for additional development to resolve these differences is discussed. (author)

  7. Application of a Monte Carlo linac model in routine verifications of dose calculations

    International Nuclear Information System (INIS)

    Linares Rosales, H. M.; Alfonso Laguardia, R.; Lara Mas, E.; Popescu, T.

    2015-01-01

    The analysis of some parameters of interest in Radiotherapy Medical Physics based on an experimentally validated Monte Carlo model of an Elekta Precise lineal accelerator, was performed for 6 and 15 Mv photon beams. The simulations were performed using the EGSnrc code. As reference for simulations, the optimal beam parameters values (energy and FWHM) previously obtained were used. Deposited dose calculations in water phantoms were done, on typical complex geometries commonly are used in acceptance and quality control tests, such as irregular and asymmetric fields. Parameters such as MLC scatter, maximum opening or closing position, and the separation between them were analyzed from calculations in water. Similarly simulations were performed on phantoms obtained from CT studies of real patients, making comparisons of the dose distribution calculated with EGSnrc and the dose distribution obtained from the computerized treatment planning systems (TPS) used in routine clinical plans. All the results showed a great agreement with measurements, finding all of them within tolerance limits. These results allowed the possibility of using the developed model as a robust verification tool for validating calculations in very complex situation, where the accuracy of the available TPS could be questionable. (Author)

  8. Rheological-dynamical continuum damage model for concrete under uniaxial compression and its experimental verification

    Directory of Open Access Journals (Sweden)

    Milašinović Dragan D.

    2015-01-01

    Full Text Available A new analytical model for the prediction of concrete response under uniaxial compression and its experimental verification is presented in this paper. The proposed approach, referred to as the rheological-dynamical continuum damage model, combines rheological-dynamical analogy and damage mechanics. Within the framework of this approach the key continuum parameters such as the creep coefficient, Poisson’s ratio and damage variable are functionally related. The critical values of the creep coefficient and damage variable under peak stress are used to describe the failure mode of the concrete cylinder. The ultimate strain is determined in the post-peak regime only, using the secant stress-strain relation from damage mechanics. The post-peak branch is used for the energy analysis. Experimental data for five concrete compositions were obtained during the examination presented herein. The principal difference between compressive failure and tensile fracture is that there is a residual stress in the specimens, which is a consequence of uniformly accelerated motion of load during the examination of compressive strength. The critical interpenetration displacements and crushing energy are obtained theoretically based on the concept of global failure analysis. [Projekat Ministarstva nauke Republike Srbije, br. ON 174027: Computational Mechanics in Structural Engineering i br. TR 36017: Utilization of by-products and recycled waste materials in concrete composites for sustainable construction development in Serbia: Investigation and environmental assessment of possible applications

  9. Analysis and verification of a prediction model of solar energetic proton events

    Science.gov (United States)

    Wang, J.; Zhong, Q.

    2017-12-01

    The solar energetic particle event can cause severe radiation damages near Earth. The alerts and summary products of the solar energetic proton events were provided by the Space Environment Prediction Center (SEPC) according to the flux of the greater than 10 MeV protons taken by GOES satellite in geosynchronous orbit. The start of a solar energetic proton event is defined as the time when the flux of the greater than 10 MeV protons equals or exceeds 10 proton flux units (pfu). In this study, a model was developed to predict the solar energetic proton events, provide the warning for the solar energetic proton events at least minutes in advance, based on both the soft X-ray flux and integral proton flux taken by GOES. The quality of the forecast model was measured against verifications of accuracy, reliability, discrimination capability, and forecast skills. The peak flux and rise time of the solar energetic proton events in the six channels, >1MeV, >5 MeV, >10 MeV, >30 MeV, >50 MeV, >100 MeV, were also simulated and analyzed.

  10. Verification of an interaction model of an ultrasonic oscillatory system with periodontal tissues

    Directory of Open Access Journals (Sweden)

    V. A. Karpuhin

    2014-01-01

    Full Text Available Verification of an interaction model of an ultrasonic oscillatory system with biological tissues which was developed in COMSOL Multiphysics was carried out. It was shown that calculation results in COMSOL Multiphysics obtained using the “Finer” grid (the ratio of the grid step to a minimum transversal section area of the model ≤ 0.3 mm-1 best of all qualitatively and quantitatively corresponded to practical results. The average relative error of the obtained results in comparison with the experimental ones did not exceed 4.0%. Influence of geometrical parameters (thickness of load on electrical admittance of the ultrasonic oscillatory system interacting with biological tissues was investigated. It was shown that increase in thickness of load within the range from 0 to 95 mm led to decrease in calculated values of natural resonance frequency of longitudinal fluctuations and electrical admittance from 26,58 to 26,35 kHz and from 0,86 to 0,44 mS.

  11. Verification test of the SURF and SURFplus models in xRage

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-18

    As a verification test of the SURF and SURFplus models in the xRage code we use a propagating underdriven detonation wave in 1-D. This is about the only test cases for which an accurate solution can be determined based on the theoretical structure of the solution. The solution consists of a steady ZND reaction zone profile joined with a scale invariant rarefaction or Taylor wave and followed by a constant state. The end of the reaction profile and the head of the rarefaction coincide with the sonic CJ state of the detonation wave. The constant state is required to match a rigid wall boundary condition. For a test case, we use PBX 9502 with the same EOS and burn rate as previously used to test the shock detector algorithm utilized by the SURF model. The detonation wave is propagated for 10 μs (slightly under 80mm). As expected, the pointwise errors are largest in the neighborhood of discontinuities; pressure discontinuity at the lead shock front and pressure derivative discontinuities at the head and tail of the rarefaction. As a quantitative measure of the overall accuracy, the L2 norm of the difference of the numerical pressure and the exact solution is used. Results are presented for simulations using both a uniform grid and an adaptive grid that refines the reaction zone.

  12. Remaining Sites Verification Package for the 100-B-14:1 Process Sewer, Waste Site Reclassification Form 2004-005

    Energy Technology Data Exchange (ETDEWEB)

    L. M. Dittmer

    2007-02-22

    The 100-B-14:1 subsite encompasses the former process sewer main associated with the 105-B Reactor Building, 108-B Chemical Pumphouse and Tritium Separation Facility, 184-B Boiler House and the 100-B water treatment facilities, as well as the feeder lines associated with the 108-B facility, formerly discharging to the 116-B-7 Outfall Structure. The subsite has been remediated to achieve the remedial action objectives specified in the Remaining Sites ROD. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.

  13. GREENSCOPE: Sustainable Process Modeling

    Science.gov (United States)

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  14. Integrating Requirements Engineering, Modeling, and Verification Technologies into Software and Systems Engineering

    National Research Council Canada - National Science Library

    Broy, Manfred; Leucker, Martin

    2007-01-01

    The objective of this project is the development of an integrated suite of technologies focusing on end-to-end software development supporting requirements analysis, design, implementation, and verification...

  15. Verification of the model of predisposition in triathlon – structural model of confirmative factor analysis

    Directory of Open Access Journals (Sweden)

    Lenka Kovářová

    2012-09-01

    Full Text Available BACKGROUND: The triathlon is a combination of three different types of sport – swimming, cycling, and running. Each of these requires different top level predispositions and complex approach to talent selection is a rather difficult process. Attempts to identify assumptions in the triathlon have so far been specific and focused only on some groups of predispositions (physiology, motor tests, and psychology. The latest studies missed the structural approach and were based on determinants of sport performance, theory of sports training and expert assessment. OBJECTIVE: The aim of our study was to verify the model of predisposition in the short triathlon for talent assessment of young male athletes age 17–20 years. METHODS: The research sample consisted of 55 top level triathletes – men, who were included in the Government supported sports talent programme in the Czech Republic at the age of 17–20 years. We used a confirmative factor analysis (FA and Path diagram to verify the model, which allow us to explain mutual relationships among observed variables. For statistical data processing we used a structure equating modeling (SEM by software Lisrel L88. RESULTS: The study confirms best structural model for talent selection in triathlon at the age of 17–20 years old men, which composed seventeen indicators (tests and explained 91% of all cross-correlations (Goodness of Fit Index /GFI/ 0.91, Root Mean Square Residual /RMSR/ 0.13. Tests for predispositions in triathlons were grouped into five items, three motor predispositions (swimming, cycling and running skills, aerobic and psychological predispositions. Aerobic predispositions showed the highest importance to the assumptions to the general factor (1.00; 0. Running predispositions were measured as a very significant factor (–0.85; 0.28 which confirms importance of this critical stage of the race. Lower factor weight showed clusters of swimming (–0.61; 0.63 and cycling (0.53; 0

  16. Auditory processing models

    DEFF Research Database (Denmark)

    Dau, Torsten

    2008-01-01

    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook...... will present topics on signal processing which are important in a specific area of acoustics. These will be of interest to specialists in these areas because they will be presented from their technical perspective, rather than a generic engineering approach to signal processing. Non-specialists, or specialists...

  17. Learning Markov Decision Processes for Model Checking

    DEFF Research Database (Denmark)

    Mao, Hua; Chen, Yingke; Jaeger, Manfred

    2012-01-01

    Constructing an accurate system model for formal model verification can be both resource demanding and time-consuming. To alleviate this shortcoming, algorithms have been proposed for automatically learning system models based on observed system behaviors. In this paper we extend the algorithm...... is performed by analyzing the probabilistic linear temporal logic properties of the system as well as by analyzing the schedulers, in particular the optimal schedulers, induced by the learned models....

  18. Theoretical model and experimental verification on the PID tracking method using liquid crystal optical phased array

    Science.gov (United States)

    Wang, Xiangru; Xu, Jianhua; Huang, Ziqiang; Wu, Liang; Zhang, Tianyi; Wu, Shuanghong; Qiu, Qi

    2017-02-01

    Liquid crystal optical phased array (LC-OPA) has been considered with great potential on the non-mechanical laser deflector because it is fabricated using photolithographic patterning technology which has been well advanced by the electronics and display industry. As a vital application of LC-OPA, free space laser communication has demonstrated its merits on communication bandwidth. Before data communication, ATP (acquisition, tracking and pointing) process costs relatively long time to result in a bottle-neck of free space laser communication. Meanwhile, dynamic real time accurate tracking is sensitive to keep a stable communication link. The electro-optic medium liquid crystal with low driving voltage can be used as the laser beam deflector. This paper presents a fast-track method using liquid crystal optical phased array as the beam deflector, CCD as a beacon light detector. PID (Proportion Integration Differentiation) loop algorithm is introduced as the controlling algorithm to generate the corresponding steering angle. To achieve the goal of fast and accurate tracking, theoretical analysis and experimental verification are demonstrated that PID closed-loop system can suppress the attitude random vibration. Meanwhile, theoretical analysis shows that tracking accuracy can be less than 6.5μrad, with a relative agreement with experimental results which is obtained after 10 adjustments that the tracking accuracy is less than12.6μrad.

  19. Separating lexical-semantic access from other mnemonic processes in picture-name verification.

    Directory of Open Access Journals (Sweden)

    Jason Fitzgerald Smith

    2013-10-01

    Full Text Available We present a novel paradigm to identify shared and unique brain regions underlying non-semantic, non-phonological, abstract, audio-visual (AV memory versus naming using a longitudinal functional magnetic resonance imaging experiment. Participants were trained to associate novel AV stimulus pairs containing hidden linguistic content. Half of the stimulus pairs were distorted images of animals and sine-wave speech versions of the animal’s name. Images and sounds were distorted in such a way as to make their linguistic content easily recognizable only after being made aware of its existence. Memory for the pairings was tested by presenting an AV pair and asking participants to verify if the two stimuli formed a learned pairing. After memory testing, the hidden linguistic content was revealed and participants were tested again on their recollection of the pairings in this linguistically informed state. Once informed, the AV verification task could be performed by naming the picture. There was substantial overlap between the regions involved in recognition of nonlinguistic sensory memory and naming, suggesting a strong relation between them. Contrasts between sessions identified left angular gyrus and middle temporal gyrus as key additional players in the naming network. Left inferior frontal regions participated in both naming and nonlinguistic AV memory suggesting the region is responsible for AV memory independent of phonological content contrary to previous proposals. Functional connectivity between angular gyrus and left inferior frontal gyrus and left middle temporal gyrus increased when performing the AV task as naming. The results are consistent with the hypothesis that, at the spatial resolution of fMRI, the regions that facilitate nonlinguistic AV associations are a subset of those that facilitate naming though reorganized into distinct networks.

  20. Verification and Validation of RADTRAN 5.5.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  1. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  2. Methods, Computational Platform, Verification, and Application of Earthquake-Soil-Structure-Interaction Modeling and Simulation

    Science.gov (United States)

    Tafazzoli, Nima

    Seismic response of soil-structure systems has attracted significant attention for a long time. This is quite understandable with the size and the complexity of soil-structure systems. The focus of three important aspects of ESSI modeling could be on consistent following of input seismic energy and a number of energy dissipation mechanisms within the system, numerical techniques used to simulate dynamics of ESSI, and influence of uncertainty of ESSI simulations. This dissertation is a contribution to development of one such tool called ESSI Simulator. The work is being done on extensive verified and validated suite for ESSI Simulator. Verification and validation are important for high fidelity numerical predictions of behavior of complex systems. This simulator uses finite element method as a numerical tool to obtain solutions for large class of engineering problems such as liquefaction, earthquake-soil-structure-interaction, site effect, piles, pile group, probabilistic plasticity, stochastic elastic-plastic FEM, and detailed large scale parallel models. Response of full three-dimensional soil-structure-interaction simulation of complex structures is evaluated under the 3D wave propagation. Domain-Reduction-Method is used for applying the forces as a two-step procedure for dynamic analysis with the goal of reducing the large size computational domain. The issue of damping of the waves at the boundary of the finite element models is studied using different damping patterns. This is used at the layer of elements outside of the Domain-Reduction-Method zone in order to absorb the residual waves coming out of the boundary layer due to structural excitation. Extensive parametric study is done on dynamic soil-structure-interaction of a complex system and results of different cases in terms of soil strength and foundation embedment are compared. High efficiency set of constitutive models in terms of computational time are developed and implemented in ESSI Simulator

  3. Applying multi-physics requirements and loads in FEM analysis and testing—The JET KL11 endoscope design verification process

    International Nuclear Information System (INIS)

    Zauner, C.; Klammer, J.; Hartl, M.; Kampf, D.; Huber, A.; Mertens, Ph.; Schweer, B.; Terra, A.; Balshaw, N.

    2013-01-01

    Considering multi-physics requirements and loads in the early design phase as well as during the later experimental verification is especially important for the design of fusion devices due to the extreme environmental conditions and loads. Typical disciplines in design of fusion devices are thermodynamics, structural-mechanics, electro-magnetics, and optics. The interaction of these disciplines as well as an efficient approach to implement this interaction in numerical and experimental simulations is presented as applied at the new JET KL11 divertor endoscope design and verification process. The endoscope's first pictures already showed the very good performance of the instrument

  4. Verification of the hygienic adequacy of beef carcass cooling processes by microbiological culture and the temperature-function integration technique.

    Science.gov (United States)

    Jericho, K W; O'Laney, G; Kozub, G C

    1998-10-01

    To enhance food safety and keeping quality, beef carcasses are cooled immediately after leaving the slaughter floor. Within hazard analysis and critical control point (HACCP) systems, this cooling process needs to be monitored by the industry and verified by regulatory agencies. This study assessed the usefulness of the temperature-function integration technique (TFIT) for the verification of the hygienic adequacy of two cooling processes for beef carcasses at one abattoir. The cooling process passes carcasses through a spray cooler for at least 17 h and a holding cooler for at least 7 h. The TFIT is faster and cheaper than culture methods. For spray cooler 1, the Escherichia coli generations predicted by TFIT for carcass surfaces (pelvic and shank sites) were compared to estimated E. coli counts from 120 surface excision samples (rump, brisket, and sacrum; 5 by 5 by 0.2 cm) before and after cooling. Counts of aerobic bacteria, coliforms, and E. coli were decreased after spray cooler 1 (P < or = 0.001). The number of E. coli generations (with lag) at the pelvic site calculated by TFIT averaged 0.85 +/- 0.19 and 0.15 +/- 0.04 after emerging from spray coolers 1 and 2, respectively. The TFIT (with lag) was considered convenient and appropriate for the inspection service to verify HACCP systems for carcass cooling processes.

  5. Modeling and experimental verification of laser self-mixing interference phenomenon with the structure of two-external-cavity feedback

    Science.gov (United States)

    Chen, Peng; Liu, Yuwei; Gao, Bingkun; Jiang, Chunlei

    2018-03-01

    A semiconductor laser employed with two-external-cavity feedback structure for laser self-mixing interference (SMI) phenomenon is investigated and analyzed. The SMI model with two directions based on F-P cavity is deduced, and numerical simulation and experimental verification were conducted. Experimental results show that the SMI with the structure of two-external-cavity feedback under weak light feedback is similar to the sum of two SMIs.

  6. Pneumatic Muscles Actuated Lower-Limb Orthosis Model Verification with Actual Human Muscle Activation Patterns

    Directory of Open Access Journals (Sweden)

    Dzahir M.A.M

    2017-01-01

    Full Text Available A review study was conducted on existing lower-limb orthosis systems for rehabilitation which implemented pneumatic muscle type of actuators with the aim to clarify the current and on-going research in this field. The implementation of pneumatic artificial muscle will play an important role for the development of the advanced robotic system. In this research a derivation model for the antagonistic mono- and bi-articular muscles using pneumatic artificial muscles of a lower limb orthosis will be verified with actual human’s muscle activities models. A healthy and young male 29 years old subject with height 174cm and weight 68kg was used as a test subject. Two mono-articular muscles Vastus Medialis (VM and Vastus Lateralis (VL were selected to verify the mono-articular muscle models and muscle synergy between anterior muscles. Two biarticular muscles Rectus Femoris (RF and Bicep Femoris (BF were selected to verify the bi-articular muscle models and muscle co-contraction between anterior-posterior muscles. The test was carried out on a treadmill with a speed of 4.0 km/h, which approximately around 1.25 m/s for completing one cycle of walking motion. The data was collected for about one minute on a treadmill and 20 complete cycles of walking motion were successfully recorded. For the evaluations, the mathematical model obtained from the derivation and the actual human muscle activation patterns obtained using the surface electromyography (sEMG system were compared and analysed. The results shown that, high correlation values ranging from 0.83 up to 0.93 were obtained in between the derivation model and the actual human muscle’s model for both mono- and biarticular muscles. As a conclusion, based on the verification with the sEMG muscle activities data and its correlation values, the proposed derivation models of the antagonistic mono- and bi-articular muscles were suitable to simulate and controls the pneumatic muscles actuated lower limb

  7. INNOVATION PROCESS MODELLING

    Directory of Open Access Journals (Sweden)

    JANUSZ K. GRABARA

    2011-01-01

    Full Text Available Modelling phenomena in accordance with the structural approach enables one to simplify the observed relations and to present the classification grounds. An example may be a model of organisational structure identifying the logical relations between particular units and presenting the division of authority, work.

  8. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  9. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    The subject of this thesis is to develop a methodological framework that can systematically guide mathematical model building for better understanding of multi-enzyme processes. In this way, opportunities for process improvements can be identified by analyzing simulations of either existing...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective...... in the scientific literature. Reliable mathematical models of such multi-catalytic schemes can exploit the potential benefit of these processes. In this way, the best outcome of the process can be obtained understanding the types of modification that are required for process optimization. An effective evaluation...

  10. M3 version 3.0: Verification and validation; Hydrochemical model of ground water at repository site

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Javier B. (Dept. of Earth Sciences, Univ. of Zaragoza, Zaragoza (Spain)); Laaksoharju, Marcus (Geopoint AB, Sollentuna (Sweden)); Skaarman, Erik (Abscondo, Bromma (Sweden)); Gurban, Ioana (3D-Terra (Canada))

    2009-01-15

    Hydrochemical evaluation is a complex type of work that is carried out by specialists. The outcome of this work is generally presented as qualitative models and process descriptions of a site. To support and help to quantify the processes in an objective way, a multivariate mathematical tool entitled M3 (Multivariate Mixing and Mass balance calculations) has been constructed. The computer code can be used to trace the origin of the groundwater, and to calculate the mixing proportions and mass balances from groundwater data. The M3 code is a groundwater response model, which means that changes in the groundwater chemistry in terms of sources and sinks are traced in relation to an ideal mixing model. The complexity of the measured groundwater data determines the configuration of the ideal mixing model. Deviations from the ideal mixing model are interpreted as being due to reactions. Assumptions concerning important mineral phases altering the groundwater or uncertainties associated with thermodynamic constants do not affect the modelling because the calculations are solely based on the measured groundwater composition. M3 uses the opposite approach to that of many standard hydrochemical models. In M3, mixing is evaluated and calculated first. The constituents that cannot be described by mixing are described by reactions. The M3 model consists of three steps: the first is a standard principal component analysis, followed by mixing and finally mass balance calculations. The measured groundwater composition can be described in terms of mixing proportions (%), while the sinks and sources of an element associated with reactions are reported in mg/L. This report contains a set of verification and validation exercises with the intention of building confidence in the use of the M3 methodology. At the same time, clear answers are given to questions related to the accuracy and the precision of the results, including the inherent uncertainties and the errors that can be made

  11. Final Report - Independent Verification Survey Activities at the Seperations Process Research Unit Sites, Niskayuna, New York

    Energy Technology Data Exchange (ETDEWEB)

    Evan Harpenau

    2011-03-15

    The Separations Process Research Unit (SPRU) complex located on the Knolls Atomic Power Laboratory (KAPL) site in Niskayuna, New York, was constructed in the late 1940s to research the chemical separation of plutonium and uranium (Figure A-1). SPRU operated as a laboratory scale research facility between February 1950 and October 1953. The research activities ceased following the successful development of the reduction oxidation and plutonium/uranium extraction processes. The oxidation and extraction processes were subsequently developed for large scale use by the Hanford and Savannah River sites (aRc 2008a). Decommissioning of the SPRU facilities began in October 1953 and continued through the 1990s.

  12. A systematic approach for model verification: application on seven published activated sludge models.

    Science.gov (United States)

    Hauduc, H; Rieger, L; Takács, I; Héduit, A; Vanrolleghem, P A; Gillot, S

    2010-01-01

    The quality of simulation results can be significantly affected by errors in the published model (typing, inconsistencies, gaps or conceptual errors) and/or in the underlying numerical model description. Seven of the most commonly used activated sludge models have been investigated to point out the typing errors, inconsistencies and gaps in the model publications: ASM1; ASM2d; ASM3; ASM3 + Bio-P; ASM2d + TUD; New General; UCTPHO+. A systematic approach to verify models by tracking typing errors and inconsistencies in model development and software implementation is proposed. Then, stoichiometry and kinetic rate expressions are checked for each model and the errors found are reported in detail. An attached spreadsheet (see http://www.iwaponline.com/wst/06104/0898.pdf) provides corrected matrices with the calculations of all stoichiometric coefficients for the discussed biokinetic models and gives an example of proper continuity checks.

  13. Optimal Calculation of Residuals for ARMAX Models with Applications to Model Verification

    DEFF Research Database (Denmark)

    Knudsen, Torben

    1997-01-01

    Residual tests for sufficient model orders are based on the assumption that prediction errors are white when the model is correct. If an ARMAX system has zeros in the MA part which are close to the unit circle, then the standard predictor can have large transients. Even when the correct model...

  14. The experimental verification of the condition of the magnetic material caused by different technological processes

    CERN Document Server

    Tumanski, S

    2000-01-01

    The changes of electrical steel parameters caused by different technological processes have been tested using the magnetovision method. The effects of cutting into the strips, stamping the shape, laser scribing, annealing, bending and stressing have been investigated.

  15. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  16. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  17. Temperature Modeling of Lost Creek Lake Using CE-QUAL-W2: A Report on the Development, Calibration, Verification, and Application of the Model

    Science.gov (United States)

    2017-05-01

    ER D C/ EL T R- 17 -6 Temperature Modeling of Applegate Lake Using CE-QUAL-W2 A Report on the Development, Calibration, Verification...library at http://acwc.sdp.sirsi.net/client/default. ERDC/EL TR-17-6 May 2017 Temperature Modeling of Applegate Lake Using CE-QUAL-W2 A Report on...model and the corresponding results from the study provided CENWP with more refined estimates of water temperatures so that more defendable water

  18. Verification of operation of the actuator control system using the integration the B&R Automation Studio software with a virtual model of the actuator system

    Science.gov (United States)

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.

  19. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  20. Are Australian and New Zealand trauma service resources reflective of the Australasian Trauma Verification Model Resource Criteria?

    Science.gov (United States)

    Leonard, Elizabeth; Curtis, Kate

    2014-01-01

    The Australasian Trauma Verification Program was developed in 2000 to improve the quality of care provided at services in Australia and New Zealand. The programme outlines resources required for differing levels of trauma services. This study compares the human resources in Australia and New Zealand trauma services with those recommended by the Australasian College of Surgeons Trauma Verification Program. In September 2011, all trauma nurse coordinators in Australia and New Zealand were invited to participate in an electronic survey endorsed by the Australasian Trauma Society. This study expands on previous bi-national research and aimed to identify demographic and trauma service human resource levels. Fifty-three surveys (78%) were completed and all 27 Level 1 trauma centres represented. Of the Level 1 trauma centres, a trauma director and fellow were available at 16 (51.8%) and 14 (40.7%) centres, respectively. The majority (93%) had a full-time trauma coordinator although a trauma case manager was only available at 14 (48.1%) of Level 1 trauma centres. Despite the large amount of data collection and extraction required, trauma services had limited access to a data manager (50.9%) or clerical staff (36.9%). Human resources in Australian and NZ trauma services are not reflective of those recommended by the Australasian Trauma Verification Program. This impacts on the ability to coordinate trauma monitoring and performance improvement. Review of the Australasian Trauma Verification Model Resource Criteria is required. Injury surveillance in Australia and NZ is hampered by insufficient trauma registry resources. © 2014 Royal Australasian College of Surgeons.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - ALBANY INTERNATIONAL CORP. INDUSTRIAL PROCESS TECHNOLOGIES PRIMATEX PLUS I FILTER SAMPLE

    Science.gov (United States)

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  2. Substantiation and verification of the heat exchange crisis model in a rod bundles by means of the KORSAR thermohydraulic code

    International Nuclear Information System (INIS)

    Bobkov, V.P.; Vinogradov, V.N.; Efanov, A.D.; Sergeev, V.V.; Smogalev, I.P.

    2003-01-01

    The results of verifying the model for calculating the heat exchange crisis in the uniformly heated rod bundles, realized in the calculation code of the improved evaluation KORSAR, are presented. The model for calculating the critical heat fluxes in this code is based on the tabular method. The experimental data bank of the Branch base center of the thermophysical data GNTs RF - FEhI for the rod bundles, structurally similar to the WWER fuel assemblies, was used by the verification within the wide range of parameters: pressure from 0.11 up to 20 MPa and mass velocity from 5- up to 5000 kg/(m 2 s) [ru

  3. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  4. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    of Iraq was a case of late detection of undeclared activities, the case of DPRK was a case of prompt detection of discrepancies in the initial declaration through implementation of modem detection techniques, such as environmental sampling, and access to information. Access to the Security Council became important in view of the protracted process of non-compliance. The Model Additional Protocol (INFCIRC 540) agreed in 1997 incorporates the results of the efforts to strengthen the safeguards system and as such provides the possibility for more transparency by the States and more access to locations by the inspectors on the basis of information. It does not provide the broad and intrusive access rights as in the case of Iraq, since such rights are unprecedented and the result of a cease-fire arrangement involving the Security Council. But the expectations are that the broad implementation of the Additional Protocol will result in an effective and efficient safeguards verification system for the future. The on-site verification systems on a national, regional or multinational basis that have been put into operation in the past or are being discussed by States for the implementation of disarmament and non-proliferation conventions related to weapons of mass destruction whether nuclear, chemical or biological, have benefited and will benefit in the future from the guiding experience - both from the strengths and weaknesses -of the IAEA verification system. This is hopefully a legacy for the future of verification

  5. Chemical Process Modeling and Control.

    Science.gov (United States)

    Bartusiak, R. Donald; Price, Randel M.

    1987-01-01

    Describes some of the features of Lehigh University's (Pennsylvania) process modeling and control program. Highlights the creation and operation of the Chemical Process Modeling and Control Center (PMC). Outlines the program's philosophy, faculty, technical program, current research projects, and facilities. (TW)

  6. Chapter 1: Standard Model processes

    OpenAIRE

    Becher, Thomas

    2017-01-01

    This chapter documents the production rates and typical distributions for a number of benchmark Standard Model processes, and discusses new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  7. Vega library for processing DICOM data required in Monte Carlo verification of radiotherapy treatment plans

    International Nuclear Information System (INIS)

    Locke, C.; Zavgorodni, S.; British Columbia Cancer Agency, Vancouver Island Center, Victoria BC

    2008-01-01

    Monte Carlo (MC) methods provide the most accurate to-date dose calculations in heterogeneous media and complex geometries, and this spawns increasing interest in incorporating MC calculations into treatment planning quality assurance process. This involves MC dose calculations for clinically produced treatment plans. To perform these calculations, a number of treatment plan parameters specifying radiation beam

  8. ANALYSIS OF EFFICIENCY OF THE DISTANCE TRAINING SYSTEM IN THE PROCESS OF COMPETENCY VERIFICATION

    Directory of Open Access Journals (Sweden)

    Kravtsovа L.

    2017-12-01

    Full Text Available The Kherson State Maritime Academy conducts experimental research on the integration of the educational process on the basis of a competence approach to the state educational policy of Ukraine. One of the directions of this work is the creation and implementation of a distance education platform to support the educational process in the Kherson Maritime Academy. The distance learning system of KSMA is built on the basis of the open Moodle platform, which offers a wide range of opportunities to fully support the learning process in the remote environment, namely, a variety of ways of presenting the training material, testing knowledge and monitoring progress. The peculiarity of the developed system is that the effectiveness of the training strategy is provided by taking into account the psychological characteristics of the user contingent, the ultimate goal of training, the motivation of the whole process of education, namely, the specifics of the seaman's profession. One of the main directions of the work was a complete replacement of the classical methodology for conducting the examination session for complex testing, which covers all disciplines from 1 to 5 courses of study and is conducted on the basis of a distance education platform. The results of the experiment showed that own site of distance learning is an effective tool for studying the teaching material and for testing the quality of its learning.

  9. Development and Verification of CFD Models for Modeling Wind Conditions on Forested Wind Turbine Sites

    DEFF Research Database (Denmark)

    Andersen, Morten Q.; Mortensen, Kasper; Nielsen, Daniel E.

    2009-01-01

    This paper describes a proposed CFD model to simulate the wind conditions on a forested site. The model introduces porous subdomains representing the forests in the terrain. Obtained simulation values are compared to field measurements in- and outside a forest. Initial results are very promising...

  10. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  11. Verification of an alternative sludge treatment process for pathogen reduction at two wastewater treatment plants in Victoria, Australia.

    Science.gov (United States)

    Irwin, R; Surapaneni, A; Smith, D; Schmidt, J; Rigby, H; Smith, S R

    2017-08-01

    At South East Water wastewater treatment plants (WwTPs) in Victoria, Australia, biosolids are stockpiled for three years in compliance with the State guidelines to achieve the highest pathogen reduction grade (T1), suitable for unrestricted use in agriculture and landscaping. However, extended stockpiling is costly, may increase odour nuisance and greenhouse gas emissions, and reduces the fertiliser value of the biosolids. A verification programme of sampling and analysis for enteric pathogens was conducted at two WwTPs where sludge is treated by aerobic and anaerobic digestion, air drying (in drying pans or solar drying sheds) and stockpiling, to enumerate and, if present, monitor the decay of a range of enteric pathogens and parasites. The sludge treatment processes at both WwTPs achieved T1 grade biosolids with respect to prescribed pathogenic bacterial numbers (3 log 10 enteric virus reduction after a storage period of one year. No Ascaris eggs were detected in the influent to the WwTPs, confirming previous studies that the presence of helminth infections in Victoria is extremely low and that Ascaris is not applicable as a control criterion for the microbiological quality of biosolids in the region.

  12. Modeling nuclear processes by Simulink

    Science.gov (United States)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  13. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  14. Probabilistic Models and Process Calculi for Mobile Ad Hoc Networks

    DEFF Research Database (Denmark)

    Song, Lei

    the protocols for MANETs are usually more complicated and error-prone. In this thesis we discuss different models and their underlying theories which will facilitate the verification of protocols for MANETs. Process calculi have been used successfully as a formal method to verify and analyze functional...... equivalences and their logical characterizations have been proposed to combat the infamous states space explosion problem of PAs, but unfortunately it is well known that the behavioral equivalences are strictly stronger than the logical equivalences induced by PCTL or PCTL*. We address this problem...

  15. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables......Many production processes are carried out in stages. At the end of each stage, the production engineer can analyze the intermediate results and correct process parameters (variables) of the next stage. Both analysis of the process and correction to process parameters at next stage should...... be performed regarding the foreseeable output property y, and with respect to an admissible range of correcting actions for the parameters of the next stage. In this paper the basic principles of path modeling is presented. The mathematics is presented for processes having only one stage, having two stages...

  16. Verification and improvement of the signal processing technique using experimental data

    International Nuclear Information System (INIS)

    Sing, O.P.; Srinivasan, G.S.; Vyjayanthi, R.K.; Prabhakar, R.

    1997-01-01

    International Atomic Energy Agency (IAEA) has organized the Co-ordinated Research Programme (CRP) on Acoustic Signal Processing for the Detection of Boiling or Sodium/Water Reaction in LMFR over the period 1990-1995. This was an extension of an earlier CRP on Signal Processing Techniques for Sodium Boiling Noise Detection which ended in 1988. The objectives of the extended CRP were to evaluate the suitability of signal processing techniques for the detection of boiling noise under very poor signal to noise ratio conditions and then to develop the techniques further to cover leak detection in the Steam Generator Units of Liquid Metal Fast Reactors. The Indian analysis showed that Determinant and Trace features and Adaptive Learning Network techniques were able to detect with very good discrimination boiling signals down to a signal to noise ratio (S/N) of -12 dB. Determinant and Trace features were very sensitive to boiling noise and could detect signals even down to -21 dB S/N, but with a lower reliability of detection. Among the various analysis techniques evaluated by India on the acoustic leak noise data during the period 1991-1994, Determinant and Trace of covariance matrix constructed from auto power spectral density proved to be the most sensitive features for leak noise detection. These features have detected leak signals even down to -24 dB S/N, but with reduced reliability of detection below -16 dB. The data for the final year of the CRP (1995) were generated from the end-of-life acoustic leak experiments on the evaporator unit of the Prototype Fast Reactor at Dounreay, United Kingdom. The Indian analysis on this data pertained to the detection of onset and duration of leak signal, to the identification of injection medium, to the determination of leak noise transmission characteristics and to the detection of leak location. (author). 13 refs, 14 figs, 5 tabs

  17. Verification of extended model of goal directed behavior applied on aggression

    Directory of Open Access Journals (Sweden)

    Katarína Vasková

    2016-01-01

    behavioral desire. Also important impact of this factor on prevolitional stages of aggressive behavior was identified. Next important predictor of behavioral desire was anticipation of positive emotions, but not negative emotions. These results correspond with theory of self-regulation where behavior that is focused on goal attainment is accompanied with positive emotions (see for example Cacioppo, Gardner & Berntson, 1999, Carver, 2004. Results confirmed not only sufficient model fit, but also explained 53% of variance of behavioral desire, 68% of intention and 37% of behavior. Some limitations should be mentioned - especially unequal gender representation in the second sample. Some results could be affected by lower sample size. For the future we recommend use also other types of aggressive behavior in verification EMGB and also to apply more complex incorporation of inhibition to the model. At last, character of this study is co-relational, therefore further researches should manipulate with key variables in experimental way to appraise main characteristics of stated theoretical background.

  18. Thermal Pollution Mathematical Model. Volume 2; Verification of One-Dimensional Numerical Model at Lake Keowee

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1980-01-01

    A one dimensional model for studying the thermal dynamics of cooling lakes was developed and verified. The model is essentially a set of partial differential equations which are solved by finite difference methods. The model includes the effects of variation of area with depth, surface heating due to solar radiation absorbed at the upper layer, and internal heating due to the transmission of solar radiation to the sub-surface layers. The exchange of mechanical energy between the lake and the atmosphere is included through the coupling of thermal diffusivity and wind speed. The effects of discharge and intake by power plants are also included. The numerical model was calibrated by applying it to Cayuga Lake. The model was then verified through a long term simulation using Lake Keowee data base. The comparison between measured and predicted vertical temperature profiles for the nine years is good. The physical limnology of Lake Keowee is presented through a set of graphical representations of the measured data base.

  19. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  20. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-21

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  1. Translating activity diagram from duration calculus for modeling of real-time systems and its formal verification using UPPAAL and DiVinE

    International Nuclear Information System (INIS)

    Rahim, M.A.B.U.; Arif, F.

    2016-01-01

    The RTS (Real-Time Systems) are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language) for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus) implementation based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost. (author)

  2. Verification and validation of models: far-field modelling of radionuclide migration

    International Nuclear Information System (INIS)

    Porter, J.D.; Herbert, A.W.; Clarke, D.S.; Roe, P.; Vassilic Melling, D.; Einfeldt, B.; Mackay, R.; Glendinning, R.

    1992-01-01

    The aim of this project was to improve the capability, efficiency and realism of the NAMMU and NAPSAC codes, which simulate groundwater flow and solute transport. Using NAMMU, various solution methods for non linear problems were investigated. The Broyden method gave a useful reduction in computing time and appeared robust. The relative saving obtained with this method increased with the problem size. This was also the case when parameter stepping was used. The existing empirical sorption models in NAMMU were generalized and a ternary heterogeneous ion exchange model was added. These modifications were tested and gave excellent results. The desirability of coupling NAMMU to an existing geochemical speciation code was assessed

  3. A room acoustical computer model for industrial environments - the model and its verification

    DEFF Research Database (Denmark)

    Christensen, Claus Lynge; Foged, Hans Torben

    1998-01-01

    sources. Combining these three source types it is possible to model huge machinery in an easy and visually clear way. Traditionally room acoustic simulations have been aimed at auditorium acoustics. The aim of the simulations has been to model the room acoustic measuring setup consisting...... of an omnidirectional sound source and a microphone. This allows the comparison of simulated results with the ones measured in real rooms. However when simulating the acoustic environment in industrial rooms, the sound sources are often far from being point like, as they can be distributed over a large space...

  4. Operational Characteristics Identification and Simulation Model Verification for Incheon International Airport

    Science.gov (United States)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Zhu, Zhifan; Jung, Yoon C.; Jeong, Myeongsook; Kim, Hyounkyong; Oh, Eunmi; Hong, Sungkwon; Lee, Junwon

    2016-01-01

    integrated into NASA's Airspace Technology Demonstration-2 (ATD-2) project for technology demonstration of Integrated Arrival-Departure-Surface (IADS) operations at CLT. This study is a part of the international research collaboration between KAIA (Korea Agency for Infrastructure Technology Advancement), KARI (Korea Aerospace Research Institute) and NASA, which is being conducted to validate the effectiveness of SARDA concept as a controller decision support tool for departure and surface management of ICN. This paper presents the preliminary results of the collaboration effort. It includes investigation of the operational environment of ICN, data analysis for identification of the operational characteristics of the airport, construction and verification of airport simulation model using Surface Operations Simulator and Scheduler (SOSS), NASA's fast-time simulation tool.

  5. Community Radiative Transfer Model for Inter-Satellites Calibration and Verification

    Science.gov (United States)

    Liu, Q.; Nalli, N. R.; Ignatov, A.; Garrett, K.; Chen, Y.; Weng, F.; Boukabara, S. A.; van Delst, P. F.; Groff, D. N.; Collard, A.; Joseph, E.; Morris, V. R.; Minnett, P. J.

    2014-12-01

    Developed at the Joint Center for Satellite Data Assimilation, the Community Radiative Transfer Model (CRTM) [1], operationally supports satellite radiance assimilation for weather forecasting. The CRTM also supports JPSS/NPP and GOES-R missions [2] for instrument calibration, validation, monitoring long-term trending, and satellite retrieved products [3]. The CRTM is used daily at the NOAA NCEP to quantify the biases and standard deviations between radiance simulations and satellite radiance measurements in a time series and angular dependency. The purposes of monitoring the data assimilation system are to ensure the proper performance of the assimilation system and to diagnose problems with the system for future improvements. The CRTM is a very useful tool for cross-sensor verifications. Using the double difference method, it can remove the biases caused by slight differences in spectral response and geometric angles between measurements of the two instruments. The CRTM is particularly useful to reduce the difference between instruments for climate studies [4]. In this study, we will carry out the assessment of the Suomi National Polar-orbiting Partnership (SNPP) [5] Cross-track Infrared Sounder (CrIS) data [6], Advanced Technology Microwave Sounder (ATMS) data, and data for Visible Infrared Imaging Radiometer Suite (VIIRS) [7][8] thermal emissive bands. We use dedicated radiosondes and surface data acquired from NOAA Aerosols and Ocean Science Expeditions (AEROSE) [9]. The high quality radiosondes were launched when Suomi NPP flew over NOAA Ship Ronald H. Brown situated in the tropical Atlantic Ocean. The atmospheric data include profiles of temperature, water vapor, and ozone, as well as total aerosol optical depths. The surface data includes air temperature and humidity at 2 meters, skin temperature (Marine Atmospheric Emitted Radiance Interferometer, M-AERI [10]), surface temperature, and surface wind vector. [1] Liu, Q., and F. Weng, 2006: JAS [2] Liu, Q

  6. Formal verification of automated teller machine systems using SPIN

    Science.gov (United States)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  7. Experimental determination of temperatures of the inner wall of a boiler combustion chamber for the purpose of verification of a CFD model

    Directory of Open Access Journals (Sweden)

    Petr Trávníček

    2011-01-01

    Full Text Available The paper focuses on the non-destructive method of determination of temperatures in the boiler combustion chamber. This method proves to be significant mainly as regards CFD (Computational Fluid Dynamics simulations of combustion processes, in case of which it is subsequently advisable to verify the data calculated using CFD software application with the actually measured data. Verification of the method was based on usage of reference combustion equipment (130 kW which performs combustion of a mixture of waste sawdust and shavings originating in the course of production of wooden furniture. Measuring of temperatures inside the combustion chamber is – considering mainly the high temperature values – highly demanding and requires a special type of temperature sensors. Furthermore, as regards standard operation, it is not possible to install such sensors without performing structural alterations of the boiler. Therefore, for the purpose of determination of these temperatures a special experimental device was constructed while exploiting a thermal imaging system used for monitoring of the surface temperature of outer wall of the reference boiler. Temperatures on the wall of the boiler combustion chamber were determined on the basis of data measured using the experimental device as well as data from the thermal imaging system. These values might serve for verification of the respective CFD model of combustion equipment.

  8. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    developed, all models constructed in each phase are verifiable. This requires that the modelling notations are formally defined and related in order to have tool support developed for the integration of sophisticated checkers, generators and transformations. This paper summarises our research on the method...... of Refinement of Component and Object Systems (rCOS) and illustrates it with experiences from the work on the Common Component Modelling Example (CoCoME). This gives evidence that the formal techniques developed in rCOS can be integrated into a model-driven development process and shows where it may...

  9. Modeling of containment response for Krsko NPP Full Scope Simulator verification

    International Nuclear Information System (INIS)

    Kljenak, I.; Skerlavaj, A.

    2000-01-01

    Containment responses during the first 10000 s of Anticipated Transient Without Scram and Small Break Loss-of-Coolant Accident scenarios in the Krsko two-loop Westinghouse pressurized water reactor nuclear power plant were simulated with the CONTAIN computer code. Sources of coolant were obtained from simulations with the RELAP5 code. The simulations were carried out so that the results could be used for the verification of the Krsko Full Scope Simulator. (author)

  10. Verification and process oriented validation of the MiKlip decadal prediction system

    Directory of Open Access Journals (Sweden)

    Frank Kaspar

    2016-12-01

    Full Text Available Decadal prediction systems are designed to become a valuable tool for decision making in different sectors of economy, administration or politics. Progress in decadal predictions is also expected to improve our scientific understanding of the climate system. The German Federal Ministry for Education and Research (BMBF therefore funds the German national research project MiKlip (Mittelfristige Klimaprognosen. A network of German research institutions contributes to the development of the system by conducting individual research projects. This special issue presents a collection of papers with results of the evaluation activities within the first phase of MiKlip. They document the improvements of the MiKlip decadal prediction system which were achieved during the first phase. Key aspects are the role of initialization strategies, model resolution or ensemble size. Additional topics are the evaluation of specific weather parameters in selected regions and the use of specific observational datasets for the evaluation.

  11. DEFENSE WASTE PROCESSING FACILITY ANALYTICAL METHOD VERIFICATION FOR THE SLUDGE BATCH 5 QUALIFICATION SAMPLE

    International Nuclear Information System (INIS)

    Click, D; Tommy Edwards, T; Henry Ajo, H

    2008-01-01

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs confirmation of the applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples. DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem Method, see Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICP-AES). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 5 (SB5) SRAT Receipt and SB5 SRAT Product samples. The SB5 SRAT Receipt and SB5 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB5 Batch composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 4 (SB4), to form the SB5 Blend composition. The results for any one particular element should not be used in any way to identify the form or speciation of a particular element in the sludge or used to estimate ratios of compounds in the sludge. A statistical comparison of the data validates the use of the DWPF CC method for SB5 Batch composition. However, the difficulty that was encountered in using the CC method for SB4 brings into question the adequacy of CC for the SB5 Blend. Also, it should be noted that visible solids remained in the final diluted solutions of all samples digested by this method at SRNL (8 samples total), which is typical for the DWPF CC method but not seen in the other methods. Recommendations to the DWPF for application to SB5 based on studies to date: (1) A dissolution study should be performed on the WAPS

  12. Mathematical modeling of a fluidized bed rice husk gasifier: Part 3 -- Model verification

    Energy Technology Data Exchange (ETDEWEB)

    Mansaray, K.G.; Ghaly, A.E.; Al-Taweel, A.M.; Ugursal, V.I.; Hamdullahpur, F.

    2000-04-01

    The validity of the two-compartment model developed for fluidized bed gasification of biomass was tested using experimental data obtained from a dual-distributor-type fluidized bed gasifier. The fluidized bed was operated on rice husks at various bed heights (19.5, 25.5, and 31.5 cm), fluidization velocities (0.22, 0.28, and 0.33 m/s), and equivalence ratios (0.25, 0.30, and 0.35). The model gave reasonable predictions of the core, annulus, and exit temperatures as well as the mole fractions of the combustible gas components and product gas higher heating value, except for the overall carbon conversion, which was overestimated. This could be attributed to uncertainties in the sampling procedure.

  13. Mathematical modeling of a fluidized bed rice husk gasifier: Part 3 - Model verification

    Energy Technology Data Exchange (ETDEWEB)

    Mansaray, K.G.; Ghaly, A.E.; Al-Taweel, A.M.; Ugursal, V.I.; Hamdullahpur, F.

    2000-03-01

    The validity of the two-compartment model developed for fluidized bed gasification of biomass was tested using experimental data obtained from a dual-distributor-type fluidized bed gasifier. The fluidized bed was operated on rice husks at various bed heights (19.5, 25.5, and 31.5 cm), fluidization velocities (0.22, 0.28, and 0.33 m/s), and equivalence ratios (0.25, 0.30, and 0.35). The model gave reasonable predictions of the core, annulus, and exit temperatures as well as the mole fractions of the combustible gas components and product gas higher heating value, except for the overall carbon conversion, which was overestimated. This could be attributed to uncertainties in the sampling procedure. (Author)

  14. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    Science.gov (United States)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  15. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  16. Simple Models for Process Control

    Czech Academy of Sciences Publication Activity Database

    Gorez, R.; Klán, Petr

    2011-01-01

    Roč. 22, č. 2 (2011), s. 58-62 ISSN 0929-2268 Institutional research plan: CEZ:AV0Z10300504 Keywords : process models * PID control * second order dynamics Subject RIV: JB - Sensors, Measurment, Regulation

  17. Modeling radon entry into houses with basements: Model description and verification

    International Nuclear Information System (INIS)

    Revzan, K.L.; Fisk, W.J.; Gadgil, A.J.

    1991-01-01

    We model radon entry into basements using a previously developed three-dimensional steady-state finite difference model that has been modified in the following ways: first, cylindrical coordinates are used to take advantage of the symmetry of the problem in the horizontal plant; second, the configuration of the basement has been made more realistic by incorporating the concrete footer; third, a quadratic relationship between the pressure and flow in the L-shaped gap between slab, footer, and wall has been employed; fourth, the natural convection of the soil gas which follows from the heating of the basement in winter has been taken into account. The temperature field in the soil is determined from the equation of energy conservation, using the basement, surface, and deep-soil temperatures as boundary conditions. The pressure field is determined from Darcy's law and the equation of mass conservation (continuity), assuming that there is no flow across any boundary except the soil surface (atmospheric pressure) and the opening in the basement shell (fixed pressure). After the pressure and temperatures field have been obtained the velocity field is found from Darcy's law. Finally, the radon concentration field is found from the equation of mass-transport. The convective radon entry rate through the opening or openings is then calculated. In this paper we describe the modified model, compare the predicted radon entry rates with and without the consideration of thermal convection, and compare the predicted rates with determined from data from 7 houses in the Spokane River valley of Washington and Idaho. Although the predicted rate is much lower than the mean of the rates determined from measurements, errors in the measurement of soil permeability and variations in the permeability of the area immediately under the basement slab, which has a significant influence on the pressure field, can account for the range of entry rates inferred from the data. 25 refs., 8 figs

  18. State Token Petri Net modeling method for formal verification of computerized procedure including operator's interruptions of procedure execution flow

    International Nuclear Information System (INIS)

    Kim, Yun Goo; Seong, Poong Hyun

    2012-01-01

    The Computerized Procedure System (CPS) is one of the primary operating support systems in the digital Main Control Room. The CPS displays procedure on the computer screen in the form of a flow chart, and displays plant operating information along with procedure instructions. It also supports operator decision making by providing a system decision. A procedure flow should be correct and reliable, as an error would lead to operator misjudgement and inadequate control. In this paper we present a modeling for the CPS that enables formal verification based on Petri nets. The proposed State Token Petri Nets (STPN) also support modeling of a procedure flow that has various interruptions by the operator, according to the plant condition. STPN modeling is compared with Coloured Petri net when they are applied to Emergency Operating Computerized Procedure. A converting program for Computerized Procedure (CP) to STPN has been also developed. The formal verification and validation methods of CP with STPN increase the safety of a nuclear power plant and provide digital quality assurance means that are needed when the role and function of the CPS is increasing.

  19. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  20. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    Science.gov (United States)

    Zavodsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use

  1. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...

  2. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative...

  3. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    Directory of Open Access Journals (Sweden)

    Tuo Ming Fu

    2016-01-01

    Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion

  4. Tracer experiment data sets for the verification of local and meso-scale atmospheric dispersion models including topographic effects

    International Nuclear Information System (INIS)

    Sartori, E.; Schuler, W.

    1992-01-01

    Software and data for nuclear energy applications are acquired, tested and distributed by several information centres; in particular, relevant computer codes are distributed internationally by the OECD/NEA Data Bank (France) and by ESTSC and EPIC/RSIC (United States). This activity is coordinated among the centres and is extended outside the OECD area through an arrangement with the IAEA. This article proposes more specifically a scheme for acquiring, storing and distributing atmospheric tracer experiment data (ATE) required for verification of atmospheric dispersion models especially the most advanced ones including topographic effects and specific to the local and meso-scale. These well documented data sets will form a valuable complement to the set of atmospheric dispersion computer codes distributed internationally. Modellers will be able to gain confidence in the predictive power of their models or to verify their modelling skills. (au)

  5. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  6. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  7. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  8. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  9. VERIFICATION OF THE DEFENSE WASTE PROCESSING FACILITY PROCESS DIGESTION METHOD FOR THE SLUDGE BATCH 6 QUALIFICATION SAMPLE

    Energy Technology Data Exchange (ETDEWEB)

    Click, D.; Jones, M.; Edwards, T.

    2010-06-09

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) confirms applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples.1 DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem (CC) Method, see DWPF Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICPAES). In addition to the CC method confirmation, the DWPF lab's mercury (Hg) digestion method was also evaluated for applicability to SB6 (see DWPF procedure 'Mercury System Operating Manual', Manual: SW4-15.204. Section 6.1, Revision 5, Effective date: 12-04-03). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 6 (SB6) SRAT Receipt and SB6 SRAT Product samples. For validation of the DWPF lab's Hg method, only SRAT receipt material was used and compared to AR digestion results. The SB6 SRAT Receipt and SB6 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB6 Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 5 (SB5), to form the SB6 Blend composition. In addition to the 16 elements currently measured by the DWPF, this report includes Hg and thorium (Th) data (Th comprising {approx}2.5 - 3 Wt% of the total solids in SRAT Receipt and SRAT Product, respectively) and provides specific details of ICP-AES analysis of Th. Thorium was found to interfere with the U 367.007 nm emission line, and an inter-element correction (IEC) had to be applied to U

  10. The Brookhaven Process Optimization Models

    Energy Technology Data Exchange (ETDEWEB)

    Pilati, D. A.; Sparrow, F. T.

    1979-01-01

    The Brookhaven National Laboratory Industry Model Program (IMP) has undertaken the development of a set of industry-specific process-optimization models. These models are to be used for energy-use projections, energy-policy analyses, and process technology assessments. Applications of the models currently under development show that system-wide energy impacts may be very different from engineering estimates, selected investment tax credits for cogeneration (or other conservation strategies) may have the perverse effect of increasing industrial energy use, and that a proper combination of energy taxes and investment tax credits is more socially desirable than either policy alone. A section is included describing possible extensions of these models to answer questions or address other systems (e.g., a single plant instead of an entire industry).

  11. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  12. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Today formal verification is finding increasing acceptance in some areas, especially model abstraction and functional verification. Other major chal- lenges, like timing verification, remain before this technology can be posed as a complete alternative to simulation. This special issue is devoted to presenting some of the ...

  13. Contaminant transport in groundwater in the presence of colloids and bacteria: model development and verification.

    Science.gov (United States)

    Bekhit, Hesham M; El-Kordy, Mohamed A; Hassan, Ahmed E

    2009-09-01

    Colloids and bacteria (microorganisms) naturally exist in groundwater aquifers and can significantly impact contaminant migration rates. A conceptual model is first developed to account for the different physiochemical and biological processes, reaction kinetics, and different transport mechanisms of the combined system (contaminant-colloids-bacteria). All three constituents are assumed to be reactive with the reactions taking place between each constituent and the porous medium and also among the different constituents. A general linear kinetic reaction model is assumed for all reactive processes considered. The mathematical model is represented by fourteen coupled partial differential equations describing mass balance and reaction processes. Two of these equations describe colloid movement and reactions with the porous medium, four equations describe bacterial movement and reactions with colloids and the porous medium, and the remaining eight equations describe contaminant movement and its reactions with bacteria, colloids, and the porous medium. The mass balance equations are numerically solved for two-dimensional groundwater systems using a third-order, total variance-diminishing scheme (TVD) for the advection terms. Due to the complex coupling of the equations, they are solved iteratively each time step until a convergence criterion is met. The model is tested against experimental data and the results are favorable.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: UTC FUEL CELLS' PC25C POWER PLANT - GAS PROCESSING UNIT PERFORMANCE FOR ANAEROBIC DIGESTER GAS

    Science.gov (United States)

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system based on the UTC Fuel Cell's PC25C Fuel Cell Power Plant was evaluated. The...

  15. Quality assurance of sterilized products: verification of a model relating frequency of contaminated items and increasing radiation dose

    International Nuclear Information System (INIS)

    Khan, A.A.; Tallentire, A.; Dwyer, J.

    1977-01-01

    Values of the γ-radiation resistance parameters (k and n of the 'multi-hit' expression) for Bacillus pumilus E601 spores and Serratia marcescens cells have been determined and the constancy of values for a given test condition demonstrated. These organisms, differing by a factor of about 50 in k value, have been included in a laboratory test system for use in verification of a model describing the dependence of the proportion of contaminated items in a population of items on radiation dose. The proportions of contaminated units of the test system at various γ-radiation doses have been measured for different initial numbers and types of organisms present in units either singly or together. Using the model, the probabilities of contaminated units for corresponding sets of conditions have been evaluated together with associated variances. Measured proportions and predicted probabilities agree well, showing that the model holds in a laboratory contrived situation. (author)

  16. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  17. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    OpenAIRE

    M. A. Lukin

    2014-01-01

    The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maxi...

  18. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  19. Modeling and experimental verification of thermally induced residual stress in RF-MEMS

    International Nuclear Information System (INIS)

    Somà, Aurelio; Saleem, Muhammad Mubasher

    2015-01-01

    Electrostatically actuated radio frequency microelectromechanical systems (RF-MEMS) generally consist of microcantilevers and clamped–clamped microbeams. The presence of residual stress in these microstructures affects the static and dynamic behavior of the device. In this study, nonlinear finite element method (FEM) modeling and the experimental validation of residual stress induced in the clamped–clamped microbeams and the symmetric toggle RF-MEMS switch (STS) is presented. The formation of residual stress due to plastic deformation during the thermal loading-unloading cycle in the plasma etching step of the microfabrication process is explained and modeled using the Bauschinger effect. The difference between the designed and the measured natural frequency and pull-in voltage values for the clamped–clamped microbeams is explained by the presence of the nonhomogenous tensile residual stress. For the STS switch specimens, three-dimensional (3D) FEM models are developed and the initial deflection at zero bias voltage, observed during the optical profile measurements, is explained by the residual stress developed during the plasma etching step. The simulated residual stress due to the plastic deformation is included in the STS models to obtain the switch pull-in voltage. At the end of the simulation process, a good correspondence is obtained between the FEM model results and the experimental measurements for both the clamped–clamped microbeams and the STS switch specimens. (paper)

  20. DISCRETE DYNAMIC MODEL OF BEVEL GEAR – VERIFICATION THE PROGRAM SOURCE CODE FOR NUMERICAL SIMULATION

    Directory of Open Access Journals (Sweden)

    Krzysztof TWARDOCH

    2014-06-01

    Full Text Available In the article presented a new model of physical and mathematical bevel gear to study the influence of design parameters and operating factors on the dynamic state of the gear transmission. Discusses the process of verifying proper operation of copyright calculation program used to determine the solutions of the dynamic model of bevel gear. Presents the block diagram of a computing algorithm that was used to create a program for the numerical simulation. The program source code is written in an interactive environment to perform scientific and engineering calculations, MATLAB

  1. Modelling and verification of single slope solar still using ANSYS-CFX

    Energy Technology Data Exchange (ETDEWEB)

    Panchal, Hitesh N. [Research Scholar, Kadi Sarvavishwavidyalaya University, Gandhinagar (India); Shah, P.K. [Principal, Silver Oak College of Engineering and Technology, Ahmedabad (India)

    2011-07-01

    Solar distillation method is an easy, small scale and cost effective technique for providing safe water. It requires an energy input as heat and the solar radiation can be source of energy. Solar still is a device which uses process of solar distillation. Here, a two phase, three dimensional model was made for evaporation as well as condensation process in solar still by using ANSYS CFX method to simulate the present model. Simulation results of solar still compared with actual experiment data of single basin solar still at climate conditions of Mehsana (23{sup o}12' N, 72{sup o}30'). There is a good agreement with experimental results and simulation results of distillate output, water temperature and heat transfer coefficients. Overall study shows the ANSYS CFX is a powerful tool for diagnostic as well as analysis of solar still.

  2. MARATHON Verification (MARV)

    Science.gov (United States)

    2017-08-01

    the verification and "lessons learned " from the semantic and technical issues we discovered as we implemented the approach. 15. SUBJECT TERMS...any programming language at use at CAA for modeling or other data analysis applications, to include R, Python , Scheme, Common Lisp, Julia, Mathematica

  3. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  4. Distinct neural correlates for pragmatic and semantic meaning processing: An event-related potential investigation of scalar implicature processing using picture-sentence verification

    Science.gov (United States)

    Politzer-Ahles, Stephen; Fiorentino, Robert; Jiang, Xiaoming; Zhou, Xiaolin

    2014-01-01

    The present study examines the brain-level representation and composition of meaning in scalar quantifiers (e.g., some), which have both a semantic meaning (at least one) and a pragmatic meaning (not all). We adopted a picture-sentence verification design to examine event-related potential (ERP) effects of reading infelicitous quantifiers for which the semantic meaning was correct with respect to the context but the pragmatic meaning was not, compared to quantifiers for which the semantic meaning was inconsistent with the context and no additional pragmatic meaning is available. In the first experiment, only pragmatically inconsistent quantifiers, not semantically inconsistent quantifiers, elicited a sustained posterior negative component. This late negativity contrasts with the N400 effect typically elicited by nouns that are incongruent with their context, suggesting that the recognition of scalar implicature errors elicits a qualitatively different ERP signature than the recognition of lexico-semantic errors. We hypothesize that the sustained negativity reflects cancellation of the pragmatic inference and retrieval of the semantic meaning. In our second experiment, we found that the process of re-interpreting the quantifier was independent from lexico-semantic processing: the N400 elicited by lexico-semantic violations was not modulated by the presence of a pragmatic inconsistency. These findings suggest that inferential pragmatic aspects of meaning are processed using different mechanisms than lexical or combinatorial semantic aspects of meaning, that inferential pragmatic meaning can be realized rapidly, and that the computation of meaning involves continuous negotiation between different aspects of meaning. PMID:23103410

  5. Adsorption and biodegradation of 2-chlorophenol by mixed culture using activated carbon as a supporting medium-reactor performance and model verification

    Science.gov (United States)

    Lin, Yen-Hui

    2017-11-01

    A non-steady-state mathematical model system for the kinetics of adsorption and biodegradation of 2-chlorophenol (2-CP) by attached and suspended biomass on activated carbon process was derived. The mechanisms in the model system included 2-CP adsorption by activated carbon, 2-CP mass transport diffusion in biofilm, and biodegradation by attached and suspended biomass. Batch kinetic tests were performed to determine surface diffusivity of 2-CP, adsorption parameters for 2-CP, and biokinetic parameters of biomass. Experiments were conducted using a biological activated carbon (BAC) reactor system with high recycled rate to approximate a completely mixed flow reactor for model verification. Concentration profiles of 2-CP by model predictions indicated that biofilm bioregenerated the activated carbon by lowering the 2-CP concentration at the biofilm-activated carbon interface as the biofilm grew thicker. The removal efficiency of 2-CP by biomass was approximately 98.5% when 2-CP concentration in the influent was around 190.5 mg L-1 at a steady-state condition. The concentration of suspended biomass reached up to about 25.3 mg L-1 while the thickness of attached biomass was estimated to be 636 μm at a steady-state condition by model prediction. The experimental results agree closely with the results of the model predictions.

  6. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  7. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret...

  8. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    International Nuclear Information System (INIS)

    Lee, Se Ho; Lee, Seung Wook; Han, Su Chul; Park, Seung Woo

    2016-01-01

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study

  9. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Se Ho; Lee, Seung Wook [Pusan National University, Busan (Korea, Republic of); Han, Su Chul; Park, Seung Woo [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-05-15

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

  10. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  11. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  12. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  13. Verification and Validation of Encapsulation Flow Models in GOMA, Version 1.1; TOPICAL

    International Nuclear Information System (INIS)

    MONDY, LISA ANN; RAO, REKHA R.; SCHUNK, P. RANDALL; SACKINGER, PHILIP A.; ADOLF, DOUGLAS B.

    2001-01-01

    Encapsulation is a common process used in manufacturing most non-nuclear components including: firing sets, neutron generators, trajectory sensing signal generators (TSSGs), arming, fusing and firing devices (AF and Fs), radars, programmers, connectors, and batteries. Encapsulation is used to contain high voltage, to mitigate stress and vibration and to protect against moisture. The purpose of the ASCI Encapsulation project is to develop a simulation capability that will allow us to aid in the encapsulation design process, especially for neutron generators. The introduction of an encapsulant poses many problems because of the need to balance ease of processing and properties necessary to achieve the design benefits such as tailored encapsulant properties, optimized cure schedule and reduced failure rates. Encapsulants can fail through fracture or delamination as a result of cure shrinkage, thermally induced residual stresses, voids or incomplete component embedding and particle gradients. Manufacturing design requirements include (1) maintaining uniform composition of particles in order to maintain the desired thermal coefficient of expansion (CTE) and density, (2) mitigating void formation during mold fill, (3) mitigating cure and thermally induced stresses during cure and cool down, and (4) eliminating delamination and fracture due to cure shrinkage/thermal strains. The first two require modeling of the fluid phase, and it is proposed to use the finite element code GOMA to accomplish this. The latter two require modeling of the solid state; however, ideally the effects of particle distribution would be included in the calculations, and thus initial conditions would be set from GOMA predictions. These models, once they are verified and validated, will be transitioned into the SIERRA framework and the ARIA code. This will facilitate exchange of data with the solid mechanics calculations in SIERRA/ADAGIO

  14. Verification of the New FAST v8 Capabilities for the Modeling of Fixed-Bottom Offshore Wind Turbines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Barahona, B.; Jonkman, J.; Damiani, R.; Robertson, A.; Hayman, G.

    2014-12-01

    Coupled dynamic analysis has an important role in the design of offshore wind turbines because the systems are subject to complex operating conditions from the combined action of waves and wind. The aero-hydro-servo-elastic tool FAST v8 is framed in a novel modularization scheme that facilitates such analysis. Here, we present the verification of new capabilities of FAST v8 to model fixed-bottom offshore wind turbines. We analyze a series of load cases with both wind and wave loads and compare the results against those from the previous international code comparison projects-the International Energy Agency (IEA) Wind Task 23 Subtask 2 Offshore Code Comparison Collaboration (OC3) and the IEA Wind Task 30 OC3 Continued (OC4) projects. The verification is performed using the NREL 5-MW reference turbine supported by monopile, tripod, and jacket substructures. The substructure structural-dynamics models are built within the new SubDyn module of FAST v8, which uses a linear finite-element beam model with Craig-Bampton dynamic system reduction. This allows the modal properties of the substructure to be synthesized and coupled to hydrodynamic loads and tower dynamics. The hydrodynamic loads are calculated using a new strip theory approach for multimember substructures in the updated HydroDyn module of FAST v8. These modules are linked to the rest of FAST through the new coupling scheme involving mapping between module-independent spatial discretizations and a numerically rigorous implicit solver. The results show that the new structural dynamics, hydrodynamics, and coupled solutions compare well to the results from the previous code comparison projects.

  15. Bedrock geology Forsmark. Modelling stage 2.3. Implications for and verification of the deterministic geological models based on complementary data

    Energy Technology Data Exchange (ETDEWEB)

    Stephens, Michael B. (Geological Survey of Sweden, Uppsala (Sweden)); Simeonov, Assen (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Isaksson, Hans (GeoVista AB, Luleaa (Sweden))

    2008-12-15

    The Swedish Nuclear Fuel and Waste Management Company is in the process of completing site descriptive modelling at two locations in Sweden, with the objective to site a deep geological repository for spent nuclear fuel. At Forsmark, the results of the stage 2.2 geological modelling formed the input for downstream users. Since complementary ground and borehole geological and geophysical data, acquired after model stage 2.2, were not planned to be included in the deterministic rock domain, fracture domain and deformation zone models supplied to the users, it was deemed necessary to evaluate the implications of these stage 2.3 data for the stage 2.2 deterministic geological models and, if possible, to make use of these data to verify the models. This report presents the results of the analysis of the complementary stage 2.3 geological and geophysical data. Model verification from borehole data has been implemented in the form of a prediction-outcome test. The stage 2.3 geological and geophysical data at Forsmark mostly provide information on the bedrock outside the target volume. Additional high-resolution ground magnetic data and the data from the boreholes KFM02B, KFM11A, KFM12A and HFM33 to HFM37 can be included in this category. Other data complement older information of identical character, both inside and outside this volume. These include the character and kinematics of deformation zones and fracture mineralogy. In general terms, it can be stated that all these new data either confirm the geological modelling work completed during stage 2.2 or are in good agreement with the data that were used in this work. In particular, although the new high-resolution ground magnetic data modify slightly the position and trace length of some stage 2.2 deformation zones at the ground surface, no new or modified deformation zones with a trace length longer than 3,000 m at the ground surface have emerged. It is also apparent that the revision of fracture orientation data

  16. Bedrock geology Forsmark. Modelling stage 2.3. Implications for and verification of the deterministic geological models based on complementary data

    International Nuclear Information System (INIS)

    Stephens, Michael B.; Simeonov, Assen; Isaksson, Hans

    2008-12-01

    The Swedish Nuclear Fuel and Waste Management Company is in the process of completing site descriptive modelling at two locations in Sweden, with the objective to site a deep geological repository for spent nuclear fuel. At Forsmark, the results of the stage 2.2 geological modelling formed the input for downstream users. Since complementary ground and borehole geological and geophysical data, acquired after model stage 2.2, were not planned to be included in the deterministic rock domain, fracture domain and deformation zone models supplied to the users, it was deemed necessary to evaluate the implications of these stage 2.3 data for the stage 2.2 deterministic geological models and, if possible, to make use of these data to verify the models. This report presents the results of the analysis of the complementary stage 2.3 geological and geophysical data. Model verification from borehole data has been implemented in the form of a prediction-outcome test. The stage 2.3 geological and geophysical data at Forsmark mostly provide information on the bedrock outside the target volume. Additional high-resolution ground magnetic data and the data from the boreholes KFM02B, KFM11A, KFM12A and HFM33 to HFM37 can be included in this category. Other data complement older information of identical character, both inside and outside this volume. These include the character and kinematics of deformation zones and fracture mineralogy. In general terms, it can be stated that all these new data either confirm the geological modelling work completed during stage 2.2 or are in good agreement with the data that were used in this work. In particular, although the new high-resolution ground magnetic data modify slightly the position and trace length of some stage 2.2 deformation zones at the ground surface, no new or modified deformation zones with a trace length longer than 3,000 m at the ground surface have emerged. It is also apparent that the revision of fracture orientation data

  17. Animal models and conserved processes

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-09-01

    Full Text Available Abstract Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is

  18. Animal models and conserved processes.

    Science.gov (United States)

    Greek, Ray; Rice, Mark J

    2012-09-10

    The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. We conclude that even the presence of conserved processes is insufficient for inter-species extrapolation when the trait or response

  19. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  20. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  1. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  2. Mathematical framework for optimizing the image processing of chest; Application of techniques of Super-Resolution (MR) imaging to map the influence of dose verification of IMRT treatments

    Energy Technology Data Exchange (ETDEWEB)

    Cortes Rodicio, J.; Jimenez Albericio, F. J.; Laliena Bielsa, V.; Canellas Anoz, M.; Millan Cebrian, E.; Hernandez Vitoria, A.; Garcia Romero, A.; Ortega Pardina, P.

    2011-07-01

    The techniques of super-resolution (SR) have been applied in many fields of image processing such as remote sensing, security cameras, medical imaging, etc., exploiting the information content of a set of images of the same phenomenon to treat to increase the sampling frequency end a high-resolution (HR). This requires that the point of capture is different in each picture so that there are sub-pixel displacement. Using these techniques has improved the resolution that provides a two dimensional array of detectors (Matrixx) in the verification of fluence maps IMRT treatments.

  3. [Verification of the VEF photon beam model for dose calculations by the Voxel-Monte-Carlo-Algorithm].

    Science.gov (United States)

    Kriesen, Stephan; Fippel, Matthias

    2005-01-01

    The VEF linac head model (VEF, virtual energy fluence) was developed at the University of Tübingen to determine the primary fluence for calculations of dose distributions in patients by the Voxel-Monte-Carlo-Algorithm (XVMC). This analytical model can be fitted to any therapy accelerator head by measuring only a few basic dose data; therefore, time-consuming Monte-Carlo simulations of the linac head become unnecessary. The aim of the present study was the verification of the VEF model by means of water-phantom measurements, as well as the comparison of this system with a common analytical linac head model of a commercial planning system (TMS, formerly HELAX or MDS Nordion, respectively). The results show that both the VEF and the TMS models can very well simulate the primary fluence. However, the VEF model proved superior in the simulations of scattered radiation and in the calculations of strongly irregular MLC fields. Thus, an accurate and clinically practicable tool for the determination of the primary fluence for Monte-Carlo-Simulations with photons was established, especially for the use in IMRT planning.

  4. Verification of the VEF photon beam model for dose calculations by the voxel-Monte-Carlo-algorithm

    International Nuclear Information System (INIS)

    Kriesen, S.; Fippel, M.

    2005-01-01

    The VEF linac head model (VEF, virtual energy fluence) was developed at the University of Tuebingen to determine the primary fluence for calculations of dose distributions in patients by the Voxel-Monte-Carlo-Algorithm (XVMC). This analytical model can be fitted to any therapy accelerator head by measuring only a few basic dose data; therefore, time-consuming Monte-Carlo simulations of the linac head become unnecessary. The aim of the present study was the verification of the VEF model by means of water-phantom measurements, as well as the comparison of this system with a common analytical linac head model of a commercial planning system (TMS, formerly HELAX or MDS Nordion, respectively). The results show that both the VEF and the TMS models can very well simulate the primary fluence. However, the VEF model proved superior in the simulations of scattered radiation and in the calculations of strongly irregular MLC fields. Thus, an accurate and clinically practicable tool for the determination of the primary fluence for Monte-Carlo-Simulations with photons was established, especially for the use in IMRT planning. (orig.)

  5. Migration of 90Sr, 137Cs and Pu in soils. Verification of a computer model on the behaviour of these radiocontaminants in soils of Western Europe

    International Nuclear Information System (INIS)

    Frissel, M.J.; Poelstra, P.; Klugt, N. van der.

    1980-01-01

    The main emphasis in 1979 was on the 239 240 Pu model for simulating translocations in soil. The verification was hampered because data for 239 Pu were available from only two locations. A comparison between the observed and predicted Pu distribution however indicated the possibility of using the available simulation approach for 239 240 Pu. (Auth.)

  6. Simulating and Predicting Cereal Crop Yields in Ethiopia: Model Calibration and Verification

    Science.gov (United States)

    Yang, M.; Wang, G.; Ahmed, K. F.; Eggen, M.; Adugna, B.; Anagnostou, E. N.

    2017-12-01

    Agriculture in developing countries are extremely vulnerable to climate variability and changes. In East Africa, most people live in the rural areas with outdated agriculture techniques and infrastructure. Smallholder agriculture continues to play a key role in this area, and the rate of irrigation is among the lowest of the world. As a result, seasonal and inter-annual weather patterns play an important role in the spatiotemporal variability of crop yields. This study investigates how various climate variables (e.g., temperature, precipitation, sunshine) and agricultural practice (e.g., fertilization, irrigation, planting date) influence cereal crop yields using a process-based model (DSSAT) and statistical analysis, and focuses on the Blue Nile Basin of Ethiopia. The DSSAT model is driven with meteorological forcing from the ECMWF's latest reanalysis product that cover the past 35 years; the statistical model will be developed by linking the same meteorological reanalysis data with harvest data at the woreda level from the Ethiopian national dataset. Results from this study will set the stage for the development of a seasonal prediction system for weather and crop yields in Ethiopia, which will serve multiple sectors in coping with the agricultural impact of climate variability.

  7. Simulink based behavioural modelling of a pulse oximeter for deployment in rapid development, prototyping and verification.

    Science.gov (United States)

    Shokouhian, M; Morling, R C S; Kale, I

    2012-01-01

    The pulse oximeter is a well-known device for measuring the level of oxygen in blood. Since their invention, pulse oximeters have been under constant development in both aspects of hardware and software; however there are still unsolved problems that limit their performance [6], [7]. Many fresh algorithms and new design techniques are being suggested every year by industry and academic researchers which claim that they can improve accuracy of measurements [8], [9]. With the lack of an accurate computer-based behavioural model for pulse oximeters, the only way for evaluation of these newly developed systems and algorithms is through hardware implementation which can be both expensive and time consuming. This paper presents an accurate Simulink based behavioural model for a pulse oximeter that can be used by industry and academia alike working in this area, as an exploration as well as productivity enhancement tool during their research and development process. The aim of this paper is to introduce a new computer-based behavioural model which provides a simulation environment from which new ideas can be rapidly evaluated long before the real implementation.

  8. THE FLOOD RISK IN THE LOWER GIANH RIVER: MODELLING AND FIELD VERIFICATION

    Directory of Open Access Journals (Sweden)

    NGUYEN H. D.

    2016-03-01

    Full Text Available Problems associated with flood risk definitely represent a highly topical issue in Vietnam. The case of the lower Gianh River in the central area of Vietnam, with a watershed area of 353 km2, is particularly interesting. In this area, periodically subject to flood risk, the scientific question is strongly linked to risk management. In addition, flood risk is the consequence of the hydrological hazard of an event and the damages related to this event. For this reason, our approach is based on hydrodynamic modelling using Mike Flood to simulate the runoff during a flood event. Unfortunately the data in the studied area are quite limited. Our computation of the flood risk is based on a three-step modelling process, using rainfall data coming from 8 stations, cross sections, the topographic map and the land-use map. The first step consists of creating a 1-D model using Mike 11, in order to simulate the runoff in the minor river bed. In the second step, we use Mike 21 to create a 2-D model to simulate the runoff in the flood plain. The last step allows us to couple the two models in order to precisely describe the variables for the hazard analysis in the flood plain (the water level, the speed, the extent of the flooding. Moreover the model is calibrated and verified using observational data of the water level at hydrologic stations and field control data (on the one hand flood height measurements, on the other hand interviews with the community and with the local councillors. We then generate GIS maps in order to improve flood hazard management, which allows us to create flood hazard maps by coupling the flood plain map and the runoff speed map. Our results show that: the flood peak, caused by typhoon Nari, reached more than 6 m on October 16th 2013 at 4 p.m. (its area was extended by 149 km². End that the typhoon constitutes an extreme flood hazard for 11.39%, very high for 10.60%, high for 30.79%, medium for 31.91% and a light flood hazard for 15

  9. Numerical climate modeling and verification of selected areas for heat waves of Pakistan using ensemble prediction system

    International Nuclear Information System (INIS)

    Amna, S; Samreen, N; Khalid, B; Shamim, A

    2013-01-01

    Depending upon the topography, there is an extreme variation in the temperature of Pakistan. Heat waves are the Weather-related events, having significant impact on the humans, including all socioeconomic activities and health issues as well which changes according to the climatic conditions of the area. The forecasting climate is of prime importance for being aware of future climatic changes, in order to mitigate them. The study used the Ensemble Prediction System (EPS) for the purpose of modeling seasonal weather hind-cast of three selected areas i.e., Islamabad, Jhelum and Muzaffarabad. This research was purposely carried out in order to suggest the most suitable climate model for Pakistan. Real time and simulated data of five General Circulation Models i.e., ECMWF, ERA-40, MPI, Meteo France and UKMO for selected areas was acquired from Pakistan Meteorological Department. Data incorporated constituted the statistical temperature records of 32 years for the months of June, July and August. This study was based on EPS to calculate probabilistic forecasts produced by single ensembles. Verification was done out to assess the quality of the forecast t by using standard probabilistic measures of Brier Score, Brier Skill Score, Cross Validation and Relative Operating Characteristic curve. The results showed ECMWF the most suitable model for Islamabad and Jhelum; and Meteo France for Muzaffarabad. Other models have significant results by omitting particular initial conditions.

  10. Implementation and verification of a coupled fire model as a thermal boundary condition within P3/THERMAL

    International Nuclear Information System (INIS)

    Hensinger, D.M.; Gritzo, L.A.; Koski, J.A.

    1996-01-01

    A user-defined boundary condition subroutine has been implemented within P3/THERMAL to represent the heat flux between a noncombusting object and an engulfing fire. The heat flux calculations includes a simple 2D fire model in which energy and radiative heat transport equations are solved to produce estimates of the heat fluxes at the fire-object interface. These estimates reflect radiative coupling between a cold object and the flow of hot combustion gases which has been observed in fire experiments. The model uses a database of experimental pool fire measurements for far field boundary conditions and volumetric heat release rates. Taking into account the coupling between a structure and the fire is an improvement over the σT 4 approximation frequently used as a boundary condition for engineered system response and is the preliminary step in the development of a fire model with a predictive capability. This paper describes the implementation of the fire model as a P3/THERMAL boundary condition and presents the results of a verification calculation carried out using the model

  11. A contribution to the electron-beam surface-melting process of metallic materials. Numerical simulation and experimental verification

    International Nuclear Information System (INIS)

    Bruckner, A.

    1996-08-01

    For the optimization of the surface melting process it is necessary to make many different experiments. Therefore, the simulation of the surface melting process becomes a major role for the optimization. Most of the simulations, developed for the laser surface melting process, are not usable for the electron-beam surface melting process, because of the different energy input and the possibility of high frequent movement of the electron-beam. In this thesis, a calculation model for electron-beam surface melting is presented. For this numerical simulation a variable volume source is used, which moves in axial direction with the same velocity as the vapor cavity into the material. With this calculation model also the high frequent movement of the electron-beam may be taken into account. The electron-beam diameter is measured with a method of drilling holes with short electron-beam pulses in thin foils. The diameter of the holes depends on the pulse length and reaches a maximal value, which is used for the diameter of the volume source in the calculation. The crack-formation, seen in many treated surfaces, is examined with the Acoustic-Emission Testing. The possibilities of the electron-beam surface melting process are shown with some experiments for different requirements of the treated surfaces, like increasing the hardness, reducing the porosity of a sintered material and the alloying of tin in an aluminium-silicon surface. (author)

  12. Application of Steinberg vibration fatigue model for structural verification of space instruments

    Science.gov (United States)

    García, Andrés; Sorribes-Palmer, Félix; Alonso, Gustavo

    2018-01-01

    Electronic components in spaceships are subjected to vibration loads during the ascent phase of the launcher. It is important to verify by tests and analysis that all parts can survive in the most severe load cases. The purpose of this paper is to present the methodology and results of the application of the Steinberg's fatigue model to estimate the life of electronic components of the EPT-HET instrument for the Solar Orbiter space mission. A Nastran finite element model (FEM) of the EPT-HET instrument was created and used for the structural analysis. The methodology is based on the use of the FEM of the entire instrument to calculate the relative displacement RDSD and RMS values of the PCBs from random vibration analysis. These values are used to estimate the fatigue life of the most susceptible electronic components with the Steinberg's fatigue damage equation and the Miner's cumulative fatigue index. The estimations are calculated for two different configurations of the instrument and three different inputs in order to support the redesign process. Finally, these analytical results are contrasted with the inspections and the functional tests made after the vibration tests, concluding that this methodology can adequately predict the fatigue damage or survival of the electronic components.

  13. Revisiting the constant growth angle: Estimation and verification via rigorous thermal modeling

    Science.gov (United States)

    Virozub, Alexander; Rasin, Igal G.; Brandon, Simon

    2008-12-01

    Methods for estimating growth angle ( θgr) values, based on the a posteriori analysis of directionally solidified material (e.g. drops) often involve assumptions of negligible gravitational effects as well as a planar solid/liquid interface during solidification. We relax both of these assumptions when using experimental drop shapes from the literature to estimate the relevant growth angles at the initial stages of solidification. Assumed to be constant, we use these values as input into a rigorous heat transfer and solidification model of the growth process. This model, which is shown to reproduce the experimental shape of a solidified sessile water drop using the literature value of θgr=0∘, yields excellent agreement with experimental profiles using our estimated values for silicon ( θgr=10∘) and germanium ( θgr=14.3∘) solidifying on an isotropic crystalline surface. The effect of gravity on the solidified drop shape is found to be significant in the case of germanium, suggesting that gravity should either be included in the analysis or that care should be taken that the relevant Bond number is truly small enough in each measurement. The planar solidification interface assumption is found to be unjustified. Although this issue is important when simulating the inflection point in the profile of the solidified water drop, there are indications that solidified drop shapes (at least in the case of silicon) may be fairly insensitive to the shape of this interface.

  14. Models of memory: information processing.

    Science.gov (United States)

    Eysenck, M W

    1988-01-01

    A complete understanding of human memory will necessarily involve consideration of the active processes involved at the time of learning and of the organization and nature of representation of information in long-term memory. In addition to process and structure, it is important for theory to indicate the ways in which stimulus-driven and conceptually driven processes interact with each other in the learning situation. Not surprisingly, no existent theory provides a detailed specification of all of these factors. However, there are a number of more specific theories which are successful in illuminating some of the component structures and processes. The working memory model proposed by Baddeley and Hitch (1974) and modified subsequently has shown how the earlier theoretical construct of the short-term store should be replaced with the notion of working memory. In essence, working memory is a system which is used both to process information and to permit the transient storage of information. It comprises a number of conceptually distinct, but functionally interdependent components. So far as long-term memory is concerned, there is evidence of a number of different kinds of representation. Of particular importance is the distinction between declarative knowledge and procedural knowledge, a distinction which has received support from the study of amnesic patients. Kosslyn has argued for a distinction between literal representation and propositional representation, whereas Tulving has distinguished between episodic and semantic memories. While Tulving's distinction is perhaps the best known, there is increasing evidence that episodic and semantic memory differ primarily in content rather than in process, and so the distinction may be of less theoretical value than was originally believed.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  16. Modeling and verification of the diffraction-limited visible light telescope aboard the solar observing satellite HINODE

    Science.gov (United States)

    Katsukawa, Y.; Suematsu, Y.; Tsuneta, S.; Ichimoto, K.; Shimizu, T.

    2011-09-01

    HINODE, Japanese for "sunrise", is a spacecraft dedicated for observations of the Sun, and was launched in 2006 to study the Sun's magnetic fields and how their explosive energies propagate through the different atmospheric layers. The spacecraft carries the Solar Optical Telescope (SOT), which has a 50 cm diameter clear aperture and provides a continuous series of diffraction-limited visible light images from space. The telescope was developed through international collaboration between Japan and US. In order to achieve the diffraction-limited performance, thermal and structural modeling of the telescope was extensively used in its development phase to predict how the optical performance changes dependent on the thermal condition in orbit. Not only the modeling, we devoted many efforts to verify the optical performance in ground tests before the launch. The verification in the ground tests helped us to find many issues, such as temperature dependent focus shifts, which were not identified only through the thermal-structural modeling. Another critical issue was micro-vibrations induced by internal disturbances of mechanical gyroscopes and momentum wheels for attitude control of the spacecraft. Because the structural modeling was not accurate enough to predict how much the image quality was degraded by the micro-vibrations, we measured their transmission in a spacecraft-level test.

  17. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  18. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...

  19. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train......In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...

  20. Modeling pellet impact drilling process

    Science.gov (United States)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  1. Integration of process design and controller design for chemical processes using model-based methodology

    DEFF Research Database (Denmark)

    Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan; Gani, Rafiqul

    2010-01-01

    In this paper, a novel systematic model-based methodology for performing integrated process design and controller design (IPDC) for chemical processes is presented. The methodology uses a decomposition method to solve the IPDC typically formulated as a mathematical programming (optimization with ...... that satisfy design, control and cost criteria. The advantage of the proposed methodology is that it is systematic, makes use of thermodynamic-process knowledge and provides valuable insights to the solution of IPDC problems in chemical engineering practice.......In this paper, a novel systematic model-based methodology for performing integrated process design and controller design (IPDC) for chemical processes is presented. The methodology uses a decomposition method to solve the IPDC typically formulated as a mathematical programming (optimization...... with constraints) problem. Accordingly the optimization problem is decomposed into four sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection and verification, which are relatively easier to solve. The methodology makes use of thermodynamic-process...

  2. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H.

    2006-09-01

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard

  3. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  4. Remaining Sites Verification Package for the 100-C-9:1 Main Process Sewer Collection Line, Waste Site Reclassification Form 2004-012

    Energy Technology Data Exchange (ETDEWEB)

    L. M. Dittmer

    2007-06-11

    The 100-C-9:1 main process sewer pipeline, also known as the twin box culvert, was a dual reinforced process sewer that collected process effluent from the 183-C and 190-C water treatment facilities, discharging at the 132-C-2 Outfall. For remedial action purposes, the 100-C-9:1 waste site was subdivided into northern and southern sections. The 100-C-9:1 subsite has been remediated to achieve the remedial action objectives specified in the Remaining Sites ROD. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.

  5. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  6. A visual analysis of the process of process modeling

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.

    2015-01-01

    The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process

  7. Use of an Existing Airborne Radon Data Base in the Verification of the NASA/AEAP Core Model

    Science.gov (United States)

    Kritz, Mark A.

    1998-01-01

    The primary objective of this project was to apply the tropospheric atmospheric radon (Rn222) measurements to the development and verification of the global 3-D atmospheric chemical transport model under development by NASA's Atmospheric Effects of Aviation Project (AEAP). The AEAP project had two principal components: (1) a modeling effort, whose goal was to create, test and apply an elaborate three-dimensional atmospheric chemical transport model (the NASA/AEAP Core model to an evaluation of the possible short and long-term effects of aircraft emissions on atmospheric chemistry and climate--and (2) a measurement effort, whose goal was to obtain a focused set of atmospheric measurements that would provide some of the observational data used in the modeling effort. My activity in this project was confined to the first of these components. Both atmospheric transport and atmospheric chemical reactions (as well the input and removal of chemical species) are accounted for in the NASA/AEAP Core model. Thus, for example, in assessing the effect of aircraft effluents on the chemistry of a given region of the upper troposphere, the model must keep track not only of the chemical reactions of the effluent species emitted by aircraft flying in this region, but also of the transport into the region of these (and other) species from other, remote sources--for example, via the vertical convection of boundary layer air to the upper troposphere. Radon, because of its known surface source and known radioactive half-life, and freedom from chemical production or loss, and from removal from the atmosphere by physical scavenging, is a recognized and valuable tool for testing the transport components of global transport and circulation models.

  8. Collapse models and perceptual processes

    International Nuclear Information System (INIS)

    Ghirardi, Gian Carlo; Romano, Raffaele

    2014-01-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  9. Verification SEBAL and Hargreaves –Samani Models to Estimate Evapotranspiration by Lysimeter Data

    Directory of Open Access Journals (Sweden)

    Ali Morshedi

    2017-02-01

    .272 mm/day and 0.700 for the d index, respectively. Similar indices for the Hargreaves-Samani model were 1.003, 0.580 and 0.290 mm/day and 0.917 for the d index. For HS model results show that RMSE, MAE and MBE values were 0.813, 0.477 and 0.206 mm/day, and 0.930 for the index of d, during the entire growing period (185 days. Conclusion: However, results showed that the efficiency and reliability of the SEBAL model by processing satellite visible, near infrared and thermal infrared bands. The need for irrigation water requirements and ET estimation are noteworthy, during the growth of various plants, which vary and thus the complete time series of satellite imageries is required to estimate the total and annual evapotranspiration.

  10. Dynamic CT myocardial perfusion imaging: detection of ischemia in a porcine model with FFR verification

    Science.gov (United States)

    Fahmi, Rachid; Eck, Brendan L.; Vembar, Mani; Bezerra, Hiram G.; Wilson, David L.

    2014-03-01

    Dynamic cardiac CT perfusion (CTP) is a high resolution, non-invasive technique for assessing myocardial blood ow (MBF), which in concert with coronary CT angiography enable CT to provide a unique, comprehensive, fast analysis of both coronary anatomy and functional ow. We assessed perfusion in a porcine model with and without coronary occlusion. To induce occlusion, each animal underwent left anterior descending (LAD) stent implantation and angioplasty balloon insertion. Normal ow condition was obtained with balloon completely de ated. Partial occlusion was induced by balloon in ation against the stent with FFR used to assess the extent of occlusion. Prospective ECG-triggered partial scan images were acquired at end systole (45% R-R) using a multi-detector CT (MDCT) scanner. Images were reconstructed using FBP and a hybrid iterative reconstruction (iDose4, Philips Healthcare). Processing included: beam hardening (BH) correction, registration of image volumes using 3D cubic B-spline normalized mutual-information, and spatio-temporal bilateral ltering to reduce partial scan artifacts and noise variation. Absolute blood ow was calculated with a deconvolutionbased approach using singular value decomposition (SVD). Arterial input function was estimated from the left ventricle (LV) cavity. Regions of interest (ROIs) were identi ed in healthy and ischemic myocardium and compared in normal and occluded conditions. Under-perfusion was detected in the correct LAD territory and ow reduction agreed well with FFR measurements. Flow was reduced, on average, in LAD territories by 54%.

  11. Kinematic Modelling and Simulation of a 2-R Robot Using SolidWorks and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Mahmoud Gouasmi

    2012-12-01

    Full Text Available The simulation of robot systems is becoming very popular, especially with the lowering of the cost of computers, and it can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. The trajectory planning of redundant manipulators is a very active area since many tasks require special characteristics to be satisfied. The importance of redundant manipulators has increased over the last two decades because of the possibility of avoiding singularities as well as obstacles within the course of motion. The angle that the last link of a 2 DOF manipulator makes with the x-axis is required in order to find the solution for the inverse kinematics problem. This angle could be optimized with respect to a given specified key factor (time, velocity, torques while the end-effector performs a chosen trajectory (i.e., avoiding an obstacle in the task space. Modeling and simulation of robots could be achieved using either of the following models: the geometrical model (positions, postures, the kinematic model and the dynamic model. To do so, the modelization of a 2-R robot type is implemented. Our main tasks are comparing two robot postures with the same trajectory (path and for the same length of time, and establishing a computing code to obtain the kinematic and dynamic parameters. SolidWorks and MATLAB/Simulink softwares are used to check the theory and the robot motion simulation. This could be easily generalized to a 3-R robot and possibly therefore to any serial robot (Scara, Puma, etc.. The verification of the obtained results by both softwares allows us to qualitatively evaluate and underline the validityof the chosen model and obtain the right conclusions. The results of the simulations are discussed and an agreement between the two softwares is certainly obtained.

  12. Preliminary verification results of the DWD limited area model LME and evaluation of its storm forecasting skill over the area of Cyprus

    Directory of Open Access Journals (Sweden)

    A. Orphanou

    2006-01-01

    Full Text Available A preliminary verification and evaluation is made of the forecast fields of the non-hydrostatic limited area model LME of the German Weather Service (DWD, for a recent three month period. For this purpose, observations from two synoptic stations in Cyprus are utilized. In addition, days with depressions over the area were selected in order to evaluate the model's forecast skill in storm forecasting.

  13. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  14. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  15. Presentation and verification of a simple mathematical model foridentification of the areas behind noise barrierwith the highest performance

    Directory of Open Access Journals (Sweden)

    M. Monazzam

    2009-07-01

    Full Text Available Background and aims   Traffic noise barriers are the most important measure to control the environmental noise pollution. Diffraction from top edge of noise barriers is the most important path of indirect sound wave moves towards receiver.Therefore, most studies are focused on  improvement of this kind.   Methods   T-shape profile barriers are one of the most successful barrier among many different profiles. In this investigation the theory of destructive effect of diffracted waves from real edge of barrier and the wave diffracted from image of the barrier with phase difference of radians is used. Firstly a simple mathematical representation of the zones behind rigid and absorbent T- shape barriers with the highest insertion loss using the destructive effect of indirect path via barrier  image is introduced and then two different profile reflective and absorption barrier is used for  verification of the introduced model   Results   The results are then compared with the results of a verified two dimensional boundary element method at 1/3 octave band frequencies and in a wide field behind those barriers. Avery good agreement between the results has been achieved. In this method effective height is used for any different profile barriers.   Conclusion   The introduced model is very simple, flexible and fast and could be used for choosing the best location of profile rigid and absorptive barriers to achieve the highest  performance.  

  16. Kinematic Modeling and Simulation of a 2-R Robot by Using Solid Works and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Fernini Brahim

    2012-05-01

    Full Text Available Simulation of robot systems which is getting very popular, especially with the lowering cost of computers, can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. Object staging modelisation using robots holds, wether for the object or the robot, the following models: The geometric one, the kinematics one and the dynamic one. To do so, the modelisation of a 2-R robot type is being implemented. Comparing between two robot postures with the same trajectory (path and for the same length of time and establishing a computing code to obtain the kinematic and dynamic parameters are the main tasks. SolidWorks and Matlab/Simulink softwares are used to check the theory and the robot motion simulation. The verification of the obtained results by both softwares allows us to, qualitatively evaluate ,underline the rightness of the chosen model and to get the right conclusions. The results of simulations were discussed. An agreement between the two softwares is certainly Obtained.

  17. Design, implementation and verification of software code for radiation dose assessment based on simple generic environmental model

    International Nuclear Information System (INIS)

    I Putu Susila; Arif Yuniarto

    2017-01-01

    Radiation dose assessment to determine the potential of radiological impacts of various installations within nuclear facility complex is necessary to ensure environmental and public safety. A simple generic model-based method for calculating radiation doses caused by the release of radioactive substances into the environment has been published by the International Atomic Energy Agency (IAEA) as the Safety Report Series No. 19 (SRS-19). In order to assist the application of the assessment method and a basis for the development of more complex assessment methods, an open-source based software code has been designed and implemented. The software comes with maps and is very easy to be used because assessment scenarios can be done through diagrams. Software verification was performed by comparing its result to SRS-19 and CROM software calculation results. Dose estimated by SRS-19 are higher compared to the result of developed software. However, these are still acceptable since dose estimation in SRS-19 is based on conservative approach. On the other hand, compared to CROM software, the same results for three scenarios and a non-significant difference of 2.25 % in another scenario were obtained. These results indicate the correctness of our implementation and implies that the developed software is ready for use in real scenario. In the future, the addition of various features and development of new model need to be done to improve the capability of software that has been developed. (author)

  18. Storage and growth of denitrifiers in aerobic granules: part II. model calibration and verification.

    Science.gov (United States)

    Ni, Bing-Jie; Yu, Han-Qing; Xie, Wen-Ming

    2008-02-01

    A mathematical model to describe the simultaneous storage and growth activities of denitrifiers in aerobic granules under anoxic conditions has been developed in an accompanying article. The sensitivity of the nitrate uptake rate (NUR) toward the stoichiometric and kinetic coefficients is analyzed in this article. The model parameter values are estimated by minimizing the sum of squares of the deviations between the measured and model-predicted values. The model is successfully calibrated and a set of stoichiometric and kinetic parameters for the anoxic storage and growth of the denitrifiers are obtained. Thereafter, the model established is verified with three set of experimental data. The comparison between the model established with the ASM1 model and ASM3 shows that the present model is appropriate to simulate and predict the performance of a granule-based denitrification system. (c) 2007 Wiley Periodicals, Inc.

  19. Validation and verification of agent models for trust: Independent compared to relative trust

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van

    2011-01-01

    In this paper, the results of a validation experiment for two existing computational trust models describing human trust are reported. One model uses experiences of performance in order to estimate the trust in different trustees. The second model in addition carries the notion of relative trust.

  20. Gas Chromatographic Verification of a Mathematical Model: Product Distribution Following Methanolysis Reactions.

    Science.gov (United States)

    Lam, R. B.; And Others

    1983-01-01

    Investigated application of binomial statistics to equilibrium distribution of ester systems by employing gas chromatography to verify the mathematical model used. Discusses model development and experimental techniques, indicating the model enables a straightforward extension to symmetrical polyfunctional esters and presents a mathematical basis…

  1. Towards the Availability of the Distributed Cluster Rendering System: Automatic Modeling and Verification

    DEFF Research Database (Denmark)

    Wang, Kemin; Jiang, Zhengtao; Wang, Yongbin

    2012-01-01

    In this study, we proposed a Continuous Time Markov Chain Model towards the availability of n-node clusters of Distributed Rendering System. It's an infinite one, we formalized it, based on the model, we implemented a software, which can automatically model with PRISM language. With the tool, whe...

  2. Experimental Verification of Same Simple Equilibrium Models of Masonry Shear Walls

    Science.gov (United States)

    Radosław, Jasiński

    2017-10-01

    This paper contains theoretical fundamentals of strut and tie models, used in unreinforced horizontal shear walls. Depending on support conditions and wall loading, we can distinguish models with discrete bars when point load is applied to the wall (type I model) or with continuous bars (type II model) when load is uniformly distributed at the wall boundary. The main part of this paper compares calculated results with the own tests on horizontal shear walls made of solid brick, silicate elements and autoclaved aerated concrete. The tests were performed in Poland. The model required some modifications due to specific load and static diagram.

  3. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    Probabilistic properties of Cox processes of relevance for statistical modelling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties...... and point process operations such as thinning, displacements, and superpositioning. We also discuss how to simulate specific Cox processes....

  4. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  5. Mathematical modeling and microbiological verification of ohmic heating of a multicomponent mixture of particles in a continuous flow ohmic heater system with electric field parallel to flow.

    Science.gov (United States)

    Kamonpatana, Pitiya; Mohamed, Hussein M H; Shynkaryk, Mykola; Heskitt, Brian; Yousef, Ahmed E; Sastry, Sudhir K

    2013-11-01

    To accomplish continuous flow ohmic heating of a low-acid food product, sufficient heat treatment needs to be delivered to the slowest-heating particle at the outlet of the holding section. This research was aimed at developing mathematical models for sterilization of a multicomponent food in a pilot-scale ohmic heater with electric-field-oriented parallel to the flow and validating microbial inactivation by inoculated particle methods. The model involved 2 sets of simulations, one for determination of fluid temperatures, and a second for evaluating the worst-case scenario. A residence time distribution study was conducted using radio frequency identification methodology to determine the residence time of the fastest-moving particle from a sample of at least 300 particles. Thermal verification of the mathematical model showed good agreement between calculated and experimental fluid temperatures (P > 0.05) at heater and holding tube exits, with a maximum error of 0.6 °C. To achieve a specified target lethal effect at the cold spot of the slowest-heating particle, the length of holding tube required was predicted to be 22 m for a 139.6 °C process temperature with volumetric flow rate of 1.0 × 10(-4) m3/s and 0.05 m in diameter. To verify the model, a microbiological validation test was conducted using at least 299 chicken-alginate particles inoculated with Clostridium sporogenes spores per run. The inoculated pack study indicated the absence of viable microorganisms at the target treatment and its presence for a subtarget treatment, thereby verifying model predictions. © 2013 Institute of Food Technologists®

  6. Documentation, User Support, and Verification of Wind Turbine and Plant Models

    Energy Technology Data Exchange (ETDEWEB)

    Robert Zavadil; Vadim Zheglov; Yuriy Kazachkov; Bo Gong; Juan Sanchez; Jun Li

    2012-09-18

    As part of the Utility Wind Energy Integration Group (UWIG) and EnerNex's Wind Turbine Modeling Project, EnerNex has received ARRA (federal stimulus) funding through the Department of Energy (DOE) to further the progress of wind turbine and wind plant models. Despite the large existing and planned wind generation deployment, industry-standard models for wind generation have not been formally adopted. Models commonly provided for interconnection studies are not adequate for use in general transmission planning studies, where public, non-proprietary, documented and validated models are needed. NERC MOD (North American Electric Reliability Corporation) reliability standards require that power flow and dynamics models be provided, in accordance with regional requirements and procedures. The goal of this project is to accelerate the appropriate use of generic wind turbine models for transmission network analysis by: (1) Defining proposed enhancements to the generic wind turbine model structures that would allow representation of more advanced; (2) Comparative testing of the generic models against more detailed (and sometimes proprietary) versions developed by turbine vendors; (3) Developing recommended parameters for the generic models to best mimic the performance of specific commercial wind turbines; (4) Documenting results of the comparative simulations in an application guide for users; (5) Conducting technology transfer activities in regional workshops for dissemination of knowledge and information gained, and to engage electric power and wind industry personnel in the project while underway; (6) Designing of a "living" homepage to establish an online resource for transmission planners.

  7. Development and verification of signal processing system of avalanche photo diode for the active shields onboard ASTRO-H

    Energy Technology Data Exchange (ETDEWEB)

    Ohno, M., E-mail: ohno@hep01.hepl.hiroshima-u.ac.jp [Department of Physical Sciences, Hiroshima University, Hiroshima 739-8526 (Japan); Kawano, T.; Edahiro, I.; Shirakawa, H.; Ohashi, N.; Okada, C.; Habata, S.; Katsuta, J.; Tanaka, Y.; Takahashi, H.; Mizuno, T.; Fukazawa, Y. [Department of Physical Sciences, Hiroshima University, Hiroshima 739-8526 (Japan); Murakami, H.; Kobayashi, S.; Miyake, K.; Ono, K.; Kato, Y.; Furuta, Y.; Murota, Y.; Okuda, K. [Department of Physics, University of Tokyo, Tokyo 113-0033 (Japan); and others

    2016-09-21

    The hard X-ray Imager and Soft Gamma-ray Detector onboard ASTRO-H demonstrate high sensitivity to hard X-ray (5–80 keV) and soft gamma-rays (60–600 keV), respectively. To reduce the background, both instruments are actively shielded by large, thick Bismuth Germanate scintillators. We have developed the signal processing system of the avalanche photodiode in the BGO active shields and have demonstrated its effectiveness after assembly in the flight model of the HXI/SGD sensor and after integration into the satellite. The energy threshold achieved is about 150 keV and anti-coincidence efficiency for cosmic-ray events is almost 100%. Installed in the BGO active shield, the developed signal processing system successfully reduces the room background level of the main detector. - Highlights: • A detail of development of signal processing system for ASTRO-H is presented. • Digital filer with FPGA instead of discrete analog circuit is applied. • Expected performance is verified after integration of the satellite.

  8. Assessment of surface solar irradiance derived from real-time modelling techniques and verification with ground-based measurements

    Science.gov (United States)

    Kosmopoulos, Panagiotis G.; Kazadzis, Stelios; Taylor, Michael; Raptis, Panagiotis I.; Keramitsoglou, Iphigenia; Kiranoudis, Chris; Bais, Alkiviadis F.

    2018-02-01

    This study focuses on the assessment of surface solar radiation (SSR) based on operational neural network (NN) and multi-regression function (MRF) modelling techniques that produce instantaneous (in less than 1 min) outputs. Using real-time cloud and aerosol optical properties inputs from the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation (MSG) satellite and the Copernicus Atmosphere Monitoring Service (CAMS), respectively, these models are capable of calculating SSR in high resolution (1 nm, 0.05°, 15 min) that can be used for spectrally integrated irradiance maps, databases and various applications related to energy exploitation. The real-time models are validated against ground-based measurements of the Baseline Surface Radiation Network (BSRN) in a temporal range varying from 15 min to monthly means, while a sensitivity analysis of the cloud and aerosol effects on SSR is performed to ensure reliability under different sky and climatological conditions. The simulated outputs, compared to their common training dataset created by the radiative transfer model (RTM) libRadtran, showed median error values in the range -15 to 15 % for the NN that produces spectral irradiances (NNS), 5-6 % underestimation for the integrated NN and close to zero errors for the MRF technique. The verification against BSRN revealed that the real-time calculation uncertainty ranges from -100 to 40 and -20 to 20 W m-2, for the 15 min and monthly mean global horizontal irradiance (GHI) averages, respectively, while the accuracy of the input parameters, in terms of aerosol and cloud optical thickness (AOD and COT), and their impact on GHI, was of the order of 10 % as compared to the ground-based measurements. The proposed system aims to be utilized through studies and real-time applications which are related to solar energy production planning and use.

  9. Experimental verification of computational model for wind turbine blade geometry design

    Directory of Open Access Journals (Sweden)

    Štorch Vít

    2015-01-01

    Full Text Available A 3D potential flow solver with unsteady force free wake model intended for optimization of blade shape for wind power generation is applied on a test case scenario formed by a wind turbine with vertical axis of rotation. The calculation is sensitive to correct modelling of wake and its interaction with blades. The validity of the flow solver is verified by comparing experimentally obtained performance data of model rotor with numerical results.

  10. Induction Heating of Carbon-Fiber Composites: Experimental Verification of Models

    National Research Council Canada - National Science Library

    Fink, Bruce

    2000-01-01

    .... The validity of the global thermal generation model is established through an experimental test matrix in which various specimen configurations are evaluated and compared with theoretical predictions...

  11. Use of the Long Duration Exposure Facility's thermal measurement system for the verification of thermal models

    Science.gov (United States)

    Berrios, William M.

    1992-01-01

    The Long Duration Exposure Facility (LDEF) postflight thermal model predicted temperatures were matched to flight temperature data recorded by the Thermal Measurement System (THERM), LDEF experiment P0003. Flight temperatures, recorded at intervals of approximately 112 minutes for the first 390 days of LDEF's 2105 day mission were compared with predictions using the thermal mathematical model (TMM). This model was unverified prior to flight. The postflight analysis has reduced the thermal model uncertainty at the temperature sensor locations from +/- 40 F to +/- 18 F. The improved temperature predictions will be used by the LDEF's principal investigators to calculate improved flight temperatures experienced by 57 experiments located on 86 trays of the facility.

  12. Cupola Furnace Computer Process Model

    Energy Technology Data Exchange (ETDEWEB)

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  13. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  14. Towards a phase field model of the microstructural evolution of duplex steel with experimental verification

    DEFF Research Database (Denmark)

    Poulsen, Stefan Othmar; Voorhees, P.W.; Lauridsen, Erik Mejdal

    2012-01-01

    A phase field model to study the microstructural evolution of a polycrystalline dual-phase material with conserved phase fraction has been implemented, and 2D simulations have been performed. For 2D simulations, the model predicts the cubic growth well-known for diffusion-controlled systems. Some...

  15. Verification of back stress in a constitutive model for cyclic plasticity

    International Nuclear Information System (INIS)

    Ishikawa, H.

    1991-01-01

    Aiming at a formulation of the unified constitutive model of cyclic plasticity and creep, the concept of the effective stress which is defined as stress measured from the current center of yield surface, is employed to explain the intermittent creep period after cyclic prestraining. Then the experimental results show a marvelous regularity which might make us easily to construct the unified constitutive model. (author)

  16. Erythrocyte lysis in isotonic solution of ammonium chloride: Theoretical modelling and experimental verification

    NARCIS (Netherlands)

    Chernyshev, A.V.; Tarasov, P.A.; Semianov, K.A.; Nekrasov, V.M.; Hoekstra, A.G.; Maltsev, V.P.

    2008-01-01

    A mathematical model of erythrocyte lysis in isotonic solution of ammonium chloride is presented in frames of a statistical approach. The model is used to evaluate several parameters of mature erythrocytes (volume, surface area, hemoglobin concentration, number of anionic exchangers on membrane,

  17. Motion/posture modeling and simulation verification of physically handicapped in manufacturing system design

    Science.gov (United States)

    Fu, Yan; Li, Shiqi; Chen, Gwen-guo

    2013-03-01

    Non-obstacle design is critical to tailor physically handicapped workers in manufacturing system. Simultaneous consideration of variability in physically disabled users, machines and environment of the manufacturing system is extremely complex and generally requires modeling of physically handicapped interaction with the system. Most current modeling either concentrates on the task results or functional disability. The integration of physical constraints with task constraints is far more complex because of functional disability and its extended influence on adjacent body parts. A framework is proposed to integrate the two constraints and thus model the specific behavior of the physical handicapped in virtual environment generated by product specifications. Within the framework a simplified model of physical disabled body is constructed, and body motion is generated based on 3 levels of constraints(effecter constraints, kinematics constraints and physical constraints). The kinematics and dynamic calculations are made and optimized based on the weighting manipulated by the kinematics constraints and dynamic constraints. With object transferring task as example, the model is validated in Jack 6.0. Modelled task motion elements except for squatting and overreaching well matched with captured motion elements. The proposed modeling method can model the complex behavior of the physically handicapped by integrating both task and physical disability constraints.

  18. TARDEC FIXED HEEL POINT (FHP): DRIVER CAD ACCOMMODATION MODEL VERIFICATION RERPOT

    Science.gov (United States)

    2017-11-09

    Distribution Statement A. Approve for public release; distribution is unlimited | P a g e 4 REVISION HISTORY ...34 10.1.2 M&S Use History ...the CAD model was compared to the outputs of the UMTRI Soldier Driver Accommodation (2017) model spreadsheet; and boundary manikin hip and eye

  19. Car-following model with relative-velocity effect and its experimental verification.

    Science.gov (United States)

    Shamoto, Daisuke; Tomoeda, Akiyasu; Nishi, Ryosuke; Nishinari, Katsuhiro

    2011-04-01

    In driving a vehicle, drivers respond to the changes of both the headway and the relative velocity to the vehicle in front. In this paper a new car-following model including these maneuvers is proposed. The acceleration of the model becomes infinite (has a singularity) when the distance between two vehicles is zero, and the asymmetry between the acceleration and the deceleration is incorporated in a nonlinear way. The model is simple but contains enough features of driving for reproducing real vehicle traffic. From the linear stability analysis, we confirm that the model shows the metastable homogeneous flow around the critical density, beyond which a traffic jam emerges. Moreover, we perform experiments to verify this model. From the data it is shown that the acceleration of a vehicle has a positive correlation with the relative velocity.

  20. Predicted and actual indoor environmental quality: Verification of occupants' behaviour models in residential buildings

    DEFF Research Database (Denmark)

    Andersen, Rune Korsholm; Fabi, Valentina; Corgnati, Stefano P.

    2016-01-01

    Occupants' interactions with the building envelope and building systems can have a large impact on the indoor environment and energy consumption in a building. As a consequence, any realistic forecast of building performance must include realistic models of the occupants' interactions...... with the building controls (windows, thermostats, solar shading etc.). During the last decade, studies about stochastic models of occupants' behaviour in relation to control of the indoor environment have been published. Often the overall aim of these models is to enable more reliable predictions of building...... performance using building energy performance simulations (BEPS). However, the validity of these models has only been sparsely tested. In this paper, stochastic models of occupants' behaviour from literature were tested against measurements in five apartments. In a monitoring campaign, measurements of indoor...

  1. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  2. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  3. Verification of some numerical models for operationally predicting mesoscale winds aloft

    International Nuclear Information System (INIS)

    Cornett, J.S.; Randerson, D.

    1977-01-01

    Four numerical models are described for predicting mesoscale winds aloft for a 6 h period. These models are all tested statistically against persistence as the control forecast and against predictions made by operational forecasters. Mesoscale winds aloft data were used to initialize the models and to verify the predictions on an hourly basis. The model yielding the smallest root-mean-square vector errors (RMSVE's) was the one based on the most physics which included advection, ageostrophic acceleration, vertical mixing and friction. Horizontal advection was found to be the most important term in reducing the RMSVE's followed by ageostrophic acceleration, vertical advection, surface friction and vertical mixing. From a comparison of the mean absolute errors based on up to 72 independent wind-profile predictions made by operational forecasters, by the most complete model, and by persistence, we conclude that the model is the best wind predictor in the free air. In the boundary layer, the results tend to favor the forecaster for direction predictions. The speed predictions showed no overall superiority in any of these three models

  4. Implementation and verification of interface constitutive model in FLAC3D

    Directory of Open Access Journals (Sweden)

    Hai-min Wu

    2011-09-01

    Full Text Available Due to the complexity of soil-structure interaction, simple constitutive models typically used for interface elements in general computer programs cannot satisfy the requirements of discontinuous deformation analysis of structures that contain different interfaces. In order to simulate the strain-softening characteristics of interfaces, a nonlinear strain-softening interface constitutive model was incorporated into fast Lagrange analysis of continua in three dimensions (FLAC3D through a user-defined program in the FISH environment. A numerical simulation of a direct shear test for geosynthetic interfaces was conducted to verify that the interface model was implemented correctly. Results of the numerical tests show good agreement with the results obtained from theoretical calculations, indicating that the model incorporated into FLAC3D can simulate the nonlinear strain-softening behavior of interfaces involving geosynthetic materials. The results confirmed the validity and reliability of the improved interface model. The procedure and method of implementing an interface constitutive model into a commercial computer program also provide a reference for implementation of a new interface constitutive model in FLAC3D.

  5. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  6. Simulating the Water Use of Thermoelectric Power Plants in the United States: Model Development and Verification

    Science.gov (United States)

    Betrie, G.; Yan, E.; Clark, C.

    2016-12-01

    Thermoelectric power plants use the highest amount of freshwater second to the agriculture sector. However, there is scarcity of information that characterizes the freshwater use of these plants in the United States. This could be attributed to the lack of model and data that are required to conduct analysis and gain insights. The competition for freshwater among sectors will increase in the future as the amount of freshwater gets limited due climate change and population growth. A model that makes use of less data is urgently needed to conduct analysis and identify adaptation strategies. The objectives of this study are to develop a model and simulate the water use of thermoelectric power plants in the United States. The developed model has heat-balance, climate, cooling system, and optimization modules. It computes the amount of heat rejected to the environment, estimates the quantity of heat exchanged through latent and sensible heat to the environment, and computes the amount of water required per unit generation of electricity. To verify the model, we simulated a total of 876 fossil-fired, nuclear and gas-turbine power plants with different cooling systems (CS) using 2010-2014 data obtained from Energy Information Administration. The CS includes once-through with cooling pond, once-through without cooling ponds, recirculating with induced draft and recirculating with induced draft natural draft. The results show that the model reproduced the observed water use per unit generation of electricity for the most of the power plants. It is also noticed that the model slightly overestimates the water use during the summer period when the input water temperatures are higher. We are investigating the possible reasons for the overestimation and address it in the future work. The model could be used individually or coupled to regional models to analyze various adaptation strategies and improve the water use efficiency of thermoelectric power plants.

  7. Analog modelling of obduction processes

    Science.gov (United States)

    Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.

    2012-04-01

    Obduction corresponds to one of plate tectonics oddities, whereby dense, oceanic rocks (ophiolites) are presumably 'thrust' on top of light, continental ones, as for the short-lived, almost synchronous Peri-Arabic obduction (which took place along thousands of km from Turkey to Oman in c. 5-10 Ma). Analog modelling experiments were performed to study the mechanisms of obduction initiation and test various triggering hypotheses (i.e., plate acceleration, slab hitting the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises (1) an upper mantle, modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and (2) high-viscosity silicone plates (Rhodrosil Gomme with PDMS iron fillers to reproduce densities of continental or oceanic plates), located at the centre of the tank above the syrup to simulate the subducting and the overriding plates - and avoid friction on the sides of the tank. Convergence is simulated by pushing on a piston at one end of the model with velocities comparable to those of plate tectonics (i.e., in the range 1-10 cm/yr). The reference set-up includes, from one end to the other (~60 cm): (i) the piston, (ii) a continental margin containing a transition zone to the adjacent oceanic plate, (iii) a weakness zone with variable resistance and dip (W), (iv) an oceanic plate - with or without a spreading ridge, (v) a subduction zone (S) dipping away from the piston and (vi) an upper, active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as is known to have been the case in Oman). Several configurations were tested and over thirty different parametric tests were performed. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Displacements, together with along-strike and across-strike internal deformation in all

  8. Helicopter stability and control modeling improvements and verification on two helicopters

    Science.gov (United States)

    Schrage, D. P.; Peters, D. A.; Prasad, J. V. R.; Stumpf, W. F.; He, Chengjian

    1988-01-01

    A linearized model of helicopter flight dynamics is developed which includes the flapping, lead-lag, and dynamic inflow degrees of freedom (DOF). The model is a combination of analytical terms and numerically determined stability derivatives, and is used to investigate the importance of the rotor DOF to stability and control modeling. The results show that the rotor DOF can have a significant impact on some of the natural modes in a linear model. The flap and dynamic inflow DOF show the greatest influence. Flapping exhibits strong coupling to the body, dynamic inflow, and to lead-lag to a lesser extent. Dynamic inflow tends to damp the high-frequency flapping modes, and reduces the damping on coupled body-flap motion. Dynamic inflow also couples to the flapping motion to produce complex roots. With body-flap and lag regressing modes as exceptions, the results show essentially similar behavior for most modes of articulated and hingeless rotor helicopters.

  9. Analysis of Changes in Auditory Nerve Signals Following Simulated Tinnitus for the Verification of Cochlear Model

    National Research Council Canada - National Science Library

    Yi, Y

    2001-01-01

    For an interpretation of the tinnitus phenomenon, reticular lamina which transmits energy in a cochlear was assumed as a mass and the components for the stiffness and control were added to the model...

  10. The Bilevel Design Problem for Communication Networks on Trains: Model, Algorithm, and Verification

    Directory of Open Access Journals (Sweden)

    Yin Tian

    2014-01-01

    Full Text Available This paper proposes a novel method to solve the problem of train communication network design. Firstly, we put forward a general description of such problem. Then, taking advantage of the bilevel programming theory, we created the cost-reliability-delay model (CRD model that consisted of two parts: the physical topology part aimed at obtaining the networks with the maximum reliability under constrained cost, while the logical topology part focused on the communication paths yielding minimum delay based on the physical topology delivered from upper level. We also suggested a method to solve the CRD model, which combined the genetic algorithm and the Floyd-Warshall algorithm. Finally, we used a practical example to verify the accuracy and the effectiveness of the CRD model and further applied the novel method on a train with six carriages.

  11. Nascap-2k Spacecraft-Plasma Environment Interactions Modeling: New Capabilities and Verification

    National Research Council Canada - National Science Library

    Davis, V. A; Mandell, M. J; Cooke, D. L; Ferguson, D. C

    2007-01-01

    .... Here we examine the accuracy and limitations of two new capabilities of Nascap-2k: modeling of plasma plumes such as generated by electric thrusters and enhanced PIC computational capabilities...

  12. Modeling and experimental verification of infusion speed of liquids in photonic crystal fibers

    DEFF Research Database (Denmark)

    Sørensen, Thorkild; Noordegraaf, Danny; Nielsen, Kristian

    A theoretical method for predicting infusion time of liquids in microcapillaries is formulated. Through a microscopical, a fluorescent, and, finally, through a reflectometric measurement method, the model is successfully verified in real photonic crystal fibers.......A theoretical method for predicting infusion time of liquids in microcapillaries is formulated. Through a microscopical, a fluorescent, and, finally, through a reflectometric measurement method, the model is successfully verified in real photonic crystal fibers....

  13. Predictive Simulation of Material Failure Using Peridynamics-Advanced Constitutive Modeling, Verification and Validation

    Science.gov (United States)

    2016-03-31

    system solver utilities • PyTrilinos - a python interface providing Python wrappers for many Trilinos packages, and offering compatibility between...parallelization is achieved using Epetra data structures for distributed variables. Model force evaluations are coded in Python , making extensive use...analytical solutions. For comparison, equivalent models are created and analyzed in Abaqus 6.12 to verify simple cases. 66 DISTRIBUTION A: Distribution

  14. Experimental verification of mathematical model of the heat transfer in exhaust system

    OpenAIRE

    Petković Snežana; Pešić Radivoje; Lukić Jovanka

    2011-01-01

    A Catalyst convertor has maximal efficiency when it reaches working temperature. In a cold start phase efficiency of the catalyst is low and exhaust emissions have high level of air pollutants. The exhaust system optimization, in order to decrease time of achievement of the catalyst working temperature, caused reduction of the total vehicle emission. Implementation of mathematical models in development of exhaust systems decrease total costs and reduce time. Mathematical model has to be...

  15. Release Fixed Heel Point (FHP) Accommodation Model Verification and Validation (V and V) Plan - Rev A

    Science.gov (United States)

    2017-01-23

    including the effects of body size, vehicle layout, and Soldier protective equipment and gear. The boundaries defined include the required space and...90% of Soldier population, 85% male). The model can guide vehicle designers in creating an optimized work space for the occupant. The CAD...posture metrics observed in vehicles . The results of the validation testing may lead to the FHP accommodation model being adjusted to address

  16. Verification of Nine-phase PMSM Model in d-q Coordinates with Mutual Couplings

    OpenAIRE

    Kozovský, Matúš; Blaha, Petr; Václavek, Pavel

    2016-01-01

    Electric motors with more than three phases have many advantages comparing to ordinary three-phase motor. For this reason it is natural to pay attention to hem and to work on advanced control methods. Control algorithms development requires to operate with the model of motor. This paper presents the modeling concept of the nine-phase permanent magnet synchronous motor (PMSM) in three times three-phase arrangement fed by nine-phase voltage source inverter (VSI). Magnetic interaction between...

  17. Design, analysis and verification of a knee joint oncological prosthesis finite element model.

    Science.gov (United States)

    Zach, Lukáš; Kunčická, Lenka; Růžička, Pavel; Kocich, Radim

    2014-11-01

    The aim of this paper was to design a finite element model for a hinged PROSPON oncological knee endoprosthesis and to verify the model by comparison with ankle flexion angle using knee-bending experimental data obtained previously. Visible Human Project CT scans were used to create a general lower extremity bones model and to compose a 3D CAD knee joint model to which muscles and ligaments were added. Into the assembly the designed finite element PROSPON prosthesis model was integrated and an analysis focused on the PEEK-OPTIMA hinge pin bushing stress state was carried out. To confirm the stress state analysis results, contact pressure was investigated. The analysis was performed in the knee-bending position within 15.4-69.4° hip joint flexion range. The results showed that the maximum stress achieved during the analysis (46.6 MPa) did not exceed the yield strength of the material (90 MPa); the condition of plastic stability was therefore met. The stress state analysis results were confirmed by the distribution of contact pressure during knee-bending. The applicability of our designed finite element model for the real implant behaviour prediction was proven on the basis of good correlation of the analytical and experimental ankle flexion angle data. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. A dynamic human water and electrolyte balance model for verification and optimization of life support systems in space flight applications

    Science.gov (United States)

    Hager, P.; Czupalla, M.; Walter, U.

    2010-11-01

    In this paper we report on the development of a dynamic MATLAB SIMULINK® model for the water and electrolyte balance inside the human body. This model is part of an environmentally sensitive dynamic human model for the optimization and verification of environmental control and life support systems (ECLSS) in space flight applications. An ECLSS provides all vital supplies for supporting human life on board a spacecraft. As human space flight today focuses on medium- to long-term missions, the strategy in ECLSS is shifting to closed loop systems. For these systems the dynamic stability and function over long duration are essential. However, the only evaluation and rating methods for ECLSS up to now are either expensive trial and error breadboarding strategies or static and semi-dynamic simulations. In order to overcome this mismatch the Exploration Group at Technische Universität München (TUM) is developing a dynamic environmental simulation, the "Virtual Habitat" (V-HAB). The central element of this simulation is the dynamic and environmentally sensitive human model. The water subsystem simulation of the human model discussed in this paper is of vital importance for the efficiency of possible ECLSS optimizations, as an over- or under-scaled water subsystem would have an adverse effect on the overall mass budget. On the other hand water has a pivotal role in the human organism. Water accounts for about 60% of the total body mass and is educt and product of numerous metabolic reactions. It is a transport medium for solutes and, due to its high evaporation enthalpy, provides the most potent medium for heat load dissipation. In a system engineering approach the human water balance was worked out by simulating the human body's subsystems and their interactions. The body fluids were assumed to reside in three compartments: blood plasma, interstitial fluid and intracellular fluid. In addition, the active and passive transport of water and solutes between those

  19. Hydraulic pitch control system for wind turbines: Advanced modeling and verification of an hydraulic accumulator

    DEFF Research Database (Denmark)

    Irizar, Victor; Andreasen, Casper Schousboe

    2017-01-01

    Hydraulic pitch systems provide robust and reliable control of power and speed of modern wind turbines. During emergency stops, where the pitch of the blades has to be taken to a full stop position to avoid over speed situations, hydraulic accumulators play a crucial role. Their efficiency...... and capability of providing enough energy to rotate the blades is affected by thermal processes due to the compression and decompression of the gas chamber. This paper presents an in depth study of the thermodynamical processes involved in an hydraulic accumulator during operation, and how they affect the energy...

  20. From business value model to coordination process model

    NARCIS (Netherlands)

    Fatemi, Hassan; Wieringa, Roelf J.; Poler, R.; van Sinderen, Marten J.; Sanchis, R.

    2009-01-01

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary

  1. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  2. Model-based reasoning and the control of process plants

    International Nuclear Information System (INIS)

    Vaelisuo, Heikki

    1993-02-01

    In addition to feedback control, safe and economic operation of industrial process plants requires discrete-event type logic control like for example automatic control sequences, interlocks, etc. A lot of complex routine reasoning is involved in the design and verification and validation (VandV) of such automatics. Similar reasoning tasks are encountered during plant operation in action planning and fault diagnosis. The low-level part of the required problem solving is so straightforward that it could be accomplished by a computer if only there were plant models which allow versatile mechanised reasoning. Such plant models and corresponding inference algorithms are the main subject of this report. Deep knowledge and qualitative modelling play an essential role in this work. Deep knowledge refers to mechanised reasoning based on the first principles of the phenomena in the problem domain. Qualitative modelling refers to knowledge representation formalism and related reasoning methods which allow solving problems on an abstraction level higher than for example traditional simulation and optimisation. Prolog is a commonly used platform for artificial intelligence (Al) applications. Constraint logic languages like CLP(R) and Prolog-III extend the scope of logic programming to numeric problem solving. In addition they allow a programming style which often reduces the computational complexity significantly. An approach to model-based reasoning implemented in constraint logic programming language CLP(R) is presented. The approach is based on some of the principles of QSIM, an algorithm for qualitative simulation. It is discussed how model-based reasoning can be applied in the design and VandV of plant automatics and in action planning during plant operation. A prototype tool called ISIR is discussed and some initial results obtained during the development of the tool are presented. The results presented originate from preliminary test results of the prototype obtained

  3. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  4. Verification of a Monte Carlo model of the Missouri S&T reactor

    Science.gov (United States)

    Richardson, Brad Paul

    The purpose of this research is to ensure that an MCNP model of the Missouri S&T reactor produces accurate results so that it may be used to predict the effects of some desired upgrades to the reactor. The desired upgrades are an increase in licensed power from 200 kW to 400kW, and the installation of a secondary cooling system to prevent heating of the pool. This was performed by comparing simulations performed using the model with experiments performed using the reactor. The experiments performed were, the approach to criticality method of predicting the critical control rod height, measurement of the axial flux profile, moderator temperature coefficient of reactivity, and void coefficient of reactivity. The results of these experiments and results from the simulation show that the model produces a similar axial flux profile, and that it models the void and temperature coefficients of reactivity well. The model does however over-predict the criticality of the core, such that it predicts a lower critical rod height and a keff greater than one when simulating conditions in which the reactor was at a stable power. It is assumed that this is due to the model using fuel compositions from when the fuel was new, while in reality the reactor has been operating with this fuel for nearly 20 years. It has therefore been concluded that the fuel composition should be updated by performing a burnup analysis, and an accurate heat transfer and fluid flow analysis be performed to better represent the temperature profile before the model is used to simulate the effects of the desired upgrades.

  5. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  6. Future directions of nuclear verification

    International Nuclear Information System (INIS)

    Blix, H.

    1997-01-01

    Future directions of nuclear verification are discussed including the following topics: verification of non-proliferation commitments; practicalities of strengthening safeguards; new tasks in nuclear verification

  7. A fission gas release model for MOX fuel and its verification

    International Nuclear Information System (INIS)

    Koo, Y.H.; Sohn, D.S.; Strijov, P.

    2000-01-01

    A fission gas release model for MOX fuel has been developed based on a model for UO 2 fuel. Using the concept of equivalent cell, the model considers the uneven distribution of Pu within the fuel matrix and a number of Pu-rich particles that could lead to a non-uniform fission rate and fission gas distribution across the fuel pellet. The model has been incorporated into a code, COSMOS, and some parametric studies were made to analyze the effect of the size and Pu content of Pu-rich agglomerates. The model was then applied to the experimental data obtained from the FIGARO program, which consisted of the base irradiation of MOX fuels in the BEZNAU-1 PWR and the subsequent irradiation of four refabricated fuel segments in the Halden reactor. The calculated gas releases show good agreement with the measured ones. In addition, the present analysis indicates that the microstructure of the MOX fuel used in the FIGARO program is such that it has produced little difference in terms of gas release compared with UO 2 fuel. (author)

  8. Experimental Verification of Modeled Thermal Distribution Produced by a Piston Source in Physiotherapy Ultrasound

    Directory of Open Access Journals (Sweden)

    M. I. Gutierrez

    2016-01-01

    Full Text Available Objectives. To present a quantitative comparison of thermal patterns produced by the piston-in-a-baffle approach with those generated by a physiotherapy ultrasonic device and to show the dependency among thermal patterns and acoustic intensity distributions. Methods. The finite element (FE method was used to model an ideal acoustic field and the produced thermal pattern to be compared with the experimental acoustic and temperature distributions produced by a real ultrasonic applicator. A thermal model using the measured acoustic profile as input is also presented for comparison. Temperature measurements were carried out with thermocouples inserted in muscle phantom. The insertion place of thermocouples was monitored with ultrasound imaging. Results. Modeled and measured thermal profiles were compared within the first 10 cm of depth. The ideal acoustic field did not adequately represent the measured field having different temperature profiles (errors 10% to 20%. Experimental field was concentrated near the transducer producing a region with higher temperatures, while the modeled ideal temperature was linearly distributed along the depth. The error was reduced to 7% when introducing the measured acoustic field as the input variable in the FE temperature modeling. Conclusions. Temperature distributions are strongly related to the acoustic field distributions.

  9. The Verification of Structural Decision-Making Model for Evaluating Education on Facebook

    Directory of Open Access Journals (Sweden)

    Kozel Roman

    2013-09-01

    Full Text Available The aim of this paper is to present the work of the research team who tried to construct a model that explores general opinions of students about education on Facebook and also opinions of students about education on the social page for course E-marketing by using structural equation model. Facebook has already been present at universities due to the fact that students use it as a primary source of information about news in courses, duties, and so on. The research team carried out an experiment in the course E-marketing at FE of VŠB – TUO, in which Facebook was used as a tool for communication between students and teachers. The research on the attitude of students towards education on Facebook was conducted by questioning using predefined variables. The first form of the model was designed by factor analysis with method Varimax, when six groups of factors that affect respondents´ opinions about education were defined. A structural equation model was used to verify the validity of the model. It appears that four groups of factors mainly affect respondents´ attitudes to this type of education according to the testing performed. These groups of factors are Engagement, Information and Modern Technologies, Lecturers and Scores, and Education on Facebook. The research team also determined statistically the most significant variables in these factors that affect the opinions of students about education the most.

  10. Model-Based Interpretation and Experimental Verification of ECT Signals of Steam Generator Tubes

    International Nuclear Information System (INIS)

    Song, Sung Jin; Kim, Young Hwan; Kim, Eui Lae; Yim, Chang Jae; Lee, Jin Ho

    2004-01-01

    Model-based inversion tools for eddy current signals have been developed by combining neural networks and finite element modeling, for quantitative flaw characterization in steam generator tubes. In the present work, interpretation of experimental eddy current signals was carried out in order to validate the developed inversion tools. A database was constructed using the synthetic flaw signals generated by the finite element model. The hybrid neural networks composed of a PNN classifier and BPNN size estimators were trained using the synthetic signals. Experimental eddy current signals were obtained from axisymmetric artificial flaws. Interpretation of flaw signals was conducted by feeding the experimental signals into the neural networks. The interpretation was excellent, which shows that the developed inversion tools would be applicable to the Interpretation of real eddy current signals

  11. Verification of the IVA4 film boiling model with the data base of Liu and Theofanous

    Energy Technology Data Exchange (ETDEWEB)

    Kolev, N.I. [Siemens AG Unternehmensbereich KWU, Erlangen (Germany)

    1998-01-01

    Part 1 of this work presents a closed analytical solution for mixed-convection film boiling on vertical walls. Heat transfer coefficients predicted by the proposed model and experimental data obtained at the Royal Institute of Technology in Sweden by Okkonen et al are compared. All data predicted are inside the {+-}10% error band, with mean averaged error being below 4% using the slightly modified analytical solution. The solution obtained is recommended for practical applications. The method presented here is used in Part 2 as a guideline for developing model for film boiling on spheres. The new semi-empirical film boiling model for spheres used in IVA4 computer code is compared with the experimental data base obtained by Liu and Theofanous. The data are predicted within {+-}30% error band. (author)

  12. Model and verification of electrokinetic flow and transport in a micro-electrophoresis device.

    Science.gov (United States)

    Barz, Dominik P J; Ehrhard, Peter

    2005-09-01

    We investigate the electrokinetic flow and transport within a micro-electrophoresis device. A mathematical model is set up, which allows to perform two-dimensional, time-dependent finite-element simulations. The model reflects the dominant features of the system, namely electroosmosis, electrophoresis, externally-applied electrical potentials, and equilibrium chemistry. For the solution of the model equations we rely on numerical simulations of the core region, while the immediate wall region is treated analytically at leading order. This avoids extreme refinements of the numerical grid within the EDL. An asymptotic matching of both solutions and subsequent superposition, nevertheless, provides an approximation for the solution in the entire domain. The results of the simulations are verified against experimental observation and show good agreement.

  13. Experimental verification of a radiofrequency power model for Wi-Fi technology.

    Science.gov (United States)

    Fang, Minyu; Malone, David

    2010-04-01

    When assessing the power emitted from a Wi-Fi network, it has been observed that these networks operate at a relatively low duty cycle. In this paper, we extend a recently introduced model of emitted power in Wi-Fi networks to cover conditions where devices do not always have packets to transmit. We present experimental results to validate the original model and its extension by developing approximate, but practical, testbed measurement techniques. The accuracy of the models is confirmed, with small relative errors: less than 5-10%. Moreover, we confirm that the greatest power is emitted when the network is saturated with traffic. Using this, we give a simple technique to quickly estimate power output based on traffic levels and give examples showing how this might be used in practice to predict current or future power output from a Wi-Fi network.

  14. Formal verification of dynamic hybrid systems: a NuSMV-based model checking approach

    Directory of Open Access Journals (Sweden)

    Xu Zhi

    2018-01-01

    Full Text Available Software security is an important and challenging research topic in developing dynamic hybrid embedded software systems. Ensuring the correct behavior of these systems is particularly difficult due to the interactions between the continuous subsystem and the discrete subsystem. Currently available security analysis methods for system risks have been limited, as they rely on manual inspections of the individual subsystems under simplifying assumptions. To improve this situation, a new approach is proposed that is based on the symbolic model checking tool NuSMV. A dual PID system is used as an example system, for which the logical part and the computational part of the system are modeled in a unified manner. Constraints are constructed on the controlled object, and a counter-example path is ultimately generated, indicating that the hybrid system can be analyzed by the model checking tool.

  15. Developpement of a GoldSim Biosphere Model, Evaluation, and Its Verification

    International Nuclear Information System (INIS)

    Lee, Youn Myoung; Hwang, Yong Soo

    2009-12-01

    For the purpose of evaluating dose rate to individual due to long-term release of nuclides from the repository for an HLW or a pyroprocessing repository, a biosphere assessment model and the implemented program based on BIOMASS methodology have been developed by utilizing GoldSim, a general model developing tool. To show its practicability and usability as well as to see the sensitivity of parametric and scenario variations to the annual exposure, some probabilistic calculations are made and investigated. For the cases when changing the exposure groups and associated GBIs as well as varying selected input values, all of which seem important for the biosphere evaluation, dose rate per nuclide release rate is probabilistically calculated and analyzed. A series of comparison studies with JAEA, Japan have been also carried out to verify the model

  16. A Modified Approach to Modeling of Diffusive Transformation Kinetics from Nonisothermal Data and Experimental Verification

    Science.gov (United States)

    Chen, Xiangjun; Xiao, Namin; Cai, Minghui; Li, Dianzhong; Li, Guangyao; Sun, Guangyong; Rolfe, Bernard F.

    2016-09-01

    An inverse model is proposed to construct the mathematical relationship between continuous cooling transformation (CCT) kinetics with constant rates and the isothermal one. The kinetic parameters in JMAK equations of isothermal kinetics can be deduced from the experimental CCT kinetics. Furthermore, a generalized model with a new additive rule is developed for predicting the kinetics of nucleation and growth during diffusional phase transformation with arbitrary cooling paths based only on CCT curve. A generalized contribution coefficient is introduced into the new additivity rule to describe the influences of current temperature and cooling rate on the incubation time of nuclei. Finally, then the reliability of the proposed model is validated using dilatometry experiments of a microalloy steel with fully bainitic microstructure based on various cooling routes.

  17. Towards the Automated Annotation of Process Models

    NARCIS (Netherlands)

    Leopold, H.; Meilicke, C.; Fellmann, M.; Pittke, F.; Stuckenschmidt, H.; Mendling, J.

    2016-01-01

    Many techniques for the advanced analysis of process models build on the annotation of process models with elements from predefined vocabularies such as taxonomies. However, the manual annotation of process models is cumbersome and sometimes even hardly manageable taking the size of taxonomies into

  18. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  19. Verification of RBMK-1500 reactor main circulation circuit model with Cathare V1.3L

    International Nuclear Information System (INIS)

    Jasiulevicius, A.

    2001-01-01

    Among other computer codes, French code CATHARE is also applied for RBMK reactor calculations. In this paper results of such application for Ignalina NPP reactor (RBMK-1500 type) main circulation circuit are presented. Three transients calculations were performed: all main circulation pumps (MCP) trip, trip of one main circulation pump and trip of one main circulation pump without a closure of check valve on the pump line. Calculation results were compared to data from the Ignalina NPP, where all these transients were recorded in the years 1986, 1996 and 1998. The presented studies prove the capability of the CATHARE code to treat thermal-hydraulic transients with a reactor scram in the RBMK, in case of single or multiple pump trips. However, the presented model needs further improvements in order to simulate loss of coolant accidents. For this reason, emergency core cooling system should be included in the model. Additional model improvement is also needed in order to gain more independent pressure behavior in both loops. Also, flow rates through the reactor channels should be modeled by dividing channels into several groups, referring to channel power (in RBMK power produced in a channel, located in different parts of the core is not the same). The point-neutron kinetic model of the CATHARE code is not suitable to predict transients when the reactor is operating at a nominal power level. Such transients would require the use of 3D-neutron kinetics model to describe properly the strong space-time effect on the power distribution in the reactor core

  20. Field verification of linear and nonlinear hybrid wave models for offshore tower response prediction

    Energy Technology Data Exchange (ETDEWEB)

    Couch, A.T. [Hudson Engineering, Houston, TX (United States). Offshore Structural Div.; Conte, J.P. [Rice Univ., Houston, TX (United States). Dept. of Civil Engineering

    1996-12-31

    Accuracy of the prediction of the dynamic response of deepwater fixed offshore platforms to irregular sea waves depends very much on the theory used to determine water kinematics from the mudline to the free surface. A common industry practice consists of using linear wave theory, which assumes infinitesimal wave steepness, in conjunction with empirical wave stretching techniques to provide a more realistic representation of near surface kinematics. The current velocity field is then added to the wave-induced fluid velocity field and the wave-and-current forces acting on the structure are computed via Morrison`s equation. The first objective of this study is to compare the predicted responses of Cognac, a deepwater fixed platform, obtained from various empirically stretched linear wave models with the response of Cognac predicted based on the Hybrid Wave Model. The latter is a recently developed higher-order, and therefore more accurate, wave model which satisfies, up to the second-order in wave steepness, the local mass conservation and the free surface boundary conditions up to the free surface. The second objective of this study consists of comparing the various analytical response predictions with the measured response of the Cognac platform. Availability of a set of oceanographic and structural vibration data for Cognac provides a unique opportunity to evaluate the prediction ability of traditional analytical models used in designing such structures. The results of this study indicate that (1) the use of the Hybrid Wave Model provides a predicted platform response which is in closer agreement with the measured response than the predictions based on the various stretched linear wave models; and (2) the Wheeler stretching technique produces platform response results which are more accurate than those obtained by using the other stretching schemes considered here.