WorldWideScience

Sample records for model verification studies

  1. Verification modeling study for the influential factors of secondary clarifier

    OpenAIRE

    Gao, Haiwen

    2016-01-01

    A numerical Quasi 3-D model of secondary clarifier is applied to verify the data obtained through the literature and analyze the influential factors for secondary clarifiers. The data from the papers provide the input parameters for the model. During this study, several influential factors (density waterfall; surface overflow rate; solids loading rate; solids-settling characteristics; mixed liquor suspended solid; clarifier geometry) are tested. The results show that there are some difference...

  2. Second order closure integrated puff (SCIPUFF) model verification and evaluation study. Technical memo

    Energy Technology Data Exchange (ETDEWEB)

    Nappo, C.J.; Eckman, R.M.; Rao, K.S.; Herwehe, J.A.; Gunter, L.

    1998-06-01

    This report summarizes a verification of the SCIPUFF model as descried in the draft report PC-SCIPUFF Version 0.2 Technical Documentation by Sykes et al. The verification included a scientific review of the model physics and parameterizations described in the report, and checks for their internal usage and consistency with current practices in atmospheric dispersion modeling. This work is intended to examine the scientific basis and defensiblity of the model for the intended application. A related task is an assessment of the model`s general capabilities and limitations. A line-by-line verification of the computer source code was not possible; however, the code was checked with a commercial code analyzer. About 47 potential coding inconsistencies were identified.

  3. A Modeling and Verification Study of Summer Precipitation Systems Using NASA Surface Initialization Datasets

    Science.gov (United States)

    Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.

    2010-01-01

    One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using

  4. Probabilistic Model for Dynamic Signature Verification System

    Directory of Open Access Journals (Sweden)

    Chai Tong Yuen

    2011-11-01

    Full Text Available This study has proposed the algorithm for signature verification system using dynamic parameters of the signature: pen pressure, velocity and position. The system is proposed to read, analyze and verify the signatures from the SUSig online database. Firstly, the testing and reference samples will have to be normalized, re-sampled and smoothed through pre-processing stage. In verification stage, the difference between reference and testing signatures will be calculated based on the proposed thresholded standard deviation method. A probabilistic acceptance model has been designed to enhance the performance of the verification system. The proposed algorithm has reported False Rejection Rate (FRR of 14.8% and False Acceptance Rate (FAR of 2.64%. Meanwhile, the classification rate of the system is around 97%.

  5. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    Directory of Open Access Journals (Sweden)

    Paweł Drapikowski

    2016-06-01

    Full Text Available This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated.

  6. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Ed; Mader, Angelika

    2004-01-01

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the verificatio

  7. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  8. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on t

  9. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  10. Calibration and verification of environmental models

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Weinberg, N.; Hiser, H.

    1976-01-01

    The problems of calibration and verification of mesoscale models used for investigating power plant discharges are considered. The value of remote sensors for data acquisition is discussed as well as an investigation of Biscayne Bay in southern Florida.

  11. ENROLMENT MODEL STABILITY IN STATIC SIGNATURE VERIFICATION

    NARCIS (Netherlands)

    Allgrove, C.; Fairhurst, M.C.

    2004-01-01

    The stability of enrolment models used in a static verification system is assessed, in order to provide an enhanced chracterisation of signatures through the validation of the enrolment process. A number of static features are used to illustrate the effect of the variation in enrolment model size on

  12. Performance study of a heat pump dryer system for speciality crops - Pt. 2: model verification

    Energy Technology Data Exchange (ETDEWEB)

    Adapa, P.K.; Schoenau, G.J.; Sokhansanj, S. [University of Saskatchewan (Canada). College of Engineering

    2002-07-01

    The experimental and predicted performance data of a heat pump dryer system is reported. Chopped alfalfa was dried in a cabinet dryer in batches and also by emulating continuous bed drying using two heat pumps operating in parallel. Results showed that alfalfa was dried from an initial moisture content of 70% (wb) to a final moisture content of 10% (wb). The batch drying took about 4.5 h while continuous bed drying took 4 h to dry the same amount of material. The average air velocity inside the dryer was 0.36 m s{sup -1}. Low temperatures (30-45{sup o}C) for safe drying of specialty crops were achieved experimentally. The heat pump drying system used in this study was about 50% more efficient in recovering the latent heat from the dryer exhaust compared to the conventional dryers. Specific moisture extraction rate (SMER) was maximum when relative humidity stayed above 40%. The dryer was shown to be capable of SMER of between 0.5 and 1.02 kg kW{sup -1} h{sup -1}. It was concluded that continuous bed drying is potentially a better option than batch drying because high process air humidity ratios at the entrance of the evaporator and constant moisture extraction rate and specific moisture extraction rate values can be maintained. An uncertainty analysis confirmed the accuracy of the model. (author)

  13. The Construction of Verification Models for Embedded Systems

    NARCIS (Netherlands)

    Mader, Angelika H.; Wupper, H.; Boon, Mieke

    2007-01-01

    The usefulness of verification hinges on the quality of the verification model. Verification is useful if it increases our confidence that an artefact bahaves as expected. As modelling inherently contains non-formal elements, the qualityof models cannot be captured by purely formal means. Still, we

  14. Systematic study of source mask optimization and verification flows

    Science.gov (United States)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  15. Correction, improvement and model verification of CARE 3, version 3

    Science.gov (United States)

    Rose, D. M.; Manke, J. W.; Altschul, R. E.; Nelson, D. L.

    1987-01-01

    An independent verification of the CARE 3 mathematical model and computer code was conducted and reported in NASA Contractor Report 166096, Review and Verification of CARE 3 Mathematical Model and Code: Interim Report. The study uncovered some implementation errors that were corrected and are reported in this document. The corrected CARE 3 program is called version 4. Thus the document, correction. improvement, and model verification of CARE 3, version 3 was written in April 1984. It is being published now as it has been determined to contain a more accurate representation of CARE 3 than the preceding document of April 1983. This edition supercedes NASA-CR-166122 entitled, 'Correction and Improvement of CARE 3,' version 3, April 1983.

  16. A global verification study of a quasi-static knee model with multi-bundle ligaments

    NARCIS (Netherlands)

    Mommersteeg, TJA; Blankevoort, L; Kooloos, JGM; Kauer, JMG; Maathuis, PGM

    1996-01-01

    The ligaments of the knee consist of fiber bundles with variable orientations, lengths and mechanical properties. In concept, however, these structures were too often seen as homogeneous structures, which are either stretched or slack during knee motions. In previous studies, we proposed a new struc

  17. VARTM Model Development and Verification

    Science.gov (United States)

    Cano, Roberto J. (Technical Monitor); Dowling, Norman E.

    2004-01-01

    In this investigation, a comprehensive Vacuum Assisted Resin Transfer Molding (VARTM) process simulation model was developed and verified. The model incorporates resin flow through the preform, compaction and relaxation of the preform, and viscosity and cure kinetics of the resin. The computer model can be used to analyze the resin flow details, track the thickness change of the preform, predict the total infiltration time and final fiber volume fraction of the parts, and determine whether the resin could completely infiltrate and uniformly wet out the preform.

  18. Verification and Validation of the Coastal Modeling System. Report 3: CMS-Flow: Hydrodynamics

    Science.gov (United States)

    2011-12-01

    ER D C/ CH L TR -1 1- 10 Verification and Validation of the Coastal Modeling System Report 3, CMS -Flow: Hydrodynamics Co as ta l a nd...11-10 December 2011 Verification and Validation of the Coastal Modeling System Report 3, CMS -Flow: Hydrodynamics Alejandro Sánchez, Weiming Wu...of four reports toward the Verification and Validation (V&V) of the Coastal Modeling System ( CMS ). The details of the V&V study specific to the

  19. A hybrid Bayesian hierarchical model combining cohort and case-control studies for meta-analysis of diagnostic tests: Accounting for partial verification bias.

    Science.gov (United States)

    Ma, Xiaoye; Chen, Yong; Cole, Stephen R; Chu, Haitao

    2014-05-26

    To account for between-study heterogeneity in meta-analysis of diagnostic accuracy studies, bivariate random effects models have been recommended to jointly model the sensitivities and specificities. As study design and population vary, the definition of disease status or severity could differ across studies. Consequently, sensitivity and specificity may be correlated with disease prevalence. To account for this dependence, a trivariate random effects model had been proposed. However, the proposed approach can only include cohort studies with information estimating study-specific disease prevalence. In addition, some diagnostic accuracy studies only select a subset of samples to be verified by the reference test. It is known that ignoring unverified subjects may lead to partial verification bias in the estimation of prevalence, sensitivities, and specificities in a single study. However, the impact of this bias on a meta-analysis has not been investigated. In this paper, we propose a novel hybrid Bayesian hierarchical model combining cohort and case-control studies and correcting partial verification bias at the same time. We investigate the performance of the proposed methods through a set of simulation studies. Two case studies on assessing the diagnostic accuracy of gadolinium-enhanced magnetic resonance imaging in detecting lymph node metastases and of adrenal fluorine-18 fluorodeoxyglucose positron emission tomography in characterizing adrenal masses are presented.

  20. VERIFICATION OF GEAR DYNAMIC MODEL IN DIFFERENT OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Grzegorz PERUŃ

    2014-09-01

    Full Text Available The article presents the results of verification of the drive system dynamic model with gear. Tests were carried out on the real object in different operating conditions. For the same assumed conditions were also carried out simulation studies. Comparison of the results obtained from those two series of tests helped determine the suitability of the model and verify the possibility of replacing experimental research by simulations with use of dynamic model.

  1. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  2. Verification of pneumatic railway brake models

    Science.gov (United States)

    Piechowiak, Tadeusz

    2010-03-01

    The article presents a survey of diverse methods for validation of pneumatic train brake modelling. Various experimental measurements of railway pneumatic brakes were made chiefly on a test stand at Poznań University of Technology; other test stands and some results have been taken from the literature. The measurements, some of them unconventional, were performed on separate pneumatic elements, brake devices, the brake pipe and fragments thereof. Mechanical devices were also included. The experimental measurement results were used for the verification of numerical models and for the determination of parameters. The latter was partially performed using an optimisation method.

  3. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  4. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  5. Formal Modeling and Verification for MVB

    Directory of Open Access Journals (Sweden)

    Mo Xia

    2013-01-01

    Full Text Available Multifunction Vehicle Bus (MVB is a critical component in the Train Communication Network (TCN, which is widely used in most of the modern train techniques of the transportation system. How to ensure security of MVB has become an important issue. Traditional testing could not ensure the system correctness. The MVB system modeling and verification are concerned in this paper. Petri Net and model checking methods are used to verify the MVB system. A Hierarchy Colored Petri Net (HCPN approach is presented to model and simulate the Master Transfer protocol of MVB. Synchronous and asynchronous methods are proposed to describe the entities and communication environment. Automata model of the Master Transfer protocol is designed. Based on our model checking platform M3C, the Master Transfer protocol of the MVB is verified and some system logic critical errors are found. Experimental results show the efficiency of our methods.

  6. Approaches to verification of two-dimensional water quality models

    Energy Technology Data Exchange (ETDEWEB)

    Butkus, S.R. (Tennessee Valley Authority, Chattanooga, TN (USA). Water Quality Dept.)

    1990-11-01

    The verification of a water quality model is the one procedure most needed by decision making evaluating a model predictions, but is often not adequate or done at all. The results of a properly conducted verification provide the decision makers with an estimate of the uncertainty associated with model predictions. Several statistical tests are available for quantifying of the performance of a model. Six methods of verification were evaluated using an application of the BETTER two-dimensional water quality model for Chickamauga reservoir. Model predictions for ten state variables were compared to observed conditions from 1989. Spatial distributions of the verification measures showed the model predictions were generally adequate, except at a few specific locations in the reservoir. The most useful statistics were the mean standard error of the residuals. Quantifiable measures of model performance should be calculated during calibration and verification of future applications of the BETTER model. 25 refs., 5 figs., 7 tabs.

  7. USER CONTEXT MODELS : A FRAMEWORK TO EASE SOFTWARE FORMAL VERIFICATIONS

    OpenAIRE

    2010-01-01

    This article is accepted to appear in ICEIS 2010 proceedings; International audience; Several works emphasize the difficulties of software verification applied to embedded systems. In past years, formal verification techniques and tools were widely developed and used by the research community. However, the use of formal verification at industrial scale remains difficult, expensive and requires lot of time. This is due to the size and the complexity of manipulated models, but also, to the impo...

  8. Control and verification of industrial hybrid systems using models specified with the formalism $ chi $

    NARCIS (Netherlands)

    J.J.H. Fey

    1996-01-01

    textabstractControl and verification of hybrid systems is studied using two industrial examples. The hybrid models of a conveyor-belt and of a biochemical plant for the production of ethanol are specified in the formalism $chi .$ A verification of the closed-loop systems for those examples,

  9. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  10. Columbia River Estuary Hybrid Model Studies. Report 1. Verification of Hybrid Modeling of the Columbia River Mouth.

    Science.gov (United States)

    1983-09-01

    King, J.M., and Carlson, P.R. 1966. " Seismic Reflection Studies of Buried Channels off the Columbia River," Ore Bin, Vol. 28, Aug. 12. Boone, C.G...Adjacent Ocean Waters, University of Washington Press, Seattle, WA. 64. Foster, R.F. 1972. The History of Hanford and Its Contribution of Radionuclides...Corvallis, OR. 81. Hanson, P.J., and Forster, W.O. 1969. "Measurement of Columbia River Flow Time from Hanford Reactors to Astoria, Oregon - Summer

  11. Hybrid Enrichment Verification Array: Module Characterization Studies

    Energy Technology Data Exchange (ETDEWEB)

    Zalavadia, Mital A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Leon E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McDonald, Benjamin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kulisek, Jonathan A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mace, Emily K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Deshmukh, Nikhil S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-03-01

    The work presented in this report is focused on the characterization and refinement of the Hybrid Enrichment Verification Array (HEVA) approach, which combines the traditional 186-keV 235U signature with high-energy prompt gamma rays from neutron capture in the detector and surrounding collimator material, to determine the relative enrichment and 235U mass of the cylinder. The design of the HEVA modules (hardware and software) deployed in the current field trial builds on over seven years of study and evolution by PNNL, and consists of a ø3''×3'' NaI(Tl) scintillator coupled to an Osprey digital multi-channel analyzer tube base from Canberra. The core of the HEVA methodology, the high-energy prompt gamma-ray signature, serves as an indirect method for the measurement of total neutron emission from the cylinder. A method for measuring the intrinsic efficiency of this “non-traditional” neutron signature and the results from a benchmark experiment are presented. Also discussed are potential perturbing effects on the non-traditional signature, including short-lived activation of materials in the HEVA module. Modeling and empirical results are presented to demonstrate that such effects are expected to be negligible for the envisioned implementation scenario. In comparison to previous versions, the new design boosts the high-energy prompt gamma-ray signature, provides more flexible and effective collimation, and improves count-rate management via commercially available pulse-processing electronics with a special modification prompted by PNNL.

  12. Comparison of particulate verification techniques study

    Science.gov (United States)

    Rivera, Rachel

    2006-08-01

    The efficacy of five particulate verification techniques on four types of materials was studied. Statistical Analysis Software/JMP 6.0 was used to create a statistically valid design of experiments. In doing so, 35 witness coupons consisting of the four types of materials being studied, were intentionally contaminated with particulate fallout. Image Analysis was used to characterize the extent of particulate fallout on the coupons and was used to establish a baseline, or basis of comparison, against the five techniques that were studied. The five particulate verification techniques were the Tapelift, the Particulate Solvent Rinse, the GelPak lift, an in-line vacuum filtration probe, and the Infinity Focusing Microscope (IFM). The four types of materials consisted of magnesium flouride (MgF II) coated mirrors, composite coated silver aluminum (CCAg), Z93 and NS43G coated aluminum, and silicon (si) wafers. The vacuum probe was determined to be most effective for Z93, the tapelift or vacuum probe for MgF2, and the GelPak Lift for CCAg and si substrates. A margin of error for each technique, based on experimental data from two experiments, for si wafer substrates, yielded the following: Tapelift - 67%, Solvent Rinse - 58%, GelPak- 26%, Vacuum Probe - 93%, IFM-to be determined.

  13. Verification and transfer of thermal pollution model. Volume 5: Verification of 2-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorate (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  14. Continuous Verification of Large Embedded Software using SMT-Based Bounded Model Checking

    CERN Document Server

    Cordeiro, Lucas; Marques-Silva, Joao

    2009-01-01

    The complexity of software in embedded systems has increased significantly over the last years so that software verification now plays an important role in ensuring the overall product quality. In this context, SAT-based bounded model checking has been successfully applied to discover subtle errors, but for larger applications, it often suffers from the state space explosion problem. This paper describes a new approach called continuous verification to detect design errors as quickly as possible by looking at the Software Configuration Management (SCM) system and by combining dynamic and static verification to reduce the state space to be explored. We also give a set of encodings that provide accurate support for program verification and use different background theories in order to improve scalability and precision in a completely automatic way. A case study from the telecommunications domain shows that the proposed approach improves the error-detection capability and reduces the overall verification time by...

  15. Verification strategies for fluid-based plasma simulation models

    Science.gov (United States)

    Mahadevan, Shankar

    2012-10-01

    Verification is an essential aspect of computational code development for models based on partial differential equations. However, verification of plasma models is often conducted internally by authors of these programs and not openly discussed. Several professional research bodies including the IEEE, AIAA, ASME and others have formulated standards for verification and validation (V&V) of computational software. This work focuses on verification, defined succinctly as determining whether the mathematical model is solved correctly. As plasma fluid models share several aspects with the Navier-Stokes equations used in Computational Fluid Dynamics (CFD), the CFD verification process is used as a guide. Steps in the verification process: consistency checks, examination of iterative, spatial and temporal convergence, and comparison with exact solutions, are described with examples from plasma modeling. The Method of Manufactured Solutions (MMS), which has been used to verify complex systems of PDEs in solid and fluid mechanics, is introduced. An example of the application of MMS to a self-consistent plasma fluid model using the local mean energy approximation is presented. The strengths and weaknesses of the techniques presented in this work are discussed.

  16. Weather model verification using Sodankylä mast measurements

    Directory of Open Access Journals (Sweden)

    M. Kangas

    2015-12-01

    Full Text Available Sodankylä, in the heart of Arctic Research Centre of the Finnish Meteorological Institute (FMI ARC in northern Finland, is an ideal site for atmospheric and environmental research in the boreal and sub-arctic zone. With temperatures ranging from −50 to +30 °C, it provides a challenging testing ground for numerical weather forecasting (NWP models as well as weather forecasting in general. An extensive set of measurements has been carried out in Sodankylä for more than 100 years. In 2000, a 48 m high micrometeorological mast was erected in the area. In this article, the use of Sodankylä mast measurements in NWP model verification is described. Started in 2000 with NWP model HIRLAM and Sodankylä measurements, the verification system has now been expanded to include comparisons between 12 NWP models and seven measurement masts. A case study, comparing forecasted and observed radiation fluxes, is also presented. It was found that three different radiation schemes, applicable in NWP model HARMONIE-AROME, produced during cloudy days somewhat different downwelling long-wave radiation fluxes, which however did not change the overall cold bias of the predicted screen-level temperature.

  17. A verification and validation process for model-driven engineering

    Science.gov (United States)

    Delmas, R.; Pires, A. F.; Polacsek, T.

    2013-12-01

    Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

  18. Interactive verification of Markov chains: Two distributed protocol case studies

    Directory of Open Access Journals (Sweden)

    Johannes Hölzl

    2012-12-01

    Full Text Available Probabilistic model checkers like PRISM only check probabilistic systems of a fixed size. To guarantee the desired properties for an arbitrary size, mathematical analysis is necessary. We show for two case studies how this can be done in the interactive proof assistant Isabelle/HOL. The first case study is a detailed description of how we verified properties of the ZeroConf protocol, a decentral address allocation protocol. The second case study shows the more involved verification of anonymity properties of the Crowds protocol, an anonymizing protocol.

  19. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  20. Verification of Embedded Memory Systems using Efficient Memory Modeling

    CERN Document Server

    Ganai, Malay K; Ashar, Pranav

    2011-01-01

    We describe verification techniques for embedded memory systems using efficient memory modeling (EMM), without explicitly modeling each memory bit. We extend our previously proposed approach of EMM in Bounded Model Checking (BMC) for a single read/write port single memory system, to more commonly occurring systems with multiple memories, having multiple read and write ports. More importantly, we augment such EMM to providing correctness proofs, in addition to finding real bugs as before. The novelties of our verification approach are in a) combining EMM with proof-based abstraction that preserves the correctness of a property up to a certain analysis depth of SAT-based BMC, and b) modeling arbitrary initial memory state precisely and thereby, providing inductive proofs using SAT-based BMC for embedded memory systems. Similar to the previous approach, we construct a verification model by eliminating memory arrays, but retaining the memory interface signals with their control logic and adding constraints on tho...

  1. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however......, feasible for modelling the overall cost per day. The study also shows that combining the insurance and regional data it is possible to establish clear relationships between occurrences of claims and hazard maps. In particular, the results indicate that with improvements on data collection and analysis......, improved prediction of damage information will be possible, e.g. based on also socioeconomic variables. Furthermore, the paper concludes that more collaboration between scientific research and insurance agencies is necessary to improve inundation modelling and economic assessments for urban drainage...

  2. Verification of flood damage modelling using insurance data.

    Science.gov (United States)

    Zhou, Q; Panduro, T E; Thorsen, B J; Arnbjerg-Nielsen, K

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however, feasible for modelling the overall cost per day. The study also shows that in combining the insurance and regional data it is possible to establish clear relationships between occurrences of claims and hazard maps. In particular, the results indicate that with improvements to data collection and analysis, improved prediction of damage costs will be possible, for example based also on socioeconomic variables. Furthermore, the paper concludes that more collaboration between scientific research and insurance agencies is needed to improve inundation modelling and economic assessments for urban drainage designs.

  3. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...

  4. How reliable are satellite precipitation estimates for driving hydrological models: a verification study over the Mediterranean area

    Science.gov (United States)

    Camici, Stefania; Ciabatta, Luca; Massari, Christian; Brocca, Luca

    2017-04-01

    Floods are one of the most common and dangerous natural hazards, causing every year thousands of casualties and damages worldwide. The main tool for assessing flood risk and reducing damages is represented by hydrologic early warning systems that allow to forecast flood events by using real time data obtained through ground monitoring networks (e.g., raingauges and radars). However, the use of such data, mainly rainfall, presents some issues firstly related to the network density and to the limited spatial representativeness of local measurements. A way to overcome these issues may be the use of satellite-based rainfall products (SRPs) that nowadays are available on a global scale at ever increasing spatial/temporal resolution and accuracy. However, despite the large availability and increased accuracy of SRPs (e.g., the Tropical Rainfall Measurement Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA); the Satellite Application Facility on Support to Operational Hydrology and Water Management (H-SAF); and the recent Global Precipitation Measurement (GPM) mission), remotely sensed rainfall data are scarcely used in hydrological modeling and only a small number of studies have been carried out to outline some guidelines for using satellite data as input for hydrological modelling. Reasons may be related to: 1) the large bias characterizing satellite precipitation estimates, which is dependent on rainfall intensity and season, 2) the spatial/temporal resolution, 3) the timeliness, which is often insufficient for operational purposes, and 4) a general (often not justified) skepticism of the hydrological community in the use of satellite products for land applications. The objective of this study is to explore the feasibility of using SRPs in a lumped hydrologic model (MISDc, "Modello Idrologico Semi-Distribuito in continuo", Masseroni et al., 2017) over 10 basins in the Mediterranean area with different sizes and physiographic characteristics. Specifically

  5. Measurement and modeling of exposure to selected air toxics for health effects studies and verification by biomarkers.

    Science.gov (United States)

    Harrison, Roy M; Delgado-Saborit, Juana Maria; Baker, Stephen J; Aquilina, Noel; Meddings, Claire; Harrad, Stuart; Matthews, Ian; Vardoulakis, Sotiris; Anderson, H Ross

    2009-06-01

    The overall aim of our investigation was to quantify the magnitude and range of individual personal exposures to a variety of air toxics and to develop models for exposure prediction on the basis of time-activity diaries. The specific research goals were (1) to use personal monitoring of non-smokers at a range of residential locations and exposures to non-traffic sources to assess daily exposures to a range of air toxics, especially volatile organic compounds (VOCs) including 1,3-butadiene and particulate polycyclic aromatic hydrocarbons (PAHs); (2) to determine microenvironmental concentrations of the same air toxics, taking account of spatial and temporal variations and hot spots; (3) to optimize a model of personal exposure using microenvironmental concentration data and time-activity diaries and to compare modeled exposures with exposures independently estimated from personal monitoring data; (4) to determine the relationships of urinary biomarkers with the environmental exposures to the corresponding air toxic. Personal exposure measurements were made using an actively pumped personal sampler enclosed in a briefcase. Five 24-hour integrated personal samples were collected from 100 volunteers with a range of exposure patterns for analysis of VOCs and 1,3-butadiene concentrations of ambient air. One 24-hour integrated PAH personal exposure sample was collected by each subject concurrently with 24 hours of the personal sampling for VOCs. During the period when personal exposures were being measured, workplace and home concentrations of the same air toxics were being measured simultaneously, as were seasonal levels in other microenvironments that the subjects visit during their daily activities, including street microenvironments, transport microenvironments, indoor environments, and other home environments. Information about subjects' lifestyles and daily activities were recorded by means of questionnaires and activity diaries. VOCs were collected in tubes packed

  6. On the verification and validation of detonation models

    Science.gov (United States)

    Quirk, James

    2013-06-01

    This talk will consider the verification and validation of detonation models, such as Wescott-Stewart-Davis (Journal of Applied Physics. 2005), from the perspective of the American Institute of Aeronautics and Astronautics policy on numerical accuracy (AIAA J. Vol. 36, No. 1, 1998). A key aspect of the policy is that accepted documentation procedures must be used for journal articles with the aim of allowing the reported work to be reproduced by the interested reader. With the rise of electronic documents, since the policy was formulated, it is now possible to satisfy this mandate in its strictest sense: that is, it is now possible to run a comptuational verification study directly in a PDF, thereby allowing a technical author to report numerical subtleties that traditionally have been ignored. The motivation for this document-centric approach is discussed elsewhere (Quirk2003, Adaptive Mesh Refinement Theory and Practice, Springer), leaving the talk to concentrate on specific detonation examples that should be of broad interest to the shock-compression community.

  7. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...

  8. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  9. IP cores design from specifications to production modeling, verification, optimization, and protection

    CERN Document Server

    Mohamed, Khaled Salah

    2016-01-01

    This book describes the life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection. Various trade-offs in the design process are discussed, including  those associated with many of the most common memory cores, controller IPs  and system-on-chip (SoC) buses. Readers will also benefit from the author’s practical coverage of new verification methodologies. such as bug localization, UVM, and scan-chain.  A SoC case study is presented to compare traditional verification with the new verification methodologies. ·         Discusses the entire life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection; ·         Introduce a deep introduction for Verilog for both implementation and verification point of view.  ·         Demonstrates how to use IP in applications such as memory controllers and SoC buses. ·         Describes a new ver...

  10. Demonstration of Design Verification Model of Rubidium Frequency Standard

    CERN Document Server

    Ghosal, Bikash; Nandanwar, Satish; Banik, Alak; Dasgupta, K S; Saxena, G M

    2011-01-01

    In this paper we report the development of the design verification model (DVM) of Rb atomic frequency standard. The Rb atomic frequency standard or clock has two distinct parts. One is the Physics Package where the hyperfine transitions produce the clock signal in the integrated filter cell configuration and the other is the electronic circuits which generate the resonant microwave hyperfine frequency, phase modulator and phase sensitive detector. In this paper the details of the Rb Physics package and the electronic circuits are given. The effect of putting the photo detector inside the microwave cavity is studied and reported with its effect on the resonance signal profile. The Rb clock frequency stability measurements have also been discussed.

  11. Sensor Fusion and Model Verification for a Mobile Robot

    OpenAIRE

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck; Bendtsen, Jan Dimon; Izadi-Zamanabadi, Roozbeh

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms as well as slip. An Unscented Kalman Filter (UKF) based on the dynamic model is used for sensor fusion, feeding sensor measurements back to the robot controller in an intelligent manner. Through practi...

  12. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...

  13. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Panduro, T. E.; Thorsen, B. J.

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...

  14. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  15. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  16. Verification of road databases using multiple road models

    Science.gov (United States)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  17. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  18. High temperature furnace modeling and performance verifications

    Science.gov (United States)

    Smith, James E., Jr.

    1992-01-01

    Analytical, numerical, and experimental studies were performed on two classes of high temperature materials processing sources for their potential use as directional solidification furnaces. The research concentrated on a commercially available high temperature furnace using a zirconia ceramic tube as the heating element and an Arc Furnace based on a tube welder. The first objective was to assemble the zirconia furnace and construct parts needed to successfully perform experiments. The 2nd objective was to evaluate the zirconia furnace performance as a directional solidification furnace element. The 3rd objective was to establish a data base on materials used in the furnace construction, with particular emphasis on emissivities, transmissivities, and absorptivities as functions of wavelength and temperature. A 1-D and 2-D spectral radiation heat transfer model was developed for comparison with standard modeling techniques, and were used to predict wall and crucible temperatures. The 4th objective addressed the development of a SINDA model for the Arc Furnace and was used to design sample holders and to estimate cooling media temperatures for the steady state operation of the furnace. And, the 5th objective addressed the initial performance evaluation of the Arc Furnace and associated equipment for directional solidification. Results of these objectives are presented.

  19. Formal Specifications and Verification of a Secure Communication Protocol Model

    Institute of Scientific and Technical Information of China (English)

    夏阳; 陆余良; 蒋凡

    2003-01-01

    This paper presents a secure communication protocol model-EABM, by which network security communication can be realized easily and efficiently. First, the paper gives a thorough analysis of the protocol system, systematic construction and state transition of EABM. Then , it describes the channels and the process of state transition of EABM in terms of ESTELLE. At last, it offers a verification of the accuracy of the EABM model.

  20. Assessment of Galileo modal test results for mathematical model verification

    Science.gov (United States)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  1. Verification of A Numerical Harbour Wave Model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A numerical model for wave propagation in a harbour is verified by use of physical models. The extended time-dependent mild slope equation is employed as the governing equation, and the model is solved by use of ADI method containing the relaxation factor. Firstly, the reflection coefficient of waves in front of rubble-mound breakwaters under oblique incident waves is determined through physical model tests, and it is regarded as the basis for simulating partial reflection boundaries of the numerical model. Then model tests on refraction, diffraction and reflection of waves in a harbour are performed to measure wave height distribution. Comparative results between physical and numerical model tests show that the present numerical model can satisfactorily simulate the propagation of regular and irregular waves in a harbour with complex topography and boundary conditions.

  2. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  3. Spatial Error Metrics for Oceanographic Model Verification

    Science.gov (United States)

    2012-02-01

    quantitatively and qualitatively for this oceano - graphic data and successfully separates the model error into displacement and intensity components. This... oceano - graphic models as well, though one would likely need to make special modifications to handle the often-used nonuniform spacing between depth layers

  4. Is my model good enough? Best practices for verification and validation of musculoskeletal models and simulations of movement.

    Science.gov (United States)

    Hicks, Jennifer L; Uchida, Thomas K; Seth, Ajay; Rajagopal, Apoorva; Delp, Scott L

    2015-02-01

    Computational modeling and simulation of neuromusculoskeletal (NMS) systems enables researchers and clinicians to study the complex dynamics underlying human and animal movement. NMS models use equations derived from physical laws and biology to help solve challenging real-world problems, from designing prosthetics that maximize running speed to developing exoskeletal devices that enable walking after a stroke. NMS modeling and simulation has proliferated in the biomechanics research community over the past 25 years, but the lack of verification and validation standards remains a major barrier to wider adoption and impact. The goal of this paper is to establish practical guidelines for verification and validation of NMS models and simulations that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies. In particular, we review a general process for verification and validation applied to NMS models and simulations, including careful formulation of a research question and methods, traditional verification and validation steps, and documentation and sharing of results for use and testing by other researchers. Modeling the NMS system and simulating its motion involves methods to represent neural control, musculoskeletal geometry, muscle-tendon dynamics, contact forces, and multibody dynamics. For each of these components, we review modeling choices and software verification guidelines; discuss variability, errors, uncertainty, and sensitivity relationships; and provide recommendations for verification and validation by comparing experimental data and testing robustness. We present a series of case studies to illustrate key principles. In closing, we discuss challenges the community must overcome to ensure that modeling and simulation are successfully used to solve the broad spectrum of problems that limit human mobility.

  5. Certification and verification for Northrup Model NSC-01-0732 Fresnel lens concentrating solar collector

    Energy Technology Data Exchange (ETDEWEB)

    1979-03-01

    The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.

  6. Superelement Verification in Complex Structural Models

    Directory of Open Access Journals (Sweden)

    B. Dupont

    2008-01-01

    Full Text Available The objective of this article is to propose decision indicators to guide the analyst in the optimal definition of an ensemble of superelements in a complex structural assembly. These indicators are constructed based on comparisons between the unreduced physical model and the approximate solution provided by a nominally reduced superelement model. First, the low contribution substructure slave modes are filtered. Then, the minimum dynamical residual expansion is used to localize the superelements which are the most responsible for the response prediction errors. Moreover, it is shown that static residual vectors, which are a natural result of these calculations, can be included to represent the contribution of important truncated slave modes and consequently correct the deficient superelements. The proposed methodology is illustrated on a subassembly of an aeroengine model.

  7. Formal Verification of Full-Wave Rectifier: A Case Study

    CERN Document Server

    Lata, Kusum

    2009-01-01

    We present a case study of formal verification of full-wave rectifier for analog and mixed signal designs. We have used the Checkmate tool from CMU [1], which is a public domain formal verification tool for hybrid systems. Due to the restriction imposed by Checkmate it necessitates to make the changes in the Checkmate implementation to implement the complex and non-linear system. Full-wave rectifier has been implemented by using the Checkmate custom blocks and the Simulink blocks from MATLAB from Math works. After establishing the required changes in the Checkmate implementation we are able to efficiently verify the safety properties of the full-wave rectifier.

  8. Verification of the Chesapeake Bay Model.

    Science.gov (United States)

    1981-12-01

    line of the five cups was about 0.045 ft above the bottom of the meter frame; 30 STEPPING MOTOR 200 STEPS REVOLUTION TRANSLATOR SRPPOT.E SELECTOR DIST...about 0.1 ft in the model, represented a horizontal width of about 100 ft in the prototype. The height of the meter cups , about 0.04 ft, represented...the entire bay. Although station-to-station wind magnitude comparisons cannot be made due to variations in anemometer height and exposure, wind-field

  9. Modeling and Verification of the Bitcoin Protocol

    Directory of Open Access Journals (Sweden)

    Kaylash Chaudhary

    2015-11-01

    Full Text Available Bitcoin is a popular digital currency for online payments, realized as a decentralized peer-to-peer electronic cash system. Bitcoin keeps a ledger of all transactions; the majority of the participants decides on the correct ledger. Since there is no trusted third party to guard against double spending, and inspired by its popularity, we would like to investigate the correctness of the Bitcoin protocol. Double spending is an important threat to electronic payment systems. Double spending would happen if one user could force a majority to believe that a ledger without his previous payment is the correct one. We are interested in the probability of success of such a double spending attack, which is linked to the computational power of the attacker. This paper examines the Bitcoin protocol and provides its formalization as an UPPAAL model. The model will be used to show how double spending can be done if the parties in the Bitcoin protocol behave maliciously, and with what probability double spending occurs.

  10. Performance verification tests of JT-60SA CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Obana, Tetsuhiro, E-mail: obana.tetsuhiro@LHD.nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Murakami, Haruyuki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan)

    2015-11-15

    Highlights: • The performance of the JT-60SA CS model coil was verified. • The CS model coil comprised a quad-pancake wound with a Nb{sub 3}Sn CIC conductor. • The CS model coil met the design requirements. - Abstract: As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb{sub 3}Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of −0.62% for the Nb{sub 3}Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  11. Design verification and cold-flow modeling test report

    Energy Technology Data Exchange (ETDEWEB)

    1993-07-01

    This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, and TRW proprietary information has been excluded.

  12. Transforming PLC Programs into Formal Models for Verification Purposes

    CERN Document Server

    Darvas, D; Blanco, E

    2013-01-01

    Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.

  13. SoS contract verification using statistical model checking

    Directory of Open Access Journals (Sweden)

    Alessandro Mignogna

    2013-11-01

    Full Text Available Exhaustive formal verification for systems of systems (SoS is impractical and cannot be applied on a large scale. In this paper we propose to use statistical model checking for efficient verification of SoS. We address three relevant aspects for systems of systems: 1 the model of the SoS, which includes stochastic aspects; 2 the formalization of the SoS requirements in the form of contracts; 3 the tool-chain to support statistical model checking for SoS. We adapt the SMC technique for application to heterogeneous SoS. We extend the UPDM/SysML specification language to express the SoS requirements that the implemented strategies over the SoS must satisfy. The requirements are specified with a new contract language specifically designed for SoS, targeting a high-level English- pattern language, but relying on an accurate semantics given by the standard temporal logics. The contracts are verified against the UPDM/SysML specification using the Statistical Model Checker (SMC PLASMA combined with the simulation engine DESYRE, which integrates heterogeneous behavioral models through the functional mock-up interface (FMI standard. The tool-chain allows computing an estimation of the satisfiability of the contracts by the SoS. The results help the system architect to trade-off different solutions to guide the evolution of the SoS.

  14. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates...... sequential release – a feature in the new Danish interlocking systems. The generic model and safety properties can be instantiated with interlocking configuration data, resulting in a concrete model in the form of a Kripke structure, and in high-level safety properties expressed as state invariants. Using...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  15. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates...... sequential release - a feature in the new Danish interlocking systems. The generic model and safety properties can be instantiated with interlocking configuration data, resulting in a concrete model in the form of a Kripke structure, and in high-level safety properties expressed as state invariants. Using...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  16. A case study in pathway knowledgebase verification

    Directory of Open Access Journals (Sweden)

    Shah Nigam H

    2006-04-01

    Full Text Available Abstract Background Biological databases and pathway knowledgebases are proliferating rapidly. We are developing software tools for computer-aided hypothesis design and evaluation, and we would like our tools to take advantage of the information stored in these repositories. But before we can reliably use a pathway knowledgebase as a data source, we need to proofread it to ensure that it can fully support computer-aided information integration and inference. Results We design a series of logical tests to detect potential problems we might encounter using a particular knowledgebase, the Reactome database, with a particular computer-aided hypothesis evaluation tool, HyBrow. We develop an explicit formal language from the language implicit in the Reactome data format and specify a logic to evaluate models expressed using this language. We use the formalism of finite model theory in this work. We then use this logic to formulate tests for desirable properties (such as completeness, consistency, and well-formedness for pathways stored in Reactome. We apply these tests to the publicly available Reactome releases (releases 10 through 14 and compare the results, which highlight Reactome's steady improvement in terms of decreasing inconsistencies. We also investigate and discuss Reactome's potential for supporting computer-aided inference tools. Conclusion The case study described in this work demonstrates that it is possible to use our model theory based approach to identify problems one might encounter using a knowledgebase to support hypothesis evaluation tools. The methodology we use is general and is in no way restricted to the specific knowledgebase employed in this case study. Future application of this methodology will enable us to compare pathway resources with respect to the generic properties such resources will need to possess if they are to support automated reasoning.

  17. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available : Series B (Statistical Methodology), 50(2), pp. 157-224. 12th INCOSE SA Systems Engineering Conference ISBN 978-0-620-72719-8 Page 103 M&SCO. 2013. Verification, Validation, & Accreditation (VV&A) Recommended Practices Guide (RPG). Retrieved from U....S. DoD Modelling & Simulation Coordination Office. http://www.msco.mil/VVA_RPG.html (last accessed April 8, 2016). Pearl, J. 1988. Probabilistic reasoning in intelligent systems: Networks of plausible inference. Morgan Kaufmann. Sargent, R. G. 1981...

  18. Verification and transfer of thermal pollution model. Volume 3: Verification of 3-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free-surface model also provides surface height variations with time.

  19. Issues to be considered on obtaining plant models for formal verification purposes

    Science.gov (United States)

    Pacheco, R.; Gonzalez, L.; Intriago, M.; Machado, J.; Prisacaru, G.; Olaru, D.

    2016-08-01

    The development of dependable software for mechatronic systems can be a very complex and hard task. For facilitating the obtaining of dependable software for industrial controllers, some powerful software tools and analysis techniques can be used. Mainly, when using simulation and formal verification analysis techniques, it is necessary to develop plant models, in order to describe the plant behavior of those systems. However, developing a plant model implies that designer takes his (or her) decisions concerning granularity and level of abstraction of models; approach to consider for modeling (global or modular); and definition of strategies for simulation and formal verification tasks. This paper intends to highlight some aspects that can be considered for taking into account those decisions. For this purpose, it is presented a case study and there are illustrated and discussed very important aspects concerning above exposed issues.

  20. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    and implementation. Our approach has been developed in the context of the Coloured Petri Nets (CPNs) modelling language. We illustrate our approach by presenting a descriptive specification model of the Websocket protocol which is currently under development by the Internet Engineering Task Force (IETF), and we show......Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...

  1. A comparative verification of high resolution precipitation forecasts using model output statistics

    Science.gov (United States)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  2. Extension and validation of an analytical model for in vivo PET verification of proton therapy--a phantom and clinical study

    NARCIS (Netherlands)

    Attanasi, F; Knopf, A; Parodi, K.; Paganetti, Harald; Bortfeld, Thomas; Rosso, V; Del Guerra, Alberto

    2011-01-01

    The interest in positron emission tomography (PET) as a tool for treatment verification in proton therapy has become widespread in recent years, and several research groups worldwide are currently investigating the clinical implementation. After the first off-line investigation with a PET/CT scanner

  3. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-05-12

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  4. Verification of tropical cyclone using the KIAPS Integration Model (KIM)

    Science.gov (United States)

    Lim, S.; Seol, K. H.

    2015-12-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) is a government funded non-profit research and development institute located in Seoul, South Korea. KIAPS is developing the Global Model, a backbone for the next-generation operational global numerical weather prediction (NWP) system with three-phase plans; Establishment and R&D Planning (2011-2013), Test Model Development (2014-2016), and Operational Model Development (2017-2019). As a second-phase, we have beta version of KIAPS Integration Model (KIM) that can produce reasonable global forecasting. Using the KIM model, we are evaluating the tropical cyclone forecast in the global model. To objectively provide a best estimate of the storm's central position, we use the Geophysical Fluid Dynamics Laboratory (GFDL) vortex tracker, widely used in tracker algorithms. It gives the track and intensity of the storm throughout the duration of the forecast based on its algorithm. As a verification tool, we use the Model Evaluation Tool - Tropical Cyclone (MET-TC), which produces statistical evaluation. We expect these results give the statue of ability for the tropical cyclone forecast with KIM model.

  5. Pneumatic Adaptive Absorber: Mathematical Modelling with Experimental Verification

    Directory of Open Access Journals (Sweden)

    Grzegorz Mikułowski

    2016-01-01

    Full Text Available Many of mechanical energy absorbers utilized in engineering structures are hydraulic dampers, since they are simple and highly efficient and have favourable volume to load capacity ratio. However, there exist fields of applications where a threat of toxic contamination with the hydraulic fluid contents must be avoided, for example, food or pharmacy industries. A solution here can be a Pneumatic Adaptive Absorber (PAA, which is characterized by a high dissipation efficiency and an inactive medium. In order to properly analyse the characteristics of a PAA, an adequate mathematical model is required. This paper proposes a concept for mathematical modelling of a PAA with experimental verification. The PAA is considered as a piston-cylinder device with a controllable valve incorporated inside the piston. The objective of this paper is to describe a thermodynamic model of a double chamber cylinder with gas migration between the inner volumes of the device. The specific situation considered here is that the process cannot be defined as polytropic, characterized by constant in time thermodynamic coefficients. Instead, the coefficients of the proposed model are updated during the analysis. The results of the experimental research reveal that the proposed mathematical model is able to accurately reflect the physical behaviour of the fabricated demonstrator of the shock absorber.

  6. Simscape Modeling Verification in the Simulink Development Environment

    Science.gov (United States)

    Volle, Christopher E. E.

    2014-01-01

    The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.

  7. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  8. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  9. Verification of temporal-causal network models by mathematical analysis

    Directory of Open Access Journals (Sweden)

    Jan Treur

    2016-04-01

    Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.

  10. Verification of the two-dimensional hydrodynamic model based on remote sensing

    Science.gov (United States)

    Sazonov, Alexey; Mikhailukova, Polina; Krylenko, Inna; Frolova, Natalya; Kireeva, Mariya

    2016-04-01

    Mathematical modeling methods are used more and more actively to evaluate possible damage, identify potential flood zone and the influence of individual factors affecting the river during the passage of the flood. Calculations were performed by means of domestic software complex «STREAM-2D» which is based on the numerical solution of two-dimensional St. Venant equations. One of the major challenges in mathematical modeling is the verification of the model. This is usually made using data on water levels from hydrological stations: the smaller the difference of the actual level and the simulated one, the better the quality of the model used. Data from hydrological stations are not always available, so alternative sources of verification, such as remote sensing, are increasingly used. The aim of this work is to develop a method of verification of hydrodynamic model based on a comparison of actual flood zone area, which in turn is determined on the basis of the automated satellite image interpretation methods for different imaging systems and flooded area obtained in the course of the model. The study areas are Lena River, The North Dvina River, Amur River near Blagoveshchensk. We used satellite images made by optical and radar sensors: SPOT-5/HRG, Resurs-F, Radarsat-2. Flooded area were calculated using unsupervised classification (ISODATA and K-mean) for optical images and segmentation for Radarsat-2. Knowing the flow rate and the water level at a given date for the upper and lower limits of the model, respectively, it is possible to calculate flooded area by means of program STREAM-2D and GIS technology. All the existing vector layers with the boundaries of flooding are included in a GIS project for flood area calculation. This study was supported by the Russian Science Foundation, project no. 14-17-00155.

  11. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  12. Verification of the karst flow model under laboratory controlled conditions

    Science.gov (United States)

    Gotovac, Hrvoje; Andric, Ivo; Malenica, Luka; Srzic, Veljko

    2016-04-01

    Karst aquifers are very important groundwater resources around the world as well as in coastal part of Croatia. They consist of extremely complex structure defining by slow and laminar porous medium and small fissures and usually fast turbulent conduits/karst channels. Except simple lumped hydrological models that ignore high karst heterogeneity, full hydraulic (distributive) models have been developed exclusively by conventional finite element and finite volume elements considering complete karst heterogeneity structure that improves our understanding of complex processes in karst. Groundwater flow modeling in complex karst aquifers are faced by many difficulties such as a lack of heterogeneity knowledge (especially conduits), resolution of different spatial/temporal scales, connectivity between matrix and conduits, setting of appropriate boundary conditions and many others. Particular problem of karst flow modeling is verification of distributive models under real aquifer conditions due to lack of above-mentioned information. Therefore, we will show here possibility to verify karst flow models under the laboratory controlled conditions. Special 3-D karst flow model (5.6*2.6*2 m) consists of concrete construction, rainfall platform, 74 piezometers, 2 reservoirs and other supply equipment. Model is filled by fine sand (3-D porous matrix) and drainage plastic pipes (1-D conduits). This model enables knowledge of full heterogeneity structure including position of different sand layers as well as conduits location and geometry. Moreover, we know geometry of conduits perforation that enable analysis of interaction between matrix and conduits. In addition, pressure and precipitation distribution and discharge flow rates from both phases can be measured very accurately. These possibilities are not present in real sites what this model makes much more useful for karst flow modeling. Many experiments were performed under different controlled conditions such as different

  13. Quantum position verification in bounded-attack-frequency model

    Science.gov (United States)

    Gao, Fei; Liu, Bin; Wen, QiaoYan

    2016-11-01

    In 2011, Buhrman et al. proved that it is impossible to design an unconditionally secure quantum position verification (QPV) protocol if the adversaries are allowed to previously share unlimited entanglements. Afterwards, people started to design secure QPV protocols in practical settings, e.g. the bounded-storage model, where the adversaries' pre-shared entangled resources are supposed to be limited. Here we focus on another practical factor that it is very difficult for the adversaries to perform attack operations with unlimitedly high frequency. Concretely, we present a new kind of QPV protocols, called non-simultaneous QPV. And we prove the security of a specific non-simultaneous QPV protocol with the assumption that the frequency of the adversaries' attack operations is bounded, but no assumptions on their pre-shared entanglements or quantum storage. Actually, in our nonsimultaneous protocol, the information whether there comes a signal at present time is also a piece of command. It renders the adversaries "blind", that is, they have to execute attack operations with unlimitedly high frequency no matter whether a signal arrives, which implies the non-simultaneous QPV is also secure in the bounded-storage model.

  14. CFD Modeling & Verification in an Aircraft Paint Hangar

    Science.gov (United States)

    2011-05-01

    Collaboration •Navy Bureau of Medicine and Surgery (BUMED), IH Division –Assists CNO with health and safety of Navy aircraft artisans –Quarterly monitoring...levels • Handling paint particulates and vapors 10 E2S2. Verification Pitfalls • Artisans change process in the weeks between baseline and...verification – Added a fabric blanket in front of filter to save filter bank blocking exhaust airflow during sanding • Learn how to go w/o sleep

  15. Verification of Advances in a Coupled Snow-runoff Modeling Framework for Operational Streamflow Forecasts

    Science.gov (United States)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.

  16. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Energy Technology Data Exchange (ETDEWEB)

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  17. Verification of Quantum Cryptography Protocols by Model Checking

    Directory of Open Access Journals (Sweden)

    Mohamed Elboukhari

    2010-10-01

    Full Text Available Unlike classical cryptography which is based on mathematical functions, Quantum Cryptography orQuantum Key Distribution (QKD exploits the laws of quantum physics to offer unconditionally securecommunication. The progress of research in this field allows the anticipation of QKD to be availableoutside of laboratories within the next few years and efforts are made to improve the performance andreliability of the implemented technologies. But despite this big progress, several challenges remain. Forexample the task of how to test the devices of QKD did not yet receive enough attention. These apparatusesbecome heterogeneous, complex and so demand a big verification effort. In this paper we propose to studyquantum cryptography protocols by applying the technique of probabilistic model checking. Using PRISMtool, we analyze the security of BB84 protocol and we are focused on the specific security property ofeavesdropper's information gain on the key derived from the implementation of this protocol. We show thatthis property is affected by the parameters of the eavesdropper’s power and the quantum channel.

  18. Adjusting for differential-verification bias in diagnostic-accuracy studies: a Bayesian approach.

    Science.gov (United States)

    de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Bossuyt, Patrick M M; Moons, Karel G M

    2011-03-01

    In studies of diagnostic accuracy, the performance of an index test is assessed by verifying its results against those of a reference standard. If verification of index-test results by the preferred reference standard can be performed only in a subset of subjects, an alternative reference test could be given to the remainder. The drawback of this so-called differential-verification design is that the second reference test is often of lesser quality, or defines the target condition in a different way. Incorrectly treating results of the 2 reference standards as equivalent will lead to differential-verification bias. The Bayesian methods presented in this paper use a single model to (1) acknowledge the different nature of the 2 reference standards and (2) make simultaneous inferences about the population prevalence and the sensitivity, specificity, and predictive values of the index test with respect to both reference tests, in relation to latent disease status. We illustrate this approach using data from a study on the accuracy of the elbow extension test for diagnosis of elbow fractures in patients with elbow injury, using either radiography or follow-up as reference standards.

  19. Linear models to perform treaty verification tasks for enhanced information security

    Science.gov (United States)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  20. Modeling and Verification of Insider Threats Using Logical Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2017-01-01

    and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider...

  1. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  2. Efficient speaker verification using Gaussian mixture model component clustering.

    Energy Technology Data Exchange (ETDEWEB)

    De Leon, Phillip L. (New Mexico State University, Las Cruces, NM); McClanahan, Richard D.

    2012-04-01

    In speaker verification (SV) systems that employ a support vector machine (SVM) classifier to make decisions on a supervector derived from Gaussian mixture model (GMM) component mean vectors, a significant portion of the computational load is involved in the calculation of the a posteriori probability of the feature vectors of the speaker under test with respect to the individual component densities of the universal background model (UBM). Further, the calculation of the sufficient statistics for the weight, mean, and covariance parameters derived from these same feature vectors also contribute a substantial amount of processing load to the SV system. In this paper, we propose a method that utilizes clusters of GMM-UBM mixture component densities in order to reduce the computational load required. In the adaptation step we score the feature vectors against the clusters and calculate the a posteriori probabilities and update the statistics exclusively for mixture components belonging to appropriate clusters. Each cluster is a grouping of multivariate normal distributions and is modeled by a single multivariate distribution. As such, the set of multivariate normal distributions representing the different clusters also form a GMM. This GMM is referred to as a hash GMM which can be considered to a lower resolution representation of the GMM-UBM. The mapping that associates the components of the hash GMM with components of the original GMM-UBM is referred to as a shortlist. This research investigates various methods of clustering the components of the GMM-UBM and forming hash GMMs. Of five different methods that are presented one method, Gaussian mixture reduction as proposed by Runnall's, easily outperformed the other methods. This method of Gaussian reduction iteratively reduces the size of a GMM by successively merging pairs of component densities. Pairs are selected for merger by using a Kullback-Leibler based metric. Using Runnal's method of reduction, we

  3. SYSTEM-COGNITIVE MODEL OF FORECASTING THE DEVELOPMENT OF DIVERSIFIED AGRO-INDUSTRIAL CORPORATIONS. PART II. SYNTHESIS AND MODEL VERIFICATION

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-11-01

    Full Text Available In this article, in accordance with the methodology of the Automated system-cognitive analysis (ASCanalysis, we examine the implementation of the 3rd ASC-analysis: synthesis and verification of forecasting models of development of diversified agro-industrial corporations. In this step, we have synthesis and verification of 3 statistical and 7 system-cognitive models: ABS – matrix of the absolute frequencies, PRC1 and PRC2 – matrix of the conditional and unconditional distributions, INF1 and INF2 private criterion: the amount of knowledge based on A. Kharkevich, INF3 – private criterion: the Chi-square test: difference between the actual and the theoretically expected absolute frequencies INF4 and INF5 – private criterion: ROI - Return On Investment, INF6 and INF7 – private criterion: the difference between conditional and unconditional probability (coefficient of relationship. The reliability of the created models was assessed in accordance with the proposed metric is similar to the known F-test, but does not involve the performance of normal distribution, linearity of the object modeling, the independence and additivity acting factors. The accuracy of the obtained models was high enough to resolve the subsequent problems of identification, forecasting and decision making, as well as studies of the modeled object by studying its model, scheduled for consideration in future articles

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  5. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  6. LithoScope: Simulation Based Mask Layout Verification with Physical Resist Model

    Science.gov (United States)

    Qian, Qi-De

    2002-12-01

    Simulation based mask layout verification and optimization is a cost effective way to ensure high mask performance in wafer lithography. Because mask layout verification serves as a gateway to the expensive manufacturing process, the model used for verification must have superior accuracy than models used upstream. In this paper, we demonstrate, for the first time, a software system for mask layout verification and optical proximity correction that employs a physical resist development model. The new system, LithoScope, predicts wafer patterning by solving optical and resist processing equations on a scale that is until recently considered unpractical. Leveraging the predictive capability of the physical model, LithoScope can perform mask layout verification and optical proximity correction under a wide range of processing conditions and for any reticle enhancement technology without the need for multiple model development. We show the ability for physical resist model to change iso-focal bias by optimizing resist parameters, which is critical for matching the experimental process window. We present line width variation statistics and chip level process window predictions using a practical cell layout. We show that LithoScope model can accurately describe the resist-intensive poly gate layer patterning. This system can be used to pre-screen mask data problems before manufacturing to reduce the overall cost of the mask and the product.

  7. Viability Study for an Unattended UF6 Cylinder Verification Station: Phase I Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Miller, Karen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Garner, James R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Branney, Sean [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McDonald, Benjamin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Webster, Jennifer B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zalavadia, Mital A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Todd, Lindsay C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kulisek, Jonathan A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Nordquist, Heather [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Deshmukh, Nikhil S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stewart, Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-05-31

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument

  8. Code and Solution Verification of 3D Numerical Modeling of Flow in the Gust Erosion Chamber

    Science.gov (United States)

    Yuen, A.; Bombardelli, F. A.

    2014-12-01

    Erosion microcosms are devices commonly used to investigate the erosion and transport characteristics of sediments at the bed of rivers, lakes, or estuaries. In order to understand the results these devices provide, the bed shear stress and flow field need to be accurately described. In this research, the UMCES Gust Erosion Microcosm System (U-GEMS) is numerically modeled using Finite Volume Method. The primary aims are to simulate the bed shear stress distribution at the surface of the sediment core/bottom of the microcosm, and to validate the U-GEMS produces uniform bed shear stress at the bottom of the microcosm. The mathematical model equations are solved by on a Cartesian non-uniform grid. Multiple numerical runs were developed with different input conditions and configurations. Prior to developing the U-GEMS model, the General Moving Objects (GMO) model and different momentum algorithms in the code were verified. Code verification of these solvers was done via simulating the flow inside the top wall driven square cavity on different mesh sizes to obtain order of convergence. The GMO model was used to simulate the top wall in the top wall driven square cavity as well as the rotating disk in the U-GEMS. Components simulated with the GMO model were rigid bodies that could have any type of motion. In addition cross-verification was conducted as results were compared with numerical results by Ghia et al. (1982), and good agreement was found. Next, CFD results were validated by simulating the flow within the conventional microcosm system without suction and injection. Good agreement was found when the experimental results by Khalili et al. (2008) were compared. After the ability of the CFD solver was proved through the above code verification steps. The model was utilized to simulate the U-GEMS. The solution was verified via classic mesh convergence study on four consecutive mesh sizes, in addition to that Grid Convergence Index (GCI) was calculated and based on

  9. Modeling and Verification of Distributed Generation and Voltage Regulation Equipment for Unbalanced Distribution Power Systems; Annual Subcontract Report, June 2007

    Energy Technology Data Exchange (ETDEWEB)

    Davis, M. W.; Broadwater, R.; Hambrick, J.

    2007-07-01

    This report summarizes the development of models for distributed generation and distribution circuit voltage regulation equipment for unbalanced power systems and their verification through actual field measurements.

  10. Vibratory response modeling and verification of a high precision optical positioning system.

    Energy Technology Data Exchange (ETDEWEB)

    Barraza, J.; Kuzay, T.; Royston, T. J.; Shu, D.

    1999-06-18

    A generic vibratory-response modeling program has been developed as a tool for designing high-precision optical positioning systems. Based on multibody dynamics theory, the system is modeled as rigid-body structures connected by linear elastic elements, such as complex actuators and bearings. The full dynamic properties of each element are determined experimentally or theoretically, then integrated into the program as inertial and stiffness matrices. Utilizing this program, the theoretical and experimental verification of the vibratory behavior of a double-multilayer monochromator support and positioning system is presented. Results of parametric design studies that investigate the influence of support floor dynamics and highlight important design issues are also presented. Overall, good matches between theory and experiment demonstrate the effectiveness of the program as a dynamic modeling tool.

  11. Verification of a Probabilistic Model for A Distribution System with Integration of Dispersed Generation

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte;

    2008-01-01

    In order to assess the present and predict the future distribution system performance using a probabilistic model, verification of the model is crucial. This paper illustrates the error caused by using traditional Monte Carlo (MC) based probabilistic load flow (PLF) when involving tap...... obtained from the developed probabilistic model....

  12. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    Science.gov (United States)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  13. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification.

    Science.gov (United States)

    Sager, Jennifer E; Yu, Jingjing; Ragueneau-Majlessi, Isabelle; Isoherranen, Nina

    2015-11-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms "PBPK" and "physiologically based pharmacokinetic model" to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines.

  14. 3D MODELING FOR UNDERWATER ARCHAEOLOGICAL DOCUMENTATION: METRIC VERIFICATIONS

    Directory of Open Access Journals (Sweden)

    S. D’Amelio

    2015-04-01

    Full Text Available The survey in underwater environment has always presented considerable difficulties both operative and technical and this has sometimes made it difficult to use the techniques of survey commonly used for the documentation of Cultural Heritage in dry environment. The work of study concerns the evaluation in terms of capability and accuracy of the Autodesk123DCatch software for the reconstruction of a three-dimensional model of an object in underwater context. The subjects of the study are models generated from sets of photographs and sets of frames extracted from video sequence. The study is based on comparative method, using a reference model, obtained with laser scanner technique.

  15. On the need for data for the verification of service life models for frost damage

    DEFF Research Database (Denmark)

    Geiker, Mette Rica; Engelund, Sven

    1999-01-01

    The purpose of this paper is to draw the attention to the need for the verification of service life models for frost attack on concrete and the collection of relevant data. To illustrate the type of data needed the paper presents models for internal freeze/thaw damage (internal cracking including...

  16. Verification, Validation & Accreditation of Legacy Simulations using the Business Process Modeling Notation

    NARCIS (Netherlands)

    Gianoulis, C.; Roza, M.; Kabilan, V.

    2008-01-01

    Verification, Validation and Accreditation is an important part of the Modeling and Simulation domain. This paper focuses on legacy simulations and examines two VV&A approaches coming from different communities of the defense. We use the Business Process Modeling Notation (BPMN) to describe both app

  17. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  18. Gaia challenging performances verification: combination of spacecraft models and test results

    Science.gov (United States)

    Ecale, Eric; Faye, Frédéric; Chassat, François

    2016-08-01

    To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.

  19. A solenoid-based active hydraulic engine mount: modelling, analysis, and verification

    OpenAIRE

    Hosseini, Ali

    2010-01-01

    The focus of this thesis is on the design, modelling, identification, simulation, and experimental verification of a low-cost solenoid-based active hydraulic engine mount. To build an active engine mount, a commercial On-Off solenoid is modified to be used as an actuator and it is embedded inside a hydraulic engine mount. The hydraulic engine mount is modelled and tested, solenoid actuator is modelled and identified, and finally the models were integrated to obtain the analytical model of the...

  20. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  1. Development and verification of a screening model for surface spreading of petroleum

    Science.gov (United States)

    Hussein, Maged; Jin, Minghui; Weaver, James W.

    2002-08-01

    Overflows and leakage from aboveground storage tanks and pipelines carrying crude oil and petroleum products occur frequently. The spilled hydrocarbons pose environmental threats by contaminating the surrounding soil and the underlying ground water. Predicting the fate and transport of these chemicals is required for environmental risk assessment and for remedial measure design. The present paper discusses the formulation and application of the Oil Surface Flow Screening Model (OILSFSM) for predicting the surface flow of oil by taking into account infiltration and evaporation. Surface flow is simulated using a semi-analytical model based on the lubrication theory approximation of viscous flow. Infiltration is simulated using a version of the Green and Ampt infiltration model, which is modified to account for oil properties. Evaporation of volatile compounds is simulated using a compositional model that accounts for the changes in the fraction of each compound in the spilled oil. The coupling between surface flow, infiltration and evaporation is achieved by incorporating the infiltration and evaporation fluxes into the global continuity equation of the spilled oil. The model was verified against numerical models for infiltration and analytical models for surface flow. The verification study demonstrates the applicability of the model.

  2. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    Science.gov (United States)

    Hillen, F.; Höfle, B.; Ehlers, M.; Reinartz, P.

    2014-02-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories.

  3. A New Speaker Verification Method with GlobalSpeaker Model and Likelihood Score Normalization

    Institute of Scientific and Technical Information of China (English)

    张怡颖; 朱小燕; 张钹

    2000-01-01

    In this paper a new text-independent speaker verification method GSMSV is proposed based on likelihood score normalization. In this novel method a global speaker model is established to represent the universal features of speech and normalize the likelihood score. Statistical analysis demonstrates that this normalization method can remove common factors of speech and bring the differences between speakers into prominence. As a result the equal error rate is decreased significantly,verification procedure is accelerated and system adaptability to speaking speed is improved.

  4. How to test maximal oxygen uptake: a study on timing and testing procedure of a supramaximal verification test.

    Science.gov (United States)

    Scharhag-Rosenberger, Friederike; Carlsohn, Anja; Cassel, Michael; Mayer, Frank; Scharhag, Jürgen

    2011-02-01

    Verification tests are becoming increasingly common for confirming maximal oxygen uptake (VO2 max) attainment. Yet, timing and testing procedures vary between working groups. The aims of this study were to investigate whether verification tests can be performed after an incremental test or should be performed on a separate day, and whether VO2 max can still be determined within the first testing session in subjects not satisfying the verification criterion. Forty subjects (age, 24 ± 4 years; VO2 max, 50 ± 7 mL·min-1·kg-1) performed a maximal incremental treadmill test and, 10 min afterwards, a verification test (VerifDay1) at 110% of maximal velocity (vmax). The verification criterion was a VerifDay1 peak oxygen uptake (VO2 peak) ≤5.5% higher than the incremental test value. Subjects not achieving the verification criterion performed another verification test at 115% vmax (VerifDay1') 10 min later, trying to confirm VerifDay1 VO2 peak as VO2 max. All other subjects exclusively repeated VerifDay1 on a separate day (VerifDay2). Of the 40 subjects, 6 did not satisfy the verification criterion. In 4 of them, attainment of VO2 max was confirmed by VerifDay1'. VO2 peak was equivalent between VerifDay1 and VerifDay2 (3722 ± 991 mL·min-1 vs. 3752 ± 995 mL·min-1, p = 0.56), whereas time to exhaustion was significantly longer in VerifDay2 (2:06 ± 0:22 min:s vs. 2:42 ± 0:38 min:s, p test VO2 peak does not seem to be affected by a preceding maximal incremental test. Incremental and verification tests can therefore be performed within the same testing session. In individuals not achieving the verification criterion, VO2 max can be determined by means of a subsequent, more intense verification test in most but not all cases.

  5. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View Texas A& M; DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-12-20

    processes. These models were built on a numerical framework for solving conservation law problems in one-dimensional geometries such as spheres, cylinders, and lines. Coupled with the framework are specific models for adsorption in commercial adsorbents, such as zeolites and mordenites. Utilizing this modeling approach, the authors were able to accurately describe and predict adsorption kinetic data obtained from experiments at a variety of different temperatures and gas phase concentrations. A demonstration of how these models, and framework, can be used to simulate adsorption in fixed- bed columns is provided. The CO2 absorption work involved modeling with supportive experimental information. A dynamic model was developed to simulate CO2 absorption using high alkaline content water solutions. The model is based upon transient mass and energy balances for chemical species commonly present in CO2 absorption. A computer code was developed to implement CO2 absorption with a chemical reaction model. Experiments were conducted in a laboratory scale column to determine the model parameters. The influence of geometric parameters and operating variables on CO2 absorption was studied over a wide range of conditions. Continuing work could employ the model to control column operation and predict the absorption behavior under various input conditions and other prescribed experimental perturbations. The value of the validated models and numerical frameworks developed in this project is that they can be used to predict the sorption behavior of off-gas evolved during the reprocessing of nuclear waste and thus reduce the cost of the experiments. They can also be used to design sorption processes based on concentration limits and flow-rates determined at the plant level.

  6. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  7. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  8. The Parametric Model for PLC Reference Chanells and its Verification in Real PLC Environment

    OpenAIRE

    2008-01-01

    For the expansion of PLC systems, it is necesssary to have a detailed knowledge of the PLC transmission channel properties. This contribution shortly discusses characteristics of the PLC environment and a classification of PLC transmission channels. A main part is focused on the parametric model for PLC reference channels and its verification in the real PLC environment utilizing experimental measurements.

  9. Towards a Generic Information Data Model for Verification, Validation & Accreditation VV&A

    NARCIS (Netherlands)

    Roza, Z.C.; Voogd, J.M.; Giannoulis, C.

    2008-01-01

    The Generic Methodology for Verification, Validation and Acceptance (GM-VV) is intended to provide a common generic framework for making formal and well balanced acceptance decisions on a specific usage of models, simulations and data. GM-VV will offer the international M&S community with a Verifica

  10. A Verification and Analysis of the USAF/DoD Fatigue Model and Fatigue Management Technology

    Science.gov (United States)

    2005-11-01

    We Nap: Evolution, Chronobiology, and Functions of Polyphasic and Ultrashort Sleep . Stampi, C. (ed) Birkhduser, Boston. Defense Acquisition...Windows® soffivare application of the Sleep , Activity, Fatigue, and Task Effectiveness (SAFTE) applied model. The application, the Fatigue Avoidance...Scheduling Tool (FASTTM) was re-engineered as a clone from the SAFTE specification. The verification considered nine sleep /wake schedules that were

  11. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  12. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2010-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor ...

  13. Verification of a Quality Management Theory: Using a Delphi Study

    OpenAIRE

    Ali Mohammad Mosadeghrad

    2013-01-01

    BackgroundA model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. MethodsThe proposed model was further developed using feedback from ...

  14. Image Smearing Modeling and Verification for Strapdown Star Sensor

    Institute of Scientific and Technical Information of China (English)

    WANG Haiyong; ZHOU Wenrui; CHENG Xuan; LIN Haoyu

    2012-01-01

    To further extend study on celestial attitude determination with strapdown star sensor from static into dynamic field,one prerequisite is to generate precise dynamic simulating star maps.First a neat analytical solution of the smearing trajectory caused by spacecraft attitude maneuver is deduced successfully,whose parameters cover the geometric size of optics,three-axis angular velocities and CCD integral time.Then for the first time the mathematical law and method are discovered about how to synthesize the two formulae of smearing trajectory and the static Gaussian distribution function (GDF) model,the key of which is a line integral with regard to the static GDF attenuated by a factor 1/Ls (Ls is the arc length of the smearing trajectory) along the smearing trajectory.The dynamic smearing model is then obtained,also in an analytical form.After that,three sets of typical simulating maps and data are simulated from this dynamic model manifesting the expected smearing effects,also compatible with the linear model as its special case of no boresight rotation.Finally,model validity tests on a rate turntable are carried out,which results in a mean correlation coefficient 0.920 0 between the camera images and the corresponding model simulated ones with the same parameters.The sufficient similarity verifies the validity of the dynamic smearing model.This model,after parameter calibration,can serve as a front-end loop of the ground semi-physical simulation system for celestial attitude determination with strapdown star sensor.

  15. Sorption Modeling and Verification for Off-Gas Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Tavlarides, Lawrence L. [Syracuse Univ., NY (United States); Lin, Ronghong [Syracuse Univ., NY (United States); Nan, Yue [Syracuse Univ., NY (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Georgia Inst. of Technology, Atlanta, GA (United States); Ladshaw, Austin [Georgia Inst. of Technology, Atlanta, GA (United States); Sharma, Ketki [Georgia Inst. of Technology, Atlanta, GA (United States); Gabitto, Jorge [Prairie View A & M Univ., Prairie View, TX (United States); DePaoli, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-04-29

    uptake data. Two parallel approaches have been explored for integrating the kernels described above into a mass-transport model for adsorption in fixed beds. In one, the GSTA isotherm kernel has been incorporated into the MOOSE framework; in the other approach, a focused finite-difference framework and PDE kernels have been developed. Issues, including oscillatory behavior in MOOSE solutions to advection-diffusion problems, and opportunities have been identified for each approach, and a path forward has been identified toward developing a stronger modeling platform. Experimental systems were established for collection of microscopic kinetics and equilibria data for single and multicomponent uptake of gaseous species on solid sorbents. The systems, which can operate at ambient temperature to 250°C and dew points from -69 to 17°C, are useful for collecting data needed for modeling performance of sorbents of interest. Experiments were conducted to determine applicable models and parameters for isotherms and mass transfer for water and/or iodine adsorption on MS3A. Validation experiments were also conducted for water adsorption on fixed beds of MS3A. For absorption, work involved modeling with supportive experimentation. A dynamic model was developed to simulate CO2 absorption with chemical reaction using high alkaline content water solutions. A computer code was developed to implement the model based upon transient mass and energy balances. Experiments were conducted in a laboratory-scale column to determine model parameters. The influence of geometric parameters and operating variables on CO2 absorption was studied over a wide range of conditions. This project has resulted in 7 publications, with 3 manuscripts in preparation. Also, 15 presentations were given at national meetings of ANS and AIChE and at Material Recovery and Waste Forms Campaign Working Group meetings.

  16. Dynamic grey model of verification cycle and lifecycle of measuring instrument and its application

    Institute of Scientific and Technical Information of China (English)

    SU Hai-tao; YANG Shi-yuan; DONG Hua; SHEN Mao-hu

    2005-01-01

    Two dynamic grey models DGM (1, 1) for the verification cycle and the lifecycle of measuring instrument based on time sequence and frequency sequence were set up, according to the statistical feature of examination data and weighting method. By a specific case, i.e. vernier caliper, it is proved that the fit precision and forecast precision of the models are much higher, the cycles are obviously different under different working conditions, and the forecast result of the frequency sequence model is better than that of the time sequence model. Combining dynamic grey model and auto-manufacturing case the controlling and information subsystems of verification cycle and the lifecycle based on information integration, multi-sensor controlling and management controlling were given. The models can be used in production process to help enterprise reduce error, cost and flaw.

  17. Range verification methods in particle therapy: underlying physics and Monte Carlo modelling

    Directory of Open Access Journals (Sweden)

    Aafke Christine Kraan

    2015-07-01

    Full Text Available Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients.Non-invasive in-vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including beta+ emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC predictions is a key issue. Correctly modelling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modelling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.

  18. Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling.

    Science.gov (United States)

    Kraan, Aafke Christine

    2015-01-01

    Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β (+) emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.

  19. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  20. Land surface Verification Toolkit (LVT – a generalized framework for land surface model evaluation

    Directory of Open Access Journals (Sweden)

    S. V. Kumar

    2012-02-01

    Full Text Available Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS, it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  1. Land surface Verification Toolkit (LVT – a generalized framework for land surface model evaluation

    Directory of Open Access Journals (Sweden)

    S. V. Kumar

    2012-06-01

    Full Text Available Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS, it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  2. Formalization and Verification of Business Process Modeling Based on UML and Petri Nets

    Institute of Scientific and Technical Information of China (English)

    YAN Zhi-jun; GAN Ren-chu

    2005-01-01

    In order to provide a quantitative analysis and verification method for activity diagrams based business process modeling, a formal definition of activity diagrams is introduced. And the basic requirements for activity diagrams based business process models are proposed. Furthermore, the standardized transformation technique between business process models and basic Petri nets is presented and the analysis method for the soundness and well-structured properties of business processes is introduced.

  3. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    Energy Technology Data Exchange (ETDEWEB)

    Hautamaeki, J.; Tiitta, A. [VTT Chemical Technology, Espoo (Finland)

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  4. Ecological dynamic model of grassland and its practical verification

    Institute of Scientific and Technical Information of China (English)

    ZENG; Xiaodong

    2005-01-01

    Based on the physico-biophysical considerations, mathematical analysis and some approximate formulations generally adopted in meteorology and ecology, an ecological dynamic model of grassland is developed. The model consists of three interactive variables, I.e. The biomass of living grass, the biomass of wilted grass, and the soil wetness. The major biophysical processes are represented in parameterization formulas, and the model parameters can be determined inversely by using the observational climatological and ecological data. Some major parameters are adjusted by this method to fit the data (although incomplete) in the Inner Mongolia grassland, and other secondary parameters are estimated through sensitivity studies. The model results are well agreed with reality, e.g., (I) the maintenance of grassland requires a minimum amount of annual precipitation (approximately 300 mm); (ii) there is a significant relationship between the annual precipitation and the biomass of living grass; and (iii) the overgrazing will eventually result in desertification. A specific emphasis is put on the shading effect of the wilted grass accumulated on the soil surface. It effectively reduces the soil surface temperature and the evaporation, hence benefits the maintenance of grassland and the reduction of water loss in the soil.

  5. Computational reverse shoulder prosthesis model: Experimental data and verification.

    Science.gov (United States)

    Martins, A; Quental, C; Folgado, J; Ambrósio, J; Monteiro, J; Sarmento, M

    2015-09-18

    The reverse shoulder prosthesis aims to restore the stability and function of pathological shoulders, but the biomechanical aspects of the geometrical changes induced by the implant are yet to be fully understood. Considering a large-scale musculoskeletal model of the upper limb, the aim of this study is to evaluate how the Delta reverse shoulder prosthesis influences the biomechanical behavior of the shoulder joint. In this study, the kinematic data of an unloaded abduction in the frontal plane and an unloaded forward flexion in the sagittal plane were experimentally acquired through video-imaging for a control group, composed of 10 healthy shoulders, and a reverse shoulder group, composed of 3 reverse shoulders. Synchronously, the EMG data of 7 superficial muscles were also collected. The muscle force sharing problem was solved through the minimization of the metabolic energy consumption. The evaluation of the shoulder kinematics shows an increase in the lateral rotation of the scapula in the reverse shoulder group, and an increase in the contribution of the scapulothoracic joint to the shoulder joint. Regarding the muscle force sharing problem, the musculoskeletal model estimates an increased activity of the deltoid, teres minor, clavicular fibers of the pectoralis major, and coracobrachialis muscles in the reverse shoulder group. The comparison between the muscle forces predicted and the EMG data acquired revealed a good correlation, which provides further confidence in the model. Overall, the shoulder joint reaction force was lower in the reverse shoulder group than in the control group. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. ENSO Forecasts in the North American Multi-Model Ensemble: Composite Analysis and Verification

    Science.gov (United States)

    Chen, L. C.

    2015-12-01

    In this study, we examine precipitation and temperature forecasts during El Nino/Southern Oscillation (ENSO) events in six models in the North American Multi-Model Ensemble (NMME), including the CFSv2, CanCM3, CanCM4, FLOR, GEOS5, and CCSM4 models, by comparing the model-based ENSO composites to the observed. The composite analysis is conducted using the 1982-2010 hindcasts for each of the six models with selected ENSO episodes based on the seasonal Ocean Nino Index (ONI) just prior to the date the forecasts were initiated. Two sets of composites are constructed over the North American continent: one based on precipitation and temperature anomalies, the other based on their probability of occurrence in a tercile-based system. The composites apply to monthly mean conditions in November, December, January, February, and March, respectively, as well as to the five-month aggregates representing the winter conditions. For the anomaly composites, we use the anomaly correlation coefficient and root-mean-square error against the observed composites for evaluation. For the probability composites, unlike conventional probabilistic forecast verification assuming binary outcomes to the observations, both model and observed composites are expressed in probability terms. Performance metrics for such validation are limited. Therefore, we develop a probability anomaly correlation measure and a probability score for assessment, so the results are comparable to the anomaly composite evaluation. We found that all NMME models predict ENSO precipitation patterns well during wintertime; however, some models have large discrepancies between the model temperature composites and the observed. The skill is higher for the multi-model ensemble, as well as the five-month aggregates. Comparing to the anomaly composites, the probability composites have superior skill in predicting ENSO temperature patterns and are less sensitive to the sample used to construct the composites, suggesting that

  7. Modelling and Verification of Multiple UAV Mission Using SMV

    CERN Document Server

    Sirigineedi, Gopinadh; White, Brian A; Zbikowski, Rafal

    2010-01-01

    Model checking has been used to verify the correctness of digital circuits, security protocols, communication protocols, as they can be modelled by means of finite state transition model. However, modelling the behaviour of hybrid systems like UAVs in a Kripke model is challenging. This work is aimed at capturing the behaviour of an UAV performing cooperative search mission into a Kripke model, so as to verify it against the temporal properties expressed in Computation Tree Logic (CTL). SMV model checker is used for the purpose of model checking.

  8. Modelling and Verification of Multiple UAV Mission Using SMV

    Directory of Open Access Journals (Sweden)

    Gopinadh Sirigineedi

    2010-03-01

    Full Text Available Model checking has been used to verify the correctness of digital circuits, security protocols, communication protocols, as they can be modelled by means of finite state transition model. However, modelling the behaviour of hybrid systems like UAVs in a Kripke model is challenging. This work is aimed at capturing the behaviour of an UAV performing cooperative search mission into a Kripke model, so as to verify it against the temporal properties expressed in Computational Tree Logic (CTL. SMV model checker is used for the purpose of model checking.

  9. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    Science.gov (United States)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  10. Modeling and verification of hemispherical solar still using ANSYS CFD

    Energy Technology Data Exchange (ETDEWEB)

    Panchal, Hitesh N. [KSV University, Gujarat Power Engineering and Research Institute, Mehsana (India); Shah, P.K. [Silver Oak College of Engineering and Technology, Ahmedabad, Gujarat (India)

    2013-07-01

    In every efficient solar still design, water temperature, vapor temperature and distillate output, and difference between water temperature and inner glass cover temperatures are very important. Here, two dimensional three phase model of hemispherical solar still is made for evaporation as well as condensation process in ANSYS CFD. Simulation results like water temperature, vapor temperature, distillate output compared with actual experimental results of climate conditions of Mehsana (latitude of 23° 59’ and longitude of 72° 38) of hemispherical solar still. Water temperature and distillate output were good agreement with actual experimental results. Study shows that ANSYS-CFD is very powerful as well as efficient tool for design, comparison purpose of hemispherical solar still.

  11. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  12. FAST Mast Structural Response to Axial Loading: Modeling and Verification

    Science.gov (United States)

    Knight, Norman F., Jr.; Elliott, Kenny B.; Templeton, Justin D.; Song, Kyongchan; Rayburn, Jeffery T.

    2012-01-01

    The International Space Station s solar array wing mast shadowing problem is the focus of this paper. A building-block approach to modeling and analysis is pursued for the primary structural components of the solar array wing mast structure. Starting with an ANSYS (Registered Trademark) finite element model, a verified MSC.Nastran (Trademark) model is established for a single longeron. This finite element model translation requires the conversion of several modeling and analysis features for the two structural analysis tools to produce comparable results for the single-longeron configuration. The model is then reconciled using test data. The resulting MSC.Nastran (Trademark) model is then extended to a single-bay configuration and verified using single-bay test data. Conversion of the MSC. Nastran (Trademark) single-bay model to Abaqus (Trademark) is also performed to simulate the elastic-plastic longeron buckling response of the single bay prior to folding.

  13. Verification of five pharmacogenomics-based warfarin administration models

    Directory of Open Access Journals (Sweden)

    Meiqin Lin

    2016-01-01

    Conclusions: Since none of the models ranked high for all the three criteria considered, the impact of various factors should be thoroughly considered before selecting the most appropriate model for the region's population.

  14. VAMOS: The verification and monitoring options study: Current research options for in-situ monitoring and verification of contaminant remediation and containment within the vadose zone

    Energy Technology Data Exchange (ETDEWEB)

    Betsill, J.D. [Sandia National Labs., Albuquerque, NM (United States); Gruebel, R.D. [Tech Reps., Inc., Albuquerque, NM (United States)

    1995-09-01

    The Verification and Monitoring Options Study Project (VAMOS) was established to identify high-priority options for future vadose-zone environmental research in the areas of in-situ remediation monitoring, post-closure monitoring, and containment emplacement and verification monitoring. VAMOS examined projected needs not currently being met with applied technology in order to develop viable monitoring and verification research options. The study emphasized a compatible systems approach to reinforce the need for utilizing compatible components to provide user friendly site monitoring systems. To identify the needs and research options related to vadose-zone environmental monitoring and verification, a literature search and expert panel forums were conducted. The search included present drivers for environmental monitoring technology, technology applications, and research efforts. The forums included scientific, academic, industry, and regulatory environmental professionals as well as end users of environmental technology. The experts evaluated current and future monitoring and verification needs, methods for meeting these needs, and viable research options and directions. A variety of high-priority technology development, user facility, and technology guidance research options were developed and presented as an outcome of the literature search and expert panel forums.

  15. Models and formal verification of multiprocessor system-on-chips

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2008-01-01

    In this article we develop a model for applications running on multiprocessor platforms. An application is modelled by task graphs and a multiprocessor system is modelled by a number of processing elements, each capable of executing tasks according to a given scheduling discipline. We present a d...... could verify a smart-phone application consisting of 103 tasks executing on 4 processing elements....

  16. A verification strategy for web services composition using enhanced stacked automata model.

    Science.gov (United States)

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  17. Virtual reality verification of workplace design guidelines: a follow-up study

    Energy Technology Data Exchange (ETDEWEB)

    Nystad, Espen; Helgar, Stein; Droeivoldsmo, Asgeir

    2002-08-15

    Early identification of potential human factors guideline violations and corrective input into the design process are desired for efficient and cost-effective control room design. Virtual reality (VR) technology makes it possible to perform evaluation of the design of the control room at an early stage of the design process, but can we trust the results from such evaluations? This report describes the second phase of an experimental validation of a VR model with virtual verification tools against the real world in a guideline verification task. Results from the previous phase indicated that guideline verification in the VR model could be done with satisfactory accuracy for a number of evaluations. However, some guideline categories required further development of measurement tools and use of a model with higher resolution. To correct the shortcomings and utilize the knowledge from the previous experiment, a number of new features were implemented into the virtual environment used for testing. Results from the test indicate that the detailed modelling of operating panels was helpful. The addition of tools for measuring font sizes and direct illumination also proved useful. However, further refinement is needed in the usability of some of the tools. It is recommended to continue the fine-tuning of these tools, and that a test is made with a physical mockup in a real refurbishment project to get a more authentic test of the virtual verification tools.

  18. Automatic Verification of Railway Interlocking Systems: A Case Study

    DEFF Research Database (Denmark)

    Petersen, Jakob Lyng

    1998-01-01

    This paper presents experiences in applying formal verification to a large industrial piece of software. The are of application is railway interlocking systems. We try to prove requirements of the program controlling the Swedish railway Station Alingsås by using the decision procedure which...... is based on the Stålmarck algorithm. While some requirements are easily proved, others are virtually impossible to manage du to a very large potenbtial state space. We present what has been done in order to get, at least, an idea of whether or not such difficult requirements are fulfilled or not, and we...... express thoughts on what is needed in order to be able to successfully verify large real-life systems....

  19. Numerical Modelling of Wind Waves. Problems, Solutions, Verifications, and Applications

    CERN Document Server

    Polnikov, Vladislav

    2011-01-01

    The time-space evolution of the field is described by the transport equation for the 2-dimensional wave energy spectrum density, S(x,t), spread in the space, x, and time, t. This equation has the forcing named the source function, F, depending on both the wave spectrum, S, and the external wave-making factors: local wind, W(x, t), and local current, U(x, t). The source function contains certain physical mechanisms responsible for a wave spectrum evolution. It is used to distinguish three terms in function F: the wind-wave energy exchange mechanism, In; the energy conservative mechanism of nonlinear wave-wave interactions, Nl; and the wave energy loss mechanism, Dis. Differences in mathematical representation of the source function terms determine general differences between wave models. The problem is to derive analytical representations for the source function terms said above from the fundamental wave equations. Basing on publications of numerous authors and on the last two decades studies of the author, th...

  20. On the verification of PGD reduced-order models

    OpenAIRE

    Pled, Florent; Chamoin, Ludovic; Ladevèze, Pierre

    2014-01-01

    International audience; In current computational mechanics practice, multidimensional as well as multiscale or parametric models encountered in a wide variety of scientific and engineering fields often require either the resolution of significantly large complexity problems or the direct calculation of very numerous solutions of such complex models. In this framework, the use of model order reduction allows to dramatically reduce the computational requirements engendered by the increasing mod...

  1. Development and verification of printed circuit board toroidal transformer model

    DEFF Research Database (Denmark)

    Pejtersen, Jens; Mønster, Jakob Døllner; Knott, Arnold

    2013-01-01

    by comparing calculated parameters with 3D finite element simulations and experimental measurement results. The developed transformer model shows good agreement with the simulated and measured results. The model can be used to predict the parameters of printed circuit board toroidal transformer configurations......An analytical model of an air core printed circuit board embedded toroidal transformer configuration is presented. The transformer has been developed for galvanic isolation of very high frequency switch-mode dc-dc power converter applications. The theoretical model is developed and verified...

  2. Verification of a fully coupled FE model for tunneling under compressed air

    Energy Technology Data Exchange (ETDEWEB)

    Oettl, G.; Stark, R.F.; Hofstetter, G. [Innsbruck Univ. (Austria). Inst. for Structural Analysis and Strength of Materials

    2001-07-01

    This paper deals with the verification of a fully coupled finite element model for tunneling under compressed air. The formulation is based on mixture theory treating the soil as a three-phase medium with the constituents: deformable porous soil skeleton, water and air. Starting with a brief outline of the governing equations results of numerical simulations of different laboratory tests and of a large-scale in-situ test are presented and compared with experimental data. (orig.)

  3. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2016-01-01

    checking (BMC) and inductive reasoning, it is verified that the generated model instance satisfies the generated safety properties. Using this method, we are able to verify the safety properties for model instances corresponding to railway networks of industrial size. Experiments show that BMC is also...

  4. Kinetic model of ductile iron solidification with experimental verification

    Directory of Open Access Journals (Sweden)

    W. Kapturkiewicz

    2009-10-01

    Full Text Available A solidification model for ductile iron, including Weibull formula for nodule count has been presented. From this model, the following can be determined: cooling curves, kinetics of austenite and eutectic nucleation, austenite and eutectic growth velocity, volume fraction, distribution of Si and P both in austenite and eutectic grain with distribution in casting section.In the developed model of nodular graphite iron casting solidification, the correctness of the mathematical model has been experimentally verified in the range of the most significant factors, which include temperature field, the value of maximum undercooling, and the graphite nodule count interrelated with the casting cross-section. Literature offers practically no data on so confronted process model and simulation program.

  5. Mask synthesis and verification based on geometric model for surface micro-machined MEMS

    Institute of Scientific and Technical Information of China (English)

    LI Jian-hua; LIU Yu-sheng; GAO Shu-ming

    2005-01-01

    Traditional MEMS (microelectromechanical system) design methodology is not a structured method and has become an obstacle for MEMS creative design. In this paper, a novel method of mask synthesis and verification for surface micro-machined MEMS is proposed, which is based on the geometric model of a MEMS device. The emphasis is focused on synthesizing the masks at the basis of the layer model generated from the geometric model of the MEMS device. The method is comprised of several steps: the correction of the layer model, the generation of initial masks and final masks including multi-layer etch masks, and mask simulation. Finally some test results are given.

  6. Multiple verification in computational modeling of bone pathologies

    CERN Document Server

    Liò, Pietro; Paoletti, Nicola; 10.4204/EPTCS.67.8

    2011-01-01

    We introduce a model checking approach to diagnose the emerging of bone pathologies. The implementation of a new model of bone remodeling in PRISM has led to an interesting characterization of osteoporosis as a defective bone remodeling dynamics with respect to other bone pathologies. Our approach allows to derive three types of model checking-based diagnostic estimators. The first diagnostic measure focuses on the level of bone mineral density, which is currently used in medical practice. In addition, we have introduced a novel diagnostic estimator which uses the full patient clinical record, here simulated using the modeling framework. This estimator detects rapid (months) negative changes in bone mineral density. Independently of the actual bone mineral density, when the decrease occurs rapidly it is important to alarm the patient and monitor him/her more closely to detect insurgence of other bone co-morbidities. A third estimator takes into account the variance of the bone density, which could address the...

  7. Model Verification and Validation Using Graphical Information Systems Tools

    Science.gov (United States)

    2013-07-31

    Marques, W. C., E. H. L. Fernandes, B. C. Moraes, O. O. Möller, and A. Malcherek (2010), Dynamics of the Patos Lagoon coastal plume and its...multiple hurricane beds in the northern Gulf of Mexico , Marine Geology, Volume 210, Issues 1-4, Storms and their significance in coastal morpho-sedimentary...accuracy of model forecasts of currents in coastal areas. The MVV module is implemented as part of the Geospatial Analysis and Model Evaluation Software

  8. Target Soil Impact Verification: Experimental Testing and Kayenta Constitutive Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Broome, Scott Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Flint, Gregory Mark [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Dewers, Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Newell, Pania [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This report details experimental testing and constitutive modeling of sandy soil deformation under quasi - static conditions. This is driven by the need to understand constitutive response of soil to target/component behavior upon impact . An experimental and constitutive modeling program was followed to determine elastic - plastic properties and a compressional failure envelope of dry soil . One hydrostatic, one unconfined compressive stress (UCS), nine axisymmetric compression (ACS) , and one uniaxial strain (US) test were conducted at room temperature . Elastic moduli, assuming isotropy, are determined from unload/reload loops and final unloading for all tests pre - failure and increase monotonically with mean stress. Very little modulus degradation was discernable from elastic results even when exposed to mean stresses above 200 MPa . The failure envelope and initial yield surface were determined from peak stresses and observed onset of plastic yielding from all test results. Soil elasto - plastic behavior is described using the Brannon et al. (2009) Kayenta constitutive model. As a validation exercise, the ACS - parameterized Kayenta model is used to predict response of the soil material under uniaxial strain loading. The resulting parameterized and validated Kayenta model is of high quality and suitable for modeling sandy soil deformation under a range of conditions, including that for impact prediction.

  9. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  10. Local model for magnet-superconductor mechanical interaction: Experimental verification

    Science.gov (United States)

    Diez-Jimenez, Efren; Perez-Diaz, Jose-Luis; Garcia-Prada, Juan Carlos

    2011-03-01

    Several models exist for calculating superconducting repulsion forces in the Meissner state that are based on the method of images. The method of images, however, is limited to a small number of geometrical configurations that can be solved exactly, and the physical interpretation of the method is under discussion. A general local model based on the London equations and Maxwell's equations has been developed to describe the mechanics of the superconductor-permanent magnet system. Due to its differential form, this expression can be easily implemented in a finite elements analysis and, consequently, is easily applicable to any shape of superconductor in the Meissner state. It can solve both forces and torques. This paper reports different experiments undertaken in order to test the model's validity. The vertical forces and the angle of equilibrium between a magnet and a superconductor were measured, and a positive agreement between the experiments and theoretical calculations was found.

  11. CFD modeling of pharmaceutical isolators with experimental verification of airflow.

    Science.gov (United States)

    Nayan, N; Akay, H U; Walsh, M R; Bell, W V; Troyer, G L; Dukes, R E; Mohan, P

    2007-01-01

    Computational fluid dynamics (CFD) models have been developed to predict the airflow in a transfer isolator using a commercial CFD code. In order to assess the ability of the CFD approach in predicting the flow inside an isolator, hot wire anemometry measurements and a novel experimental flow visualization technique consisting of helium-filled glycerin bubbles were used. The results obtained have been shown to agree well with the experiments and show that CFD can be used to model barrier systems and isolators with practical fidelity. This indicates that CFD can and should be used to support the design, testing, and operation of barrier systems and isolators.

  12. Vacuum assisted resin transfer molding (VARTM): Model development and verification

    Science.gov (United States)

    Song, Xiaolan

    2003-06-01

    In this investigation, a comprehensive Vacuum Assisted Resin Transfer Molding (VARTM) process simulation model was developed and verified. The model incorporates resin flow through the preform, compaction and relaxation of the preform, and viscosity and cure kinetics of the resin. The computer model can be used to analyze the resin flow details, track the thickness change of the preform, predict the total infiltration time and final fiber volume fraction of the parts, and determine whether the resin could completely infiltrate and uniformly wet out the preform. Flow of resin through the preform is modeled as flow through porous media. Darcy's law combined with the continuity equation for an incompressible Newtonian fluid forms the basis of the flow model. During the infiltration process, it is well accepted that the total pressure is shared by the resin pressure and the pressure supported by the fiber network. With the progression of the resin, the net pressure applied to the preform decreases as a result of increasing local resin pressure. This leads to the springback of the preform, and is called the springback mechanism. On the other side, the lubrication effect of the resin causes the rearrangement of the fiber network and an increase in the preform compaction. This is called the wetting compaction mechanism. The thickness change of the preform is determined by the relative magnitude of the springback and wetting deformation mechanisms. In the compaction model, the transverse equilibrium equation is used to calculate the net compaction pressure applied to the preform, and the compaction test results are fitted to give the compressive constitutive law of the preform. The Finite Element/Control Volume (FE/CV) method is adopted to find the flow front location and the fluid pressure. The code features the ability of simultaneous integration of 1-D, 2-D and 3-D element types in a single simulation, and thus enables efficient modeling of the flow in complex mold

  13. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    Science.gov (United States)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  14. Very fast road database verification using textured 3D city models obtained from airborne imagery

    Science.gov (United States)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models

  15. Verification of the Naval Oceanic Vertical Aerosol Model During Fire

    NARCIS (Netherlands)

    Davidson, K.L.; Leeuw, G. de; Gathman, S.G.; Jensen, D.R.

    1990-01-01

    The Naval Oceanic Vertical Aerosol Model (NOVAM) has been formulated to estimate the vertical structure of the optical and infrared extinction coefficients in the marine atmospheric boundary layer (MABL), for waverengths between 0,2 and 40 um. NOVAM was designed to predict, utilizing a set of routin

  16. Modelling and Verification of Web Services Business Activity Protocol

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2011-01-01

    WS-Business Activity specification defines two coordination protocols in order to ensure a consistent agreement on the outcome of long-running distributed applications. We use the model checker Uppaal to analyse the Business Agreement with Coordination Completion protocol type. Our analyses show...

  17. Methods for the Update and Verification of Forest Surface Model

    Science.gov (United States)

    Rybansky, M.; Brenova, M.; Zerzan, P.; Simon, J.; Mikita, T.

    2016-06-01

    The digital terrain model (DTM) represents the bare ground earth's surface without any objects like vegetation and buildings. In contrast to a DTM, Digital surface model (DSM) represents the earth's surface including all objects on it. The DTM mostly does not change as frequently as the DSM. The most important changes of the DSM are in the forest areas due to the vegetation growth. Using the LIDAR technology the canopy height model (CHM) is obtained by subtracting the DTM and the corresponding DSM. The DSM is calculated from the first pulse echo and DTM from the last pulse echo data. The main problem of the DSM and CHM data using is the actuality of the airborne laser scanning. This paper describes the method of calculating the CHM and DSM data changes using the relations between the canopy height and age of trees. To get a present basic reference data model of the canopy height, the photogrammetric and trigonometric measurements of single trees were used. Comparing the heights of corresponding trees on the aerial photographs of various ages, the statistical sets of the tree growth rate were obtained. These statistical data and LIDAR data were compared with the growth curve of the spruce forest, which corresponds to a similar natural environment (soil quality, climate characteristics, geographic location, etc.) to get the updating characteristics.

  18. Verification-Driven Slicing of UML/OCL Models

    DEFF Research Database (Denmark)

    Shaikh, Asadullah; Clarisó Viladrosa, Robert; Wiil, Uffe Kock;

    2010-01-01

    computational complexity can limit their scalability. In this paper, we consider a specific static model (UML class diagrams annotated with unrestricted OCL constraints) and a specific property to verify (satisfiability, i.e., “is it possible to create objects without violating any constraint?”). Current...

  19. Verification of Conjugate Heat Transfer Models in a Closed Volume with Radiative Heat Source

    Directory of Open Access Journals (Sweden)

    Maksimov Vyacheslav I.

    2016-01-01

    Full Text Available The results of verification of mathematical model of convective-conductive heat transfer in a closed volume with a thermally conductive enclosing structures are presented. Experiments were carried out to determine the temperature of floor premises in the working conditions of radiant heating systems. Comparison of mathematical modelling of temperature fields and experiments showed their good agreement. It is concluded that the mathematical model of conjugate heat transfers in the air cavity with a heat-conducting and heat-retaining walls correspond to the real process of formation of temperature fields in premises with gas infrared heaters system.

  20. Numerical Verification of the Weak Turbulent Model for Swell Evolution

    CERN Document Server

    Korotkevich, A O; Resio, D; Zakharov, V E

    2007-01-01

    We performed numerical simulation of an ensemble of nonlinearly interacting free gravity waves (swell) by two different methods: solution of primordial dynamical equations describing potential flow of the ideal fluid with a free surface and, solution of the kinetic Hasselmann equation, describing the wave ensemble in the framework of the theory of weak turbulence. Comparison of the results demonstrates applicability of the weak turbulent approach. In both cases we observed effects predicted by this theory: frequency downshift, angular spreading and formation of Zakharov-Filonenko spectrum $I_{\\omega} \\sim \\omega^{-4}$. One of the results of our article consists in the fact that physical processes in finite size laboratory wave tanks and in the ocean are quite different, and the results of such laboratory experiments can be applied to modeling of the ocean phenomena with extra care. We also present the estimate on the minimum size of the laboratory installation, allowing to model open ocean surface wave dynami...

  1. Carbon dioxide stripping in aquaculture -- part III: model verification

    Science.gov (United States)

    Colt, John; Watten, Barnaby; Pfeiffer, Tim

    2012-01-01

    Based on conventional mass transfer models developed for oxygen, the use of the non-linear ASCE method, 2-point method, and one parameter linear-regression method were evaluated for carbon dioxide stripping data. For values of KLaCO2 < approximately 1.5/h, the 2-point or ASCE method are a good fit to experimental data, but the fit breaks down at higher values of KLaCO2. How to correct KLaCO2 for gas phase enrichment remains to be determined. The one-parameter linear regression model was used to vary the C*CO2 over the test, but it did not result in a better fit to the experimental data when compared to the ASCE or fixed C*CO2 assumptions.

  2. A New Approach to Model Verification, Falsification and Selection

    Directory of Open Access Journals (Sweden)

    Andrew J. Buck

    2015-06-01

    Full Text Available This paper shows that a qualitative analysis, i.e., an assessment of the consistency of a hypothesized sign pattern for structural arrays with the sign pattern of the estimated reduced form, can always provide decisive insight into a model’s validity both in general and compared to other models. Qualitative analysis can show that it is impossible for some models to have generated the data used to estimate the reduced form, even though standard specification tests might show the model to be adequate. A partially specified structural hypothesis can be falsified by estimating as few as one reduced form equation. Zero restrictions in the structure can themselves be falsified. It is further shown how the information content of the hypothesized structural sign patterns can be measured using a commonly applied concept of statistical entropy. The lower the hypothesized structural sign pattern’s entropy, the more a priori information it proposes about the sign pattern of the estimated reduced form. As an hypothesized structural sign pattern has a lower entropy, it is more subject to type 1 error and less subject to type 2 error. Three cases illustrate the approach taken here.

  3. Early Development of UVM based Verification Environment of Image Signal Processing Designs using TLM Reference Model of RTL

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2014-01-01

    Full Text Available With semiconductor industry trend of “smaller the better”, from an idea to a final product, more innovation on product portfolio and yet remaining competitive and profitable are few criteria which are culminating into pressure and need for more and more innovation for CAD flow, process management and project execution cycle. Project schedules are very tight and to achieve first silicon success is key for projects. This necessitates quicker verification with better coverage matrix. Quicker Verification requires early development of the verification environment with wider test vectors without waiting for RTL to be available. In this paper, we are presenting a novel approach of early development of reusable multi-language verification flow, by addressing four major activities of verification – 1. Early creation of Executable Specification 2. Early creation of Verification Environment 3. Early development of test vectors and 4. Better and increased Re-use of blocks Although this paper focuses on early development of UVM based Verification Environment of Image Signal Processing designs using TLM Reference Model of RTL, same concept can be extended for non-image signal processing designs.

  4. Pyrolysis of biomass briquettes, modelling and experimental verification

    NARCIS (Netherlands)

    van der Aa, B; Lammers, G; Beenackers, AACM; Kopetz, H; Weber, T; Palz, W; Chartier, P; Ferrero, GL

    1998-01-01

    Carbonisation of biomass briquettes was studied using a dedicated single briquette carbonisation reactor. The reactor enabled continuous measurement of the briquette mass and continuous measurement of the radial temperature profile in the briquette. Furthermore pyrolysis gas production and compositi

  5. Introducing uncertainty of radar-rainfall estimates to the verification of mesoscale model precipitation forecasts

    Directory of Open Access Journals (Sweden)

    M. P. Mittermaier

    2008-05-01

    Full Text Available A simple measure of the uncertainty associated with using radar-derived rainfall estimates as "truth" has been introduced to the Numerical Weather Prediction (NWP verification process to assess the effect on forecast skill and errors. Deterministic precipitation forecasts from the mesoscale version of the UK Met Office Unified Model for a two-day high-impact event and for a month were verified at the daily and six-hourly time scale using a spatially-based intensity-scale method and various traditional skill scores such as the Equitable Threat Score (ETS and log-odds ratio. Radar-rainfall accumulations from the UK Nimrod radar-composite were used.

    The results show that the inclusion of uncertainty has some effect, shifting the forecast errors and skill. The study also allowed for the comparison of results from the intensity-scale method and traditional skill scores. It showed that the two methods complement each other, one detailing the scale and rainfall accumulation thresholds where the errors occur, the other showing how skillful the forecast is. It was also found that for the six-hourly forecasts the error distributions remain similar with forecast lead time but skill decreases. This highlights the difference between forecast error and forecast skill, and that they are not necessarily the same.

  6. Introducing uncertainty of radar-rainfall estimates to the verification of mesoscale model precipitation forecasts

    Science.gov (United States)

    Mittermaier, M. P.

    2008-05-01

    A simple measure of the uncertainty associated with using radar-derived rainfall estimates as "truth" has been introduced to the Numerical Weather Prediction (NWP) verification process to assess the effect on forecast skill and errors. Deterministic precipitation forecasts from the mesoscale version of the UK Met Office Unified Model for a two-day high-impact event and for a month were verified at the daily and six-hourly time scale using a spatially-based intensity-scale method and various traditional skill scores such as the Equitable Threat Score (ETS) and log-odds ratio. Radar-rainfall accumulations from the UK Nimrod radar-composite were used. The results show that the inclusion of uncertainty has some effect, shifting the forecast errors and skill. The study also allowed for the comparison of results from the intensity-scale method and traditional skill scores. It showed that the two methods complement each other, one detailing the scale and rainfall accumulation thresholds where the errors occur, the other showing how skillful the forecast is. It was also found that for the six-hourly forecasts the error distributions remain similar with forecast lead time but skill decreases. This highlights the difference between forecast error and forecast skill, and that they are not necessarily the same.

  7. A New Integrated Weighted Model in SNOW-V10: Verification of Continuous Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results of nowcasts of four continuous variables generated from an integrated weighted model and underlying Numerical Weather Prediction (NWP) models. Real-time monitoring of fast changing weather conditions and the provision of short term forecasts, or nowcasts, in complex terrain within coastal regions is challenging to do with sufficient accuracy. A recently developed weighting, evaluation, bias correction and integration system was used in the Science of Nowcasting Olympic Weather for Vancouver 2010 project to generate integrated weighted forecasts (INTW) out to 6 h. INTW forecasts were generated with in situ observation data and background gridded forecasting data from Canadian high-resolution deterministic NWP system with three nested grids at 15-, 2.5- and 1-km horizontal grid-spacing configurations. In this paper, the four variables of temperature, relative humidity, wind speed and wind gust are treated as continuous variables for verifying the INTW forecasts. Fifteen sites were selected for the comparison of the model performances. The results of the study show that integrating surface observation data with the NWP forecasts produce better statistical scores than using either the NWP forecasts or an objective analysis of observed data alone. Overall, integrated observation and NWP forecasts improved forecast accuracy for the four continuous variables. The mean absolute errors from the INTW forecasts for the entire test period (12 February to 21 March 2010) are smaller than those from NWP forecasts with three configurations. The INTW is the best and most consistent performer among all models regardless of location and variable analyzed.

  8. Determinants of Business Success – Theoretical Model and Empirical Verification

    Directory of Open Access Journals (Sweden)

    Kozielski Robert

    2016-12-01

    Full Text Available Market knowledge, market orientation, learning competencies, and a business performance were the key issues of the research project conducted in the 2006 study. The main findings identified significant relationships between the independent variables (market knowledge, market orientation, learning competencies and the dependent variables (business success. A partial correlation analysis indicated that a business success primarily relies on organisational learning competencies. Organisational learning competencies, to a large extent (almost 60%, may be explained by the level of corporate market knowledge and market orientation. The aim of the paper is to evaluate to what extent the relationships between the variables are still valid. The research was based on primary and secondary data sources. The major field of the research was carried out in the form of quantitative studies. The results of the 2014 study are consistent with the previous (2006 results.

  9. Methods, Computational Platform, Verification, and Application of Earthquake-Soil-Structure-Interaction Modeling and Simulation

    Science.gov (United States)

    Tafazzoli, Nima

    Seismic response of soil-structure systems has attracted significant attention for a long time. This is quite understandable with the size and the complexity of soil-structure systems. The focus of three important aspects of ESSI modeling could be on consistent following of input seismic energy and a number of energy dissipation mechanisms within the system, numerical techniques used to simulate dynamics of ESSI, and influence of uncertainty of ESSI simulations. This dissertation is a contribution to development of one such tool called ESSI Simulator. The work is being done on extensive verified and validated suite for ESSI Simulator. Verification and validation are important for high fidelity numerical predictions of behavior of complex systems. This simulator uses finite element method as a numerical tool to obtain solutions for large class of engineering problems such as liquefaction, earthquake-soil-structure-interaction, site effect, piles, pile group, probabilistic plasticity, stochastic elastic-plastic FEM, and detailed large scale parallel models. Response of full three-dimensional soil-structure-interaction simulation of complex structures is evaluated under the 3D wave propagation. Domain-Reduction-Method is used for applying the forces as a two-step procedure for dynamic analysis with the goal of reducing the large size computational domain. The issue of damping of the waves at the boundary of the finite element models is studied using different damping patterns. This is used at the layer of elements outside of the Domain-Reduction-Method zone in order to absorb the residual waves coming out of the boundary layer due to structural excitation. Extensive parametric study is done on dynamic soil-structure-interaction of a complex system and results of different cases in terms of soil strength and foundation embedment are compared. High efficiency set of constitutive models in terms of computational time are developed and implemented in ESSI Simulator

  10. Verification and Validation of Numerical Models for Air/Water Flow on Coastal and Navigation Fluid-Structure Interaction Applications

    Science.gov (United States)

    Kees, C. E.; Farthing, M.; Dimakopoulos, A.; DeLataillade, T.

    2015-12-01

    Performance analysis and optimization of coastal and navigation structures is becoming feasible due to recent improvements in numerical methods for multiphase flows and the steady increase in capacity and availability of high performance computing resources. Now that the concept of fully three-dimensional air/water flow modelling for real world engineering analysis is achieving acceptance by the wider engineering community, it is critical to expand careful comparative studies on verification,validation, benchmarking, and uncertainty quantification for the variety of competing numerical methods that are continuing to evolve. Furthermore, uncertainty still remains about the relevance of secondary processes such as surface tension, air compressibility, air entrainment, and solid phase (structure) modelling so that questions about continuum mechanical theory and mathematical analysis of multiphase flow are still required. Two of the most popular and practical numerical approaches for large-scale engineering analysis are the Volume-Of-Fluid (VOF) and Level Set (LS) approaches. In this work we will present a publically available verification and validation test set for air-water-structure interaction problems as well as computational and physical model results including a hybrid VOF-LS method, traditional VOF methods, and Smoothed Particle Hydrodynamics (SPH) results. The test set repository and test problem formats will also be presented in order to facilitate future comparative studies and reproduction of scientific results.

  11. Verification and Validation of a Three-Dimensional Orthotropic Plasticity Constitutive Model Using a Unidirectional Composite

    Directory of Open Access Journals (Sweden)

    Canio Hoffarth

    2017-03-01

    Full Text Available A three-dimensional constitutive model has been developed for modeling orthotropic composites subject to impact loads. It has three distinct components—a deformation model involving elastic and plastic deformations; a damage model; and a failure model. The model is driven by tabular data that is generated either using laboratory tests or via virtual testing. A unidirectional composite—T800/F3900, commonly used in the aerospace industry, is used in the verification and validation tests. While the failure model is under development, these tests indicate that the implementation of the deformation and damage models in a commercial finite element program, LS-DYNA, is efficient, robust and accurate.

  12. Modelling and verification of melanin concentration on human skin type

    CSIR Research Space (South Africa)

    Karsten, AE

    2012-03-01

    Full Text Available -brown colour) and pheomelanin (yellow-reddish 11 colour) (3,4). Melanin is synthesized within melanosomes inside melanocytes located in the 12 basal layer of the epidermis and the mature melanosomes get transferred via dendrites to the 13 keratinocytes... in die epidermis where they are responsible for skin photoprotection (3). 14 It is well documented that the absorption and scattering of light through skin tissue 15 depends on the skin?s optical properties (see for example studies by Tuchin (5...

  13. Model-based mask verification on critical 45nm logic masks

    Science.gov (United States)

    Sundermann, F.; Foussadier, F.; Takigawa, T.; Wiley, J.; Vacca, A.; Depre, L.; Chen, G.; Bai, S.; Wang, J.-S.; Howell, R.; Arnoux, V.; Hayano, K.; Narukawa, S.; Kawashima, S.; Mohri, H.; Hayashi, N.; Miyashita, H.; Trouiller, Y.; Robert, F.; Vautrin, F.; Kerrien, G.; Planchot, J.; Martinelli, C.; Di-Maria, J. L.; Farys, V.; Vandewalle, B.; Perraud, L.; Le Denmat, J. C.; Villaret, A.; Gardin, C.; Yesilada, E.; Saied, M.

    2008-05-01

    In the continuous battle to improve critical dimension (CD) uniformity, especially for 45-nanometer (nm) logic advanced products, one important recent advance is the ability to accurately predict the mask CD uniformity contribution to the overall global wafer CD error budget. In most wafer process simulation models, mask error contribution is embedded in the optical and/or resist models. We have separated the mask effects, however, by creating a short-range mask process model (MPM) for each unique mask process and a long-range CD uniformity mask bias map (MBM) for each individual mask. By establishing a mask bias map, we are able to incorporate the mask CD uniformity signature into our modelling simulations and measure the effects on global wafer CD uniformity and hotspots. We also have examined several ways of proving the efficiency of this approach, including the analysis of OPC hot spot signatures with and without the mask bias map (see Figure 1) and by comparing the precision of the model contour prediction to wafer SEM images. In this paper we will show the different steps of mask bias map generation and use for advanced 45nm logic node layers, along with the current results of this new dynamic application to improve hot spot verification through Brion Technologies' model-based mask verification loop.

  14. Quantitative Safety: Linking Proof-Based Verification with Model Checking for Probabilistic Systems

    CERN Document Server

    Ndukwu, Ukachukwu

    2009-01-01

    This paper presents a novel approach for augmenting proof-based verification with performance-style analysis of the kind employed in state-of-the-art model checking tools for probabilistic systems. Quantitative safety properties usually specified as probabilistic system invariants and modeled in proof-based environments are evaluated using bounded model checking techniques. Our specific contributions include the statement of a theorem that is central to model checking safety properties of proof-based systems, the establishment of a procedure; and its full implementation in a prototype system (YAGA) which readily transforms a probabilistic model specified in a proof-based environment to its equivalent verifiable PRISM model equipped with reward structures. The reward structures capture the exact interpretation of the probabilistic invariants and can reveal succinct information about the model during experimental investigations. Finally, we demonstrate the novelty of the technique on a probabilistic library cas...

  15. Modelling and Formal Verification of Timing Aspects in Large PLC Programs

    CERN Document Server

    Fernandez Adiego, B; Blanco Vinuela, E; Tournier, J-C; Gonzalez Suarez, V M; Blech, J O

    2014-01-01

    One of the main obstacle that prevents model checking from being widely used in industrial control systems is the complexity of building formal models out of PLC programs, especially when timing aspects need to be integrated. This paper brings an answer to this obstacle by proposing a methodology to model and verify timing aspects of PLC programs. Two approaches are proposed to allow the users to balance the trade-off between the complexity of the model, i.e. its number of states, and the set of specifications possible to be verified. A tool supporting the methodology which allows to produce models for different model checkers directly from PLC programs has been developed. Verification of timing aspects for real-life PLC programs are presented in this paper using NuSMV.

  16. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  17. Verification of precipitation forecasts by the DWD limited area model LME over Cyprus

    Directory of Open Access Journals (Sweden)

    K. Savvidou

    2007-01-01

    Full Text Available A comparison is made between the precipitation forecasts by the non-hydrostatic limited area model LME of the German Weather Service (DWD and observations from a network of rain gauges in Cyprus. This is a first attempt to carry out a preliminary verification and evaluation of the LME precipitation forecasts over the area of Cyprus. For the verification, model forecasts and observations were used covering an eleven month period, from 1/2/2005 till 31/12/2005. The observations were made by three Automatic Weather Observing Systems (AWOS located at Larnaka and Paphos airports and at Athalassa synoptic station, as well as at 6, 6 and 8 rain gauges within a radius of about 30 km around these stations, respectively. The observations were compared with the model outputs, separately for each of the three forecast days. The "probability of detection" (POD of a precipitation event and the "false alarm rate" (FAR were calculated. From the selected cases of the forecast precipitation events, the average forecast precipitation amounts in the area around the three stations were compared with the measured ones. An attempt was also made to evaluate the model's skill in predicting the spatial distribution of precipitation and, in this respect, the geographical position of the maximum forecast precipitation amount was contrasted to the position of the corresponding observed maximum. Maps with monthly precipitation totals observed by a local network of 150 rain gauges were compared with the corresponding forecast precipitation maps.

  18. Certification and verification for Northrup model NSC-01-0732 fresnel lens concentrating solar collector

    Science.gov (United States)

    1979-01-01

    Structural analysis and certification of the collector system is presented. System verification against the interim performance criteria is presented and indicated by matrices. The verification discussion, analysis, and test results are also given.

  19. Formal Verification of a Secure Model for Building E-Learning Systems

    Directory of Open Access Journals (Sweden)

    Farhan M Al Obisat

    2016-06-01

    Full Text Available Internet is considered as common medium for E-learning to connect several parties with each other (instructors and students as they are supposed to be far away from each other. Both wired and wireless networks are used in this learning environment to facilitate mobile access to educational systems. This learning environment requires a secure connection and data exchange. An E-learning model was implemented and evaluated by conducting student’s experiments. Before the approach is deployed in the real world a formal verification for the model is completed which shows that unreachability case does not exist. The model in this paper which is concentrated on the security of e-content has successfully validated the model using SPIN Model Checker where no errors were found.

  20. Truth in Complex Adaptive Systems Models Should BE Based on Proof by Constructive Verification

    Science.gov (United States)

    Shipworth, David

    It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. `Emergent' properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.

  1. Multi-dimensional boron transport modeling in subchannel approach: Part I. Model selection, implementation and verification of COBRA-TF boron tracking model

    Energy Technology Data Exchange (ETDEWEB)

    Ozdemir, Ozkan Emre, E-mail: ozdemir@psu.edu [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, University Park, PA 16802 (United States); Avramova, Maria N., E-mail: mna109@psu.edu [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, University Park, PA 16802 (United States); Sato, Kenya, E-mail: kenya_sato@mhi.co.jp [Mitsubishi Heavy Industries (MHI), Kobe (Japan)

    2014-10-15

    Highlights: ► Implementation of multidimensional boron transport model in a subchannel approach. ► Studies on cross flow mechanism, heat transfer and lateral pressure drop effects. ► Verification of the implemented model via code-to-code comparison with CFD code. - Abstract: The risk of reflux condensation especially during a Small Break Loss Of Coolant Accident (SB-LOCA) and the complications of tracking the boron concentration experimentally inside the primary coolant system have stimulated and subsequently have been a focus of many computational studies on boron tracking simulations in nuclear reactors. This paper presents the development and implementation of a multidimensional boron transport model with Modified Godunov Scheme within a thermal-hydraulic code based on a subchannel approach. The cross flow mechanism in multiple-subchannel rod bundle geometry as well as the heat transfer and lateral pressure drop effects are considered in the performed studies on simulations of deboration and boration cases. The Pennsylvania State University (PSU) version of the COBRA-TF (CTF) code was chosen for the implementation of three different boron tracking models: First Order Accurate Upwind Difference Scheme, Second Order Accurate Godunov Scheme, and Modified Godunov Scheme. Based on the performed nodalization sensitivity studies, the Modified Godunov Scheme approach with a physical diffusion term was determined to provide the best solution in terms of precision and accuracy. As a part of the verification and validation activities, a code-to-code comparison was carried out with the STAR-CD computational fluid dynamics (CFD) code and presented here. The objective of this study was two-fold: (1) to verify the accuracy of the newly developed CTF boron tracking model against CFD calculations; and (2) to investigate its numerical advantages as compared to other thermal-hydraulics codes.

  2. Statistical methods to correct for verification bias in diagnostic studies are inadequate when there are few false negatives: a simulation study

    OpenAIRE

    Vickers Andrew J; Cronin Angel M

    2008-01-01

    Abstract Background A common feature of diagnostic research is that results for a diagnostic gold standard are available primarily for patients who are positive for the test under investigation. Data from such studies are subject to what has been termed "verification bias". We evaluated statistical methods for verification bias correction when there are few false negatives. Methods A simulation study was conducted of a screening study subject to verification bias. We compared estimates of the...

  3. Verification of sub-grid filtered drag models for gas-particle fluidized beds with immersed cylinder arrays

    Energy Technology Data Exchange (ETDEWEB)

    Sarkar, Avik [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sun, Xin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sundaresan, Sankaran [Princeton Univ., NJ (United States)

    2014-04-23

    The accuracy of coarse-grid multiphase CFD simulations of fluidized beds may be improved via the inclusion of filtered constitutive models. In our previous study (Sarkar et al., Chem. Eng. Sci., 104, 399-412), we developed such a set of filtered drag relationships for beds with immersed arrays of cooling tubes. Verification of these filtered drag models is addressed in this work. Predictions from coarse-grid simulations with the sub-grid filtered corrections are compared against accurate, highly-resolved simulations of full-scale turbulent and bubbling fluidized beds. The filtered drag models offer a computationally efficient yet accurate alternative for obtaining macroscopic predictions, but the spatial resolution of meso-scale clustering heterogeneities is sacrificed.

  4. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-03-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences yet, few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from different parameter uncertainty estimation methods. The Generalized Uncertainty Likelihood Estimator (GLUE, a modified version of GLUE, and the Shuffle Complex Evolution Metropolis (SCEM are used to generate model ensembles for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of parameter uncertainty, one that is commensurate with the dimension of the ensembles themselves. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  5. Modeling and Verification of Reconfigurable and Energy-Efficient Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Jiafeng Zhang

    2015-01-01

    Full Text Available This paper deals with the formal modeling and verification of reconfigurable and energy-efficient manufacturing systems (REMSs that are considered as reconfigurable discrete event control systems. A REMS not only allows global reconfigurations for switching the system from one configuration to another, but also allows local reconfigurations on components for saving energy when the system is in a particular configuration. In addition, the unreconfigured components of such a system should continue running during any reconfiguration. As a result, during a system reconfiguration, the system may have several possible paths and may fail to meet control requirements if concurrent reconfiguration events and normal events are not controlled. To guarantee the safety and correctness of such complex systems, formal verification is of great importance during a system design stage. This paper extends the formalism reconfigurable timed net condition/event systems (R-TNCESs in order to model all possible dynamic behavior in such systems. After that, the designed system based on extended R-TNCESs is verified with the help of a software tool SESA for functional, temporal, and energy-efficient properties. This paper is illustrated by an automatic assembly system.

  6. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    Science.gov (United States)

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  7. Studying the potential of point detectors in time-resolved dose verification of dynamic radiotherapy

    DEFF Research Database (Denmark)

    Beierholm, Anders Ravnsborg; Behrens, C. F.; Andersen, Claus E.

    2015-01-01

    for quality assurance and dose verification. In this context, traceable in-phantom dosimetry using a well-characterized point detector is often an important supplement to 2D-based quality assurance methods based on radiochromic film or detector arrays. In this study, an in-house developed dosimetry system...... in dose delivery, although exact positioning of detectors remains critical. (C) 2015 Published by Elsevier Ltd....

  8. Probabilistic UML statecharts for specification and verification: a case study

    NARCIS (Netherlands)

    Jansen, D.N.; Jürjens, J.; Cengarle, M.V.; Fernandez, E.B.; Rumpe, B.; Sander, R.

    2002-01-01

    This paper introduces a probabilistic extension of UML statecharts. A requirements-level semantics of statecharts is extended to include probabilistic elements. Desired properties for probabilistic statecharts are expressed in the probabilistic logic PCTL, and verified using the model checker Prism.

  9. Formal verification technique for grid service chain model and its application

    Institute of Scientific and Technical Information of China (English)

    XU Ke; WANG YueXuan; WU Cheng

    2007-01-01

    Ensuring the correctness and reliability of large-scale resource sharing and complex job processing is an important task for grid applications. From a formal method perspective, a grid service chain model based on state Pi calculus is proposed in this work as the theoretical foundation for the service composition and collaboration in grid. Following the idea of the Web Service Resource Framework (WSRF), state Pi calculus enables the life-cycle management of system states by associating the actions in the original Pi calculus with system states. Moreover, model checking technique is exploited for the design-time and run-time logical verification of grid service chain models. A grid application scenario of the dynamic analysis of material deformation structure is also provided to show the effectiveness of the proposed work.

  10. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    Science.gov (United States)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated

  11. Verification of extended model of goal directed behavior applied on aggression

    Directory of Open Access Journals (Sweden)

    Katarína Vasková

    2016-01-01

    behavioral desire. Also important impact of this factor on prevolitional stages of aggressive behavior was identified. Next important predictor of behavioral desire was anticipation of positive emotions, but not negative emotions. These results correspond with theory of self-regulation where behavior that is focused on goal attainment is accompanied with positive emotions (see for example Cacioppo, Gardner & Berntson, 1999, Carver, 2004. Results confirmed not only sufficient model fit, but also explained 53% of variance of behavioral desire, 68% of intention and 37% of behavior. Some limitations should be mentioned - especially unequal gender representation in the second sample. Some results could be affected by lower sample size. For the future we recommend use also other types of aggressive behavior in verification EMGB and also to apply more complex incorporation of inhibition to the model. At last, character of this study is co-relational, therefore further researches should manipulate with key variables in experimental way to appraise main characteristics of stated theoretical background.

  12. Automatic Verification of Biochemical Network Using Model Checking Method%基于模型校核的生化网络自动辨别方法

    Institute of Scientific and Technical Information of China (English)

    Jinkyung Kim; Younghee Lee; Il Moon

    2008-01-01

    This study focuses on automatic searching and verifying methods for the reachability, transition logics and hierarchical structure in all possible paths of biological processes using model checking. The automatic search and verification for alternative paths within complex and large networks in biological process can provide a consid-erable amount of solutions, which is difficult to handle manually. Model checking is an automatic method for veri-fying if a circuit or a condition, expressed as a concurrent transition system, satisfies a set of properties expressed ina temporal logic, such as computational tree logic (CTL). This article represents that model checking is feasible in biochemical network verification and it shows certain advantages over simulation for querying and searching of special behavioral properties in biochemical processes.

  13. Feasibility Study on Applying Radiophotoluminescent Glass Dosimeters for CyberKnife SRS Dose Verification.

    Science.gov (United States)

    Hsu, Shih-Ming; Hung, Chao-Hsiung; Liao, Yi-Jen; Fu, Hsiao-Mei; Tsai, Jo-Ting; Huang, Yung-Hui; Huang, David Y C

    2017-01-01

    CyberKnife is one of multiple modalities for stereotactic radiosurgery (SRS). Due to the nature of CyberKnife and the characteristics of SRS, dose evaluation of the CyberKnife procedure is critical. A radiophotoluminescent glass dosimeter was used to verify the dose accuracy for the CyberKnife procedure and validate a viable dose verification system for CyberKnife treatment. A radiophotoluminescent glass dosimeter, thermoluminescent dosimeter, and Kodak EDR2 film were used to measure the lateral dose profile and percent depth dose of CyberKnife. A Monte Carlo simulation for dose verification was performed using BEAMnrc to verify the measured results. This study also used a radiophotoluminescent glass dosimeter coupled with an anthropomorphic phantom to evaluate the accuracy of the dose given by CyberKnife. Measurements from the radiophotoluminescent glass dosimeter were compared with the results of a thermoluminescent dosimeter and EDR2 film, and the differences found were less than 5%. The radiophotoluminescent glass dosimeter has some advantages in terms of dose measurements over CyberKnife, such as repeatability, stability, and small effective size. These advantages make radiophotoluminescent glass dosimeters a potential candidate dosimeter for the CyberKnife procedure. This study concludes that radiophotoluminescent glass dosimeters are a promising and reliable dosimeter for CyberKnife dose verification with clinically acceptable accuracy within 5%.

  14. Design and verification of a simple 3D dynamic model of speed skating which mimics observed forces and motions.

    Science.gov (United States)

    van der Kruk, E; Veeger, H E J; van der Helm, F C T; Schwab, A L

    2017-09-14

    Advice about the optimal coordination pattern for an individual speed skater, could be addressed by simulation and optimization of a biomechanical speed skating model. But before getting to this optimization approach one needs a model that can reasonably match observed behaviour. Therefore, the objective of this study is to present a verified three dimensional inverse skater model with minimal complexity, which models the speed skating motion on the straights. The model simulates the upper body transverse translation of the skater together with the forces exerted by the skates on the ice. The input of the model is the changing distance between the upper body and the skate, referred to as the leg extension (Euclidean distance in 3D space). Verification shows that the model mimics the observed forces and motions well. The model is most accurate for the position and velocity estimation (respectively 1.2% and 2.9% maximum residuals) and least accurate for the force estimations (underestimation of 4.5-10%). The model can be used to further investigate variables in the skating motion. For this, the input of the model, the leg extension, can be optimized to obtain a maximal forward velocity of the upper body. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  15. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, A.; Larsen, K.G.; Møller, M.H.;

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  16. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Baba, H; Tachibana, H [The National Cancer Center Hospital East, Kashiwa, Chiba (Japan); Kamima, T; Takahashi, R [The Cancer Institute Hospital of JFCR, Koutou-ku, Tokyo (Japan); Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa-prefecture (Japan); Sugawara, Y [The National Center for Global Health and Medicine, Shinjuku-ku, Tokyo (Japan); Yamamoto, T [Otemae Hospital, Chuou-ku, Osaka-city (Japan); Sato, A [Itabashi Central General Hospital, Itabashi-ku, Tokyo (Japan); Yamashita, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan)

    2015-06-15

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.

  17. Wave dispersion in the hybrid-Vlasov model: verification of Vlasiator

    CERN Document Server

    Kempf, Yann; von Alfthan, Sebastian; Vaivads, Andris; Palmroth, Minna; Koskinen, Hannu E J

    2013-01-01

    Vlasiator is a new hybrid-Vlasov plasma simulation code aimed at simulating the entire magnetosphere of the Earth. The code treats ions (protons) kinetically through Vlasov's equation in the six-dimensional phase space while electrons are a massless charge-neutralizing fluid [M. Palmroth et al., Journal of Atmospheric and Solar-Terrestrial Physics 99, 41 (2013); A. Sandroos et al., Parallel Computing 39, 306 (2013)]. For first global simulations of the magnetosphere, it is critical to verify and validate the model by established methods. Here, as part of the verification of Vlasiator, we characterize the low-\\beta\\ plasma wave modes described by this model and compare with the solution computed by the Waves in Homogeneous, Anisotropic Multicomponent Plasmas (WHAMP) code [K. R\\"onnmark, Kiruna Geophysical Institute Reports 179 (1982)], using dispersion curves and surfaces produced with both programs. The match between the two fundamentally different approaches is excellent in the low-frequency, long wavelength...

  18. Formal Modeling and Verification of Context-Aware Systems using Event-B

    Directory of Open Access Journals (Sweden)

    Hong Anh Le

    2014-12-01

    Full Text Available Context awareness is a computing paradigm that makes applications responsive and adaptive with their environment. Formal modeling and verification of context-aware systems are challenging issues in the development as they are complex and uncertain. In this paper, we propose an approach to use a formal method Event-B to model and verify such systems. First, we specify a context aware system’s components such as context data entities, context rules, context relations by Event-B notions. In the next step, we use the Rodin platform to verify the system’s desired properties such as context constraint preservation. It aims to benefit from natural representation of context awareness concepts in Event-B and proof obligations generated by refinement mechanism to ensure the correctness of systems. We illustrate the use of our approach on a scenario of an Adaptive Cruise Control system.

  19. Verification Test of the SURF and SURFplus Models in xRage: Part III Affect of Mesh Alignment

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-15

    The previous studies used an underdriven detonation wave in 1-dimension (steady ZND reaction zone profile followed by a scale-invariant rarefaction wave) for PBX 9502 as a verification test of the implementation of the SURF and SURFplus models in the xRage code. Since the SURF rate is a function of the lead shock pressure, the question arises as to the effect on accuracy of variations in the detected shock pressure due to the alignment of the shock front with the mesh. To study the effect of mesh alignment we simulate a cylindrically diverging detonation wave using a planar 2-D mesh. The leading issue is the magnitude of azimuthal asymmetries in the numerical solution. The 2-D test case does not have an exact analytic solution. To quantify the accuracy, the 2-D solution along rays through the origin are compared to a highly resolved 1-D simulation in cylindrical geometry.

  20. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    Science.gov (United States)

    Mao, S. P.; Rottenberg, X.; Rochus, V.; Czarnecki, P.; Helin, P.; Severi, S.; Nauwelaers, B.; Tilmans, H. A. C.

    2017-03-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp.). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells.

  1. Numerical verification of similar Cam-clay model based on generalized potential theory

    Institute of Scientific and Technical Information of China (English)

    钟志辉; 杨光华; 傅旭东; 温勇; 张玉成

    2014-01-01

    From the mathematical principles, the generalized potential theory can be employed to create constitutive model of geomaterial directly. The similar Cam-clay model, which is created based on the generalized potential theory, has less assumptions, clearer mathematical basis, and better computational accuracy. Theoretically, it is more scientific than the traditional Cam-clay models. The particle flow code PFC3D was used to make numerical tests to verify the rationality and practicality of the similar Cam-clay model. The verification process was as follows: 1) creating the soil sample for numerical test in PFC3D, and then simulating the conventional triaxial compression test, isotropic compression test, and isotropic unloading test by PFC3D; 2) determining the parameters of the similar Cam-clay model from the results of above tests; 3) predicting the sample’s behavior in triaxial tests under different stress paths by the similar Cam-clay model, and comparing the predicting results with predictions by the Cam-clay model and the modified Cam-clay model. The analysis results show that the similar Cam-clay model has relatively high prediction accuracy, as well as good practical value.

  2. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    Science.gov (United States)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  3. Hybrid Enrichment Verification Array: Module Characterization Studies Version 2

    Energy Technology Data Exchange (ETDEWEB)

    Zalavadia, Mital A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Leon E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McDonald, Benjamin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kulisek, Jonathan A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mace, Emily K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Deshmukh, Nikhil S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-12-01

    The work presented in this report is focused on the characterization and refinement of the HEVA approach, which combines the traditional 186-keV 235U signature with high-energy prompt gamma rays from neutron capture in the detector and surrounding collimator material, to determine the relative enrichment and 235U mass of the cylinder. The design of the HEVA modules (hardware and software) deployed in the current field trial builds on over seven years of study and evolution by PNNL and consists of a ø3''×3'' NaI(Tl) scintillator coupled to an Osprey digital multi-channel analyzer tube base from Canberra. In comparison to previous versions, the new design boosts the high energy prompt gamma-ray signature, provides more flexible and effective collimation, and improves count-rate management via commercially available pulse-processing electronics with a special modification prompted by PNNL.

  4. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  5. Verification of a laboratory-based dilation model for in situ conditions using continuum models

    Institute of Scientific and Technical Information of China (English)

    G. Walton; M.S. Diederichs; L.R. Alejano; J. Arzúa

    2014-01-01

    With respect to constitutive models for continuum modeling applications, the post-yield domain re-mains the area of greatest uncertainty. Recent studies based on laboratory testing have led to the development of a number of models for brittle rock dilation, which account for both the plastic shear strain and confining stress dependencies of this phenomenon. Although these models are useful in providing an improved understanding of how dilatancy evolves during a compression test, there has been relatively little work performed examining their validity for modeling brittle rock yield in situ. In this study, different constitutive models for rock dilation are reviewed and then tested, in the context of a number of case studies, using a continuum finite-difference approach (FLAC). The uncertainty associated with the modeling of brittle fracture localization is addressed, and the overall ability of mobilized dilation models to replicate in situ deformation measurements and yield patterns is evaluated.

  6. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  7. Community Radiative Transfer Model for Inter-Satellites Calibration and Verification

    Science.gov (United States)

    Liu, Q.; Nalli, N. R.; Ignatov, A.; Garrett, K.; Chen, Y.; Weng, F.; Boukabara, S. A.; van Delst, P. F.; Groff, D. N.; Collard, A.; Joseph, E.; Morris, V. R.; Minnett, P. J.

    2014-12-01

    Developed at the Joint Center for Satellite Data Assimilation, the Community Radiative Transfer Model (CRTM) [1], operationally supports satellite radiance assimilation for weather forecasting. The CRTM also supports JPSS/NPP and GOES-R missions [2] for instrument calibration, validation, monitoring long-term trending, and satellite retrieved products [3]. The CRTM is used daily at the NOAA NCEP to quantify the biases and standard deviations between radiance simulations and satellite radiance measurements in a time series and angular dependency. The purposes of monitoring the data assimilation system are to ensure the proper performance of the assimilation system and to diagnose problems with the system for future improvements. The CRTM is a very useful tool for cross-sensor verifications. Using the double difference method, it can remove the biases caused by slight differences in spectral response and geometric angles between measurements of the two instruments. The CRTM is particularly useful to reduce the difference between instruments for climate studies [4]. In this study, we will carry out the assessment of the Suomi National Polar-orbiting Partnership (SNPP) [5] Cross-track Infrared Sounder (CrIS) data [6], Advanced Technology Microwave Sounder (ATMS) data, and data for Visible Infrared Imaging Radiometer Suite (VIIRS) [7][8] thermal emissive bands. We use dedicated radiosondes and surface data acquired from NOAA Aerosols and Ocean Science Expeditions (AEROSE) [9]. The high quality radiosondes were launched when Suomi NPP flew over NOAA Ship Ronald H. Brown situated in the tropical Atlantic Ocean. The atmospheric data include profiles of temperature, water vapor, and ozone, as well as total aerosol optical depths. The surface data includes air temperature and humidity at 2 meters, skin temperature (Marine Atmospheric Emitted Radiance Interferometer, M-AERI [10]), surface temperature, and surface wind vector. [1] Liu, Q., and F. Weng, 2006: JAS [2] Liu, Q

  8. Verification of communication protocols in web services model-checking service compositions

    CERN Document Server

    Tari, Zahir; Mukherjee, Anshuman

    2014-01-01

    Gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with the essential, state-of-the-art information about sensor networking. In the near future, wireless sensor networks will become an integral part of our day-to-day life. To solve different sensor networking related issues, researchers have put a great deal of effort into coming up with innovative ideas. Verification of Communication Protocols in Web Services: Model-Checking Service Compositions gathers recent advancements in the field of self-organizing wireless sensor networks and provides readers with integral information about sensor networking. It introduces current technological trends, particularly in node organization, and provides implementation details of each networking type to help readers set up sensor networks in their related job fields. In addition, it identifies the limitations of current technologies, as well as future research directions.

  9. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  10. Statistical methods to correct for verification bias in diagnostic studies are inadequate when there are few false negatives: a simulation study

    Directory of Open Access Journals (Sweden)

    Vickers Andrew J

    2008-11-01

    Full Text Available Abstract Background A common feature of diagnostic research is that results for a diagnostic gold standard are available primarily for patients who are positive for the test under investigation. Data from such studies are subject to what has been termed "verification bias". We evaluated statistical methods for verification bias correction when there are few false negatives. Methods A simulation study was conducted of a screening study subject to verification bias. We compared estimates of the area-under-the-curve (AUC corrected for verification bias varying both the rate and mechanism of verification. Results In a single simulated data set, varying false negatives from 0 to 4 led to verification bias corrected AUCs ranging from 0.550 to 0.852. Excess variation associated with low numbers of false negatives was confirmed in simulation studies and by analyses of published studies that incorporated verification bias correction. The 2.5th – 97.5th centile range constituted as much as 60% of the possible range of AUCs for some simulations. Conclusion Screening programs are designed such that there are few false negatives. Standard statistical methods for verification bias correction are inadequate in this circumstance.

  11. Verification of Model of Calculation of Intra-Chamber Parameters In Hybrid Solid-Propellant Rocket Engines

    Directory of Open Access Journals (Sweden)

    Zhukov Ilya S.

    2016-01-01

    Full Text Available On the basis of obtained analytical estimate of characteristics of hybrid solid-propellant rocket engine verification of earlier developed physical and mathematical model of processes in a hybrid solid-propellant rocket engine for quasi-steady-state flow regime was performed. Comparative analysis of calculated and analytical data indicated satisfactory comparability of simulation results.

  12. Verification of Model of Calculation of Intra-Chamber Parameters In Hybrid Solid-Propellant Rocket Engines

    OpenAIRE

    Zhukov Ilya S.; Borisov Boris V.; Bondarchuk Sergey S.; Zhukov Alexander S.

    2016-01-01

    On the basis of obtained analytical estimate of characteristics of hybrid solid-propellant rocket engine verification of earlier developed physical and mathematical model of processes in a hybrid solid-propellant rocket engine for quasi-steady-state flow regime was performed. Comparative analysis of calculated and analytical data indicated satisfactory comparability of simulation results.

  13. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Se Ho; Lee, Seung Wook [Pusan National University, Busan (Korea, Republic of); Han, Su Chul; Park, Seung Woo [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-05-15

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

  14. Operational Characteristics Identification and Simulation Model Verification for Incheon International Airport

    Science.gov (United States)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Zhu, Zhifan; Jung, Yoon C.; Jeong, Myeongsook; Kim, Hyounkyong; Oh, Eunmi; Hong, Sungkwon; Lee, Junwon

    2016-01-01

    integrated into NASA's Airspace Technology Demonstration-2 (ATD-2) project for technology demonstration of Integrated Arrival-Departure-Surface (IADS) operations at CLT. This study is a part of the international research collaboration between KAIA (Korea Agency for Infrastructure Technology Advancement), KARI (Korea Aerospace Research Institute) and NASA, which is being conducted to validate the effectiveness of SARDA concept as a controller decision support tool for departure and surface management of ICN. This paper presents the preliminary results of the collaboration effort. It includes investigation of the operational environment of ICN, data analysis for identification of the operational characteristics of the airport, construction and verification of airport simulation model using Surface Operations Simulator and Scheduler (SOSS), NASA's fast-time simulation tool.

  15. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  16. Modelling Chinese Smart Grid: A Stochastic Model Checking Case Study

    CERN Document Server

    Yüksel, Ender; Nielson, Flemming; Zhu, Huibiao; Huang, Heqing

    2012-01-01

    Cyber-physical systems integrate information and communication technology functions to the physical elements of a system for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues that require novel methods and applications. In this context, an important issue is the verification of certain quantitative properties of the system. In this technical report, we consider a specific Chinese Smart Grid implementation and try to address the verification problem for certain quantitative properties including performance and battery consumption. We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker.

  17. Prototyping the Semantics of a DSL using ASF+SDF: Link to Formal Verification of DSL Models

    CERN Document Server

    Andova, Suzana; Engelen, Luc; 10.4204/EPTCS.56.5

    2011-01-01

    A formal definition of the semantics of a domain-specific language (DSL) is a key prerequisite for the verification of the correctness of models specified using such a DSL and of transformations applied to these models. For this reason, we implemented a prototype of the semantics of a DSL for the specification of systems consisting of concurrent, communicating objects. Using this prototype, models specified in the DSL can be transformed to labeled transition systems (LTS). This approach of transforming models to LTSs allows us to apply existing tools for visualization and verification to models with little or no further effort. The prototype is implemented using the ASF+SDF Meta-Environment, an IDE for the algebraic specification language ASF+SDF, which offers efficient execution of the transformation as well as the ability to read models and produce LTSs without any additional pre or post processing.

  18. Prototyping the Semantics of a DSL using ASF+SDF: Link to Formal Verification of DSL Models

    Directory of Open Access Journals (Sweden)

    Suzana Andova

    2011-06-01

    Full Text Available A formal definition of the semantics of a domain-specific language (DSL is a key prerequisite for the verification of the correctness of models specified using such a DSL and of transformations applied to these models. For this reason, we implemented a prototype of the semantics of a DSL for the specification of systems consisting of concurrent, communicating objects. Using this prototype, models specified in the DSL can be transformed to labeled transition systems (LTS. This approach of transforming models to LTSs allows us to apply existing tools for visualization and verification to models with little or no further effort. The prototype is implemented using the ASF+SDF Meta-Environment, an IDE for the algebraic specification language ASF+SDF, which offers efficient execution of the transformation as well as the ability to read models and produce LTSs without any additional pre or post processing.

  19. Numerical climate modeling and verification of selected areas for heat waves of Pakistan using ensemble prediction system

    Science.gov (United States)

    Amna, S.; Samreen, N.; Khalid, B.; Shamim, A.

    2013-06-01

    Depending upon the topography, there is an extreme variation in the temperature of Pakistan. Heat waves are the Weather-related events, having significant impact on the humans, including all socioeconomic activities and health issues as well which changes according to the climatic conditions of the area. The forecasting climate is of prime importance for being aware of future climatic changes, in order to mitigate them. The study used the Ensemble Prediction System (EPS) for the purpose of modeling seasonal weather hind-cast of three selected areas i.e., Islamabad, Jhelum and Muzaffarabad. This research was purposely carried out in order to suggest the most suitable climate model for Pakistan. Real time and simulated data of five General Circulation Models i.e., ECMWF, ERA-40, MPI, Meteo France and UKMO for selected areas was acquired from Pakistan Meteorological Department. Data incorporated constituted the statistical temperature records of 32 years for the months of June, July and August. This study was based on EPS to calculate probabilistic forecasts produced by single ensembles. Verification was done out to assess the quality of the forecast t by using standard probabilistic measures of Brier Score, Brier Skill Score, Cross Validation and Relative Operating Characteristic curve. The results showed ECMWF the most suitable model for Islamabad and Jhelum; and Meteo France for Muzaffarabad. Other models have significant results by omitting particular initial conditions.

  20. Modeling the Magnetospheric X-ray Emission from Solar Wind Charge Exchange with Verification from XMM-Newton Observations

    Science.gov (United States)

    2016-08-26

    and Astronomy, University of Leicester, Leicester, UK, 2Finnish Meteorological Institute, Helsinki, Finland Abstract An MHD-based model of terrestrial...check confirms that we should continue the analysis with these new simulations. Figure 9 shows the comparison of these newly calculated model count rates...Journal of Geophysical Research: Space Physics Modeling the magnetospheric X-ray emission from solar wind charge exchange with verification from XMM

  1. Case study of verification, validation, and testing in the Automated Data Processing (ADP) system development life cycle

    Energy Technology Data Exchange (ETDEWEB)

    Riemer, C.A.

    1990-05-01

    Staff of the Environmental Assessment and Information Sciences Division of Argonne National Laboratory (ANL) studies the role played by the organizational participants in the Department of Veterans Affairs (VA) that conduct verification, validation, and testing (VV T) activities at various stages in the automated data processing (ADP) system development life cycle (SDLC). A case-study methodology was used to assess the effectiveness of VV T activities (tasks) and products (inputs and outputs). The case selected for the study was a project designed to interface the compensation and pension (C P) benefits systems with the centralized accounts receivable system (CARS). Argonne developed an organizational SDLC VV T model and checklists to help collect information from C P/CARS participants on VV T procedures and activities, and these were then evaluated against VV T standards.

  2. Decoloration of Amaranth by the white-rot fungus Trametes versicolor. Part II. Verification study.

    Science.gov (United States)

    Gavril, Mihaela; Hodson, Peter V

    2007-02-01

    The involvement of lignin peroxidase (LiP) in the decoloration of the mono-azo substituted napthalenic dye Amaranth was investigated with pure enzymes and whole cultures of Trametes versicolor. The verification study confirmed that LiP has a direct influence on the initial decoloration rate and showed that another enzyme, which does not need hydrogen peroxide to function and is not a laccase, also plays a role during decoloration. These results confirm the results of a previous statistical analysis. Furthermore, the fungal mycelium affects the performance of the decoloration process.

  3. Verification of forward kinematics of the numerical and analytical model of Fanuc AM100iB robot

    Science.gov (United States)

    Cholewa, A.; Świder, J.; Zbilski, A.

    2016-08-01

    The article presents the verification of forward kinematics of Fanuc AM100iB robot. The developed kinematic model of the machine was verified using tests on an actual robot model. The tests consisted in positioning the robot operating in the mode of controlling the values of natural angles in selected points of its workspace and reading the indications of the coordinate values of the TCP point in the robot's global coordinate system on the operator panel. Validation of the model consisted of entering the same values of natural angles that were used for positioning the robot in its inputs and calculating the coordinate values of the TCP of the machine's CAE model, and then comparing the results obtained with the values read. These results are the introduction to the partial verification of the dynamic model of the analysed device.

  4. Real-time Performance Verification of Core Protection and Monitoring System with Integrated Model for SMART Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Bon-Seung; Kim, Sung-Jin; Hwang, Dae-Hyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    In keeping with these purposes, a real-time model of the digital core protection and monitoring systems for simulator implementation was developed on the basis of SCOPS and SCOMS algorithms. In addition, important features of the software models were explained for the application to SMART simulator, and the real-time performance of the models linked with DLL was examined for various simulation scenarios. In this paper, performance verification of core protection and monitoring software is performed with integrated simulator model. A real-time performance verification of core protection and monitoring software for SMART simulator was performed with integrated simulator model. Various DLL connection tests were done for software algorithm change. In addition, typical accident scenarios of SMART were simulated with 3KEYMASTER and simulated results were compared with those of DLL linked core protection and monitoring software. Each calculational result showed good agreements.

  5. Thermal Pollution Mathematical Model. Volume 4: Verification of Three-Dimensional Rigid-Lid Model at Lake Keowee. [envrionment impact of thermal discharges from power plants

    Science.gov (United States)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1980-01-01

    The rigid lid model was developed to predict three dimensional temperature and velocity distributions in lakes. This model was verified at various sites (Lake Belews, Biscayne Bay, etc.) and th verification at Lake Keowee was the last of these series of verification runs. The verification at Lake Keowee included the following: (1) selecting the domain of interest, grid systems, and comparing the preliminary results with archival data; (2) obtaining actual ground truth and infrared scanner data both for summer and winter; and (3) using the model to predict the measured data for the above periods and comparing the predicted results with the actual data. The model results compared well with measured data. Thus, the model can be used as an effective predictive tool for future sites.

  6. Verification and optimization of a PLC control schedule

    NARCIS (Netherlands)

    Brinksma, Ed; Mader, Angelika; Fehnker, Ansgar

    2002-01-01

    We report on the use of model checking techniques for both the verification of a process control program and the derivation of optimal control schedules. Most of this work has been carried out as part of a case study for the EU VHS project (Verification of Hybrid Systems), in which the program for a

  7. Manufactured solutions and the verification of three-dimensional Stokes ice-sheet models

    Directory of Open Access Journals (Sweden)

    W. Leng

    2013-01-01

    Full Text Available The manufactured solution technique is used for the verification of computational models in many fields. In this paper, we construct manufactured solutions for the three-dimensional, isothermal, nonlinear Stokes model for flows in glaciers and ice sheets. The solution construction procedure starts with kinematic boundary conditions and is mainly based on the solution of a first-order partial differential equation for the ice velocity that satisfies the incompressibility condition. The manufactured solutions depend on the geometry of the ice sheet, basal sliding parameters, and ice softness. Initial conditions are taken from the periodic geometry of a standard problem of the ISMIP-HOM benchmark tests. The upper surface is altered through the manufactured solution procedure to generate an analytic solution for the time-dependent flow problem. We then use this manufactured solution to verify a parallel, high-order accurate, finite element Stokes ice-sheet model. Simulation results from the computational model show good convergence to the manufactured analytic solution.

  8. Verification of a 1-dimensional model for predicting shallow infiltration at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Hevesi, J.A.; Flint, A.L. [Geological Survey, Mercury, NV (United States); Flint, L.E. [Foothill Engineering Consultants, Mercury, Nevada (United States)

    1994-12-31

    A characterization of net infiltration rates is needed for site-scale evaluation of groundwater flow at Yucca Mountain, Nevada. Shallow infiltration caused by precipitation may be a potential source of net infiltration. A 1-dimensional finite difference model of shallow infiltration with a moisture-dependant evapotranspiration function and a hypothetical root-zone was calibrated and verified using measured water content profiles, measured precipitation, and estimated potential evapotranspiration. Monthly water content profiles obtained from January 1990 through October 1993 were measured by geophysical logging of 3 boreholes located in the alluvium channel of Pagany Wash on Yucca Mountain. The profiles indicated seasonal wetting and drying of the alluvium in response to winter season precipitation and summer season evapotranspiration above a depth of 2.5 meters. A gradual drying trend below a depth of 2.5 meters was interpreted as long-term redistribution and/or evapotranspiration following a deep infiltration event caused by runoff in Pagany Wash during 1984. An initial model, calibrated using the 1990 to 1 992 record, did not provide a satisfactory prediction of water content profiles measured in 1993 following a relatively wet winter season. A re-calibrated model using a modified, seasonally-dependent evapotranspiration function provided an improved fit to the total record. The new model provided a satisfactory verification using water content changes measured at a distance of 6 meters from the calibration site, but was less satisfactory in predicting changes at a distance of 18 meters.

  9. Verification of a 1-dimensional model for predicting shallow infiltration at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Hevesi, J.; Flint, A.L. [Geological Survey, Mercury, NV (United States); Flint, L.E. [Foothill Eng. Consultants, Mercury, NV (United States)

    1994-12-31

    A characterization of net infiltration rates is needed for site-scale evaluation of groundwater flow at Yucca Mountain, Nevada. Shallow infiltration caused by precipitation may be a potential source of net infiltration. A 1-dimensional finite difference model of shallow infiltration with a moisture-dependent evapotranspiration function and a hypothetical root-zone was calibrated and verified using measured water content profiles, measured precipitation, and estimated potential evapotranspiration. Monthly water content profiles obtained from January 1990 through October 1993 were measured by geophysical logging of 3 boreholes located in the alluvium channel of Pagany Wash on Yucca Mountain. The profiles indicated seasonal wetting and drying of the alluvium in response to winter season precipitation and summer season evapotranspiration above a depth of 2.5 meters. A gradual drying trend below a depth of 2.5 meters was interpreted as long-term redistribution and/or evapotranspiration following a deep infiltration event caused by runoff in Pagany Wash during 1984. An initial model, calibrated using the 1990 to 1992 record, did not provide a satisfactory prediction of water content profiles measured in 1993 following a relatively wet winter season. A re-calibrated model using a modified, seasonally-dependent evapotranspiration function provided an improved fit to the total record. The new model provided a satisfactory verification using water content changes measured at a distance of 6 meters from the calibration site, but was less satisfactory in predicting changes at a distance of 18 meters.

  10. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  11. Fiction and reality in the modelling world - Balance between simplicity and complexity, calibration and identifiability, verification and falsification

    DEFF Research Database (Denmark)

    Harremoës, P.; Madsen, H.

    1999-01-01

    Where is the balance between simplicity and complexity in model prediction of urban drainage structures? The calibration/verification approach to testing of model performance gives an exaggerated sense of certainty. Frequently, the model structure and the parameters are not identifiable...... by calibration/verification on the basis of the data series available, which generates elements of sheer guessing - unless the universality of the model is be based on induction, i.e. experience from the sum of all previous investigations. There is a need to deal more explicitly with uncertainty...... and to incorporate that in the design, operation and control of urban drainage structures. (C) 1999 IAWQ Published by Elsevier Science Ltd. All rights reserved....

  12. Model based correction of placement error in EBL and its verification

    Science.gov (United States)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-05-01

    In maskmaking, the main source of error contributing to placement error is charging. DISPLACE software corrects the placement error for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. One important step is the calibration of physical model. A test layout on a single calibration mask was used for calibration. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for the verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE. A good correlation of the measured and predicted values of the correction confirmed the high accuracy of the charging placement error correction.

  13. ParFlow.RT: Development and Verification of a New Reactive Transport Model

    Science.gov (United States)

    Beisman, J. J., III

    2015-12-01

    In natural subsurface systems, total elemental fluxes are often heavily influenced by areas of disproportionately high reaction rates. These pockets of high reaction rates tend to occur at interfaces, such as the hyporheic zone, where a hydrologic flowpath converges with either a chemically distinct hydrologic flowpath or a reactive substrate. Understanding the affects that these highly reactive zones have on the behavior of shallow subsurface systems is integral to the accurate quantification of nutrient fluxes and biogeochemical cycling. Numerical simulations of these systems may be able to offer some insight. To that end, we have developed a new reactive transport model, ParFlow.RT, by coupling the parallel flow and transport code ParFlow with the geochemical engines of both PFLOTRAN and CrunchFlow. The coupling was accomplished via the Alquimia biogeochemistry API, which provides a unified interface to several geochemical codes and allows a relatively simple implementation of advanced geochemical functionality in flow and transport codes. This model uses an operator-splitting approach, where the transport and reaction steps are solved separately. Here, we present the details of this new model, and the results of verification simulations and biogeochemical cycling simulations of the DOE's East River field site outside of Gothic, CO.

  14. Modelling horizontal steam generator with ATHLET. Verification of different nodalization schemes and implementation of verified constitutive equations

    Energy Technology Data Exchange (ETDEWEB)

    Beliaev, J.; Trunov, N.; Tschekin, I. [OKB Gidropress (Russian Federation); Luther, W. [GRS Garching (Germany); Spolitak, S. [RNC-KI (Russian Federation)

    1995-12-31

    Currently the ATHLET code is widely applied for modelling of several Power Plants of WWER type with horizontal steam generators. A main drawback of all these applications is the insufficient verification of the models for the steam generator. This paper presents the nodalization schemes for the secondary side of the steam generator, the results of stationary calculations, and preliminary comparisons to experimental data. The consideration of circulation in the water inventory of the secondary side is proved to be necessary. (orig.). 3 refs.

  15. Performance and Probabilistic Verification of Regional Parameter Estimates for Conceptual Rainfall-runoff Models

    Science.gov (United States)

    Franz, K.; Hogue, T.; Barco, J.

    2007-12-01

    Identification of appropriate parameter sets for simulation of streamflow in ungauged basins has become a significant challenge for both operational and research hydrologists. This is especially difficult in the case of conceptual models, when model parameters typically must be "calibrated" or adjusted to match streamflow conditions in specific systems (i.e. some of the parameters are not directly observable). This paper addresses the performance and uncertainty associated with transferring conceptual rainfall-runoff model parameters between basins within large-scale ecoregions. We use the National Weather Service's (NWS) operational hydrologic model, the SACramento Soil Moisture Accounting (SAC-SMA) model. A Multi-Step Automatic Calibration Scheme (MACS), using the Shuffle Complex Evolution (SCE), is used to optimize SAC-SMA parameters for a group of watersheds with extensive hydrologic records from the Model Parameter Estimation Experiment (MOPEX) database. We then explore "hydroclimatic" relationships between basins to facilitate regionalization of parameters for an established ecoregion in the southeastern United States. The impact of regionalized parameters is evaluated via standard model performance statistics as well as through generation of hindcasts and probabilistic verification procedures to evaluate streamflow forecast skill. Preliminary results show climatology ("climate neighbor") to be a better indicator of transferability than physical similarities or proximity ("nearest neighbor"). The mean and median of all the parameters within the ecoregion are the poorest choice for the ungauged basin. The choice of regionalized parameter set affected the skill of the ensemble streamflow hindcasts, however, all parameter sets show little skill in forecasts after five weeks (i.e. climatology is as good an indicator of future streamflows). In addition, the optimum parameter set changed seasonally, with the "nearest neighbor" showing the highest skill in the

  16. Verification of an interaction model of an ultrasonic oscillatory system with periodontal tissues

    Directory of Open Access Journals (Sweden)

    V. A. Karpuhin

    2014-01-01

    Full Text Available Verification of an interaction model of an ultrasonic oscillatory system with biological tissues which was developed in COMSOL Multiphysics was carried out. It was shown that calculation results in COMSOL Multiphysics obtained using the “Finer” grid (the ratio of the grid step to a minimum transversal section area of the model ≤ 0.3 mm-1 best of all qualitatively and quantitatively corresponded to practical results. The average relative error of the obtained results in comparison with the experimental ones did not exceed 4.0%. Influence of geometrical parameters (thickness of load on electrical admittance of the ultrasonic oscillatory system interacting with biological tissues was investigated. It was shown that increase in thickness of load within the range from 0 to 95 mm led to decrease in calculated values of natural resonance frequency of longitudinal fluctuations and electrical admittance from 26,58 to 26,35 kHz and from 0,86 to 0,44 mS.

  17. Rheological-dynamical continuum damage model for concrete under uniaxial compression and its experimental verification

    Directory of Open Access Journals (Sweden)

    Milašinović Dragan D.

    2015-01-01

    Full Text Available A new analytical model for the prediction of concrete response under uniaxial compression and its experimental verification is presented in this paper. The proposed approach, referred to as the rheological-dynamical continuum damage model, combines rheological-dynamical analogy and damage mechanics. Within the framework of this approach the key continuum parameters such as the creep coefficient, Poisson’s ratio and damage variable are functionally related. The critical values of the creep coefficient and damage variable under peak stress are used to describe the failure mode of the concrete cylinder. The ultimate strain is determined in the post-peak regime only, using the secant stress-strain relation from damage mechanics. The post-peak branch is used for the energy analysis. Experimental data for five concrete compositions were obtained during the examination presented herein. The principal difference between compressive failure and tensile fracture is that there is a residual stress in the specimens, which is a consequence of uniformly accelerated motion of load during the examination of compressive strength. The critical interpenetration displacements and crushing energy are obtained theoretically based on the concept of global failure analysis. [Projekat Ministarstva nauke Republike Srbije, br. ON 174027: Computational Mechanics in Structural Engineering i br. TR 36017: Utilization of by-products and recycled waste materials in concrete composites for sustainable construction development in Serbia: Investigation and environmental assessment of possible applications

  18. Verification test of the SURF and SURFplus models in xRage

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-18

    As a verification test of the SURF and SURFplus models in the xRage code we use a propagating underdriven detonation wave in 1-D. This is about the only test cases for which an accurate solution can be determined based on the theoretical structure of the solution. The solution consists of a steady ZND reaction zone profile joined with a scale invariant rarefaction or Taylor wave and followed by a constant state. The end of the reaction profile and the head of the rarefaction coincide with the sonic CJ state of the detonation wave. The constant state is required to match a rigid wall boundary condition. For a test case, we use PBX 9502 with the same EOS and burn rate as previously used to test the shock detector algorithm utilized by the SURF model. The detonation wave is propagated for 10 μs (slightly under 80mm). As expected, the pointwise errors are largest in the neighborhood of discontinuities; pressure discontinuity at the lead shock front and pressure derivative discontinuities at the head and tail of the rarefaction. As a quantitative measure of the overall accuracy, the L2 norm of the difference of the numerical pressure and the exact solution is used. Results are presented for simulations using both a uniform grid and an adaptive grid that refines the reaction zone.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHOTOACOUSTIC SPECTROPHOTOMATER INNOVA AIR TECH INSTRUMENTS MODEL 1312 MULTI-GAS MONITOR

    Science.gov (United States)

    The U.S. Environmental Protection Agency, Through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report documents demons...

  20. A Method Based on Active Appearance Model and Gradient Orientation Pyramid of Face Verification as People Age

    Directory of Open Access Journals (Sweden)

    Ji-Xiang Du

    2014-01-01

    Full Text Available Face verification in the presence of age progression is an important problem that has not been widely addressed. In this paper, we propose to use the active appearance model (AAM and gradient orientation pyramid (GOP feature representation for this problem. First, we use the AAM on the dataset and generate the AAM images; we then get the representation of gradient orientation on a hierarchical model, which is the appearance of GOP. When combined with a support vector machine (SVM, experimental results show that our approach has excellent performance on two public domain face aging datasets: FGNET and MORPH. Second, we compare the performance of the proposed methods with a number of related face verification methods; the results show that the new approach is more robust and performs better.

  1. Verification of three-dimensional neutron kinetics model of TRAP-KS code regarding reactivity variations

    Energy Technology Data Exchange (ETDEWEB)

    Uvakin, Maxim A.; Alekhin, Grigory V.; Bykov, Mikhail A.; Zaitsev, Sergei I. [EDO ' GIDROPRESS' , Moscow Region, Podolsk (Russian Federation)

    2016-09-15

    This work deals with TRAP-KS code verification. TRAP-KS is used for coupled neutron and thermo-hydraulic process calculations of VVER reactors. The three-dimensional neutron kinetics model enables consideration of space effects, which are produced by energy field and feedback parameters variations. This feature has to be investigated especially for asymmetrical multiplying variations of core properties, power fluctuations and strong local perturbation insertion. The presented work consists of three test definitions. First, an asymmetrical control rod (CR) ejection during power operation is defined. This process leads to fast reactivity insertion with short-time power spike. As second task xenon oscillations are considered. Here, small negative reactivity insertion leads to power decreasing and induces space oscillations of xenon concentration. In the late phase, these oscillations are suppressed by external actions. As last test, an international code comparison for a hypothetical main steam line break (V1000CT-2, task 2) was performed. This scenario is interesting for asymmetrical positive reactivity insertion by decreasing coolant temperature in the affected loop.

  2. Kinematic Modelling and Simulation of a 2-R Robot Using SolidWorks and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Mahmoud Gouasmi

    2012-12-01

    Full Text Available The simulation of robot systems is becoming very popular, especially with the lowering of the cost of computers, and it can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. The trajectory planning of redundant manipulators is a very active area since many tasks require special characteristics to be satisfied. The importance of redundant manipulators has increased over the last two decades because of the possibility of avoiding singularities as well as obstacles within the course of motion. The angle that the last link of a 2 DOF manipulator makes with the x-axis is required in order to find the solution for the inverse kinematics problem. This angle could be optimized with respect to a given specified key factor (time, velocity, torques while the end-effector performs a chosen trajectory (i.e., avoiding an obstacle in the task space. Modeling and simulation of robots could be achieved using either of the following models: the geometrical model (positions, postures, the kinematic model and the dynamic model. To do so, the modelization of a 2-R robot type is implemented. Our main tasks are comparing two robot postures with the same trajectory (path and for the same length of time, and establishing a computing code to obtain the kinematic and dynamic parameters. SolidWorks and MATLAB/Simulink softwares are used to check the theory and the robot motion simulation. This could be easily generalized to a 3-R robot and possibly therefore to any serial robot (Scara, Puma, etc.. The verification of the obtained results by both softwares allows us to qualitatively evaluate and underline the validityof the chosen model and obtain the right conclusions. The results of the simulations are discussed and an agreement between the two softwares is certainly obtained.

  3. A study on the experimental verification for the pipe whip problem in a Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yeong Shin; Choi, Myeong Hwan; Kim, Yeong Wan; Hyeon, Joong Sup; Han, Jae Do; Kang, Yun Gee [Chungnam National Univ., Taejon (Korea, Republic of)

    1993-12-15

    The purpose of this study is to investigate on the experimental verification analysis for the pipe whip problems and to obtain the quantitative evaluation technologies for the design technique of pipe whip restraints. These will contribute to the advance of nuclear regulatory technologies and enhance nuclear power plant safety. This study presents the experimental and transient analytical results of pipe whip tests using the 4', 6' diameter pipe and U-shaped restraints. In the tests, the effects of the overhang length, clearance, impact height on the pipe whip behavior of the pipe-restraints were investigated. The transient impact analysis of the pipe-restraint system was conducted by the finite element program ABAQUS. The applicability of the ABAQUS program to the pipe whip analysis is made clear through this analysis.

  4. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    Science.gov (United States)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  5. Vacuum-assisted resin transfer molding (VARTM) model development, verification, and process analysis

    Science.gov (United States)

    Sayre, Jay Randall

    2000-12-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process

  6. Kinematic Modeling and Simulation of a 2-R Robot by Using Solid Works and Verification by MATLAB/Simulink

    Directory of Open Access Journals (Sweden)

    Fernini Brahim

    2012-05-01

    Full Text Available Simulation of robot systems which is getting very popular, especially with the lowering cost of computers, can be used for layout evaluation, feasibility studies, presentations with animation and off-line programming. Object staging modelisation using robots holds, wether for the object or the robot, the following models: The geometric one, the kinematics one and the dynamic one. To do so, the modelisation of a 2-R robot type is being implemented. Comparing between two robot postures with the same trajectory (path and for the same length of time and establishing a computing code to obtain the kinematic and dynamic parameters are the main tasks. SolidWorks and Matlab/Simulink softwares are used to check the theory and the robot motion simulation. The verification of the obtained results by both softwares allows us to, qualitatively evaluate ,underline the rightness of the chosen model and to get the right conclusions. The results of simulations were discussed. An agreement between the two softwares is certainly Obtained.

  7. A "Kane's Dynamics" Model for the Active Rack Isolation System Part Two: Nonlinear Model Development, Verification, and Simplification

    Science.gov (United States)

    Beech, G. S.; Hampton, R. D.; Rupert, J. K.

    2004-01-01

    Many microgravity space-science experiments require vibratory acceleration levels that are unachievable without active isolation. The Boeing Corporation's active rack isolation system (ARIS) employs a novel combination of magnetic actuation and mechanical linkages to address these isolation requirements on the International Space Station. Effective model-based vibration isolation requires: (1) An isolation device, (2) an adequate dynamic; i.e., mathematical, model of that isolator, and (3) a suitable, corresponding controller. This Technical Memorandum documents the validation of that high-fidelity dynamic model of ARIS. The verification of this dynamics model was achieved by utilizing two commercial off-the-shelf (COTS) software tools: Deneb's ENVISION(registered trademark), and Online Dynamics Autolev(trademark). ENVISION is a robotics software package developed for the automotive industry that employs three-dimensional computer-aided design models to facilitate both forward and inverse kinematics analyses. Autolev is a DOS-based interpreter designed, in general, to solve vector-based mathematical problems and specifically to solve dynamics problems using Kane's method. The simplification of this model was achieved using the small-angle theorem for the joint angle of the ARIS actuators. This simplification has a profound effect on the overall complexity of the closed-form solution while yielding a closed-form solution easily employed using COTS control hardware.

  8. A study of applications scribe frame data verifications using design rule check

    Science.gov (United States)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  9. Feature-Aware Verification

    CERN Document Server

    Apel, Sven; Wendler, Philipp; von Rhein, Alexander; Beyer, Dirk

    2011-01-01

    A software product line is a set of software products that are distinguished in terms of features (i.e., end-user--visible units of behavior). Feature interactions ---situations in which the combination of features leads to emergent and possibly critical behavior--- are a major source of failures in software product lines. We explore how feature-aware verification can improve the automatic detection of feature interactions in software product lines. Feature-aware verification uses product-line verification techniques and supports the specification of feature properties along with the features in separate and composable units. It integrates the technique of variability encoding to verify a product line without generating and checking a possibly exponential number of feature combinations. We developed the tool suite SPLverifier for feature-aware verification, which is based on standard model-checking technology. We applied it to an e-mail system that incorporates domain knowledge of AT&T. We found that feat...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL RESIDENTIAL HOMES, WATERLOO BIOFILTER® MODEL 4-BEDROOM (NSF 02/03/WQPC-SWP)

    Science.gov (United States)

    Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...

  11. An empirical model for independent dose verification of the Gamma Knife treatment planning.

    Science.gov (United States)

    Phaisangittisakul, Nakorn; Ma, Lijun

    2002-09-01

    A formalism for an independent dose verification of the Gamma Knife treatment planning is developed. It is based on the approximation that isodose distribution for a single shot is in the shape of an ellipsoid in three-dimensional space. The dose profiles for a phantom along each of the three major axes are fitted to a function which contains the terms that represent the contributions from a point source, an extrafocal scattering, and a flat background. The fitting parameters are extracted for all four helmet collimators, at various shot locations, and with different skull shapes. The 33 parameters of a patient's skull shape obtained from the Skull Scaling Instrument measurements are modeled for individual patients. The relative doses for a treatment volume in the form of 31 x 31 x 31 matrix of points are extracted from the treatment planning system, the Leksell Gamma-Plan (LGP). Our model evaluates the relative doses using the same input parameters as in the LGP, which are skull measurement data, shot location, weight, gamma-angle of the head frame, and helmet collimator size. For 29 single-shot cases, the discrepancy of dose at the focus point between the calculation and the LGP is found to be within -1% to 2%. For multi-shot cases, the value and the coordinate of the maximum dose point from the calculation agree within +/-7% and +/-3 mm with the LGP results. In general, the calculated doses agree with the LGP calculations within +/-10% for the off-center locations. Results of calculation with this method for the dimension and location of the 50% isodose line are in good agreement with results from Leksell GammaPlan. Therefore, this method can be served as a useful tool for secondary quality assurance of Gamma Knife treatment plans.

  12. Verification of the model of predisposition in triathlon – structural model of confirmative factor analysis

    Directory of Open Access Journals (Sweden)

    Lenka Kovářová

    2012-09-01

    Full Text Available BACKGROUND: The triathlon is a combination of three different types of sport – swimming, cycling, and running. Each of these requires different top level predispositions and complex approach to talent selection is a rather difficult process. Attempts to identify assumptions in the triathlon have so far been specific and focused only on some groups of predispositions (physiology, motor tests, and psychology. The latest studies missed the structural approach and were based on determinants of sport performance, theory of sports training and expert assessment. OBJECTIVE: The aim of our study was to verify the model of predisposition in the short triathlon for talent assessment of young male athletes age 17–20 years. METHODS: The research sample consisted of 55 top level triathletes – men, who were included in the Government supported sports talent programme in the Czech Republic at the age of 17–20 years. We used a confirmative factor analysis (FA and Path diagram to verify the model, which allow us to explain mutual relationships among observed variables. For statistical data processing we used a structure equating modeling (SEM by software Lisrel L88. RESULTS: The study confirms best structural model for talent selection in triathlon at the age of 17–20 years old men, which composed seventeen indicators (tests and explained 91% of all cross-correlations (Goodness of Fit Index /GFI/ 0.91, Root Mean Square Residual /RMSR/ 0.13. Tests for predispositions in triathlons were grouped into five items, three motor predispositions (swimming, cycling and running skills, aerobic and psychological predispositions. Aerobic predispositions showed the highest importance to the assumptions to the general factor (1.00; 0. Running predispositions were measured as a very significant factor (–0.85; 0.28 which confirms importance of this critical stage of the race. Lower factor weight showed clusters of swimming (–0.61; 0.63 and cycling (0.53; 0

  13. Verification test on an innovated method for the studies on inheritance of resistance to rice sheath blight

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@We have recently developed a systematic method for the study on the inheritance of resistance to sheath blight. The key of the system is an innovated method of inoculation and investigation along with the employment of the permanent population. This paper reported the procedure of the system and the result of its verification.

  14. Assessing the therapeutic usefulness of Ricinus communis: A multicentric observational clinical verification study

    Directory of Open Access Journals (Sweden)

    P S Chakraborty

    2014-01-01

    Full Text Available Introduction: Clinical verification is an ongoing research programme of the Central Council for Research in Homoeopathy, under which many symptoms of Indian and rarely used drugs in Homoeopathy have been clinically verified. Objectives: To clinically verify the symptomatology of Ricinus communis as observed during its proving conducted by Council and also to ascertain the clinical symptoms relieved in the process of verification. Materials and Methods: Two hundred and twenty-five patients from all age-groups and both sexes were enrolled from the outpatient departments (OPDs of the institutes and units of the Council following the exclusion and inclusion criteria as per protocol and obtaining written consent. The presenting signs and symptoms were recorded in a predefined case recording proforma and if Ricinus communis was found very closely similar to the symptoms of the patient, the patients were enrolled in the study. The medicine was prescribed in different potencies as per the need of the case and in accordance with homoeopathic principles. The progress was noted in a follow-up sheet to determine the effects of the medicine, in relieving the symptoms of the patient. Result: Forty eight out of fifty three symptoms obtained from proving of Ricinus communis could be clinically verified. The characteristic indications were left-sided affinity, aggravation from sun, amelioration in open air, dryness of mucous membrane of gastrointestinal tract, dissatisfaction leading to irritability and anger. The usefulness of the medicine was mostly marked in relieving headache, coryza, aphthae, gastritis, diarrhoea, constipation and acne. All the verified symptoms indicated the scope of its therapeutic action. Conclusion: Ricinus communis can be considered as an important medicine for the management of acne, aphthae, backache, colic, constipation, coryza, cough, diarrhoea, dyspepsia, fever, gastritis, headache and irritability.

  15. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H

    2006-09-15

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard.

  16. 基于 OCL的本体模型校验方法%ONTOLOGY MODEL VERIFICATION APPROACH BASED ON OCL

    Institute of Scientific and Technical Information of China (English)

    钱鹏飞; 王英林; 张申生

    2015-01-01

    In this paper, by combining the set and relation theory with ontology model and introducing and expanding Object Constraint Language ( OCL) in object oriented technology, we present an OCL-based ontology verification method.The method extracts an ontology defi-nition meta-model ( ODM) , which is based on set and relation theory, from a large number of ontology models.The ontology model is divided into'entity related element' and'constraint rule related element' , and through a series of OCL expansion functions the formalised expres-sion of the above 2 kinds of ontology model elements are completed so as to fulfil the OCL-based formalised ontology model verification.In the end, the issue of realising ontology model conflict inspection and reconciliation using this model verification approach is further discussed through an ontology model verification sample of'vehicle management ontology slice of Baosteel information sharing platform' .%将集合关系理论与本体模型相结合,同时引入并扩展面向对象中的OCL( Object Constraint Language)语言,提出一种基于OCL的本体校验方法. 该方法从大量本体模型中抽象出一个本体定义元模型ODM(Ontology Constraint Meta-model),该元模型基于集合关系理论,将本体模型划分为"实体相关元素"和"约束规则相关元素",并通过一系列OCL扩展函数来完成上述两种本体模型元素的形式化表示,以完成基于OCL的本体模型形式化校验. 最后,通过宝钢信息共享平台车辆管理本体片段的本体模型校验实例,进一步讨论如何使用该模型校验方法实现本体模型的冲突检测和冲突消解.

  17. Establishment and experimental verification of the photoresist model considering interface slip between photoresist and concave spherical substrate

    Directory of Open Access Journals (Sweden)

    S. Yang

    2015-07-01

    Full Text Available A thickness distribution model of photoresist spin-coating on concave spherical substrate (CSS has been developed via both theoretical studies and experimental verification. The stress of photoresist on rotating CSS is analyzed and the boundary conditions of hydrodynamic equation are presented under the non-lubricating condition. Moreover, a multivariable polynomial equation of photoresist-layer thickness distribution is derived by analyzing and deducing the flow equation where the evaporation rate, substrate topography, interface slip between liquid and CSS, and the variation of rotational speed and photoresist parameters are considered in detail. Importantly, the photoresist-layer thickness at various CSS rotational speeds and liquid concentrations can be obtained according to the theoretical equation. The required photoresist viscosity and concentration parameters of different photoresist coating thickness under a certain coating speeds can be also solved through this equation. It is noted that the calculated theoretical values are well consistent with the experimental results which were measured with various CSS rotational speeds and liquid concentrations at steady state. Therefore, both our experimental results and theoretical analysis provide the guidance for photoresist dilution and pave the way for potential improvements and microfabrication applications in the future.

  18. Verification of COMDES-II Systems Using UPPAAL with Model Transformation

    DEFF Research Database (Denmark)

    Xu, Ke; Pettersson, Paul; Sierszecki, Krzysztof

    2008-01-01

    in a timed multitasking environment, modal continuous operation combining reactive control behavior with continuous data processing, etc., by following the principle of separation-of-concerns. In the paper we present a transformational approach to the formal verification of both timing and reactive behaviors...

  19. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2017-01-01

    -dependent. We show that the proposed method significantly reduces the error rates of text-dependent speaker verification for the non-target types: target-wrong and impostor-wrong while it maintains comparable TD-SV performance when impostors speak a correct utterance with respect to the conventional system...

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH ELECTRONIC SENSOR TECHNOLOGY MODEL 4100

    Science.gov (United States)

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. As part of this program, the...

  1. Electric Machine Analysis, Control and Verification for Mechatronics Motion Control Applications, Using New MATLAB Built-in Function and Simulink Model

    Directory of Open Access Journals (Sweden)

    Farhan A. Salem

    2014-05-01

    Full Text Available This paper proposes a new, simple and user–friendly MATLAB built-in function, mathematical and Simulink models, to be used to early identify system level problems, to ensure that all design requirements are met, and, generally, to simplify Mechatronics motion control design process including; performance analysis and verification of a given electric DC machine, proper controller selection and verification for desired output speed or angle.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES, VOLUME II

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  3. Ethylene Decomposition Initiated by Ultraviolet Radiation from Low Pressure Mercury Lamps: Kinetics Model Prediction and Experimental Verification.

    Science.gov (United States)

    Jozwiak, Zbigniew Boguslaw

    1995-01-01

    Ethylene is an important auto-catalytic plant growth hormone. Removal of ethylene from the atmosphere surrounding ethylene-sensitive horticultural products may be very beneficial, allowing an extended period of storage and preventing or delaying the induction of disorders. Various ethylene removal techniques have been studied and put into practice. One technique is based on using low pressure mercury ultraviolet lamps as a source of photochemical energy to initiate chemical reactions that destroy ethylene. Although previous research showed that ethylene disappeared in experiments with mercury ultraviolet lamps, the reactions were not described and the actual cause of ethylene disappearance remained unknown. Proposed causes for this disappearance were the direct action of ultraviolet rays on ethylene, reaction of ethylene with ozone (which is formed when air or gas containing molecular oxygen is exposed to radiation emitted by this type of lamp), or reactions with atomic oxygen leading to formation of ozone. The objective of the present study was to determine the set of physical and chemical actions leading to the disappearance of ethylene from artificial storage atmosphere under conditions of ultraviolet irradiation. The goal was achieved by developing a static chemical model based on the physical properties of a commercially available ultraviolet lamp, the photochemistry of gases, and the kinetics of chemical reactions. The model was used to perform computer simulations predicting time dependent concentrations of chemical species included in the model. Development of the model was accompanied by the design of a reaction chamber used for experimental verification. The model provided a good prediction of the general behavior of the species involved in the chemistry under consideration; however the model predicted lower than measured rate of ethylene disappearance. Some reasons for the model -experiment disagreement are radiation intensity averaging, the experimental

  4. Forecast Verification for North American Mesoscale (NAM) Operational Model over Karst/Non-Karst regions

    Science.gov (United States)

    Sullivan, Z.; Fan, X.

    2014-12-01

    Karst is defined as a landscape that contains especially soluble rocks such as limestone, gypsum, and marble in which caves, underground water systems, over-time sinkholes, vertical shafts, and subterranean river systems form. The cavities and voids within a karst system affect the hydrology of the region and, consequently, can affect the moisture and energy budget at surface, the planetary boundary layer development, convection, and precipitation. Carbonate karst landscapes comprise about 40% of land areas over the continental U.S east of Tulsa, Oklahoma. Currently, due to the lack of knowledge of the effects karst has on the atmosphere, no existing weather model has the capability to represent karst landscapes and to simulate its impact. One way to check the impact of a karst region on the atmosphere is to check the performance of existing weather models over karst and non-karst regions. The North American Mesoscale (NAM) operational forecast is the best example, of which historical forecasts were archived. Variables such as precipitation, maximum/minimum temperature, dew point, evapotranspiration, and surface winds were taken into account when checking the model performance over karst versus non-karst regions. The forecast verification focused on a five-year period from 2007-2011. Surface station observations, gridded observational dataset, and North American Regional Reanalysis (for certain variables with insufficient observations) were used. Thirteen regions of differing climate, size, and landscape compositions were chosen across the Contiguous United States (CONUS) for the investigation. Equitable threat score (ETS), frequency bias (fBias), and root-mean-square error (RMSE) scores were calculated and analyzed for precipitation. RMSE and mean bias (Bias) were analyzed for other variables. ETS, fBias, and RMSE scores show generally a pattern of lower forecast skills, a greater magnitude of error, and a greater under prediction of precipitation over karst than

  5. Operative temperature and thermal comfort in the sun - Implementation and verification of a model for IDA ICE

    DEFF Research Database (Denmark)

    Karlsen, Line; Grozman, Grigori; Heiselberg, Per Kvols;

    2015-01-01

    (MRT) model for IDA Indoor Climate and Energy (IDA ICE). The new feature of the model is that it includes the effect of shortwave radiation in the room and contributes to a more comprehensive prediction of operative temperature, e.g. of a person exposed to direct sun light. The verification...... comfort of persons affected by direct solar radiation. This may further have implications on the predicted energy use and design of the façade, since e.g. an enlarged need for local cooling or use of dynamic solar shading might be discovered....

  6. Experimental Verification of the Physical Model for Droplet-Particles Cleaning in Pulsed Bias Arc Ion Plating

    Institute of Scientific and Technical Information of China (English)

    Yanhui ZHAO; Guoqiang LIN; Chuang DONG; Lishi WEN

    2005-01-01

    It has been reported that application of pulsed biases in arc ion plating could effectively eliminate droplet particles.The present paper aims at experimental verification of a physical model proposed previously by us which is based on particle charging and repulsion in the pulsed plasma sheath. An orthogonal experiment was designed for this purpose,using the electrical parameters of the pulsed bias for the deposition of TiN films on stainless steel substrates. The effect of these parameters on the amount and the size distribution of the particles were analyzed, and the results provided sufficient evidence for the physical model.

  7. A Novel Verification Approach of Workflow Schema

    Institute of Scientific and Technical Information of China (English)

    WANG Guangqi; WANG Juying; WANG Yan; SONG Baoyan; YU Ge

    2006-01-01

    A workflow schema is an abstract description of the business processed by workflow model, and plays a critical role in analyzing, executing and reorganizing business processes. The verification issue on the correctness of complicated workflow schemas is difficult in the field of workflow. We make an intensive study of it in this paper. We present here local errors and schema logic errors (global errors) in workflow schemas in detail, and offer some constraint rules trying to avoid schema errors during modeling. In addition, we propose a verification approach based on graph reduction and graph spread, and give the algorithm. The algorithm is implemented in a workflow prototype system e-ScopeWork.

  8. Implementation of the Short-Term Ensemble Prediction System (STEPS) in Belgium and verification of case studies

    Science.gov (United States)

    Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent

    2014-05-01

    The Short-Term Ensemble Prediction System (STEPS) is a probabilistic precipitation nowcasting scheme developed at the Australian Bureau of Meteorology in collaboration with the UK Met Office. In order to account for the multiscaling nature of rainfall structures, the radar field is decomposed into an 8 levels multiplicative cascade using a Fast Fourier Transform. The cascade is advected using the velocity field estimated with optical flow and evolves stochastically according to a hierarchy of auto-regressive processes. This allows reproducing the empirical observation that the rate of temporal evolution of the small scales is faster than the large scales. The uncertainty in radar rainfall measurement and the unknown future development of the velocity field are also considered by stochastic modelling in order to reflect their typical spatial and temporal variability. Recently, a 4 years national research program has been initiated by the University of Leuven, the Royal Meteorological Institute (RMI) of Belgium and 3 other partners: PLURISK ("forecasting and management of extreme rainfall induced risks in the urban environment"). The project deals with the nowcasting of rainfall and subsequent urban inundations, as well as socio-economic risk quantification, communication, warning and prevention. At the urban scale it is widely recognized that the uncertainty of hydrological and hydraulic models is largely driven by the input rainfall estimation and forecast uncertainty. In support to the PLURISK project the RMI aims at integrating STEPS in the current operational deterministic precipitation nowcasting system INCA-BE (Integrated Nowcasting through Comprehensive Analysis). This contribution will illustrate examples of STEPS ensemble and probabilistic nowcasts for a few selected case studies of stratiform and convective rain in Belgium. The paper focuses on the development of STEPS products for potential hydrological users and a preliminary verification of the nowcasts

  9. Study and characterization of arrays of detectors for dosimetric verification of radiotherapy, analysis of business solutions; Estudio y caracterizacion de materiales de detectores para verificacion dosimetrica de radioterapia, analisis de las soluciones comerciales

    Energy Technology Data Exchange (ETDEWEB)

    Gago Arias, A.; Brualla Gonzalez, L.; Gomez Rodriguez, F.; Gonzalez Castano, D. M.; Pardo Montero, J.; Luna Vega, V.; Mosquera Sueiro, J.; Sanchez Garcia, M.

    2011-07-01

    This paper presents a comparative study of the detector arrays developed by different business houses to the demand for devices that speed up the verification process. Will analyze the effect of spatial response of individual detectors in the measurement of dose distributions, modeling the same and analyzing the ability of the arrays to detect variations in a treatment yield.

  10. A room acoustical computer model for industrial environments - the model and its verification

    DEFF Research Database (Denmark)

    Christensen, Claus Lynge; Foged, Hans Torben

    1998-01-01

    This paper presents an extension to the traditional room acoustic modelling methods allowing computer modelling of huge machinery in industrial spaces. The program in question is Odeon 3.0 Industrial and Odeon 3.0 Combined which allows the modelling of point sources, surface sources and line...... sources. Combining these three source types it is possible to model huge machinery in an easy and visually clear way. Traditionally room acoustic simulations have been aimed at auditorium acoustics. The aim of the simulations has been to model the room acoustic measuring setup consisting...

  11. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare these appro......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  12. Computer Simulations to Study Diffraction Effects of Stacking Faults in Beta-SiC: II. Experimental Verification. 2; Experimental Verification

    Science.gov (United States)

    Pujar, Vijay V.; Cawley, James D.; Levine, S. (Technical Monitor)

    2000-01-01

    Earlier results from computer simulation studies suggest a correlation between the spatial distribution of stacking errors in the Beta-SiC structure and features observed in X-ray diffraction patterns of the material. Reported here are experimental results obtained from two types of nominally Beta-SiC specimens, which yield distinct XRD data. These samples were analyzed using high resolution transmission electron microscopy (HRTEM) and the stacking error distribution was directly determined. The HRTEM results compare well to those deduced by matching the XRD data with simulated spectra, confirming the hypothesis that the XRD data is indicative not only of the presence and density of stacking errors, but also that it can yield information regarding their distribution. In addition, the stacking error population in both specimens is related to their synthesis conditions and it appears that it is similar to the relation developed by others to explain the formation of the corresponding polytypes.

  13. Synergy between Emissions Verification for Climate and Air Quality: Results from Modeling Analysis over the Contiguous US using CMAQ

    Science.gov (United States)

    Liu, Z.; Bambha, R.; Pinto, J. P.; Zeng, T.; Michelsen, H. A.

    2013-12-01

    The synergy between emissions-verification exercises for fossil-fuel CO2 and traditional air pollutants (TAPs, e.g., NOx, SO2, CO, and PM) stems from the common physical processes underlying the generation, transport, and perturbations of their emissions. Better understanding and characterizing such a synergetic relationship are of great interest and benefit for science and policy. To this end, we have been developing a modeling framework that allows for studying CO2 along with TAPs on regional-through-urban scales. The framework is based on the EPA Community Multi-Scale Air Quality (CMAQ) modeling system and has been implemented on a domain over the contiguous US, where abundant observational data and complete emissions information is available. In this presentation, we will show results from a comprehensive analysis of atmospheric CO2 and an array of TAPs observed from multiple networks and platforms (in situ and satellite observations) and those simulated by CMAQ over the contiguous US for a full year of 2007. We will first present the model configurations and input data used for CMAQ CO2 simulations and the results from model evaluations [1]. In light of the unique properties of CO2 compared to TAPs, we tested the sensitivity of model-simulated CO2 to different initial and boundary conditions, biosphere-atmosphere bidirectional fluxes and fossil-fuel emissions. We then examined the variability of CO2 and TAPs simulated by CMAQ and observed from the NOAA ESRL tall-tower network, the EPA AQS network, and satellites (e.g., SCIAMACHY and OMI) at various spatial and temporal scales. Finally, we diagnosed in CMAQ the roles of fluxes and transport in regulating the covariance between CO2 and TAPs manifested in both surface concentrations and column-integrated densities. We will discuss the implications from these results on how to understand trends and characteristics fossil-fuel emissions by exploiting and combining currently available observational and modeling

  14. Temperature Modeling of Lost Creek Lake Using CE-QUAL-W2: A Report on the Development, Calibration, Verification, and Application of the Model

    Science.gov (United States)

    2017-05-01

    ER D C/ EL T R- 17 -6 Temperature Modeling of Applegate Lake Using CE-QUAL-W2 A Report on the Development, Calibration, Verification...and Application of the Model En vi ro nm en ta l L ab or at or y Tammy L. Threadgill, Daniel F. Turner, Laurie A. Nicholas, Barry W. Bunch...Dorothy H. Tillman, and David L. Smith May 2017 Approved for public release; distribution is unlimited. The U.S. Army Engineer Research and

  15. Transition Metal Complexes of Naproxen: Synthesis, Characterization, Forced Degradation Studies, and Analytical Method Verification

    Directory of Open Access Journals (Sweden)

    Md. Sharif Hasan

    2016-01-01

    Full Text Available The aim of our current research was to synthesize some transition metal complexes of Naproxen, determine their physical properties, and examine their relative stability under various conditions. Characterizations of these complexes were done by 1H-NMR, Differential Scanning Calorimetry (DSC, FT-IR, HPLC, and scanning electron microscope (SEM. Complexes were subjected to acidic, basic, and aqueous hydrolysis as well as oxidation, reduction, and thermal degradation. Also the reversed phase high-performance liquid chromatography (RP-HPLC method of Naproxen outlined in USP was verified for the Naproxen-metal complexes, with respect to accuracy, precision, solution stability, robustness, and system suitability. The melting points of the complexes were higher than that of the parent drug molecule suggesting their thermal stability. In forced degradation study, complexes were found more stable than the Naproxen itself in all conditions: acidic, basic, oxidation, and reduction media. All the HPLC verification parameters were found within the acceptable value. Therefore, it can be concluded from the study that the metal complexes of Naproxen can be more stable drug entity and offer better efficacy and longer shelf life than the parent Naproxen.

  16. Verification and Validation of the Spalart-Allmaras Turbulence Model for Strand Grids

    Science.gov (United States)

    2013-01-01

    pp. 4703–4723. [11] Feynman , R., The Feynman Lectures on Physics : Mainly Mechanics, Radiation and Heat , 6th ed., Basic Books, New York, NY, 1977... Lecture Notes in Physics , Vol. 323, 1989, pp. 273–277. [34] Folkner, D., Katz, A., and Sankaran, V., “Design and Verification Methodology of Boundary...unpredictable, thus making its predic- tion and simulation difficult. Nobel Laureate Richard Feynman famously described turbu- lence as “the most important

  17. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao;

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated...

  18. A mathematical model of the nickel converter: Part I. Model development and verification

    Science.gov (United States)

    Kyllo, A. K.; Richards, G. G.

    1991-04-01

    A mathematical model of the nickel converter has been developed. The primary assumption of the model is that the three phases in the converter are in thermal and chemical equilibrium. All matte, slag, and gas in the converter is brought to equilibrium at the end of each of a series of short time steps throughout an entire charge. An empirical model of both the matte and slag is used to characterize the activity coefficients in each phase. Two nickel sulfide species were used to allow for the modeling of sulfur-deficient mattes. A heat balance is carried out over each time step, considering the major heat flows in the converter. The model was validated by a detailed comparison with measured data from six industrial charges. The overall predicted mass balance was shown to be close to that seen in actual practice, and the heat balance gave a good fit of converter temperature up to the last two or three blows of a charge. At this point, reactions in the converter begin to deviate strongly from “equilibrium,” probably due to the converter reactions coming under liquid-phase mass-transfer control. While the equilibrium assumption does work, it is not strictly valid, and the majority of the charge is probably under gas-phase mass-transfer control.

  19. A Survey of Workflow Modeling Approaches and Model Verification%工作流过程建模方法及模型的形式化验证

    Institute of Scientific and Technical Information of China (English)

    杨东; 王英林; 张申生; 傅谦

    2003-01-01

    Work/low technology is widely used in business process modeling, software process modeling as well as en-terprise information integration. At present, there exist a variety of workflow modeling approaches, which differ in the easiness of modeling, expressiveness and formalism. In this paper, the modeling approaches most used in research project and workflow products are compared. And the verification of workflow model is also dealt. We argue that a ideal workflow modelin~ approach is a hybrid one, i.e. the inteuration of the above approaches.

  20. Manufactured solutions and the numerical verification of isothermal, nonlinear, three-dimensional Stokes ice-sheet models

    Directory of Open Access Journals (Sweden)

    W. Leng

    2012-07-01

    Full Text Available The technique of manufactured solutions is used for verification of computational models in many fields. In this paper we construct manufactured solutions for models of three-dimensional, isothermal, nonlinear Stokes flow in glaciers and ice sheets. The solution construction procedure starts with kinematic boundary conditions and is mainly based on the solution of a first-order partial differential equation for the ice velocity that satisfies the incompressibility condition. The manufactured solutions depend on the geometry of the ice sheet and other model parameters. Initial conditions are taken from the periodic geometry of a standard problem of the ISMIP-HOM benchmark tests and altered through the manufactured solution procedure to generate an analytic solution for the time-dependent flow problem. We then use this manufactured solution to verify a parallel, high-order accurate, finite element Stokes ice-sheet model. Results from the computational model show excellent agreement with the manufactured analytic solutions.

  1. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  2. Development of the VESUVIUS module. Molten jet breakup modeling and model verification

    Energy Technology Data Exchange (ETDEWEB)

    Vierow, K. [Nuclear Power Engineering Corp., Tokyo (Japan); Nagano, Katsuhiro; Araki, Kazuhiro

    1998-01-01

    With the in-vessel vapor explosion issue ({alpha}-mode failure) now considered to pose an acceptably small risk to the safety of a light water reactor, ex-vessel vapor explosions are being given considerable attention. Attempts are being made to analytically model breakup of continuous-phase jets, however uncertainty exists regarding the basic phenomena. In addition, the conditions upon reactor vessel failure, which determine the starting point of the ex-vessel vapor explosion process, are difficult to quantify. Herein, molten jet ejection from the reactor pressure vessel is characterized. Next, the expected mode of jet breakup is determined and the current state of analytical modeling is reviewed. A jet breakup model for ex-vessel scenarios, with the primary breakup mechanism being the Kelvin-Helmholtz instability, is described. The model has been incorporated into the VESUVIUS module and comparisons of VESUVIUS calculations against FARO L-06 experimental data show differences, particularly in the pressure curve and amount of jet breakup. The need for additional development to resolve these differences is discussed. (author)

  3. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  4. Studies of a proton phase beam monitor for range verification in proton therapy

    Energy Technology Data Exchange (ETDEWEB)

    Werner, T.; Golnik, C.; Enghardt, W.; Petzoldt, J.; Kormoll, T.; Pausch, G. [Technische Universitaet Dresden, OncoRay, PF 41, 01307 Dresden, (Germany); Straessner, A. [Technische Universitaet Dresden, Institute for Nuclear and Particle Physics, Zellescher Weg 19, 01069 Dresden, (Germany); Roemer, K.; Dreyer, A.; Hueso-Gonzalez, F.; Enghardt, W. [Helmholtz-Zentrum Dresden-Rossendorf, PF 510 119, 01314 Dresden, (Germany)

    2015-07-01

    A primary subject of the present research in particle therapy is to ensure the precise irradiation of the target volume. The prompt gamma timing (PGT) method provides one possibility for in vivo range verification during the irradiation of patients. Prompt gamma rays with high energies are emitted promptly due to nuclear reactions of protons with tissue. The arrival time of these gammas to the detector reflects the stopping process of the primary protons in tissue and are directly correlated to the range. Due to the time resolution of the detector and the proton bunch time spread, as well as drifts of the bunch phase with respect to the accelerator frequency, timing spectra are smeared out and compromise the accuracy of range information intended for future clinical applications. Nevertheless, counteracting this limitation and recovering range information from the PGT measured spectra, corrections using a phase beam monitor can be performed. A first prototype of phase beam monitor was tested at GSI Darmstadt, where measurements of the energy profile of the ion bunches were performed. At the ELBE accelerator Helmholtz-Zentrum Dresden-Rossendorf (HZDR), set up to provide bremsstrahlung photons in very short pulses, a constant fraction algorithm for the incoming digital signals was evaluated, which is used for optimizing the time resolution. Studies of scattering experiments with different thin targets and detector positions are accomplished at Oncoray Dresden, where a clinical proton beam is available. These experiments allow a basic characterization of the proton bunch structure and the detection yield. (authors)

  5. Studies of a Proton Bunch Phase Monitor for Range Verification in Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Werner, T.; Golnik, C.; Enghardt, W.; Petzoldt, J.; Kormoll, T.; Pausch, G. [Technische Universitaet Dresden, OncoRay, PF 41, 01307 Dresden (Germany); Straessner, A. [Technische Universitaet Dresden, Institute for Nuclear and Particle Physics, Zellescher Weg 19, 01069 Dresden (Germany); Roemer, K.; Dreyer, A.; Hueso-Gonzalez, F.; Enghardt, W. [Helmholtz-Zentrum Dresden-Rossendorf, PF 510 119, 01314 Dresden (Germany)

    2015-07-01

    A primary subject of the present research in particle therapy is to ensure the precise irradiation of the target volume. The prompt gamma timing (PGT) method provides one possibility for in vivo range verification during the irradiation of patients. Prompt gamma rays with high energies are emitted promptly due to nuclear reactions of protons with tissue. The arrival time of these gammas to the detector reflects the stopping process of the primary protons in tissue and is directly correlated to the range. Due to the time resolution of the detector and the proton bunch time spread, as well as drifts of the bunch phase with respect to the accelerator frequency, timing spectra are smeared out and compromise the accuracy of range information intended for future clinical applications. Nevertheless, counteracting this limitation and recovering range information from the PGT measured spectra, corrections using a bunch phase monitor can be performed. A first prototype of bunch phase monitor was tested at GSI Darmstadt, where measurements of the energy correlation profile of the ion bunches were performed. At the ELBE accelerator at Helmholtz-Zentrum Dresden-Rossendorf (HZDR), set up to provide bremsstrahlung photons in very short pulses, a constant fraction algorithm for the incoming digital signals was evaluated, which is used for optimizing the time resolution. Studies of scattering experiments with different thin targets and detector positions are accomplished at Onco Ray Dresden, where a clinical proton beam is available. These experiments allow a basic characterization of the proton bunch structure and the detection yield. (authors)

  6. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    Science.gov (United States)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  7. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    Science.gov (United States)

    Zavodsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use

  8. Towards the Availability of the Distributed Cluster Rendering System: Automatic Modeling and Verification

    DEFF Research Database (Denmark)

    Wang, Kemin; Jiang, Zhengtao; Wang, Yongbin;

    2012-01-01

    , whenever the number of node-n and related parameters vary, we can create the PRISM model file rapidly and then we can use PRISM model checker to verify ralated system properties. At the end of this study, we analyzed and verified the availability distributions of the Distributed Cluster Rendering System......In this study, we proposed a Continuous Time Markov Chain Model towards the availability of n-node clusters of Distributed Rendering System. It's an infinite one, we formalized it, based on the model, we implemented a software, which can automatically model with PRISM language. With the tool...

  9. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    Directory of Open Access Journals (Sweden)

    Tuo Ming Fu

    2016-01-01

    Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion

  10. Theoretical study of closed-loop recycling liquid-liquid chromatography and experimental verification of the theory.

    Science.gov (United States)

    Kostanyan, Artak E; Erastov, Andrey A

    2016-09-02

    The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment.

  11. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  12. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  13. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  14. The Cooking and Pneumonia Study (CAPS in Malawi: Implementation of Remote Source Data Verification.

    Directory of Open Access Journals (Sweden)

    William Weston

    Full Text Available Source data verification (SDV is a data monitoring procedure which compares the original records with the Case Report Form (CRF. Traditionally, on-site SDV relies on monitors making multiples visits to study sites requiring extensive resources. The Cooking And Pneumonia Study (CAPS is a 24- month village-level cluster randomized controlled trial assessing the effectiveness of an advanced cook-stove intervention in preventing pneumonia in children under five in rural Malawi (www.capstudy.org. CAPS used smartphones to capture digital images of the original records on an electronic CRF (eCRF. In the present study, descriptive statistics are used to report the experience of electronic data capture with remote SDV in a challenging research setting in rural Malawi.At three monthly intervals, fieldworkers, who were employed by CAPS, captured pneumonia data from the original records onto the eCRF. Fieldworkers also captured digital images of the original records. Once Internet connectivity was available, the data captured on the eCRF and the digital images of the original records were uploaded to a web-based SDV application. This enabled SDV to be conducted remotely from the UK. We conducted SDV of the pneumonia data (occurrence, severity, and clinical indicators recorded in the eCRF with the data in the digital images of the original records.664 episodes of pneumonia were recorded after 6 months of follow-up. Of these 664 episodes, 611 (92% had a finding of pneumonia in the original records. All digital images of the original records were clear and legible.Electronic data capture using eCRFs on mobile technology is feasible in rural Malawi. Capturing digital images of the original records in the field allows remote SDV to be conducted efficiently and securely without requiring additional field visits. We recommend these approaches in similar settings, especially those with health endpoints.

  15. Quantitative Verification in Practice

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.; Katoen, Joost-Pieter; Larsen, Kim G.

    2010-01-01

    Soon after the birth of model checking, the first theoretical achievements have been reported on the automated verification of quanti- tative system aspects such as discrete probabilities and continuous time. These theories have been extended in various dimensions, such as con- tinuous probabilities

  16. Automatic verification of SSD and generation of respiratory signal with lasers in radiotherapy: a preliminary study.

    Science.gov (United States)

    Prabhakar, Ramachandran

    2012-01-01

    Source to surface distance (SSD) plays a very important role in external beam radiotherapy treatment verification. In this study, a simple technique has been developed to verify the SSD automatically with lasers. The study also suggests a methodology for determining the respiratory signal with lasers. Two lasers, red and green are mounted on the collimator head of a Clinac 2300 C/D linac along with a camera to determine the SSD. A software (SSDLas) was developed to estimate the SSD automatically from the images captured by a 12-megapixel camera. To determine the SSD to a patient surface, the external body contour of the central axis transverse computed tomography (CT) cut is imported into the software. Another important aspect in radiotherapy is the generation of respiratory signal. The changes in the lasers separation as the patient breathes are converted to produce a respiratory signal. Multiple frames of laser images were acquired from the camera mounted on the collimator head and each frame was analyzed with SSDLas to generate the respiratory signal. The SSD as observed with the ODI on the machine and SSD measured by the SSDlas software was found to be within the tolerance limit. The methodology described for generating the respiratory signals will be useful for the treatment of mobile tumors such as lung, liver, breast, pancreas etc. The technique described for determining the SSD and the generation of respiratory signals using lasers is cost effective and simple to implement. Copyright © 2011 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Predictive permeability model of faults in crystalline rocks; verification by joint hydraulic factor (JH) obtained from water pressure tests

    Science.gov (United States)

    Barani, Hamidreza Rostami; Lashkaripour, Gholamreza; Ghafoori, Mohammad

    2014-08-01

    In the present study, a new model is proposed to predict the permeability per fracture in the fault zones by a new parameter named joint hydraulic factor (JH). JH is obtained from Water Pressure Test (WPT) and modified by the degree of fracturing. The results of JH correspond with quantitative fault zone descriptions, qualitative fracture, and fault rock properties. In this respect, a case study was done based on the data collected from Seyahoo dam site located in the east of Iran to provide the permeability prediction model of fault zone structures. Datasets including scan-lines, drill cores, and water pressure tests in the terrain of Andesite and Basalt rocks were used to analyse the variability of in-site relative permeability of a range from fault zones to host rocks. The rock mass joint permeability quality, therefore, is defined by the JH. JH data analysis showed that the background sub-zone had commonly core had permeability characteristics nearly as low as the outer damage zone, represented by 8 Lu (1.3 ×10-4 m 3/s) per fracture, with occasional peaks towards 12 Lu (2 ×10-4 m 3/s) per fracture. The maximum JH value belongs to the inner damage zone, marginal to the fault core, with 14-22 Lu (2.3 ×10-4-3.6 ×10-4 m 3/s) per fracture, locally exceeding 25 Lu (4.1 ×10-4 m 3/s) per fracture. This gives a proportional relationship for JH approximately 1:4:2 between the fault core, inner damage zone, and outer damage zone of extensional fault zones in crystalline rocks. The results of the verification exercise revealed that the new approach would be efficient and that the JH parameter is a reliable scale for the fracture permeability change. It can be concluded that using short duration hydraulic tests (WPTs) and fracture frequency (FF) to calculate the JH parameter provides a possibility to describe a complex situation and compare, discuss, and weigh the hydraulic quality to make predictions as to the permeability models and permeation amounts of different

  18. Predictive permeability model of faults in crystalline rocks; verification by joint hydraulic factor (JH) obtained from water pressure tests

    Indian Academy of Sciences (India)

    Hamidreza Rostami Barani; Gholamreza Lashkaripour; Mohammad Ghafoori

    2014-08-01

    In the present study, a new model is proposed to predict the permeability per fracture in the fault zones by a new parameter named joint hydraulic factor (JH). JH is obtained from Water Pressure Test WPT) and modified by the degree of fracturing. The results of JH correspond with quantitative fault zone descriptions, qualitative fracture, and fault rock properties. In this respect, a case study was done based on the data collected from Seyahoo dam site located in the east of Iran to provide the permeability prediction model of fault zone structures. Datasets including scan-lines, drill cores, and water pressure tests in the terrain of Andesite and Basalt rocks were used to analyse the variability of in-site relative permeability of a range from fault zones to host rocks. The rock mass joint permeability quality, therefore, is defined by the JH. JH data analysis showed that the background sub-zone had commonly > 3 Lu (less of 5 × 10−5 m3/s) per fracture, whereas the fault core had permeability characteristics nearly as low as the outer damage zone, represented by 8 Lu (1.3 × 10−4 m3/s) per fracture, with occasional peaks towards 12 Lu (2 × 10−4 m3/s) per fracture. The maximum JH value belongs to the inner damage zone, marginal to the fault core, with 14–22 Lu (2.3 × 10−4 –3.6 × 10−4 m3/s) per fracture, locally exceeding 25 Lu (4.1 × 10−4 m3/s) per fracture. This gives a proportional relationship for JH approximately 1:4:2 between the fault core, inner damage zone, and outer damage zone of extensional fault zones in crystalline rocks. The results of the verification exercise revealed that the new approach would be efficient and that the JH parameter is a reliable scale for the fracture permeability change. It can be concluded that using short duration hydraulic tests (WPTs) and fracture frequency (FF) to calculate the JH parameter provides a possibility to describe a complex situation and compare, discuss, and weigh the hydraulic quality to make

  19. Monte Carlo based verification of a beam model used in a treatment planning system

    Science.gov (United States)

    Wieslander, E.; Knöös, T.

    2008-02-01

    Modern treatment planning systems (TPSs) usually separate the dose modelling into a beam modelling phase, describing the beam exiting the accelerator, followed by a subsequent dose calculation in the patient. The aim of this work is to use the Monte Carlo code system EGSnrc to study the modelling of head scatter as well as the transmission through multi-leaf collimator (MLC) and diaphragms in the beam model used in a commercial TPS (MasterPlan, Nucletron B.V.). An Elekta Precise linear accelerator equipped with an MLC has been modelled in BEAMnrc, based on available information from the vendor regarding the material and geometry of the treatment head. The collimation in the MLC direction consists of leafs which are complemented with a backup diaphragm. The characteristics of the electron beam, i.e., energy and spot size, impinging on the target have been tuned to match measured data. Phase spaces from simulations of the treatment head are used to extract the scatter from, e.g., the flattening filter and the collimating structures. Similar data for the source models used in the TPS are extracted from the treatment planning system, thus a comprehensive analysis is possible. Simulations in a water phantom, with DOSXYZnrc, are also used to study the modelling of the MLC and the diaphragms by the TPS. The results from this study will be helpful to understand the limitations of the model in the TPS and provide knowledge for further improvements of the TPS source modelling.

  20. Documentation, User Support, and Verification of Wind Turbine and Plant Models

    Energy Technology Data Exchange (ETDEWEB)

    Robert Zavadil; Vadim Zheglov; Yuriy Kazachkov; Bo Gong; Juan Sanchez; Jun Li

    2012-09-18

    As part of the Utility Wind Energy Integration Group (UWIG) and EnerNex's Wind Turbine Modeling Project, EnerNex has received ARRA (federal stimulus) funding through the Department of Energy (DOE) to further the progress of wind turbine and wind plant models. Despite the large existing and planned wind generation deployment, industry-standard models for wind generation have not been formally adopted. Models commonly provided for interconnection studies are not adequate for use in general transmission planning studies, where public, non-proprietary, documented and validated models are needed. NERC MOD (North American Electric Reliability Corporation) reliability standards require that power flow and dynamics models be provided, in accordance with regional requirements and procedures. The goal of this project is to accelerate the appropriate use of generic wind turbine models for transmission network analysis by: (1) Defining proposed enhancements to the generic wind turbine model structures that would allow representation of more advanced; (2) Comparative testing of the generic models against more detailed (and sometimes proprietary) versions developed by turbine vendors; (3) Developing recommended parameters for the generic models to best mimic the performance of specific commercial wind turbines; (4) Documenting results of the comparative simulations in an application guide for users; (5) Conducting technology transfer activities in regional workshops for dissemination of knowledge and information gained, and to engage electric power and wind industry personnel in the project while underway; (6) Designing of a "living" homepage to establish an online resource for transmission planners.

  1. EXPERIMENTAL VERIFICATION OF THE THREE-DIMENSIONAL THERMAL-HYDRAULIC MODELS IN THE BEST-ESTIMATE CODE BAGIRA.

    Energy Technology Data Exchange (ETDEWEB)

    KALINICHENKO,S.D.KROSHILIN,A.E.KROSHILIN,V.E.SMIRNOV,A.V.KOHUT,P.

    2004-03-15

    In this paper we present verification results of the BAGIRA code that was performed using data from integral thermal-hydraulic experimental test facilities as well as data obtained from operating nuclear power plants. BAGIRA is a three-dimensional numerical best-estimate code that includes non-homogeneous modeling. Special consideration was given to the recently completed experimental data from the PSB-VVER integral test facility (EREC, Electrogorsk, Russia)--a new Russian large-scale four-loop unit, which has been designed to model the primary circuits of VVER-1000 type reactors. It is demonstrated that the code BAGIRA can be used to analyze nuclear reactor behavior under normal and accident conditions.

  2. Comparison between a Computational Seated Human Model and Experimental Verification Data

    Directory of Open Access Journals (Sweden)

    Christian G. Olesen

    2014-01-01

    Full Text Available Sitting-acquired deep tissue injuries (SADTI are the most serious type of pressure ulcers. In order to investigate the aetiology of SADTI a new approach is under development: a musculo-skeletal model which can predict forces between the chair and the human body at different seated postures. This study focuses on comparing results from a model developed in the AnyBody Modeling System, with data collected from an experimental setup. A chair with force-measuring equipment was developed, an experiment was conducted with three subjects, and the experimental results were compared with the predictions of the computational model. The results show that the model predicted the reaction forces for different chair postures well. The correlation coefficients of how well the experiment and model correlate for the seat angle, backrest angle and footrest height was 0.93, 0.96, and 0.95. The study show a good agreement between experimental data and model prediction of forces between a human body and a chair. The model can in the future be used in designing wheelchairs or automotive seats.

  3. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  4. Verification of mathematical models for calculation of viscosity of molten oxide systems

    Directory of Open Access Journals (Sweden)

    S. Rosypalová

    2014-06-01

    Full Text Available The subject of this work is the comparison of numerically obtained values of dynamic viscosity using different types of mathematical models and experimentally measured data of viscosity of oxide systems. The ternary system of SiO2-CaO-Al2O3, which presents simplified base of the casting powders used in technological process, was submitted to the experiment. Experimental research of dynamic viscosity is highly limited by its complexity. That’s why model studies play such an important role in this field. For mathematic calculation of viscosity the NPL model, Iida model and Urbain model were chosen. The results of simulation were compared with the experimentally obtained values of viscosity.

  5. Real-Time Kennedy Space Center and Cape Canaveral Air Force Station High-Resolution Model Implementation and Verification

    Science.gov (United States)

    Shafer, Jaclyn A.; Watson, Leela R.

    2015-01-01

    Customer: NASA's Launch Services Program (LSP), Ground Systems Development and Operations (GSDO), and Space Launch System (SLS) programs. NASA's LSP, GSDO, SLS and other programs at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) use the daily and weekly weather forecasts issued by the 45th Weather Squadron (45 WS) as decision tools for their day-to-day and launch operations on the Eastern Range (ER). For example, to determine if they need to limit activities such as vehicle transport to the launch pad, protect people, structures or exposed launch vehicles given a threat of severe weather, or reschedule other critical operations. The 45 WS uses numerical weather prediction models as a guide for these weather forecasts, particularly the Air Force Weather Agency (AFWA) 1.67 kilometer Weather Research and Forecasting (WRF) model. Considering the 45 WS forecasters' and Launch Weather Officers' (LWO) extensive use of the AFWA model, the 45 WS proposed a task at the September 2013 Applied Meteorology Unit (AMU) Tasking Meeting requesting the AMU verify this model. Due to the lack of archived model data available from AFWA, verification is not yet possible. Instead, the AMU proposed to implement and verify the performance of an ER version of the AMU high-resolution WRF Environmental Modeling System (EMS) model (Watson 2013) in real-time. The tasking group agreed to this proposal; therefore the AMU implemented the WRF-EMS model on the second of two NASA AMU modeling clusters. The model was set up with a triple-nested grid configuration over KSC/CCAFS based on previous AMU work (Watson 2013). The outer domain (D01) has 12-kilometer grid spacing, the middle domain (D02) has 4-kilometer grid spacing, and the inner domain (D03) has 1.33-kilometer grid spacing. The model runs a 12-hour forecast every hour, D01 and D02 domain outputs are available once an hour and D03 is every 15 minutes during the forecast period. The AMU assessed the WRF-EMS 1

  6. Simulation studies for the in-vivo dose verification of particle therapy

    Energy Technology Data Exchange (ETDEWEB)

    Rohling, Heide

    2015-06-08

    An increasing number of cancer patients is treated with proton beams or other light ion beams which allow to deliver dose precisely to the tumor. However, the depth dose distribution of these particles, which enables this precision, is sensitive to deviations from the treatment plan, as e.g. anatomical changes. Thus, to assure the quality of the treatment, a non-invasive in-vivo dose verification is highly desired. This monitoring of particle therapy relies on the detection of secondary radiation which is produced by interactions between the beam particles and the nuclei of the patient's tissue. Up to now, the only clinically applied method for in-vivo dosimetry is Positron Emission Tomography which makes use of the β{sup +}-activity produced during the irradiation (PT-PET). Since from a PT-PET measurement the applied dose cannot be directly deduced, the simulated distribution of β{sup +}-emitting nuclei is used as a basis for the analysis of the measured PT-PET data. Therefore, the reliable modeling of the production rates and the spatial distribution of the β{sup +}-emitters is required. PT-PET applied during instead of after the treatment is referred to as in-beam PET. A challenge concerning in-beam PET is the design of the PET camera, because a standard full-ring scanner is not feasible. Thus, for in-beam PET and PGI dedicated detection systems and, moreover, profound knowledge about the corresponding radiation fields are required. Using various simulation codes, this thesis contributes to the modelling of the β{sup +}-emitters and photons produced during particle irradiation, as well as to the evaluation and optimization of hardware for both techniques. Concerning the modeling of the production of the relevant β{sup +}-emitters, the abilities of the Monte Carlo simulation code PHITS and of the deterministic, one-dimensional code HIBRAC were assessed. HIBRAC was substantially extended to enable the modeling of the depth-dependent yields of specific

  7. Outcomes of the JNT 1955 Phase I Viability Study of Gamma Emission Tomography for Spent Fuel Verification

    Energy Technology Data Exchange (ETDEWEB)

    Jacobsson-Svard, Staffan; Smith, Leon E.; White, Timothy; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, Holly R.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2017-05-17

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly has been assessed within the IAEA Support Program project JNT 1955, phase I, which was completed and reported to the IAEA in October 2016. Two safeguards verification objectives were identified in the project; (1) independent determination of the number of active pins that are present in a measured assembly, in the absence of a priori information about the assembly; and (2) quantitative assessment of pin-by-pin properties, for example the activity of key isotopes or pin attributes such as cooling time and relative burnup, under the assumption that basic fuel parameters (e.g., assembly type and nominal fuel composition) are known. The efficacy of GET to meet these two verification objectives was evaluated across a range of fuel types, burnups and cooling times, while targeting a total interrogation time of less than 60 minutes. The evaluations were founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types were used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. The simulated instrument response data were then processed using a variety of tomographic-reconstruction and image-processing methods, and scoring metrics were defined and used to evaluate the performance of the methods.This paper describes the analysis framework and metrics used to predict tomographer performance. It also presents the design of a “universal” GET (UGET) instrument intended to support the full range of verification scenarios envisioned by the IAEA. Finally, it gives examples of the expected partial-defect detection capabilities for some fuels and diversion scenarios, and it provides a comparison of predicted performance for the notional UGET design and an optimized variant of an existing IAEA instrument.

  8. Model-Based Verification and Validation of the SMAP Uplink Processes

    Science.gov (United States)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  9. Towards a phase field model of the microstructural evolution of duplex steel with experimental verification

    DEFF Research Database (Denmark)

    Poulsen, Stefan Othmar; Voorhees, P.W.; Lauridsen, Erik Mejdal

    2012-01-01

    A phase field model to study the microstructural evolution of a polycrystalline dual-phase material with conserved phase fraction has been implemented, and 2D simulations have been performed. For 2D simulations, the model predicts the cubic growth well-known for diffusion-controlled systems. Some...... interphase boundaries are found to show a persistent non-constant curvature, which seems to be a feature of multi-phase materials. Finally, it is briefly outlined how this model is to be applied to investigate microstructural evolution in duplex steel. © (2012) Trans Tech Publications, Switzerland....

  10. Verification of the New FAST v8 Capabilities for the Modeling of Fixed-Bottom Offshore Wind Turbines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Barahona, B.; Jonkman, J.; Damiani, R.; Robertson, A.; Hayman, G.

    2014-12-01

    Coupled dynamic analysis has an important role in the design of offshore wind turbines because the systems are subject to complex operating conditions from the combined action of waves and wind. The aero-hydro-servo-elastic tool FAST v8 is framed in a novel modularization scheme that facilitates such analysis. Here, we present the verification of new capabilities of FAST v8 to model fixed-bottom offshore wind turbines. We analyze a series of load cases with both wind and wave loads and compare the results against those from the previous international code comparison projects-the International Energy Agency (IEA) Wind Task 23 Subtask 2 Offshore Code Comparison Collaboration (OC3) and the IEA Wind Task 30 OC3 Continued (OC4) projects. The verification is performed using the NREL 5-MW reference turbine supported by monopile, tripod, and jacket substructures. The substructure structural-dynamics models are built within the new SubDyn module of FAST v8, which uses a linear finite-element beam model with Craig-Bampton dynamic system reduction. This allows the modal properties of the substructure to be synthesized and coupled to hydrodynamic loads and tower dynamics. The hydrodynamic loads are calculated using a new strip theory approach for multimember substructures in the updated HydroDyn module of FAST v8. These modules are linked to the rest of FAST through the new coupling scheme involving mapping between module-independent spatial discretizations and a numerically rigorous implicit solver. The results show that the new structural dynamics, hydrodynamics, and coupled solutions compare well to the results from the previous code comparison projects.

  11. Verification of a Predictive Model of Psychological Health at Work in Canada and France

    Directory of Open Access Journals (Sweden)

    Jean-Sébastien Boudrias

    2014-01-01

    Full Text Available The purpose of this study was to test the invariance of a predictive model of psychological health at work (PHW in Canada and France. The model a defines PHW as an integrative second-order variable (low distress, high well-being and b includes three categories of PHW inductors (job demands, personal resources and social-organizational resources and one psychological intermediate variable (needs satisfaction that were found to be directly or indirectly related to PHW in a previous study on a sample of French teachers (Boudrias, Desrumaux, Gaudreau, Nelson, Savoie and Brunet, 2011. To test if this model is invariant across countries, these data from French teachers ('N' = 391 were reanalyzed and compared with data from a sample of Canadian teachers ('N' = 480 who completed the same set of questionnaires. Results from structural equation modeling analyses indicated that the model is completely invariant across the two samples. Therefore, pathways to PHW appeared to generalize across these samples of teachers without the addition of other cultural variables. This PHW model suggests that personal resources exert considerable influence directly and indirectly on psychological health through multiple mediators. Research implications and study limitations are discussed.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS: TRI-DIM FILTER CORP. PREDATOR II MODEL 8VADTP123C23

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Predator II, Model 8VADTP123C23CC000 air filter for dust and bioaerosol filtration manufactured by Tri-Dim Filter Corporation. The pressure drop across the filter was 138 Pa clean and...

  13. Mechanistic Model for Predicting NO3—N Uptake by Plants and Its Verification

    Institute of Scientific and Technical Information of China (English)

    XUANJIA-XIANG; ZHANGLI-GAN; 等

    1991-01-01

    Some mechanistic models have been proposed to predict the No3- concentrations in the soil solution at root surface and the NO3-N uptake by plants,but all these relatively effective non-steady state models have not yet been verified by and soil culture experiment.In the present study,a mathematical model based on the nutrient transport to the roots,root length and root uptake kinetics as well as taking account of the inter-root competition was used for calculation,and soil culture experiments with rice,wheat and rape plants grown on alkali,neutral and acid soils in rhizoboxes with nylon screen as a isolator were carried out to evaluate the prediction ability of the model through comparing the measured NO3-concentrations at root surface and N uptake with the calculated values.Whether the inter-root competition for nutrients was accounted for in the model was of less importance to the calculated N uptake but could induce significant changes in the relative concentrations of NO3- at root surface.For the three soils and crops,the measured NO3-N uptake agreed well with the calculated one,and the calculated relative concentrations at root surface were approximate to the measured values.But an appropriate rectification for some conditions is necessary when the plant uptake parameter obtained in solution culture experiment is applied to soil culture.In contrast with the present non-steady state model,the predicted relative concentrations,which show an accumulation,by the Phillips' steady-state model were distinct from the measured values which show a depletion,indicating that the present model has a better prediction ability than the steady-state model.

  14. Acculturation and mental health--empirical verification of J.W. Berry's model of acculturative stress

    DEFF Research Database (Denmark)

    Koch, M W; Bjerregaard, P; Curtis, C

    2004-01-01

    OBJECTIVES: Many studies concerning mental health among ethnic minorities have used the concept of acculturation as a model of explanation, in particular J.W. Berry's model of acculturative stress. But Berry's theory has only been empirically verified few times. The aims of the study were...... to examine whether Berry's hypothesis about the connection between acculturation and mental health can be empirically verified for Greenlanders living in Denmark and to analyse whether acculturation plays a significant role for mental health among Greenlanders living in Denmark. STUDY DESIGN AND METHODS......: The study used data from the 1999 Health Profile for Greenlanders in Denmark. As measure of mental health we applied the General Health Questionnaire (GHQ-12). Acculturation was assessed from answers to questions about how the respondents value the fact that children maintain their traditional cultural...

  15. Study of verification, validation, and testing in the automated data processing system at the Department of Veterans Affairs

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, A. (Argonne National Lab., IL (USA). Energy Systems Div.); Formento, J.W.; Hill, L.G.; Riemer, C.A. (Argonne National Lab., IL (USA). Environmental Assessment and Information Sciences Div.)

    1990-01-01

    Argonne National Laboratory (ANL) studied the role of verification, validation, and testing (VV T) in the Department of Veterans Affairs (VA) automated data processing (ADP) system development life cycle (SDLC). In this study, ANL reviewed and compared standard VV T practices in the private and government sectors with those in the VA. The methodology included extensive interviews with, and surveys of, users, analysts, and staff in the Systems Development Division (SDD) and Systems Verification and Testing Division (SV TD) of the VA, as well as representatives of private and government organizations, and a review of ADP standards. The study revealed that VA's approach to VV T already incorporates some industry practices -- in particular, the use of an independent organization that relies on the acceptability of test results to validate a software system. Argonne recommends that the role of SV TD be limited to validation and acceptance testing (defined as formal testing conducted independently to determine whether a software system satisfies its acceptance criteria). It also recommends that the role of the SDD be expanded to include verification testing (defined as formal testing or revaluation conducted by the developer to determine whether a software development satisfies design criteria). Integrated systems testing should be performed by Operations in a production-like environment under stressful situations to assess how trouble-free and acceptable the software is to the end user. A separate, independent, quality assurance group should be responsible for ADP auditing and for helping to establish policies for managing software configurations and should report directly to the VA central office. Finally, and of no less importance, an in-house training program and procedures manual should be instituted for the entire SDLC for all involved staff; it should incorporate or reference ADP standards.

  16. DISCRETE DYNAMIC MODEL OF BEVEL GEAR – VERIFICATION THE PROGRAM SOURCE CODE FOR NUMERICAL SIMULATION

    Directory of Open Access Journals (Sweden)

    Krzysztof TWARDOCH

    2014-06-01

    Full Text Available In the article presented a new model of physical and mathematical bevel gear to study the influence of design parameters and operating factors on the dynamic state of the gear transmission. Discusses the process of verifying proper operation of copyright calculation program used to determine the solutions of the dynamic model of bevel gear. Presents the block diagram of a computing algorithm that was used to create a program for the numerical simulation. The program source code is written in an interactive environment to perform scientific and engineering calculations, MATLAB

  17. Marine boundary layer simulation and verification during BOBMEX-Pilot using NCMRWF model

    Indian Academy of Sciences (India)

    Swati Basu

    2000-06-01

    A global spectral model (T80L18) that is operational at NCMRWF is utilized to study the structure of the marine boundary layer over the Bay of Bengal during the BOBMEX-Pilot period. The vertical profiles of various meteorological parameters within the boundary layer are studied and verified against the available observations. The diurnal variation of various surface fields are also studied. The impact of non-local closure scheme for the boundary layer parameterisation is seen in simulation of the flow pattern as well as on the boundary layer structure over the oceanic region.

  18. Verification of the multi-layer SNOWPACK model with different water transport schemes

    Science.gov (United States)

    Wever, N.; Schmid, L.; Heilig, A.; Eisen, O.; Fierz, C.; Lehning, M.

    2015-12-01

    The widely used detailed SNOWPACK model has undergone constant development over the years. A notable recent extension is the introduction of a Richards equation (RE) solver as an alternative for the bucket-type approach for describing water transport in the snow and soil layers. In addition, continuous updates of snow settling and new snow density parameterizations have changed model behavior. This study presents a detailed evaluation of model performance against a comprehensive multiyear data set from Weissfluhjoch near Davos, Switzerland. The data set is collected by automatic meteorological and snowpack measurements and manual snow profiles. During the main winter season, snow height (RMSE: manually observed snow profiles do not support this conclusion. This discrepancy suggests that the implementation of RE partly mimics preferential flow effects.

  19. Verification of a binary fluid solidification model in the finite-volume flow solver

    CERN Document Server

    Waclawczyk, Tomasz

    2015-01-01

    The aim of this paper is to verify the new numerical implementation of a binary fluid, heat conduction dominated solidification model. First, we extend a semi-analytical solution to the heat diffusion equation, next, the range of its applicability is investigated. It was found that the linearization introduced to the heat diffusion equation negatively affects the ability to predict solidus and liquidus lines positions whenever the magnitude of latent heat of fusion exceeds a certain value. Next, a binary fluid solidification model is coupled with a flow solver, and is used in a numerical study of Al-4.1%Cu alloy solidification in a two-dimensional rectangular cavity. An accurate coupling between the solidification model and the flow solver is crucial for the correct forecast of solidification front positions and macrosegregation patterns.

  20. SU-E-T-48: A Multi-Institutional Study of Independent Dose Verification for Conventional, SRS and SBRT

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, R; Kamima, T [The Cancer Institute Hospital of JFCR, Koto-ku, Tokyo (Japan); Tachibana, H; Baba, H [National Cancer Center Hospital East, Kashiwa, Chiba (Japan); Itano, M; Yamazaki, T [Inagi Municipal Hospital, Inagi, Tokyo (Japan); Ishibashi, S; Higuchi, Y [Sasebo City General Hospital, Sasebo, Nagasaki (Japan); Shimizu, H [Kitasato University Medical Center, Kitamoto, Saitama (Japan); Yamamoto, T [Otemae Hospital, Chuou-ku, Osaka-city (Japan); Yamashita, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Sugawara, Y [The National Center for Global Health and Medicine, Shinjuku-ku, Tokyo (Japan); Sato, A [Itabashi Central General Hospital, Itabashi-ku, Tokyo (Japan); Nishiyama, S [Kuki General Hospital, Kuki, Saitama (Japan); Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa-prefecture (Japan); Miyaoka, S [Kamitsuga General Hospital, Kanuma, Tochigi (Japan)

    2015-06-15

    Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle{sup 3} (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sites (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly.

  1. Wind-wave coupling in the atmospheric boundary layer over a reservoir: field measurements and verification of the model

    Science.gov (United States)

    Troitskaya, Yuliya; Papko, Vladislav; Baidakov, Georgy; Vdovin, Maxim; Kandaurov, Alexander; Sergeev, Daniil

    2013-04-01

    This paper presents the results of field experiments conducted at the Gorky Reservoir to test a quasi-linear model of the atmospheric boundary layer [1]. In the course of the experiment we simultaneously measured profiles of wind speed and surface wave spectra using instruments placed on the Froude buoy, which measures the following parameters: i) the module and the direction of the wind speed using ultrasonic wind sensor WindSonic Gill instruments, located on the 4 - levels from 0.1 x 5 m long; ii) profile of the surface waves with 3-channel string wave-gauge with a base of 5 cm, iii) the temperature of the water and air with a resistive sensor. From the measured profiles of wind speed, we calculated basic parameters of the atmospheric boundary layer: the friction velocity u*, the wind speed at the standard height of 10 m U10 and the drag coefficient CD. Data on CD(U10), obtained at the Gorky Reservoir, were compared with similar data obtained on Lake George in Australia during the Australian Shallow Water Experiment (AUSWEX) conducted in 1997 - 1999 [2,3]. A good agreement was obtained between measured data at two different on the parameters of inland waters: deep Gorky reservoir and shallow Lake George.To elucidate the reasons for this coincidence of the drag coefficients under strongly different conditions an analysis of surface waves was conducted.Measurements have shown that in both water bodies the surface wave spectra have almost the same asymptotics (spatial spectrum - k-3, the frequency spectrum -5), corresponding to the Phillips saturation spectrum.These spectra are typically observed for the steep surface waves, for which the basic dissipation mechanism is wave breaking. The similarity of the short-wave parts of the spectra can be regarded as a probable cause of coincidence of dependency of drag coefficient of the water surface on wind speed. Quantitative verification of this hypothesis was carried out in the framework of quasi-linear model of the wind

  2. Development and experimental verification of a genome-scale metabolic model for Corynebacterium glutamicum

    Directory of Open Access Journals (Sweden)

    Hirasawa Takashi

    2009-08-01

    Full Text Available Abstract Background In silico genome-scale metabolic models enable the analysis of the characteristics of metabolic systems of organisms. In this study, we reconstructed a genome-scale metabolic model of Corynebacterium glutamicum on the basis of genome sequence annotation and physiological data. The metabolic characteristics were analyzed using flux balance analysis (FBA, and the results of FBA were validated using data from culture experiments performed at different oxygen uptake rates. Results The reconstructed genome-scale metabolic model of C. glutamicum contains 502 reactions and 423 metabolites. We collected the reactions and biomass components from the database and literatures, and made the model available for the flux balance analysis by filling gaps in the reaction networks and removing inadequate loop reactions. Using the framework of FBA and our genome-scale metabolic model, we first simulated the changes in the metabolic flux profiles that occur on changing the oxygen uptake rate. The predicted production yields of carbon dioxide and organic acids agreed well with the experimental data. The metabolic profiles of amino acid production phases were also investigated. A comprehensive gene deletion study was performed in which the effects of gene deletions on metabolic fluxes were simulated; this helped in the identification of several genes whose deletion resulted in an improvement in organic acid production. Conclusion The genome-scale metabolic model provides useful information for the evaluation of the metabolic capabilities and prediction of the metabolic characteristics of C. glutamicum. This can form a basis for the in silico design of C. glutamicum metabolic networks for improved bioproduction of desirable metabolites.

  3. [Verification of the double dissociation model of shyness using the implicit association test].

    Science.gov (United States)

    Fujii, Tsutomu; Aikawa, Atsushi

    2013-12-01

    The "double dissociation model" of shyness proposed by Asendorpf, Banse, and Mtücke (2002) was demonstrated in Japan by Aikawa and Fujii (2011). However, the generalizability of the double dissociation model of shyness was uncertain. The present study examined whether the results reported in Aikawa and Fujii (2011) would be replicated. In Study 1, college students (n = 91) completed explicit self-ratings of shyness and other personality scales. In Study 2, forty-eight participants completed IAT (Implicit Association Test) for shyness, and their friends (n = 141) rated those participants on various personality scales. The results revealed that only the explicit self-concept ratings predicted other-rated low praise-seeking behavior, sociable behavior and high rejection-avoidance behavior (controlled shy behavior). Only the implicit self-concept measured by the shyness IAT predicted other-rated high interpersonal tension (spontaneous shy behavior). The results of this study are similar to the findings of the previous research, which supports generalizability of the double dissociation model of shyness.

  4. Nanofibre distribution in composites manufactured with epoxy reinforced with nanofibrillated cellulose: model prediction and verification

    Science.gov (United States)

    Aitomäki, Yvonne; Westin, Mikael; Korpimäki, Jani; Oksman, Kristiina

    2016-07-01

    In this study a model based on simple scattering is developed and used to predict the distribution of nanofibrillated cellulose in composites manufactured by resin transfer moulding (RTM) where the resin contains nanofibres. The model is a Monte Carlo based simulation where nanofibres are randomly chosen from probability density functions for length, diameter and orientation. Their movements are then tracked as they advance through a random arrangement of fibres in defined fibre bundles. The results of the model show that the fabric filters the nanofibres within the first 20 µm unless clear inter-bundle channels are available. The volume fraction of the fabric fibres, flow velocity and size of nanofibre influence this to some extent. To verify the model, an epoxy with 0.5 wt.% Kraft Birch nanofibres was made through a solvent exchange route and stained with a colouring agent. This was infused into a glass fibre fabric using an RTM process. The experimental results confirmed the filtering of the nanofibres by the fibre bundles and their penetration in the fabric via the inter-bundle channels. Hence, the model is a useful tool for visualising the distribution of the nanofibres in composites in this manufacturing process.

  5. Design of the software development and verification system (SWDVS) for shuttle NASA study task 35

    Science.gov (United States)

    Drane, L. W.; Mccoy, B. J.; Silver, L. W.

    1973-01-01

    An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.

  6. Design and verifications of an eye model fitted with contact lenses for wavefront measurement systems

    Science.gov (United States)

    Cheng, Yuan-Chieh; Chen, Jia-Hong; Chang, Rong-Jie; Wang, Chung-Yen; Hsu, Wei-Yao; Wang, Pei-Jen

    2015-09-01

    Contact lenses are typically measured by the wet-box method because of the high optical power resulting from the anterior central curvature of cornea, even though the back vertex power of the lenses are small. In this study, an optical measurement system based on the Shack-Hartmann wavefront principle was established to investigate the aberrations of soft contact lenses. Fitting conditions were micmicked to study the optical design of an eye model with various topographical shapes in the anterior cornea. Initially, the contact lenses were measured by the wet-box method, and then by fitting the various topographical shapes of cornea to the eye model. In addition, an optics simulation program was employed to determine the sources of errors and assess the accuracy of the system. Finally, samples of soft contact lenses with various Diopters were measured; and, both simulations and experimental results were compared for resolving the controversies of fitting contact lenses to an eye model for optical measurements. More importantly, the results show that the proposed system can be employed for study of primary aberrations in contact lenses.

  7. Coupled groundwater flow and transport: 1. Verification of variable density flow and transport models

    Science.gov (United States)

    Kolditz, Olaf; Ratke, Rainer; Diersch, Hans-Jörg G.; Zielke, Werner

    This work examines variable density flow and corresponding solute transport in groundwater systems. Fluid dynamics of salty solutions with significant density variations are of increasing interest in many problems of subsurface hydrology. The mathematical model comprises a set of non-linear, coupled, partial differential equations to be solved for pressure/hydraulic head and mass fraction/concentration of the solute component. The governing equations and underlying assumptions are developed and discussed. The equation of solute mass conservation is formulated in terms of mass fraction and mass concentration. Different levels of the approximation of density variations in the mass balance equations are used for convection problems (e.g. the Boussinesq approximation and its extension, fully density approximation). The impact of these simplifications is studied by use of numerical modelling. Numerical models for nonlinear problems, such as density-driven convection, must be carefully verified in a particular series of tests. Standard benchmarks for proving variable density flow models are the Henry, Elder, and salt dome (HYDROCOIN level 1 case 5) problems. We studied these benchmarks using two finite element simulators - ROCKFLOW, which was developed at the Institute of Fluid Mechanics and Computer Applications in Civil Engineering and FEFLOW, which was developed at the Institute for Water Resources Planning and Systems Research Ltd. Although both simulators are based on the Galerkin finite element method, they differ in many approximation details such as temporal discretization (Crank-Nicolson vs predictor-corrector schemes), spatial discretization (triangular and quadrilateral elements), finite element basis functions (linear, bilinear, biquadratic), iteration schemes (Newton, Picard) and solvers (direct, iterative). The numerical analysis illustrates discretization effects and defects arising from the different levels of the density of approximation. We contribute

  8. Constraining millennial scale dynamics of a Greenland tidewater glacier for the verification of a calving criterion based numerical model

    Science.gov (United States)

    Lea, J.; Mair, D.; Rea, B.; Nick, F.; Schofield, E.

    2012-04-01

    The ability to successfully model the behaviour of Greenland tidewater glaciers is pivotal to understanding the controls on their dynamics and potential impact on global sea level. However, to have confidence in the results of numerical models in this setting, the evidence required for robust verification must extend well beyond the existing instrumental record. Perhaps uniquely for a major Greenland outlet glacier, both the advance and retreat dynamics of Kangiata Nunata Sermia (KNS), Nuuk Fjord, SW Greenland over the last ~1000 years can be reasonably constrained through a combination of geomorphological, sedimentological and archaeological evidence. It is therefore an ideal location to test the ability of the latest generation of calving criterion based tidewater models to explain millennial scale dynamics. This poster presents geomorphological evidence recording the post-Little Ice Age maximum dynamics of KNS, derived from high-resolution satellite imagery. This includes evidence of annual retreat moraine complexes suggesting controlled rather than catastrophic retreat between pinning points, in addition to a series of ice dammed lake shorelines, allowing detailed interpretation of the dynamics of the glacier as it thinned and retreated. Pending ground truthing, this evidence will contribute towards the calibration of results obtained from a calving criterion numerical model (Nick et al, 2010), driven by an air temperature reconstruction for the KNS region determined from ice core data.

  9. Experimental verification of a precooled mixed gas Joule-Thomson cryoprobe model

    Science.gov (United States)

    Passow, Kendra Lynn; Skye, Harrison; Nellis, Gregory; Klein, Sanford

    2012-06-01

    Cryosurgery is a medical technique that uses a cryoprobe to apply extreme cold to undesirable tissue such as cancers. Precooled Mixed Gas Joule-Thomson (pMGJT) cycles with Hampson-style recuperators are integrated with the latest generation of cryoprobes to create more powerful and compact instruments. Selection of gas mixtures for these cycles is not a trivial process; the focus of this research is the development of a detailed model that can be integrated with an optimization algorithm to select optimal gas mixtures. A test facility has been constructed to experimentally tune and verify this model. The facility uses a commercially available cryoprobe system that was modified to integrate measurement instrumentation sufficient to determine the performance of the system and its component parts. Spatially resolved temperature measurements allow detailed measurements of the heat transfer within the recuperator and therefore computation of the spatially resolved conductance. These data can be used to study the multiphase, multicomponent heat transfer process in the complicated recuperator geometry. The optimization model has been expanded to model the pressure drop associated with the flow to more accurately predict the performance of the system. The test facility has been used to evaluate the accuracy and usefulness of this improvement.

  10. Radial distribution of dose within heavy charged particle tracks. Models and experimental verification using LiF:Mg,Cu,P TL detectors

    CERN Document Server

    Gieszczyk, W; Olko, P; Obryk, B

    2014-01-01

    A new method of experimental verification of radial dose distribution models using solid state thermoluminescent (TL) detectors LiF:Mg,Cu,P has been recently proposed. In this work the method was applied to verify the spatial distribution of energy deposition within a single 131Xe ion track. Detectors were irradiated at the Department of Physics of the University of Jyv\\"askyl\\"a, Finland. The obtained results have been compared with theoretical data, calculated according to the Zhang et al., Cucinotta et al. and Geiss et al. radial dose distribution (RDD) models. At the lowest dose range the Zhang et al. RDD model exhibited the best agreement as compared to experimental data. In the intermediate dose range, up to 104 Gy, the best agreement was found for the RDD model of Cucinotta et al. The probability of occurrence of doses higher than 104 Gy within a single 131Xe ion track was found to be lower than predicted by all the studied RDD models. This may be a result of diffusion of the charge, which is then capt...

  11. Verification of a Monte-Carlo planetary surface radiation environment model using gamma-ray data from Lunar Prospector and 2001 Mars Odyssey

    Energy Technology Data Exchange (ETDEWEB)

    Skidmore, M.S., E-mail: mss16@star.le.ac.u [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester, LE1 7RH (United Kingdom); Ambrosi, R.M. [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester, LE1 7RH (United Kingdom)

    2010-01-01

    Characterising a planetary radiation environment is important to: (1) assess the habitability of a planetary body for indigenous life; (2) assess the risks associated with manned exploration missions to a planetary body and (3) predict/interpret the results that remote sensing instrumentation may obtain from a planetary body (e.g. interpret the gamma-ray emissions from a planetary surface produced by radioactive decay or via the interaction of galactic cosmic rays to obtain meaningful estimates of the concentration of certain elements on the surface of a planet). The University of Leicester is developing instrumentation for geophysical applications that include gamma-ray spectroscopy, gamma-ray densitometry and radiometric dating. This paper describes the verification of a Monte-Carlo planetary radiation environment model developed using the MCNPX code. The model is designed to model the radiation environments of Mars and the Moon, but is applicable to other planetary bodies, and will be used to predict the performance of the instrumentation being developed at Leicester. This study demonstrates that the modelled gamma-ray data is in good agreement with gamma-ray data obtained by the gamma-ray spectrometers on 2001 Mars Odyssey and Lunar Prospector, and can be used to accurately model geophysical instrumentation for planetary science applications.

  12. Experimental verification of bridge seismic damage states quantified by calibrating analytical models with empirical field data

    Institute of Scientific and Technical Information of China (English)

    Swagata Banerjee; Masanobu Shinozuka

    2008-01-01

    Bridges are one of the most vulnerable components of a highway transportation network system subjected to earthquake ground motions.Prediction of resilience and sustainability of bridge performance in a probabilistic manner provides valuable information for pre-event system upgrading and post-event functional recovery of the network.The current study integrates bridge seismic damageability information obtained through empirical,analytical and experimental procedures and quantifies threshold limits of bridge damage states consistent with the physical damage description given in HAZUS.Experimental data from a large-scale shaking table test are utilized for this purpose.This experiment was conducted at the University of Nevada,Reno,where a research team from the University of California,Irvine,participated.Observed experimental damage data are processed to idemify and quantify bridge damage states in terms of rotational ductility at bridge column ends.In parallel,a mechanistic model for fragility curves is developed in such a way that the model can be calibrated against empirical fragility curves that have been constructed from damage data obtained during the 1994 Northridge earthquake.This calibration quantifies threshold values of bridge damage states and makes the analytical study consistent with damage data observed in past earthquakes.The mechanistic model is transportable and applicable to most types and sizes of bridges.Finally,calibrated damage state definitions are compared with that obtained using experimental findings.Comparison shows excellent consistency among results from analytical,empirical and experimental observations.

  13. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study

    Science.gov (United States)

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-01

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.

  14. Modelling and verification of single slope solar still using ANSYS-CFX

    Energy Technology Data Exchange (ETDEWEB)

    Panchal, Hitesh N. [Research Scholar, Kadi Sarvavishwavidyalaya University, Gandhinagar (India); Shah, P.K. [Principal, Silver Oak College of Engineering and Technology, Ahmedabad (India)

    2011-07-01

    Solar distillation method is an easy, small scale and cost effective technique for providing safe water. It requires an energy input as heat and the solar radiation can be source of energy. Solar still is a device which uses process of solar distillation. Here, a two phase, three dimensional model was made for evaporation as well as condensation process in solar still by using ANSYS CFX method to simulate the present model. Simulation results of solar still compared with actual experiment data of single basin solar still at climate conditions of Mehsana (23{sup o}12' N, 72{sup o}30'). There is a good agreement with experimental results and simulation results of distillate output, water temperature and heat transfer coefficients. Overall study shows the ANSYS CFX is a powerful tool for diagnostic as well as analysis of solar still.

  15. Verification of the exponential model of body temperature decrease after death in pigs.

    Science.gov (United States)

    Kaliszan, Michal; Hauser, Roman; Kaliszan, Roman; Wiczling, Paweł; Buczyñski, Janusz; Penkowski, Michal

    2005-09-01

    The authors have conducted a systematic study in pigs to verify the models of post-mortem body temperature decrease currently employed in forensic medicine. Twenty-four hour automatic temperature recordings were performed in four body sites starting 1.25 h after pig killing in an industrial slaughterhouse under typical environmental conditions (19.5-22.5 degrees C). The animals had been randomly selected under a regular manufacturing process. The temperature decrease time plots drawn starting 75 min after death for the eyeball, the orbit soft tissues, the rectum and muscle tissue were found to fit the single-exponential thermodynamic model originally proposed by H. Rainy in 1868. In view of the actual intersubject variability, the addition of a second exponential term to the model was demonstrated to be statistically insignificant. Therefore, the two-exponential model for death time estimation frequently recommended in the forensic medicine literature, even if theoretically substantiated for individual test cases, provides no advantage as regards the reliability of estimation in an actual case. The improvement of the precision of time of death estimation by the reconstruction of an individual curve on the basis of two dead body temperature measurements taken 1 h apart or taken continuously for a longer time (about 4 h), has also been proved incorrect. It was demonstrated that the reported increase of precision of time of death estimation due to use of a multiexponential model, with individual exponential terms to account for the cooling rate of the specific body sites separately, is artifactual. The results of this study support the use of the eyeball and/or the orbit soft tissues as temperature measuring sites at times shortly after death. A single-exponential model applied to the eyeball cooling has been shown to provide a very precise estimation of the time of death up to approximately 13 h after death. For the period thereafter, a better estimation of the time

  16. Use of an Existing Airborne Radon Data Base in the Verification of the NASA/AEAP Core Model

    Science.gov (United States)

    Kritz, Mark A.

    1998-01-01

    The primary objective of this project was to apply the tropospheric atmospheric radon (Rn222) measurements to the development and verification of the global 3-D atmospheric chemical transport model under development by NASA's Atmospheric Effects of Aviation Project (AEAP). The AEAP project had two principal components: (1) a modeling effort, whose goal was to create, test and apply an elaborate three-dimensional atmospheric chemical transport model (the NASA/AEAP Core model to an evaluation of the possible short and long-term effects of aircraft emissions on atmospheric chemistry and climate--and (2) a measurement effort, whose goal was to obtain a focused set of atmospheric measurements that would provide some of the observational data used in the modeling effort. My activity in this project was confined to the first of these components. Both atmospheric transport and atmospheric chemical reactions (as well the input and removal of chemical species) are accounted for in the NASA/AEAP Core model. Thus, for example, in assessing the effect of aircraft effluents on the chemistry of a given region of the upper troposphere, the model must keep track not only of the chemical reactions of the effluent species emitted by aircraft flying in this region, but also of the transport into the region of these (and other) species from other, remote sources--for example, via the vertical convection of boundary layer air to the upper troposphere. Radon, because of its known surface source and known radioactive half-life, and freedom from chemical production or loss, and from removal from the atmosphere by physical scavenging, is a recognized and valuable tool for testing the transport components of global transport and circulation models.

  17. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  18. Analytical design model for a piezo-composite unimorph actuator and its verification using lightweight piezo-composite curved actuators

    Science.gov (United States)

    Yoon, K. J.; Park, K. H.; Lee, S. K.; Goo, N. S.; Park, H. C.

    2004-06-01

    This paper describes an analytical design model for a layered piezo-composite unimorph actuator and its numerical and experimental verification using a LIPCA (lightweight piezo-composite curved actuator) that is lighter than other conventional piezo-composite type actuators. The LIPCA is composed of top fiber composite layers with high modulus and low CTE (coefficient of thermal expansion), a middle PZT ceramic wafer, and base layers with low modulus and high CTE. The advantages of the LIPCA design are to replace the heavy metal layer of THUNDER by lightweight fiber-reinforced plastic layers without compromising the generation of high force and large displacement and to have design flexibility by selecting the fiber direction and the number of prepreg layers. In addition to the lightweight advantage and design flexibility, the proposed device can be manufactured without adhesive layers when we use a resin prepreg system. A piezo-actuation model for a laminate with piezo-electric material layers and fiber composite layers is proposed to predict the curvature and residual stress of the LIPCA. To predict the actuation displacement of the LIPCA with curvature, a finite element analysis method using the proposed piezo-actuation model is introduced. The predicted deformations are in good agreement with the experimental ones.

  19. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    Science.gov (United States)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture

  20. THE FLOOD RISK IN THE LOWER GIANH RIVER: MODELLING AND FIELD VERIFICATION

    Directory of Open Access Journals (Sweden)

    NGUYEN H. D.

    2016-03-01

    Full Text Available Problems associated with flood risk definitely represent a highly topical issue in Vietnam. The case of the lower Gianh River in the central area of Vietnam, with a watershed area of 353 km2, is particularly interesting. In this area, periodically subject to flood risk, the scientific question is strongly linked to risk management. In addition, flood risk is the consequence of the hydrological hazard of an event and the damages related to this event. For this reason, our approach is based on hydrodynamic modelling using Mike Flood to simulate the runoff during a flood event. Unfortunately the data in the studied area are quite limited. Our computation of the flood risk is based on a three-step modelling process, using rainfall data coming from 8 stations, cross sections, the topographic map and the land-use map. The first step consists of creating a 1-D model using Mike 11, in order to simulate the runoff in the minor river bed. In the second step, we use Mike 21 to create a 2-D model to simulate the runoff in the flood plain. The last step allows us to couple the two models in order to precisely describe the variables for the hazard analysis in the flood plain (the water level, the speed, the extent of the flooding. Moreover the model is calibrated and verified using observational data of the water level at hydrologic stations and field control data (on the one hand flood height measurements, on the other hand interviews with the community and with the local councillors. We then generate GIS maps in order to improve flood hazard management, which allows us to create flood hazard maps by coupling the flood plain map and the runoff speed map. Our results show that: the flood peak, caused by typhoon Nari, reached more than 6 m on October 16th 2013 at 4 p.m. (its area was extended by 149 km². End that the typhoon constitutes an extreme flood hazard for 11.39%, very high for 10.60%, high for 30.79%, medium for 31.91% and a light flood hazard for 15

  1. Documenting Differences between Early Stone Age Flake Production Systems: An Experimental Model and Archaeological Verification.

    Science.gov (United States)

    Presnyakova, Darya; Archer, Will; Braun, David R; Flear, Wesley

    2015-01-01

    This study investigates morphological differences between flakes produced via "core and flake" technologies and those resulting from bifacial shaping strategies. We investigate systematic variation between two technological groups of flakes using experimentally produced assemblages, and then apply the experimental model to the Cutting 10 Mid -Pleistocene archaeological collection from Elandsfontein, South Africa. We argue that a specific set of independent variables--and their interactions--including external platform angle, platform depth, measures of thickness variance and flake curvature should distinguish between these two technological groups. The role of these variables in technological group separation was further investigated using the Generalized Linear Model as well as Linear Discriminant Analysis. The Discriminant model was used to classify archaeological flakes from the Cutting 10 locality in terms of their probability of association, within either experimentally developed technological group. The results indicate that the selected independent variables play a central role in separating core and flake from bifacial technologies. Thickness evenness and curvature had the greatest effect sizes in both the Generalized Linear and Discriminant models. Interestingly the interaction between thickness evenness and platform depth was significant and played an important role in influencing technological group membership. The identified interaction emphasizes the complexity in attempting to distinguish flake production strategies based on flake morphological attributes. The results of the discriminant function analysis demonstrate that the majority of flakes at the Cutting 10 locality were not associated with the production of the numerous Large Cutting Tools found at the site, which corresponds with previous suggestions regarding technological behaviors reflected in this assemblage.

  2. Verification SEBAL and Hargreaves –Samani Models to Estimate Evapotranspiration by Lysimeter Data

    Directory of Open Access Journals (Sweden)

    Ali Morshedi

    2017-02-01

    Full Text Available Introduction: Evapotranspiration (ET is an important component of the hydrological cycle, energy equations at the surface and water balance. ET estimation is needed in various fields of science, such as hydrology, agriculture, forestry and pasture, and water resources management. Conventional methods used to estimate evapotranspiration from point measurements. Remote sensing models have the capability to estimate ET using surface albedo, surface temperature and vegetation indices in larger scales. Surface Energy Balance Algorithm for Land (SEBAL estimate ET at the moment of satellite path as a residual of energy balance equation for each pixel. In this study Hargreaves-Samani (HS and SEBAL models ET compared to an alfalfa lysimeter data’s, located in Shahrekord plain within the Karun basin. Satellite imageries were based on Landsat 7 ETM+ sensor data’s in seven satellite passes for path 164 and row 38 in the World Reference System, similar to lysimeter sampling data period, from April to October 2011. SEBAL uses the energy balance equation to estimate evapotranspiration. Equation No. 1 shows the energy balance equation for an evaporative surface: λET=Rn–G–H [1] In this equation Rn, H, G and λET represent the net radiation flux input to the surface (W/m2, Sensible heat flux (W/m2, soil heat flux (W/m2, and latent heat of vaporization (W/m2, respectively. In this equation the vertical flux considered and the horizontal fluxes of energy are neglected. The above equation must be used for large surfaces and uniformly full cover plant area. SEBAL is provided for estimating ET, using the minimum data measured by ground equipment. This model is applied and tested in more than 30 countries with an accuracy of about 85% at field scale, and 95 percent in the daily and seasonal scales. In Borkhar watershed (East of Isfahan, IRAN ASTER and MODIS satellite imageries were used for SEBAL to compare Penman-Monteith model. Results showed that estimated

  3. Steps in the construction and verification of an explanatory model of psychosocial adjustment

    Directory of Open Access Journals (Sweden)

    Arantzazu Rodríguez-Fernández

    2016-06-01

    Full Text Available The aim of the present study was to empirically test an explanatory model of psychosocial adjustment during adolescence, with psychosocial adjustment during this stage being understood as a combination of school adjustment (or school engagement and subjective well-being. According to the hypothetic model, psychosocial adjustment depends on self-concept and resilience, which in turn act as mediators of the influence of perceived social support (from family, peers and teachers on this adjustment. Participants were 1250 secondary school students (638 girls and 612 boys aged between 12 and 15 years (Mean = 13.72; SD = 1.09. The results provided evidence of: (a the influence of all three types of perceived support on subject resilience and self-concept, with perceived family support being particularly important in this respect; (b the influence of the support received from teachers on school adjustment and support received from the family on psychological wellbeing; and (c the absence of any direct influence of peer support on psychosocial adjustment, although indirect influence was observed through the psychological variables studied. These results are discussed from an educational perspective and in terms of future research

  4. 基于时间抽象状态机的AADL模型验证∗%Verification of AADL Models with Timed Abstract State Machines

    Institute of Scientific and Technical Information of China (English)

    杨志斌; 胡凯; 赵永望; 马殿富; Jean-Paul BODEVEIX

    2015-01-01

    This paper presents a formal verification method for AADL (architecture analysis and design language) models by TASM (timed abstract state machine) translation. The abstract syntax of the chosen subset of AADL and of TASM are given. The translation rules are defined clearly by the semantic functions expressed in a ML-like language. Furthermore, the translation is implemented in the model transformation tool AADL2TASM, which provides model checking and simulation for AADL models. Finally, a case study of space GNC (guidance, navigation and control) system is provided.%提出了一种基于时间抽象状态机(timed abstract state machine,简称TASM)的AADL(architecture analysis and design language)模型验证方法。分别给出了AADL子集和TASM的抽象语法,并基于语义函数和类ML的元语言形式定义转换规则。在此基础上,基于AADL开源建模环境OSATE(open source AADL tool environment)设计并实现了AADL模型验证与分析工具AADL2TASM,并基于航天器导航、制导与控制系统(guidance, navigation and control)进行了实例性验证。

  5. Robust verification analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William, E-mail: wjrider@sandia.gov [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States); Witkowski, Walt [Sandia National Laboratories, Verification and Validation, Uncertainty Quantification, Credibility Processes Department, Engineering Sciences Center, Albuquerque, NM 87185 (United States); Kamm, James R. [Los Alamos National Laboratory, Methods and Algorithms Group, Computational Physics Division, Los Alamos, NM 87545 (United States); Wildey, Tim [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States)

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  6. Dakota Gasification Company CO{sub 2} sequestration verification project-a case study of greenhouse gas reduction verification and marketing

    Energy Technology Data Exchange (ETDEWEB)

    Huxley, Doug [CH2M HILL Inc., Ben Feldman/Natsource LLC, David Peightal/Dakota Gasification Company, CH2M HILL, 9193 South Jamaica Street, Englewood, CO 80112 (United States)

    2006-02-15

    The Dakota Gasification Company (DGC) and EnCana Corporation have jointly implemented a project to allow CO{sub 2} emissions from the Great Plains Synfuels Plant in North Dakota to be captured and used for enhanced oil recovery operations at the Weyburn Oil Field in Saskatchewan. This paper shares the experience of CH2M HILL and Natsource, LLC in quantification of primary and secondary greenhouse gas impacts of the project, registration of the project with DOE's 1605(b) program, development of a monitoring and verification plan to allow tracking of past and future emissions reductions, and analysis of strategies for monetizing the emissions reductions resulting from this cross-border project. The monetary values of the emission reductions were assessed under several different scenarios for future GHG policy. (author)

  7. Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.

    Science.gov (United States)

    Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M

    2000-02-01

    Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.

  8. Informational model verification of ZVS Buck quasi-resonant DC-DC converter

    Science.gov (United States)

    Vakovsky, Dimiter; Hinov, Nikolay

    2016-12-01

    The aim of the paper is to create a polymorphic informational model of a ZVS Buck quasi-resonant DC-DC converter for the modeling purposes of the object. For the creation of the model is applied flexible open standards for setting, storing, publishing and exchange of data in distributed information environment. The created model is useful for creation of many and different by type variants with different configuration of the composing elements and different inner model of the examined object.

  9. Verification and Validation of the Coastal Modeling System. Report 2: CMS-Wave

    Science.gov (United States)

    2011-12-01

    wave models in this category are ideal for generation, growth and transformation of wind-waves over large distances (fetches) in regional -scale...quantitative model -to-data intercomparison or model -to- model intercomparison . Both evaluations involve assessment of the methods and data required for...combined wind and wave modeling capabilities of CMS-Wave in a large tidally- dominated inlet environment with an energetic wave climate . Extensive field

  10. Dynamic modeling of double-helical gear with Timoshenko beam theory and experiment verification

    Directory of Open Access Journals (Sweden)

    Jincheng Dong

    2016-05-01

    Full Text Available In the dynamic study of the double-helical gear transmission, the coupling shaft in the middle of the two helical gears is difficult to be handled accurately. In this article, the coupling shaft is treated as the Timoshenko beam elements and is synthesized with the lumped-mass method of the two helical gear pairs. Then, the numerical integration method is used to solve the amplitude–frequency responses and dynamic factors under diverse operating conditions. A gear vibration test rig of closed power circuit is developed for in-depth experimental measurements and model validation. After comparing the theoretical data with the practical results, the following conclusions are drawn: (1 the dynamic model with the Timoshenko beam element is quite appropriate and reliable in the dynamic analysis of double-helical gear transmission and is of great theoretical value in the accurate dynamic research of the double-helical gear transmission. (2 In both theoretical analysis and experimental measurements, the dynamic factors of gear pair diminish with the increase in the input torque and augment with the increase in the input speed. (3 The deviation ratio of the theoretical data and the experimental results decrease with the increase in the input torque, reaching the minimum at the highest input speed.

  11. Quantum Chemical Mass Spectrometry: Verification and Extension of the Mobile Proton Model for Histidine

    Science.gov (United States)

    Cautereels, Julie; Blockhuys, Frank

    2017-06-01

    The quantum chemical mass spectrometry for materials science (QCMS2) method is used to verify the proposed mechanism for proton transfer - the Mobile Proton Model (MPM) - by histidine for ten XHS tripeptides, based on quantum chemical calculations at the DFT/B3LYP/6-311+G* level of theory. The fragmentations of the different intermediate structures in the MPM mechanism are studied within the QCMS2 framework, and the energetics of the proposed mechanism itself and those of the fragmentations of the intermediate structures are compared, leading to the computational confirmation of the MPM. In addition, the calculations suggest that the mechanism should be extended from considering only the formation of five-membered ring intermediates to include larger-ring intermediates.

  12. The lonely mouse: verification of a separation-induced model of depression in female mice.

    Science.gov (United States)

    Martin, Alison L; Brown, Richard E

    2010-02-11

    Animal models of depression seldom test females, even though women are twice as likely as men to suffer from major depressive disorder. Since female mice are sensitive to social isolation, we tested a separation-based model of depression in three experiments. In experiment 1 female C57BL/6J mice were housed in three conditions: isolated (housed individually from 8 weeks of age), separated (housed in groups and then separated and housed individually at 23 weeks of age) and grouped (housed in groups from 8 weeks of age). At 24 weeks of age, there was a significant increase in weight and in immobility in individually housed mice in the forced swim test (FST) and tail suspension test (TST), a reduction in transitions in the L/D box, a reduced startle response and reduced prepulse inhibition, but no differences in cued or context fear conditioning. Experiment 2 showed that fluoxetine treatment administered via drinking water attenuated depressive-like behaviour in the FST and TST in individually housed female C57BL/6J mice, but had no effect on anxiety-like behaviour. Experiment 3 found that group-housed females had higher baseline corticosterone (CORT) levels than isolated females and fluoxetine had no effect on CORT levels. Thus, separation from group housing is a reliable and valid method for inducing depression-like behaviour in female mice. This procedure is both versatile, allowing for the study of genetic and environmental interactions, and accessible, making it useful for studying depression and testing new drugs for its treatment.

  13. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  14. Hysteresis modelling and experimental verification of a Fe–Ga alloy magnetostrictive actuator

    Science.gov (United States)

    Wei, Zhu; Lei Xiang, Bian; Gangli, Chen; Shuxin, Liu; Qinbo, Zhou; Xiaoting, Rui

    2017-03-01

    In order to accurately describe the asymmetric rate-bias-dependent hysteresis of a Fe–Ga alloy magnetostrictive actuator, a comprehensive model, which is composed of a phenomenon model, describing hysteresis by the modified Bouc–Wen hysteresis operator, and a theoretical model, representing the dynamics characteristics, is put forward. The experimental system is setup to verify the performance of the comprehensive model. Results show that the modified Bouc–Wen model can effectively describe the dynamics and hysteresis characteristics of the Fe–Ga alloy magnetostrictive actuator. The results highlight significantly improved accuracy in the modelling of the magnetostrictive actuator.

  15. Modelling and Simulation of Variable Speed Thruster Drives with Full-Scale Verification

    Directory of Open Access Journals (Sweden)

    Jan F. Hansen

    2001-10-01

    Full Text Available In this paper considerations about modelling and simulation of variable speed thruster drives are made with comparison to full scale measurements from Varg FPSO. For special purpose vessels with electric propulsion operating in DP (Dynamic Positioning mode the thruster drives are essential for the vessel operation. Different model strategies of thruster drives are discussed. An advanced thruster drive model with a dynamic motor model and field vector control principle is shown. Simulations are performed with both the advanced model and a simplified model. These are compared with full-scale measurements from Varg FPSO. The simulation results correspond well with the measurements, for both the simplified model and the advanced model.

  16. Experimental verification of optical models of graphene with multimode slab waveguides.

    Science.gov (United States)

    Chang, Zeshan; Chiang, Kin Seng

    2016-05-01

    We compare three optical models of graphene, namely, the interface model, the isotropic model, and the anisotropic model, and verify them experimentally with two multimode slab waveguide samples operating at the wavelengths of 632.8 and 1536 nm. By comparing the calculated graphene-induced losses and the measurement data, we confirm that the interface model and the anisotropic model give correct results for both the transverse electric (TE) and transverse magnetic modes, while the isotropic model gives correct results only for the TE modes. With the experimental data, we also quantitatively verify the widely used expression for the surface conductivity of graphene in the optical regime. Our findings clarify the issue of modeling graphene in the analysis of graphene-incorporated waveguides and offer deeper insight into the optical properties of graphene for waveguide applications.

  17. Analysis of the State of the Art Contingency Analysis Model (SOTACA), Air Module Verification

    Science.gov (United States)

    1990-03-01

    General Information 6 Model Uses 7 Proponent and Users 8 System Requirements 8 History 8 Reasons to Use 9 Model Operation 10 Preprocessor 11 Area Files...Definition File (ADF) 42 Theather Data 42 Munitions and Target Effects Data 44 Air Data 44 Decision Threshold File (DTF) 44 Scenario 45 Test Cases 46 Null Case...information, model uses, proponent and users, system requirements, model history , and reasons to use SOTACA. This scction also covers the four phases of

  18. Verification Test of the SURF and SURFplus Models in xRage: Part II

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-20

    The previous study used an underdriven detonation wave (steady ZND reaction zone profile followed by a scale invariant rarefaction wave) for PBX 9502 as a validation test of the implementation of the SURF and SURFplus models in the xRage code. Even with a fairly fine uniform mesh (12,800 cells for 100mm) the detonation wave profile had limited resolution due to the thin reaction zone width (0.18mm) for the fast SURF burn rate. Here we study the effect of finer resolution by comparing results of simulations with cell sizes of 8, 2 and 1 μm, which corresponds to 25, 100 and 200 points within the reaction zone. With finer resolution the lead shock pressure is closer to the von Neumann spike pressure, and there is less noise in the rarefaction wave due to fluctuations within the reaction zone. As a result the average error decreases. The pointwise error is still dominated by the smearing the pressure kink in the vicinity of the sonic point which occurs at the end of the reaction zone.

  19. Validation and verification of agent models for trust: Independent compared to relative trust

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van

    2011-01-01

    In this paper, the results of a validation experiment for two existing computational trust models describing human trust are reported. One model uses experiences of performance in order to estimate the trust in different trustees. The second model in addition carries the notion of relative trust. Th

  20. METHANOGENESIS AND SULFATE REDUCTION IN CHEMOSTATS: II. MODEL DEVELOPMENT AND VERIFICATION

    Science.gov (United States)

    A comprehensive dynamic model is presented that simulates methanogenesis and sulfate reduction in a continuously stirred tank reactor (CSTR). This model incorporates the complex chemistry of anaerobic systems. A salient feature of the model is its ability to predict the effluent ...

  1. Derivation, calibration and verification of macroscopic model for urban traffic flow. Part 1

    CERN Document Server

    Kholodov, Yaroslav A; Kholodov, Aleksandr S; Vasiliev, Mikhail O; Kurzhanskiy, Alexander A

    2016-01-01

    In this paper we present a second-order hydrodynamic traffic model that generalizes the existing second-order models of Payne-Whithem, Zhang and Aw-Rascle. In the proposed model, we introduce the pressure equation describing the dependence of "traffic pressure" on traffic density. The pressure equation is constructed for each road segment from the fundamental diagram that is estimated using measurements from traffic detectors. We show that properties of any phenomenological model are fully defined by the pressure equation. We verify the proposed model through simulations of the Interstate 580 freeway segment in California, USA, with traffic measurements from the Performance Measurement System (PeMS).

  2. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  3. FEM modeling for 3D dynamic analysis of deep-ocean mining pipeline and its experimental verification

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    3D dynamic analysis models of 1000 m deep-ocean mining pipeline, including steel lift pipe, pump, buffer and flexible hose, were established by finite element method (FEM). The coupling effect of steel lift pipe and flexible hose, and main external loads of pipeline were considered in the models, such as gravity, buoyancy, hydrodynamic forces, internal and external fluid pressures, concentrated suspension buoyancy on the flexible hose, torsional moment and axial force induced by pump working.Some relevant FEM models and solution techniques were developed, according to various 3D transient behaviors of integrated deep-ocean mining pipeline, including towing motions of track-keeping operation and launch process of pipeline. Meanwhile, an experimental verification system in towing water tank that had similar characteristics of designed mining pipeline was developed to verify the accuracy of the FEM models and dynamic simulation. The experiment results show that the experimental records and simulation results of stress of pipe are coincided. Based on the further simulations of 1 000 m deep-ocean mining pipeline, the simulation results show that, to form configuration of a saddle shape, the total concentrated suspension buoyancy of flexible hose should be 95%-105% of the gravity of flexible hose in water, the first suspension point occupies 1/3 of the total buoyancy, and the second suspension point occupies 2/3 of the total buoyancy. When towing velocity of mining system is less than 0.5 m/s, the towing track of buffer is coincided with the setting route of ship on the whole and the configuration of flexible hose is also kept well.

  4. Fingerprint verification based on wavelet subbands

    Science.gov (United States)

    Huang, Ke; Aviyente, Selin

    2004-08-01

    Fingerprint verification has been deployed in a variety of security applications. Traditional minutiae detection based verification algorithms do not utilize the rich discriminatory texture structure of fingerprint images. Furthermore, minutiae detection requires substantial improvement of image quality and is thus error-prone. In this paper, we propose an algorithm for fingerprint verification using the statistics of subbands from wavelet analysis. One important feature for each frequency subband is the distribution of the wavelet coefficients, which can be modeled with a Generalized Gaussian Density (GGD) function. A fingerprint verification algorithm that combines the GGD parameters from different subbands is proposed to match two fingerprints. The verification algorithm in this paper is tested on a set of 1,200 fingerprint images. Experimental results indicate that wavelet analysis provides useful features for the task of fingerprint verification.

  5. Model-based virtual VSB mask writer verification for efficient mask error checking and optimization prior to MDP

    Science.gov (United States)

    Pack, Robert C.; Standiford, Keith; Lukanc, Todd; Ning, Guo Xiang; Verma, Piyush; Batarseh, Fadi; Chua, Gek Soon; Fujimura, Akira; Pang, Linyong

    2014-10-01

    A methodology is described wherein a calibrated model-based `Virtual' Variable Shaped Beam (VSB) mask writer process simulator is used to accurately verify complex Optical Proximity Correction (OPC) and Inverse Lithography Technology (ILT) mask designs prior to Mask Data Preparation (MDP) and mask fabrication. This type of verification addresses physical effects which occur in mask writing that may impact lithographic printing fidelity and variability. The work described here is motivated by requirements for extreme accuracy and control of variations for today's most demanding IC products. These extreme demands necessitate careful and detailed analysis of all potential sources of uncompensated error or variation and extreme control of these at each stage of the integrated OPC/ MDP/ Mask/ silicon lithography flow. The important potential sources of variation we focus on here originate on the basis of VSB mask writer physics and other errors inherent in the mask writing process. The deposited electron beam dose distribution may be examined in a manner similar to optical lithography aerial image analysis and image edge log-slope analysis. This approach enables one to catch, grade, and mitigate problems early and thus reduce the likelihood for costly long-loop iterations between OPC, MDP, and wafer fabrication flows. It moreover describes how to detect regions of a layout or mask where hotspots may occur or where the robustness to intrinsic variations may be improved by modification to the OPC, choice of mask technology, or by judicious design of VSB shots and dose assignment.

  6. Initial Experimental Verification of the Neutron Beam Modeling for the LBNL BNCT Facility

    Energy Technology Data Exchange (ETDEWEB)

    Bleuel, D.L.; Chu, W.T.; Donahue, R.J.; Ludewigt, B.A.; McDonald, R.J.; Smith, A.R.; Stone, N.A.; Vuji, J.

    1999-01-19

    In preparation for future clinical BNCT trials, neutron production via the 7Li(p,n) reaction as well as subsequent moderation to produce epithermal neutrons have been studied. Proper design of a moderator and filter assembly is crucial in producing an optimal epithermal neutron spectrum for brain tumor treatments. Based on in-phantom figures-of-merit,desirable assemblies have been identified. Experiments were performed at the Lawrence Berkeley National Laboratory's 88-inch cyclotron to characterize epithermal neutron beams created using several microampere of 2.5 MeV protons on a lithium target. The neutron moderating assembly consisted of Al/AlF3 and Teflon, with a lead reflector to produce an epithermal spectrum strongly peaked at 10-20 keV. The thermal neutron fluence was measured as a function of depth in a cubic lucite head phantom by neutron activation in gold foils. Portions of the neutron spectrum were measured by in-air activation of six cadmium-covered materials (Au, Mn, In, Cu, Co, W) with high epithermal neutron absorption resonances. The results are reasonably reproduced in Monte Carlo computational models, confirming their validity.

  7. Establishment and Verification of MCNP Neutron Transport Model About Tianwan Nuclear Power Plant

    Institute of Scientific and Technical Information of China (English)

    ZHOU; Qi

    2012-01-01

    <正>In order to calculating the neutron flux in the surveillance box and reactor pressure vessel of the Tianwan NPP, we need to build up the neutron transport model by using the Monte Carlo code MCNP. The core of the NPP is very complicated for modeling so we put forward some assumptions that can simplify the neutron transport model. A lot of calculation works have been done to prove that the assumptions are right and suitable.

  8. EMPIRICAL VERIFICATION OF ANISOTROPIC HYDRODYNAMIC TRAFFIC MODEL IN TRAFFIC ANALYSIS AT INTERSECTIONS IN URBAN AREA

    Institute of Scientific and Technical Information of China (English)

    WEI Yan-fang; GUO Si-ling; XUE Yu

    2007-01-01

    In this article, the traffic hydrodynamic model considering the driver's reaction time was applied to the traffic analysis at the intersections on real roads. In the numerical simulation with the model, the pinch effect of the right-turning vehicles flow was found, which mainly leads to traffic jamming on the straight lane. All of the results in accordance with the empirical data confirm the applicability of this model.

  9. Experimental verification of computational model for wind turbine blade geometry design

    Directory of Open Access Journals (Sweden)

    Štorch Vít

    2015-01-01

    Full Text Available A 3D potential flow solver with unsteady force free wake model intended for optimization of blade shape for wind power generation is applied on a test case scenario formed by a wind turbine with vertical axis of rotation. The calculation is sensitive to correct modelling of wake and its interaction with blades. The validity of the flow solver is verified by comparing experimentally obtained performance data of model rotor with numerical results.

  10. Derivation, calibration and verification of macroscopic model for urban traffic flow. Part 2

    CERN Document Server

    Alekseenko, Andrey E; Kholodov, Aleksandr S; Goreva, Anna I; Kurzhanskiy, Alexander A; Chehovich, Yuriy V; Starozhilets, Vsevolod M

    2016-01-01

    In this paper, we propose a unified procedure for calibration of macroscopic second-order multilane traffic models. The focus is on calibrating the fundamental diagram using the combination stationary detector data and GPS traces. GPS traces are used in estimation of the deceleration wave speed. Thus calibrated model adequately represents the three phases of traffic: free flow, synchronized flow and the wide moving jam. The proposed approach was validated in simulation using stationary detection data and GPS traces from the Moscow Ring Road. Simulation showed that the proposed second-order model is more accurate than the first-order LWR model.

  11. Modeling and experimental verification for a broad beam light transport in optical tomography.

    Science.gov (United States)

    Janunts, Edgar; Pöschinger, Thomas; Eisa, Fabian; Langenbucher, Achim

    2010-01-01

    This paper describes a general theoretical model for computing a broad beam excitation light transport in a 3D diffusion medium. The model is based on the diffusion approximation of the radiative transport equation. An analytical approach for the light propagation is presented by deriving a corresponding Green's function. A finite cylindrical domain with a rectangular cross section was considered as a 3D homogeneous phantom model. The results of the model are compared with corresponding experimental data. The measurements are done on solid and liquid phantoms replicating tissue-like optical properties.

  12. Predicted and actual indoor environmental quality: Verification of occupants' behaviour models in residential buildings

    DEFF Research Database (Denmark)

    Andersen, Rune Korsholm; Fabi, Valentina; Corgnati, Stefano P.

    2016-01-01

    performance using building energy performance simulations (BEPS). However, the validity of these models has only been sparsely tested. In this paper, stochastic models of occupants' behaviour from literature were tested against measurements in five apartments. In a monitoring campaign, measurements of indoor...... station close by. The stochastic models of window opening and heating set-point adjustments were implemented in the BEPS tool IDA ICE. Two apartments from the monitoring campaign were simulated using the implemented models and the measured weather data. The results were compared to measurements from...

  13. Modeling of bubble detachment in reduced gravity under the influence of electric fields and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Herman, Cila [Johns Hopkins University, Department of Mechanical Engineering, Baltimore, MD 21218 (United States); Iacona, Estelle [Johns Hopkins University, Department of Mechanical Engineering, Baltimore, MD 21218 (United States); Ecole Centrale, Laboratoire EM2C, Paris UPR 288 (France)

    2004-10-01

    A simple model for predicting bubble volume and shape at detachment in reduced gravity under the influence of electric fields is described in the paper. The model is based on relatively simple thermodynamic arguments and relies on and combines several models described in the literature. It accounts for the level of gravity and the magnitude of the electric field. For certain conditions of bubble development the properties of the bubble source are also considered. Computations were carried out for a uniform unperturbed electric field for a range of model parameters, and the significance of model assumptions and simplifications is discussed for the particular method of bubble formation. Experiments were conducted in terrestrial conditions and reduced gravity (during parabolic flights in NASA's KC-135 aircraft) by injecting air bubbles through an orifice into the electrically insulating working fluid, PF5052. Bubble shapes visualized experimentally were compared with model predictions. Measured data and model predictions show good agreement. The results suggest that the model can provide quick engineering estimates concerning bubble formation for a range of conditions (both for formation at an orifice and boiling) and such a model reduces the need for complex and expensive numerical simulations for certain applications. (orig.)

  14. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object...... of Refinement of Component and Object Systems (rCOS) and illustrates it with experiences from the work on the Common Component Modelling Example (CoCoME). This gives evidence that the formal techniques developed in rCOS can be integrated into a model-driven development process and shows where it may...

  15. Verification of stochastic behavioural models of occupants' interactions with windows in residential buildings

    DEFF Research Database (Denmark)

    Fabi, Valentina; Andersen, Rune Korsholm; Corgnati, Stefano

    2015-01-01

    Realistic characterisation of occupants' window opening behaviour is crucial for reliable prediction of building performance by means of building energy performance simulations. Window opening behaviour has been investigated by several researchers, leading to a variety of logistic regression models....... Initially three models from literature were investigated by comparison of predicted probabilities and the actual measured state of the windows. Data from one of the papers was reanalysed to create new models, based on measurements from single dwellings. These models were used to predict window transition...

  16. Mathematical modelling of a single-line flow-injection analysis systems with single-layer enzyme electrode detection. Pt. 3; Experimental verification of the model

    Energy Technology Data Exchange (ETDEWEB)

    Kolev, S.D. (Sofia Univ. (Bulgaria). Khimicheski Fakultet); Nagy, Geza; Pungor, Ernoe (Budapesti Mueszaki Egyetem, Budapest (Hungary). Altalanos es Analitikai Kemia Tanszek)

    1991-11-20

    Glucose and urea electrodes, prepared by two different enzyme immobilization techniques and used as detectors in a single-line flow-injection manifold, were experimentally investigated for elucidating the influence of their most important parameters, i.e., the initial substrate concentration in the sample, the enzyme concentration in the reaction layer and its thickness and the buffer concentration, on the output signal. The results obtained were compared with the theoretical predictions based on simulations of the model for single-line flow-injection systems with single-layer enzyme electrode detection. The good qualitative agreement which was observed is a convincing experimental verification of this model and the guidelines for the production of flow-through biocatalytic electrodes with optimum design based upon it. (author). 12 refs.; 6 figs.

  17. Probabilistic verification of Architectural software models using SoftArc and Prism

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.; Kuntz, G.W.M.; Leitner-Fischer, F.; Remke, Anne Katharina Ingrid; Roolvink, S.

    2010-01-01

    In this paper we will describe the SoftArc approach. With the SoftArc approach it is possible to model and analyse safety-critical embedded and distributed systems that consist of both hard- and software. We are going to present the SoftArc modelling language, its syntax and semantics. The semantics

  18. Using formal concept analysis for the verification of process-data matrices in conceptual domain models

    NARCIS (Netherlands)

    Poelmans, J.; Dedene, G.; Snoeck, M.; Viaene, S.; Fox, R.; Golubski, W.

    2010-01-01

    One of the first steps in a software engineering process is the elaboration of the conceptual domain model. In this paper, we investigate how Formal Concept Analysis can be used to formally underpin the construction of a conceptual domain model. In particular, we demonstrate that intuitive verificat

  19. Erythrocyte lysis in isotonic solution of ammonium chloride: Theoretical modelling and experimental verification

    NARCIS (Netherlands)

    Chernyshev, A.V.; Tarasov, P.A.; Semianov, K.A.; Nekrasov, V.M.; Hoekstra, A.G.; Maltsev, V.P.

    2008-01-01

    A mathematical model of erythrocyte lysis in isotonic solution of ammonium chloride is presented in frames of a statistical approach. The model is used to evaluate several parameters of mature erythrocytes (volume, surface area, hemoglobin concentration, number of anionic exchangers on membrane, ela

  20. GNSS-R nonlocal sea state dependencies: Model and empirical verification

    Science.gov (United States)

    Chen-Zhang, David D.; Ruf, Christopher S.; Ardhuin, Fabrice; Park, Jeonghwan

    2016-11-01

    Global Navigation Satellite System Reflectometry (GNSS-R) is an active, bistatic remote sensing technique operating at L-band frequencies. GNSS-R signals scattered from a rough ocean surface are known to interact with longer surface waves than traditional scatterometery and altimetry signals. A revised forward model for GNSS-R measurements is presented which assumes an ocean surface wave spectrum that is forced by other sources than just the local near-surface winds. The model is motivated by recent spaceborne GNSS-R observations that indicate a strong scattering dependence on significant wave height, even after controlling for local wind speed. This behavior is not well represented by the most commonly used GNSS-R scattering model, which features a one-to-one relationship between wind speed and the mean-square-slope of the ocean surface. The revised forward model incorporates a third generation wave model that is skillful at representing long waves, an anchored spectral tail model, and a GNSS-R electromagnetic scattering model. In comparisons with the spaceborne measurements, the new model is much better able to reproduce the empirical behavior.

  1. Verification of mesoscopic models of viscoelastic fluids with a non-monotonic flow curve

    Science.gov (United States)

    Kuznetsova, Julia L.; Skul'skiy, Oleg I.

    2016-02-01

    The non-monotonic flow curve of a 1 wt.% polyacrylonitrile solution in dimethyl sulfoxide is described by two mesoscopic models: the modified Vinogradov-Pokrovsky model and the model proposed by Remmelgas, Harrison and Leal. To obtain an adequate description of the experimental curve, we have selected suitable internal parameters for these models. Analytical solutions for the Couette-Poiseuille flow problems are determined in parametric form, which allows us to plot the distribution of stress components and anisotropy tensor as well as the velocity profiles containing closed loops and weak tangential discontinuities. It is shown that both models predict a similar qualitative picture of structure evolution, but exhibit a significant discrepancy in the quantitative description of the magnitude of molecular chain stretching.

  2. Car-following model with relative-velocity effect and its experimental verification

    Science.gov (United States)

    Shamoto, Daisuke; Tomoeda, Akiyasu; Nishi, Ryosuke; Nishinari, Katsuhiro

    2011-04-01

    In driving a vehicle, drivers respond to the changes of both the headway and the relative velocity to the vehicle in front. In this paper a new car-following model including these maneuvers is proposed. The acceleration of the model becomes infinite (has a singularity) when the distance between two vehicles is zero, and the asymmetry between the acceleration and the deceleration is incorporated in a nonlinear way. The model is simple but contains enough features of driving for reproducing real vehicle traffic. From the linear stability analysis, we confirm that the model shows the metastable homogeneous flow around the critical density, beyond which a traffic jam emerges. Moreover, we perform experiments to verify this model. From the data it is shown that the acceleration of a vehicle has a positive correlation with the relative velocity.

  3. Analysis of the Properties of Adjoint Equations and Accuracy Verification of Adjoint Model Based on FVM

    Directory of Open Access Journals (Sweden)

    Yaodeng Chen

    2014-01-01

    Full Text Available There are two different approaches on how to formulate adjoint numerical model (ANM. Aiming at the disputes arising from the construction methods of ANM, the differences between nonlinear shallow water equation and its adjoint equation are analyzed; the hyperbolicity and homogeneity of the adjoint equation are discussed. Then, based on unstructured meshes and finite volume method, a new adjoint model was advanced by getting numerical model of the adjoint equations directly. Using a gradient check, the correctness of the adjoint model was verified. The results of twin experiments to invert the bottom friction coefficient (Manning’s roughness coefficient indicate that the adjoint model can extract the observation information and produce good quality inversion. The reason of disputes about construction methods of ANM is also discussed in the paper.

  4. Effective verification of confidentiality for multi-threaded programs

    NARCIS (Netherlands)

    Ngo, Minh Tri; Stoelinga, Mariëlle; Huisman, Marieke

    2014-01-01

    This paper studies how confidentiality properties of multi-threaded programs can be verified efficiently by a combination of newly developed and existing model checking algorithms. In particular, we study the verification of scheduler-specific observational determinism (SSOD), a property that charac

  5. Digital Evaluation of Three Splinting Materials Used to Fabricate Verification Jigs for Full-Arch Implant Prostheses: A Comparative Study.

    Science.gov (United States)

    Papaspyridakos, Panos; Kim, Yong-Jeong; Finkelman, Matthew; El-Rafie, Khaled; Weber, Hans-Peter

    2017-04-01

    The primary aim of this study was to assess the accuracy of different splinting materials to make implant cast verification jigs. The secondary aim was to assess the effect of 20° implant angulation on the accuracy of casts. An edentulous mandibular arch with five internal connection tissue level implants served as control. The three implants in the anterior region were parallel to each other and the two implants in the posterior region were distally tilted 20° bilaterally. Verification jigs were fabricated with three different materials by splinting prefabricated bars to temporary abutments, resulting in three different groups (n = 15 specimens). Test casts were fabricated with low expansion type IV stone, and subsequently digitized with reference scanner. The STL files from the test casts and the control were superimposed, in order to determine the three-dimensional (3D) deviations. Group 1 (GC Pattern Resin) had a mean (SD) value of 36.59 (12.47) μm; Group 2 (Fixpeed Resin) had a mean (SD) value of 35.9 (10.13) μm; and Group 3 (Triad Gel) had a mean (SD) of 34.12 (7.10) μm. One-way ANOVA showed no statistically significant difference between groups (p = 0.790). For the comparative analysis of the effect of implant angulation, data were normally distributed for Groups 1 and 3 (GC Resin and Triad Gel), but not for group 2 (Fixpeed Resin). The difference between parallel and tilted implants was significant for all three groups: GC Resin (p = 0.024; paired t-test), Fixpeed Resin (p = 0.002; Wilcoxon signed-rank test), and Triad Gel (p = 0.002; paired t-test). There were no statistically significant differences between the 3D deviations of the test casts fabricated from verification jigs made by three materials (GC Pattern Resin, Fixpeed Resin, and Triad Gel). Parallel implants had nominally significantly less 3D deviations compared with 20° distally tilted implants, but not clinically significant. The results of the present study indicate

  6. Inverse dynamics of underactuated mechanical systems: A simple case study and experimental verification

    Science.gov (United States)

    Blajer, W.; Dziewiecki, K.; Kołodziejczyk, K.; Mazur, Z.

    2011-05-01

    Underactuated systems are featured by fewer control inputs than the degrees-of-freedom, m system to complete a set of m specified motion tasks is a challenging task, and the explicit solution existence is conditioned to differential flatness of the problem. The flatness-based solution denotes that all the 2 n states and m control inputs can be algebraically expressed in terms of the m specified outputs and their time derivatives up to a certain order, which is in practice attainable only for simple systems. In this contribution the problem is posed in a more practical way as a set of index-three differential-algebraic equations, and the solution is obtained numerically. The formulation is then illustrated by a two-degree-of-freedom underactuated system composed of two rotating discs connected by a torsional spring, in which the pre-specified motion of one of the discs is actuated by the torque applied to the other disc, n = 2 and m = 1. Experimental verification of the inverse simulation control methodology is reported.

  7. A study on the factors that affect the advanced mask defect verification

    Science.gov (United States)

    Woo, Sungha; Jang, Heeyeon; Lee, Youngmo; Kim, Sangpyo; Yim, Donggyu

    2015-10-01

    Defect verification has become significantly difficult to higher technology nodes over the years. Traditional primary method of defect (include repair point) control consists of inspection, AIMS and repair steps. Among them, AIMS process needs various wafer lithography conditions, such as NA, inner/outer sigma, illumination shape and etc. It has a limit to analyze for every layer accurately because AIMS tool uses the physical aperture system. And it requires meticulous management of exposure condition and CD target value which change frequently in advanced mask. We report on the influence of several AIMS parameters on the defect analysis including repair point. Under various illumination conditions with different patterns, it showed the significant correlation in defect analysis results. It is able to analyze defect under certain error budget based on the management specification required for each layer. In addition, it provided us with one of the clues in the analysis of wafer repeating defect. Finally we will present 'optimal specification' for defect management with common AIMS recipe and suggest advanced mask process flow.

  8. Implementation and verification of interface constitutive model in FLAC3D

    Directory of Open Access Journals (Sweden)

    Hai-min WU

    2011-09-01

    Full Text Available Due to the complexity of soil-structure interaction, simple constitutive models typically used for interface elements in general computer programs cannot satisfy the requirements of discontinuous deformation analysis of structures that contain different interfaces. In order to simulate the strain-softening characteristics of interfaces, a nonlinear strain-softening interface constitutive model was incorporated into fast Lagrange analysis of continua in three dimensions (FLAC3D through a user-defined program in the FISH environment. A numerical simulation of a direct shear test for geosynthetic interfaces was conducted to verify that the interface model was implemented correctly. Results of the numerical tests show good agreement with the results obtained from theoretical calculations, indicating that the model incorporated into FLAC3D can simulate the nonlinear strain-softening behavior of interfaces involving geosynthetic materials. The results confirmed the validity and reliability of the improved interface model. The procedure and method of implementing an interface constitutive model into a commercial computer program also provide a reference for implementation of a new interface constitutive model in FLAC3D.

  9. VERIFICATION OF MATHEMATICL MODEL FOR SEDIMENT TRANSPORT BY UNSTEADY FLOW IN THE LOWER YELLOW RIVER

    Institute of Scientific and Technical Information of China (English)

    Jianjun ZHOU; Bingnan LIN

    2004-01-01

    Field data from the Lower Yellow River (LYR) covering a period of ten consecutive years are used to test a mathematical model for one dimensional sediment transport by unsteady flow developed previously by the writers. Data of the first year of the said period, i.e., 1976, are used to calibrate the model and those of the remaining years to verify it. Items investigated include discharge, water stage, rate of transport of suspended sediment and riverbed erosion/deposition. Comparisons between computed and observed data indicate that the proposed model may well simulate sediment transport in the LYR under conditions of unsteady flow with sufficient accuracy.

  10. A dynamic human water and electrolyte balance model for verification and optimization of life support systems in space flight applications

    Science.gov (United States)

    Hager, P.; Czupalla, M.; Walter, U.

    2010-11-01

    In this paper we report on the development of a dynamic MATLAB SIMULINK® model for the water and electrolyte balance inside the human body. This model is part of an environmentally sensitive dynamic human model for the optimization and verification of environmental control and life support systems (ECLSS) in space flight applications. An ECLSS provides all vital supplies for supporting human life on board a spacecraft. As human space flight today focuses on medium- to long-term missions, the strategy in ECLSS is shifting to closed loop systems. For these systems the dynamic stability and function over long duration are essential. However, the only evaluation and rating methods for ECLSS up to now are either expensive trial and error breadboarding strategies or static and semi-dynamic simulations. In order to overcome this mismatch the Exploration Group at Technische Universität München (TUM) is developing a dynamic environmental simulation, the "Virtual Habitat" (V-HAB). The central element of this simulation is the dynamic and environmentally sensitive human model. The water subsystem simulation of the human model discussed in this paper is of vital importance for the efficiency of possible ECLSS optimizations, as an over- or under-scaled water subsystem would have an adverse effect on the overall mass budget. On the other hand water has a pivotal role in the human organism. Water accounts for about 60% of the total body mass and is educt and product of numerous metabolic reactions. It is a transport medium for solutes and, due to its high evaporation enthalpy, provides the most potent medium for heat load dissipation. In a system engineering approach the human water balance was worked out by simulating the human body's subsystems and their interactions. The body fluids were assumed to reside in three compartments: blood plasma, interstitial fluid and intracellular fluid. In addition, the active and passive transport of water and solutes between those

  11. SU-E-T-50: A Multi-Institutional Study of Independent Dose Verification Software Program for Lung SBRT

    Energy Technology Data Exchange (ETDEWEB)

    Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa-prefecture (Japan); Takahashi, R; Kamima, T [The Cancer Institute Hospital of JFCR, Koutou-ku, Tokyo (Japan); Baba, H [The National Cancer Center Hospital East, Kashiwa-city, Chiba prefecture (Japan); Yamamoto, T; Kubo, Y [Otemae Hospital, Chuou-ku, Osaka-city (Japan); Ishibashi, S; Higuchi, Y [Sasebo City General Hospital, Sasebo, Nagasaki (Japan); Takahashi, H [St Lukes International Hospital, Chuou-ku, Tokyo (Japan); Tachibana, H [National Cancer Center Hospital East, Kashiwa, Chiba (Japan)

    2015-06-15

    Purpose: The accuracy of dose distribution depends on treatment planning system especially in heterogeneity-region. The tolerance level (TL) of the secondary check using the independent dose verification may be variable in lung SBRT plans. We conducted a multi-institutional study to evaluate the tolerance level of lung SBRT plans shown in the AAPM TG114. Methods: Five institutes in Japan participated in this study. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. Analytical Anisotropic Algorithm (AAA), Pencil Beam Convolution with modified Batho-method (PBC-B) and Adaptive Convolve (AC) were used for lung SBRT planning. A measurement using an ion-chamber was performed in a heterogeneous phantom to compare doses from the three different algorithms and the SMU to the measured dose. In addition to it, a retrospective analysis using clinical lung SBRT plans (547 beams from 77 patients) was conducted to evaluate the confidence limit (CL, Average±2SD) in dose between the three algorithms and the SMU. Results: Compared to the measurement, the AAA showed the larger systematic dose error of 2.9±3.2% than PBC-B and AC. The Clarkson-based SMU showed larger error of 5.8±3.8%. The CLs for clinical plans were 7.7±6.0 % (AAA), 5.3±3.3 % (AC), 5.7±3.4 % (PBC -B), respectively. Conclusion: The TLs from the CLs were evaluated. A Clarkson-based system shows a large systematic variation because of inhomogeneous correction. The AAA showed a significant variation. Thus, we must consider the difference of inhomogeneous correction as well as the dependence of dose calculation engine.

  12. The Bilevel Design Problem for Communication Networks on Trains: Model, Algorithm, and Verification

    Directory of Open Access Journals (Sweden)

    Yin Tian

    2014-01-01

    Full Text Available This paper proposes a novel method to solve the problem of train communication network design. Firstly, we put forward a general description of such problem. Then, taking advantage of the bilevel programming theory, we created the cost-reliability-delay model (CRD model that consisted of two parts: the physical topology part aimed at obtaining the networks with the maximum reliability under constrained cost, while the logical topology part focused on the communication paths yielding minimum delay based on the physical topology delivered from upper level. We also suggested a method to solve the CRD model, which combined the genetic algorithm and the Floyd-Warshall algorithm. Finally, we used a practical example to verify the accuracy and the effectiveness of the CRD model and further applied the novel method on a train with six carriages.

  13. A combined aeroelastic-aeroacoustic model for wind turbine noise: Verification and analysis of field measurements

    DEFF Research Database (Denmark)

    Bertagnolio, Franck; Aagaard Madsen, Helge; Fischer, Andreas

    2017-01-01

    In this paper, semi-empirical engineering models for the three main wind turbine aerodynamic noise sources, namely, turbulent inflow, trailing edge and stall noise, are introduced. They are implemented into the in-house aeroelastic code HAWC2 commonly used for wind turbine load calculations...... and design. The results of the combined aeroelastic and aeroacoustic model are compared with field noise measurements of a 500kW wind turbine. Model and experimental data are in fairly good agreement in terms of noise levels and directivity. The combined model allows separating the various noise sources...... and highlights a number of mechanisms that are difficult to differentiate when only the overall noise from a wind turbine is measured....

  14. Empirical Verification of Fault Models for FPGAs Operating in the Subcritical Voltage Region

    DEFF Research Database (Denmark)

    Birklykke, Alex Aaen; Koch, Peter; Prasad, Ramjee

    2013-01-01

    fault models might provide insight that would allow subcritical scaling by changing digital design practices or by simply accepting errors if possible. To facilitate further work in this direction, we present probabilistic error models that allow us to link error behavior with statistical properties...... of the binary signals, and based on a two-FPGA setup we experimentally verify the correctness of candidate models. For all experiments, the observed error rates exhibit a polynomial dependency on outcome probability of the binary inputs, which corresponds to the behavior predicted by the proposed timing error...... model. Furthermore, our results show that the fault mechanism is fully deterministic - mimicking temporary stuck-at errors. As a result, given knowledge about a given signal, errors are fully predictable in the subcritical voltage region....

  15. Quantitative Verification of a Force-based Model for Pedestrian Dynamics

    CERN Document Server

    Chraibi, Mohcine; Schadschneider, Andreas; Mackens, Wolfgang

    2009-01-01

    This paper introduces a spatially continuous force-based model for simulating pedestrian dynamics. The main intention of this work is the quantitative description of pedestrian movement through bottlenecks and in corridors. Measurements of flow and density at bottlenecks will be presented and compared with empirical data. Furthermore the fundamental diagram for the movement in a corridor is reproduced. The results of the proposed model show a good agreement with empirical data.

  16. Verification and Validation of ICME Methods and Models for Aerospace Applications

    Science.gov (United States)

    2012-06-11

    most dominant and expected pro- cessing and behavioral mechanisms. Correspondingly, de- fining customer needs , the reality-of-interest, accuracy...chal- lenge some common shortcomings of materials modeling and include the following: Understanding customer needs An ICME model should serve a real...ICME predictions provide equal or better confidence than historical precedents. Dialog with the customer needs to continue throughout the ICME V&V

  17. Verification of high-speed solar wind stream forecasts using operational solar wind models

    OpenAIRE

    Reiss, Martin A.; Temmer, Manuela; Veronig, Astrid M.; Nikolic, Ljubomir; Vennerstrom, Susanne; Schoengassner, Florian; Hofmeister, Stefan J.

    2016-01-01

    High-speed solar wind streams emanating from coronal holes are frequently impinging on the Earth's magnetosphere causing recurrent, medium-level geomagnetic storm activity. Modeling high-speed solar wind streams is thus an essential element of successful space weather forecasting. Here we evaluate high-speed stream forecasts made by the empirical solar wind forecast (ESWF) and the semiempirical Wang-Sheeley-Arge (WSA) model based on the in situ plasma measurements from the ACE spacecraft for ...

  18. M3 version 3.0: Verification and validation; Hydrochemical model of ground water at repository site

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Javier B. (Dept. of Earth Sciences, Univ. of Zaragoza, Zaragoza (Spain)); Laaksoharju, Marcus (Geopoint AB, Sollentuna (Sweden)); Skaarman, Erik (Abscondo, Bromma (Sweden)); Gurban, Ioana (3D-Terra (Canada))

    2009-01-15

    Hydrochemical evaluation is a complex type of work that is carried out by specialists. The outcome of this work is generally presented as qualitative models and process descriptions of a site. To support and help to quantify the processes in an objective way, a multivariate mathematical tool entitled M3 (Multivariate Mixing and Mass balance calculations) has been constructed. The computer code can be used to trace the origin of the groundwater, and to calculate the mixing proportions and mass balances from groundwater data. The M3 code is a groundwater response model, which means that changes in the groundwater chemistry in terms of sources and sinks are traced in relation to an ideal mixing model. The complexity of the measured groundwater data determines the configuration of the ideal mixing model. Deviations from the ideal mixing model are interpreted as being due to reactions. Assumptions concerning important mineral phases altering the groundwater or uncertainties associated with thermodynamic constants do not affect the modelling because the calculations are solely based on the measured groundwater composition. M3 uses the opposite approach to that of many standard hydrochemical models. In M3, mixing is evaluated and calculated first. The constituents that cannot be described by mixing are described by reactions. The M3 model consists of three steps: the first is a standard principal component analysis, followed by mixing and finally mass balance calculations. The measured groundwater composition can be described in terms of mixing proportions (%), while the sinks and sources of an element associated with reactions are reported in mg/L. This report contains a set of verification and validation exercises with the intention of building confidence in the use of the M3 methodology. At the same time, clear answers are given to questions related to the accuracy and the precision of the results, including the inherent uncertainties and the errors that can be made

  19. M3 version 3.0: Verification and validation; Hydrochemical model of ground water at repository site

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Javier B. (Dept. of Earth Sciences, Univ. of Zaragoza, Zaragoza (Spain)); Laaksoharju, Marcus (Geopoint AB, Sollentuna (Sweden)); Skaarman, Erik (Abscondo, Bromma (Sweden)); Gurban, Ioana (3D-Terra (Canada))

    2009-01-15

    Hydrochemical evaluation is a complex type of work that is carried out by specialists. The outcome of this work is generally presented as qualitative models and process descriptions of a site. To support and help to quantify the processes in an objective way, a multivariate mathematical tool entitled M3 (Multivariate Mixing and Mass balance calculations) has been constructed. The computer code can be used to trace the origin of the groundwater, and to calculate the mixing proportions and mass balances from groundwater data. The M3 code is a groundwater response model, which means that changes in the groundwater chemistry in terms of sources and sinks are traced in relation to an ideal mixing model. The complexity of the measured groundwater data determines the configuration of the ideal mixing model. Deviations from the ideal mixing model are interpreted as being due to reactions. Assumptions concerning important mineral phases altering the groundwater or uncertainties associated with thermodynamic constants do not affect the modelling because the calculations are solely based on the measured groundwater composition. M3 uses the opposite approach to that of many standard hydrochemical models. In M3, mixing is evaluated and calculated first. The constituents that cannot be described by mixing are described by reactions. The M3 model consists of three steps: the first is a standard principal component analysis, followed by mixing and finally mass balance calculations. The measured groundwater composition can be described in terms of mixing proportions (%), while the sinks and sources of an element associated with reactions are reported in mg/L. This report contains a set of verification and validation exercises with the intention of building confidence in the use of the M3 methodology. At the same time, clear answers are given to questions related to the accuracy and the precision of the results, including the inherent uncertainties and the errors that can be made

  20. Mathematical modeling and microbiological verification of ohmic heating of a multicomponent mixture of particles in a continuous flow ohmic heater system with electric field parallel to flow.

    Science.gov (United States)

    Kamonpatana, Pitiya; Mohamed, Hussein M H; Shynkaryk, Mykola; Heskitt, Brian; Yousef, Ahmed E; Sastry, Sudhir K

    2013-11-01

    To accomplish continuous flow ohmic heating of a low-acid food product, sufficient heat treatment needs to be delivered to the slowest-heating particle at the outlet of the holding section. This research was aimed at developing mathematical models for sterilization of a multicomponent food in a pilot-scale ohmic heater with electric-field-oriented parallel to the flow and validating microbial inactivation by inoculated particle methods. The model involved 2 sets of simulations, one for determination of fluid temperatures, and a second for evaluating the worst-case scenario. A residence time distribution study was conducted using radio frequency identification methodology to determine the residence time of the fastest-moving particle from a sample of at least 300 particles. Thermal verification of the mathematical model showed good agreement between calculated and experimental fluid temperatures (P > 0.05) at heater and holding tube exits, with a maximum error of 0.6 °C. To achieve a specified target lethal effect at the cold spot of the slowest-heating particle, the length of holding tube required was predicted to be 22 m for a 139.6 °C process temperature with volumetric flow rate of 1.0 × 10(-4) m3/s and 0.05 m in diameter. To verify the model, a microbiological validation test was conducted using at least 299 chicken-alginate particles inoculated with Clostridium sporogenes spores per run. The inoculated pack study indicated the absence of viable microorganisms at the target treatment and its presence for a subtarget treatment, thereby verifying model predictions.

  1. Catastrophe model and its experimental verification of static loading rock system under impact load

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    According to the catastrophe model for impact buckling of static loading structures, a new catastrophe model for impact loading failure of a static loading rock system was established, and one dimension (1D) catastrophe model was analyzed. The analysis results indicate that the furcation collection where catastrophe may take place is not only decided by mechanical system itself but also relates to exterior loading, which is different from the results obtained under mono-static loading where the bifurcation collection is only determined by mechanics of the system itself and has nothing to do with exterior loading. In addition, the corresponding 1D coupled static-dynamic loading experiment is designed to verify the analysis results of catastrophe model The test is done with Instron 1342 electroservo controlled testing system, in which medium strain rate is caused by monotony rising dynamic load. The parameters are obtained combining theoretical model with experiment. The experimental and theoretical curves of critical dynamic load vs static load are rather coincided, thus the new model is proved to be correct.

  2. Verification of GRAPES unified global and regional numerical weather prediction model dynamic core

    Institute of Scientific and Technical Information of China (English)

    YANG XueSheng; HU JiangLin; CHEN DeHui; ZHANG HongLiang; SHEN XueShun; CHEN JiaBin; JI LiRen

    2008-01-01

    During the past few years, most of the new developed numerical weather prediction models adopt the strategy of multi-scale technique. Therefore, China Meteorological Administration has devoted to de-veloping a new generation of global and regional multi-scale model since 2003. In order to validate the performance of the GRAPES (Global and Regional Assimilation and PrEdiction System) model both for its scientific design and program coding, a suite of idealized tests has been proposed and conducted, which includes the density flow test, three-dimensional mountain wave and the cross-polar flow test. The density flow experiment indicates that the dynamic core has the ability to simulate the fine scale nonlinear flow structures and its transient features. While the three-dimensional mountain wave test shows that the model can reproduce the horizontal and vertical propagation of internal gravity waves quite well. Cross-polar flow test demonstrates the rationality of both for the semi-Lagrangian departure point calculation and the discretization of the model near the poles. The real case forecasts reveal that the model has the ability to predict the large-scale weather regimes in summer such as the subtropical high, and to capture the major synoptic patterns in the mid and high latitudes.

  3. Experimental verification and comparison of the rubber V- belt continuously variable transmission models

    Science.gov (United States)

    Grzegożek, W.; Dobaj, K.; Kot, A.

    2016-09-01

    The paper includes the analysis of the rubber V-belt cooperation with the CVT transmission pulleys. The analysis of the forces and torques acting in the CVT transmission was conducted basing on calculated characteristics of the centrifugal regulator and the torque regulator. The accurate estimation of the regulator surface curvature allowed for calculation of the relation between the driving wheel axial force, the engine rotational speed and the gear ratio of the CVT transmission. Simplified analytical models of the rubber V-belt- pulley cooperation are based on three basic approaches. The Dittrich model assumes two contact regions on the driven and driving wheel. The Kim-Kim model considers, in addition to the previous model, also the radial friction. The radial friction results in the lack of the developed friction area on the driving pulley. The third approach, formulated in the Cammalleri model, assumes variable sliding angle along the wrap arch and describes it as a result the belt longitudinal and cross flexibility. Theoretical torque on the driven and driving wheel was calculated on the basis of the known regulators characteristics. The calculated torque was compared to the measured loading torque. The best accordance, referring to the centrifugal regulator range of work, was obtained for the Kim-Kim model.

  4. Verification of MC{sup 2}-3 Doppler Sample Models in ZPPR-15 D Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Min Jae; Hartanto, Donny; Kim, Sang Ji [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    In this paper, the change of reaction rate and broadened cross section were estimated by as-built MCNP models for metallic uranium sample in ZPPR-15D using ENDF/B-VII.0 library, and the results were compared with deterministic calculations provided in previous work. The Doppler broadening is an instant feedback mechanism that improves safety and stability for both thermal and fast reactors. Therefore, the accuracy of Doppler coefficient becomes an important parameter in reactor design as well as in the safety analysis. The capability of the Doppler worth calculation by a modern computer code suites such as MC2-3 and DIF3DVARIANT, has been validated against the Zero Power Physics Reactor-15 (ZPPR-15) Doppler worth measurement experiments. For the same experiments, our previous work suggested four different MC2-3 Doppler sample models for enhanced accuracy, which are combinations of heterogeneous models and the super cell approach. The MOC and MOC-SPC models showed the smallest error in estimating the U-238 total cross section of Doppler sample N-11, and the Doppler broadening effects are well applied to the cross section compared to other two models, HOM and SPC. The effects of the super cell approach can be hardly seen, since the broadened cross section is almost the same with and without the super cell approach. Comparing the transition of reaction density, MOC and MOC-SPC models also show similar behavior as MCNP's with minor errors. As a conclusion, we could obtain more consistent broadened cross section as well as reaction density transition by providing heterogeneous models from MC2-3's MOC module.

  5. Experimental Verification of Modeled Thermal Distribution Produced by a Piston Source in Physiotherapy Ultrasound

    Directory of Open Access Journals (Sweden)

    M. I. Gutierrez

    2016-01-01

    Full Text Available Objectives. To present a quantitative comparison of thermal patterns produced by the piston-in-a-baffle approach with those generated by a physiotherapy ultrasonic device and to show the dependency among thermal patterns and acoustic intensity distributions. Methods. The finite element (FE method was used to model an ideal acoustic field and the produced thermal pattern to be compared with the experimental acoustic and temperature distributions produced by a real ultrasonic applicator. A thermal model using the measured acoustic profile as input is also presented for comparison. Temperature measurements were carried out with thermocouples inserted in muscle phantom. The insertion place of thermocouples was monitored with ultrasound imaging. Results. Modeled and measured thermal profiles were compared within the first 10 cm of depth. The ideal acoustic field did not adequately represent the measured field having different temperature profiles (errors 10% to 20%. Experimental field was concentrated near the transducer producing a region with higher temperatures, while the modeled ideal temperature was linearly distributed along the depth. The error was reduced to 7% when introducing the measured acoustic field as the input variable in the FE temperature modeling. Conclusions. Temperature distributions are strongly related to the acoustic field distributions.

  6. Experimental Verification of Modeled Thermal Distribution Produced by a Piston Source in Physiotherapy Ultrasound

    Science.gov (United States)

    Lopez-Haro, S. A.; Leija, L.

    2016-01-01

    Objectives. To present a quantitative comparison of thermal patterns produced by the piston-in-a-baffle approach with those generated by a physiotherapy ultrasonic device and to show the dependency among thermal patterns and acoustic intensity distributions. Methods. The finite element (FE) method was used to model an ideal acoustic field and the produced thermal pattern to be compared with the experimental acoustic and temperature distributions produced by a real ultrasonic applicator. A thermal model using the measured acoustic profile as input is also presented for comparison. Temperature measurements were carried out with thermocouples inserted in muscle phantom. The insertion place of thermocouples was monitored with ultrasound imaging. Results. Modeled and measured thermal profiles were compared within the first 10 cm of depth. The ideal acoustic field did not adequately represent the measured field having different temperature profiles (errors 10% to 20%). Experimental field was concentrated near the transducer producing a region with higher temperatures, while the modeled ideal temperature was linearly distributed along the depth. The error was reduced to 7% when introducing the measured acoustic field as the input variable in the FE temperature modeling. Conclusions. Temperature distributions are strongly related to the acoustic field distributions. PMID:27999801

  7. The Verification of Structural Decision-Making Model for Evaluating Education on Facebook

    Directory of Open Access Journals (Sweden)

    Kozel Roman

    2013-09-01

    Full Text Available The aim of this paper is to present the work of the research team who tried to construct a model that explores general opinions of students about education on Facebook and also opinions of students about education on the social page for course E-marketing by using structural equation model. Facebook has already been present at universities due to the fact that students use it as a primary source of information about news in courses, duties, and so on. The research team carried out an experiment in the course E-marketing at FE of VŠB – TUO, in which Facebook was used as a tool for communication between students and teachers. The research on the attitude of students towards education on Facebook was conducted by questioning using predefined variables. The first form of the model was designed by factor analysis with method Varimax, when six groups of factors that affect respondents´ opinions about education were defined. A structural equation model was used to verify the validity of the model. It appears that four groups of factors mainly affect respondents´ attitudes to this type of education according to the testing performed. These groups of factors are Engagement, Information and Modern Technologies, Lecturers and Scores, and Education on Facebook. The research team also determined statistically the most significant variables in these factors that affect the opinions of students about education the most.

  8. Verification of Beam Models for Ionic Polymer-Metal Composite Actuator

    Institute of Scientific and Technical Information of China (English)

    Ai-hong Ji; Hoon Cheol Park; Quoc Viet Nguyen; Jang Woo Lee; Young Tai Yoo

    2009-01-01

    Ionic Polymer-Metal Composite (IPMC) can work as an actuator by applying a few voltages. A thick IPMC actuator, where Nation-117 membrane was synthesized with polypyrrole/alumina composite tiller, was analyzed to verify the equivalent beam and equivalent bimorph beam models. The blocking force and tip displacement of the IPMC actuator were measured with a DC power supply and Young's modulus of the IPMC strip was measured by bending and tensile tests respectively. The calculated maximum tip displacement and the Young's modulus by the equivalent beam model were almost identical to the corresponding measured data. Finite element analysis with thermal analogy technique was utilized in the equivalent bimorph beam model to numerically reproduce the force-displacement relationship of the IPMC actuator. The results by the equivalent bimorph beam model agreed well with the force-displacement relationship acquired by the measured data. It is confirmed that the equivalent beam and equivalent bimorph beam models are practically and effectively suitable for predicting the tip displacement, blocking force and Young's modulus of IPMC actuators with different thickness and different composite of ionic polymer membrane.

  9. Verification of high-speed solar wind stream forecasts using operational solar wind models

    Science.gov (United States)

    Reiss, Martin A.; Temmer, Manuela; Veronig, Astrid M.; Nikolic, Ljubomir; Vennerstrom, Susanne; Schöngassner, Florian; Hofmeister, Stefan J.

    2016-07-01

    High-speed solar wind streams emanating from coronal holes are frequently impinging on the Earth's magnetosphere causing recurrent, medium-level geomagnetic storm activity. Modeling high-speed solar wind streams is thus an essential element of successful space weather forecasting. Here we evaluate high-speed stream forecasts made by the empirical solar wind forecast (ESWF) and the semiempirical Wang-Sheeley-Arge (WSA) model based on the in situ plasma measurements from the Advanced Composition Explorer (ACE) spacecraft for the years 2011 to 2014. While the ESWF makes use of an empirical relation between the coronal hole area observed in Solar Dynamics Observatory (SDO)/Atmospheric Imaging Assembly (AIA) images and solar wind properties at the near-Earth environment, the WSA model establishes a link between properties of the open magnetic field lines extending from the photosphere to the corona and the background solar wind conditions. We found that both solar wind models are capable of predicting the large-scale features of the observed solar wind speed (root-mean-square error, RMSE ≈100 km/s) but tend to either overestimate (ESWF) or underestimate (WSA) the number of high-speed solar wind streams (threat score, TS ≈ 0.37). The predicted high-speed streams show typical uncertainties in the arrival time of about 1 day and uncertainties in the speed of about 100 km/s. General advantages and disadvantages of the investigated solar wind models are diagnosed and outlined.

  10. Experimental verification of a model describing the intensity distribution from a single mode optical fiber

    Energy Technology Data Exchange (ETDEWEB)

    Moro, Erik A [Los Alamos National Laboratory; Puckett, Anthony D [Los Alamos National Laboratory; Todd, Michael D [UCSD

    2011-01-24

    The intensity distribution of a transmission from a single mode optical fiber is often approximated using a Gaussian-shaped curve. While this approximation is useful for some applications such as fiber alignment, it does not accurately describe transmission behavior off the axis of propagation. In this paper, another model is presented, which describes the intensity distribution of the transmission from a single mode optical fiber. A simple experimental setup is used to verify the model's accuracy, and agreement between model and experiment is established both on and off the axis of propagation. Displacement sensor designs based on the extrinsic optical lever architecture are presented. The behavior of the transmission off the axis of propagation dictates the performance of sensor architectures where large lateral offsets (25-1500 {micro}m) exist between transmitting and receiving fibers. The practical implications of modeling accuracy over this lateral offset region are discussed as they relate to the development of high-performance intensity modulated optical displacement sensors. In particular, the sensitivity, linearity, resolution, and displacement range of a sensor are functions of the relative positioning of the sensor's transmitting and receiving fibers. Sensor architectures with high combinations of sensitivity and displacement range are discussed. It is concluded that the utility of the accurate model is in its predicative capability and that this research could lead to an improved methodology for high-performance sensor design.

  11. [On-site measurement of landfill gas yield and verification of IPCC model].

    Science.gov (United States)

    Luo, Yu-Xiang; Wang, Wei; Gao, Xing-Bao

    2009-11-01

    In order to obtain the accurate yield of landfill gas in Yulongkeng Landfill, Shenzhen, improved pumping test was conducted. The methane production rates of the influence region were figured out as 14.67 x 10(-5), 9.46 x 10(-5), 9.55 x 10(-5), and 4.28 x 10(-5) m3/(t x h), respectively. According to the methane production rate, the whole methane yield of Yulongkeng Landfill in 2005 was 322 m3/h, which indicated that Yulongkeng Landfill had went into stationary phase and the recycle of landfill gas was not valuable. IPCC model was verified by the measured data. Degradation half life of the waste was the key parameter concerned to the prediction accuracy of IPCC model. In China, the degradable waste in municipal solid waste was mainly kitchen waste leading to a short degradation period, which caused the degradation half life was shorter than the proposed value in IPCC model. For the improvement in prediction accuracy of landfill gas yield, the model parameters should be adopted reasonably based on a full survey of waste characterization in China, which will boost the applicability of IPCC model.

  12. Emissivity of rough sea surface for 8-13 num: modeling and verification.

    Science.gov (United States)

    Wu, X; Smith, W L

    1997-04-20

    The emissivity model for rough sea surface [Remote Sensing Environ. 24, 313-329 (1988)] is inspected in light of the measured surface emissivity. In the presence of moderate wind (5 m/s or less), the emissivity model is found to be adequate for small to moderate view angles. For large view angles, the discrepancy between the computed and the measured emissivity is large, but one can reduce this considerably by incorporating the reflected sea surface emission into the emissivity model. In addition, examination of the spectral variation of the observed and computed emissivity suggests the need for refined measurements of the complex refractive index. An improved model is constructed to calculate the rough sea surface emissivity that can be used to provide accurate estimates of sea surface skin temperatures from remotely sensed radiometric measurements. An important feature of the improved model is that the computed sea surface emissivity is only weakly dependent on wind speed for most view angles used in practice.

  13. Numerical modeling and verification of gas flow through a network of crossed narrow v-grooves

    Science.gov (United States)

    Bejhed, Johan; Nguyen, Hugo; Åstrand, Peter; Eriksson, Anders; Köhler, Johan

    2006-10-01

    The gas flow through a network of crossing thin micro-machined channels has been successfully modeled and simulated. The crossings are formed by two sets of v-grooves that intersect as two silicon wafers are bonded together. The gas is distributed from inlets via a manifold of channels to the narrow v-grooves. The narrow v-grooves could work as a particle filter. The fluidic model is derived from the Navier-Stokes equation and assumes laminar isothermal flow and incorporates small Knudsen number corrections and Poiseuille number calculations. The simulations use the finite element method. Several elements of the full crossing network model are treated separately before lumping them together: the straight v-grooves, a single crossing in an infinite set and a set of exactly four crossings along the flow path. The introduction of a crossing effectively corresponds to a virtual reduction of the length of the flow path, thereby defining a new effective length. The first and last crossings of each flow path together contribute to a pressure drop equal to that from three ordinary crossings. The derived full network model has been compared to previous experimental results on several differently shaped crossed v-groove networks. Within the experimental errors, the model corresponds to the mass flow and pressure drop measurements. The main error source is the uncertainty in v-groove width which has a profound impact on the fluidic behavior.

  14. Verification of high-speed solar wind stream forecasts using operational solar wind models

    DEFF Research Database (Denmark)

    Reiss, Martin A.; Temmer, Manuela; Veronig, Astrid M.

    2016-01-01

    High-speed solar wind streams emanating from coronal holes are frequently impinging on the Earth's magnetosphere causing recurrent, medium-level geomagnetic storm activity. Modeling high-speed solar wind streams is thus an essential element of successful space weather forecasting. Here we evaluate...... high-speed stream forecasts made by the empirical solar wind forecast (ESWF) and the semiempirical Wang-Sheeley-Arge (WSA) model based on the in situ plasma measurements from the Advanced Composition Explorer (ACE) spacecraft for the years 2011 to 2014. While the ESWF makes use of an empirical relation...... between the coronal hole area observed in Solar Dynamics Observatory (SDO)/Atmospheric Imaging Assembly (AIA) images and solar wind properties at the near-Earth environment, the WSA model establishes a link between properties of the open magnetic field lines extending from the photosphere to the corona...

  15. Parameters identification for GTN model and their verification on 42CrMo4 steel

    Energy Technology Data Exchange (ETDEWEB)

    Kozak, V.; Vlcek, L. [Inst. of Physics of Materials, AS of CR, Brno (Czech Republic)

    2005-07-01

    The base of this paper is exact measurement of deformation and fracture material characteristics in laboratory, evaluation of these parameters and their application in models of finite element analysis modelling the fracture behaviour of components with defects. The base of the work is dealing with ductile fracture of forget steel 42CrMo4. R-curve is modelled by 3D FEM using WARP3D and Abaqus. Crack extension is simulated in sense of element extinction algorithms. Determination of micro-mechanical parameters is based on combination of tensile tests and microscopic observation. Input parameters for the next computation and simulation were received on the base of image analysis, namely f{sub N} and f{sub o}. The possibility of transferring these parameters to another specimen is discussed. (orig.)

  16. Experimental Validation and Model Verification for a Novel Geometry ICPC Solar Collector

    DEFF Research Database (Denmark)

    Perers, Bengt; Duff, William S.; Daosukho, Jirachote

    A novel geometry ICPC solar collector was developed at the University of Chicago and Colorado State University. A ray tracing model has been designed to investigate the optical performance of both the horizontal and vertical fin versions of this collector. Solar radiation is modeled as discrete...... passing through transparent media, the size of the gap between the glass tube and fin, reflectivity of the reflective surface, absorptivity of the fin and blocking and displacement of the rays by adjacent tubes. . Presentation of the progressive animation of individual rays and associated summary graphics...... to the desired incident angle of the sun’s rays, performance of the novel ICPC solar collector at various specified angles along the transverse and longitudinal evacuated tube directions were experimentally determined. To validate the ray tracing model, transverse and longitudinal performance predictions...

  17. A phase-field model for ductile fracture at finite strains and its experimental verification

    Science.gov (United States)

    Ambati, Marreddy; Kruse, Roland; De Lorenzis, Laura

    2016-01-01

    In this paper, a phase-field model for ductile fracture previously proposed in the kinematically linear regime is extended to the three-dimensional finite strain setting, and its predictions are qualitatively and quantitatively compared with several experimental results, both from ad-hoc tests carried out by the authors and from the available literature. The proposed model is based on the physical assumption that fracture occurs when a scalar measure of the accumulated plastic strain reaches a critical value, and such assumption is introduced through the dependency of the phase-field degradation function on this scalar measure. The proposed model is able to capture the experimentally observed sequence of elasto-plastic deformation, necking and fracture phenomena in flat specimens; the occurrence of cup-and-cone fracture patterns in axisymmetric specimens; the role played by notches and by their size on the measured displacement at fracture; and the sequence of distinct cracking events observed in more complex specimens.

  18. Verification of short lead time forecast models: applied to Kp and Dst forecasting

    Science.gov (United States)

    Wintoft, Peter; Wik, Magnus

    2016-04-01

    In the ongoing EU/H2020 project PROGRESS models that predicts Kp, Dst, and AE from L1 solar wind data will be used as inputs to radiation belt models. The possible lead times from L1 measurements are shorter (10s of minutes to hours) than the typical duration of the physical phenomena that should be forecast. Under these circumstances several metrics fail to single out trivial cases, such as persistence. In this work we explore metrics and approaches for short lead time forecasts. We apply these to current Kp and Dst forecast models. This project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 637302.

  19. Verification of the IVA4 film boiling model with the data base of Liu and Theofanous

    Energy Technology Data Exchange (ETDEWEB)

    Kolev, N.I. [Siemens AG Unternehmensbereich KWU, Erlangen (Germany)

    1998-01-01

    Part 1 of this work presents a closed analytical solution for mixed-convection film boiling on vertical walls. Heat transfer coefficients predicted by the proposed model and experimental data obtained at the Royal Institute of Technology in Sweden by Okkonen et al are compared. All data predicted are inside the {+-}10% error band, with mean averaged error being below 4% using the slightly modified analytical solution. The solution obtained is recommended for practical applications. The method presented here is used in Part 2 as a guideline for developing model for film boiling on spheres. The new semi-empirical film boiling model for spheres used in IVA4 computer code is compared with the experimental data base obtained by Liu and Theofanous. The data are predicted within {+-}30% error band. (author)

  20. Modelling and experimental verification on concrete-filled steel tubular columns with L or T section

    Institute of Scientific and Technical Information of China (English)

    LU Xilin; LI Xueping; WANG Dan

    2007-01-01

    Concrete-filled steel tubular columns with L or T sections were analyzed in this paper. According to the confin- ing mechanism, the stress-strain constitutive model was put forward, and calculated results were compared with experi- mental records. After that, the hysteretic rules for the in-filled concrete were constructed, aiming at the analysis on the seis- mic behavior of composite members. The simulation analysis was performed by programming it in Fortran. The models in this paper can be applied in the program of time history analysis on tall buildings with concrete-filled steel tubular columns with L or T sections.

  1. Bounded Model Checking and Inductive Verification of Hybrid Discrete-Continuous Systems

    DEFF Research Database (Denmark)

    Becker, Bernd; Behle, Markus; Eisenbrand, Fritz

    2004-01-01

    We present a concept to signicantly advance the state of the art for bounded model checking (BMC) and inductive verication (IV) of hybrid discrete-continuous systems. Our approach combines the expertise of partners coming from dierent domains, like hybrid systems modeling and digital circuit...... verication, bounded plan- ning and heuristic search, combinatorial optimization and integer programming. Af- ter sketching the overall verication ow we present rst results indicating that the combination and tight integration of dierent verication engines is a rst step to pave the way to fully automated BMC...

  2. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    Science.gov (United States)

    Yuksel, Mustafa; Gonul, Suat; Laleci Erturkmen, Gokce Banu; Sinaci, Ali Anil; Invernizzi, Paolo; Facchinetti, Sara; Migliavacca, Andrea; Bergvall, Tomas; Depraetere, Kristof; De Roo, Jos

    2016-01-01

    Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information. PMID:27123451

  3. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    Directory of Open Access Journals (Sweden)

    Mustafa Yuksel

    2016-01-01

    Full Text Available Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information.

  4. Using Verification Code with Opportunity Model Based on CAPTCHA%基于CAPTCHA技术的挑战信息使用时机模型

    Institute of Scientific and Technical Information of China (English)

    朴顺姬; 戴德伟

    2012-01-01

    Rational use of verification code for the purpose of inhibiting the spread of the virus worm,the use of mathematical modeling methods to build a verification code uses opportunity mathematic model based CAPTCHA.Improvement strategy is proposed to make up an increase of the burden of customers and the network load for CAPTCHA.Thereby it is a qualitative analysis of the best time to use verification code.%为了合理使用挑战信息并达到抑制蠕虫病毒传播的目的,采用数学建模方法建立一个基于CAPTCHA技术的挑战信息使用时机数学模型.针对CAPTCHA技术增加了用户负担和网络负载的不足,提出了改进策略,要找到一个合适的时间发送挑战信息,从而定性的分析了使用挑战信息的最佳时机.

  5. DATA VERIFICATION IN ISSUE SUPERVISING SYSTEMS

    Directory of Open Access Journals (Sweden)

    R. S. Katerinenko

    2013-01-01

    Full Text Available The paper proposes a method of data verification in issues tracking systems by means of production rules. This model makes it possible to formulate declaratively conditions that the information containment should comply with and apply reasoning procedures. Practical application of proposed verification system in a real software development project is described.

  6. Verification of the predictive capabilities of the 4C code cryogenic circuit model

    Science.gov (United States)

    Zanino, R.; Bonifetto, R.; Hoa, C.; Richard, L. Savoldi

    2014-01-01

    The 4C code was developed to model thermal-hydraulics in superconducting magnet systems and related cryogenic circuits. It consists of three coupled modules: a quasi-3D thermal-hydraulic model of the winding; a quasi-3D model of heat conduction in the magnet structures; an object-oriented a-causal model of the cryogenic circuit. In the last couple of years the code and its different modules have undergone a series of validation exercises against experimental data, including also data coming from the supercritical He loop HELIOS at CEA Grenoble. However, all this analysis work was done each time after the experiments had been performed. In this paper a first demonstration is given of the predictive capabilities of the 4C code cryogenic circuit module. To do that, a set of ad-hoc experimental scenarios have been designed, including different heating and control strategies. Simulations with the cryogenic circuit module of 4C have then been performed before the experiment. The comparison presented here between the code predictions and the results of the HELIOS measurements gives the first proof of the excellent predictive capability of the 4C code cryogenic circuit module.

  7. Prediction and simulator verification of state-space rotor modelling on helicopter manoeuvring flight

    NARCIS (Netherlands)

    Gori, R.; Gennaretti, M.; Pavel, M.D.; Stroosma, O.; Miletovic, I.

    2015-01-01

    Among the many fundamental components of a flight simulator, the mathematical representation of the vehicle dynamics stands out for complexity and importance. This is especially true for helicopters, for which the complex dynamics involved prevents simple models to be sufficiently accurate without t

  8. Verification of high-speed solar wind stream forecasts using operational solar wind models

    CERN Document Server

    Reiss, Martin A; Veronig, Astrid M; Nikolic, Ljubomir; Vennerstrom, Susanne; Schoengassner, Florian; Hofmeister, Stefan J

    2016-01-01

    High-speed solar wind streams emanating from coronal holes are frequently impinging on the Earth's magnetosphere causing recurrent, medium-level geomagnetic storm activity. Modeling high-speed solar wind streams is thus an essential element of successful space weather forecasting. Here we evaluate high-speed stream forecasts made by the empirical solar wind forecast (ESWF) and the semiempirical Wang-Sheeley-Arge (WSA) model based on the in situ plasma measurements from the ACE spacecraft for the years 2011 to 2014. While the ESWF makes use of an empirical relation between the coronal hole area observed in Solar Dynamics Observatory (SDO)/Atmospheric Imaging Assembly (AIA) images and solar wind properties at the near-Earth environment, the WSA model establishes a link between properties of the open magnetic field lines extending from the photosphere to the corona and the background solar wind conditions. We found that both solar wind models are capable of predicting the large-scale features of the observed sol...

  9. Tunable n-path notch filters for blocker suppression: modeling and verification

    NARCIS (Netherlands)

    Ghaffari, A.; Klumperink, Eric A.M.; Nauta, Bram

    2013-01-01

    N-path switched-RC circuits can realize filters with very high linearity and compression point while they are tunable by a clock frequency. In this paper, both differential and single-ended N-path notch filters are modeled and analyzed. Closed-form equations provide design equations for the main

  10. Predictive Simulation of Material Failure Using Peridynamics-Advanced Constitutive Modeling, Verification and Validation

    Science.gov (United States)

    2016-03-31

    solver, which is applied to the fully coarsened levelM grid, uses J = aKnM arithmetic operations, where a and n are constants, andKM is the total number...Mohammadnejad and AR Khoei. Hydro-mechanical modeling of cohesive crack propa- gation in multiphase porous media using the extended finite element

  11. Verification of Overall Safety Factors In Deterministic Design Of Model Tested Breakwaters

    DEFF Research Database (Denmark)

    Burcharth, H. F.

    2001-01-01

    The paper deals with concepts of safety implementation in design. An overall safety factor concept is evaluated on the basis of a reliability analysis of a model tested rubble mound breakwater with monolithic super structure. Also discussed are design load identification and failure mode limit st...

  12. Analytical Performance Verification of FCS-MPC Applied to Power Electronic Converters: A Model Checking Approach

    DEFF Research Database (Denmark)

    Novak, Mateja; Nyman, Ulrik Mathias; Dragicevic, Tomislav

    2017-01-01

    Since the introduction of finite control set model predictive control (FCS-MPC) in power electronics the algorithm has been missing an important aspect that would speed up its implementation in industry: a simple method to verify the algorithm performance. This paper proposes to use a statistical...

  13. The SeaHorn Verification Framework

    Science.gov (United States)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  14. Global sand and dust storms in 2008: Observation and HYSPLIT model verification

    Science.gov (United States)

    Wang, Yaqiang; Stein, Ariel F.; Draxler, Roland R.; de la Rosa, Jesús D.; Zhang, Xiaoye

    2011-11-01

    The HYSPLIT model has been applied to simulate the global dust distribution for 2008 using two different dust emission schemes. The first one assumes that emissions could occur from any land-use grid cell defined in the model as desert. The second emission approach uses an empirically derived algorithm based on satellite observations. To investigate the dust storm features and verify the model performance, a global dataset of Integrated Surface Hourly (ISH) observations has been analyzed to map the spatial distribution and seasonal variation of sand and dust storms. Furthermore, the PM 10 concentration data at four stations in Northern China and two stations in Southern Spain, and the AOD data from a station located at the center of the Sahara Desert have been compared with the model results. The spatial distribution of observed dust storm frequency from ISH shows the known high frequency areas located in North Africa, the Middle East, Mongolia and Northwestern China. Some sand and dust storms have also been observed in Australia, Mexico, Argentina, and other sites in South America. Most of the dust events in East Asia occur in the spring, however this seasonal feature is not so evident in other dust source regions. In general, the model reproduces the dust storm frequency for most of the regions for the two emission approaches. Also, a good quantitative performance is achieved at the ground stations in Southern Spain and Western China when using the desert land-use based emissions, although HYSPLIT overestimates the dust concentration at downwind areas of East Asia and underestimates the column in the center of the Saharan Desert. On the other hand, the satellite based emission approach improves the dust forecast performance in the Sahara, but underestimates the dust concentrations in East Asia.

  15. Modeling and experimental verification of an ultra-wide bandgap in 3D phononic crystal

    Science.gov (United States)

    D'Alessandro, L.; Belloni, E.; Ardito, R.; Corigliano, A.; Braghin, F.

    2016-11-01

    This paper reports a comprehensive modeling and experimental characterization of a three-dimensional phononic crystal composed of a single material, endowed with an ultra-wide complete bandgap. The phononic band structure shows a gap-mid gap ratio of 132% that is by far the greatest full 3D bandgap in literature for any kind of phononic crystals. A prototype of the finite crystal structure has been manufactured in polyamide by means of additive manufacturing technology and tested to assess the transmission spectrum of the crystal. The transmission spectrum has been numerically calculated taking into account a frequency-dependent elastic modulus and a Rayleigh model for damping. The measured and numerical transmission spectra are in good agreement and present up to 75 dB of attenuation for a three-layer crystal.

  16. Verification and Diagnosis Infrastructure of SoC HDL-model

    CERN Document Server

    Hahanov, Vladimir; Litvinova, Eugenia; Chumachenko, Svetlana

    2012-01-01

    This article describes technology for diagnosing SoC HDL-models, based on transactional graph. Diagnosis method is focused to considerable decrease the time of fault detection and memory for storage of diagnosis matrix by means of forming ternary relations in the form of test, monitor, and functional component. The following problems are solved: creation of digital system model in the form of transaction graph and multi-tree of fault detection tables, as well as ternary matrices for activating functional components in tests, relative to the selected set of monitors; development of a method for analyzing the activation matrix to detect the faults with given depth and synthesizing logic functions for subsequent embedded hardware fault diagnosing.

  17. Microscopic driving theory with oscillatory congested states: model and empirical verification

    CERN Document Server

    Tian, Junfang; Ma, Shoufeng; Jia, Bin; Zhang, Wenyi

    2014-01-01

    The essential distinction between the Fundamental Diagram Approach (FDA) and Kerner's Three- Phase Theory (KTPT) is the existence of a unique gap-speed (or flow-density) relationship in the former class. In order to verify this relationship, empirical data are analyzed with the following findings: (1) linear relationship between the actual space gap and speed can be identified when the speed difference between vehicles approximates zero; (2) vehicles accelerate or decelerate around the desired space gap most of the time. To explain these phenomena, we propose that, in congested traffic flow, the space gap between two vehicles will oscillate around the desired space gap in the deterministic limit. This assumption is formulated in terms of a cellular automaton. In contrast to FDA and KTPT, the new model does not have any congested steady-state solution. Simulations under periodic and open boundary conditions reproduce the empirical findings of KTPT. Calibrating and validating the model to detector data produces...

  18. Modeling of numerical simulation and experimental verification for carburizing-nitriding quenching process

    Institute of Scientific and Technical Information of China (English)

    R. MUKAI; T. MATSUMOTO; JU Dong-ying; T. SUZUKI; H. SAITO; Y. ITO

    2006-01-01

    A model considering quantitative effects of diffused carbon and nitrogen gradients and kinetics of phase transformation is presented to examine metallo-thermo-mechanical behavior during carburized and nitrided quenching. Coupled simulation of diffusion,phase transformation and stress/strain provides the final distribution of carbon and nitrogen contents as well as residual stress and distortion. Effects of both transformation and lattice expansion induced by carbon and nitrogen absorption were introduced into calculating the evolution of the internal stress and strain. In order to verify the method and the results,the simulated distributions of carbon and nitrogen content and residual stress/strain of a ring model during carburized and nitrided quenching were compared with the measured data.

  19. Contact Modelling in Resistance Welding, Part I: Algorithms and Numerical Verification

    DEFF Research Database (Denmark)

    Song, Quanfeng; Zhang, Wenqi; Bay, Niels

    2006-01-01

    Finite element analysis of resistance welding involves the contact problems between different parts. The contact problem in resistance welding includes not only mechanical contact but also thermal and electrical contact. In this paper a contact model based on the penalty method is developed for s...... for simulation of resistance spot and projection welding. After a description of the algorithms several numerical examples are presented to validate the mechanical contact algorithm.......Finite element analysis of resistance welding involves the contact problems between different parts. The contact problem in resistance welding includes not only mechanical contact but also thermal and electrical contact. In this paper a contact model based on the penalty method is developed...

  20. Security Policy Development: Towards a Life-Cycle and Logic-Based Verification Model

    Directory of Open Access Journals (Sweden)

    Luay A. Wahsheh

    2008-01-01

    Full Text Available Although security plays a major role in the design of software systems, security requirements and policies are usually added to an already existing system, not created in conjunction with the product. As a result, there are often numerous problems with the overall design. In this paper, we discuss the relationship between software engineering, security engineering, and policy engineering and present a security policy life-cycle; an engineering methodology to policy development in high assurance computer systems. The model provides system security managers with a procedural engineering process to develop security policies. We also present an executable Prolog-based model as a formal specification and knowledge representation method using a theorem prover to verify system correctness with respect to security policies in their life-cycle stages.

  1. Toward Better Intraseasonal and Seasonal Prediction: Verification and Evaluation of the NOGAPS Model Forecasts

    Science.gov (United States)

    2012-09-30

    Pacific, the ITCZ has strong 4 impacts on the tropical cyclone activity. The displacement of the ITCZ in the NOGAPS forecasts may lead to errors of...the tropical cyclone prediction. Figure 2 Comparison of the NOGAPS (left) and GFS (right) 6-day precipitation forecasts with the CMORPH...for model assessment. Bull. Amer. Met. Soc., 92, 1023-1043. Bretherton, Christopher S., Matthew E. Peters, Larissa E. Back, 2004: Relationships

  2. Vacuum-Assisted Resin Transfer Molding (VARTM) Model Development, Verification, and Process Analysis

    OpenAIRE

    Sayre, Jay Randall

    2000-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this techni...

  3. Improvement, Verification, and Refinement of Spatially-Explicit Exposure Models in Risk Assessment - SEEM

    Science.gov (United States)

    2015-06-01

    on the NPL in 1990, and is located east of the demolition debris site. The shot fall zone is to the east of Gun Club Creek marshes (GPC, 2009...vegetation that comprised the rodents’ diets . Vegetation concentrations come from plant material that was collected at each trap site where a small...through either incidental soil ingestion with the diet , or by direct contact with the soil while foraging. The model was developed for APG Gun Club

  4. Simulink based behavioural modelling of a pulse oximeter for deployment in rapid development, prototyping and verification.

    Science.gov (United States)

    Shokouhian, M; Morling, R C S; Kale, I

    2012-01-01

    The pulse oximeter is a well-known device for measuring the level of oxygen in blood. Since their invention, pulse oximeters have been under constant development in both aspects of hardware and software; however there are still unsolved problems that limit their performance [6], [7]. Many fresh algorithms and new design techniques are being suggested every year by industry and academic researchers which claim that they can improve accuracy of measurements [8], [9]. With the lack of an accurate computer-based behavioural model for pulse oximeters, the only way for evaluation of these newly developed systems and algorithms is through hardware implementation which can be both expensive and time consuming. This paper presents an accurate Simulink based behavioural model for a pulse oximeter that can be used by industry and academia alike working in this area, as an exploration as well as productivity enhancement tool during their research and development process. The aim of this paper is to introduce a new computer-based behavioural model which provides a simulation environment from which new ideas can be rapidly evaluated long before the real implementation.

  5. Linear Modeling, Simulation and Experimental Verification of a Pressure Regulator for CNG Injection Systems

    Directory of Open Access Journals (Sweden)

    Dirk Hübner

    2008-08-01

    Full Text Available The number of motor vehicles powered by internal combustion engines keeps growing despite shrinking oil reserves. As a result, compressed natural gas (CNG is gaining currency as an emerging combustion engine fuel. To this day, CNG systems – e.g., in passenger cars – are not fully integrated into the development process as conducted by vehicle or engine manufacturers. Instead, they are usually "adapted in" at a downstream stage by small, specialized companies. The present paper initially outlines the state of the art in advanced gas injection technologies. Especially the development towards sequential injection systems is described. A pressure regulator for CNG driven combustion engines is examined in detail, given its role as a highly sensitive and critical system component. Based on a precise theoretical analysis, a linear model of this pressure regulator is derived and subjected to dynamic simulation. The analytical approach is accompanied by an experimental investigation of the device. On a test rig developed at the Trier University of Applied Sciences, the static and dynamic features of the pressure regulator can be measured with the requisite precision. The comparison of measured and simulated data yields a validation of the dynamic simulation model. With the approaches developed it is now possible for the first time to model, simulate and optimize single- or multi-stage pressure regulators for CNG driven engines with less effort and higher accuracy.

  6. System-level modeling and verification of a micro pitch-tunable grating

    Science.gov (United States)

    Lv, Xianglian; Xu, Jinghui; Yu, Yiting; He, Yang; Yuan, Weizheng

    2010-10-01

    Micro Pitch-tunable Grating based on microeletromechanical systems(MEMS) technology can modulate the grating period dynamically by controlling the drive voltage. The device is so complex that it is impossible to model and sumulation by FEA method or only analysis macromodel. In this paper, a new hybrid system-level modeling method was presented. Firstly the grating was decomposed into function components such as grating beam, supporting beam, electrostatic comb-driver. Block Arnoldi algorithm was used to obtain the numerical macromodel of the grating beams and supporting beams, the analytical macromodels called multi-port-elements(MPEs) of the comb-driver and other parts were also established, and the elements were connected together to form hybrid network for representing the systemlevel models of the grating in MEME Garden, which is a MEMS CAD tool developed by Micro and Nano Electromechanical Systems Laboratory, Northwestern Polytechnical University. Both frequency and time domain simulation were implemented. The grating was fabricated using silicon-on-glass(SOG) process. The measured working displacement is 16.5μm at a driving voltage of 40V. The simulation result is 17.6μm which shows an acceptable agreement with the measurement result within the error tolerance of 6.7%. The method proposed in this paper can solve the voltage-displacement simulation problem of this kind of complex grating. It can also be adapted to similar MEMS/MOEMS devices simulations.

  7. Development and experimental verification of a model for an air jet penetrated by plumes

    Directory of Open Access Journals (Sweden)

    Xin Wang

    2015-03-01

    Full Text Available This article presents the fluid mechanics of a ventilation system formed by a momentum source and buoyancy sources. We investigate the interaction between plumes and a non-isothermal air jet for separate sources of buoyancy produced by the plume and the momentum of the air jet. The mathematical model represents the situation in which a plume rises from two heat sources causing buoyancy. The model is used to discuss the interactions involved. The effects of parameters such as the power of the source and the air-flow volume used in the mathematical-physical model are also discussed. An expression is deduced for the trajectory of the non-isothermal air jet penetrated by plumes. Experiments were also carried out to illustrate the effect on the flow of the air jet and to validate the theoretical work. The results show that the buoyancy source’s efforts to baffle the descent of the cold air have even been effective in reversing the direction of the trajectory. However, increasing the distance between the plumes can reduce the effect of the plumes on the jet curve. And it is apparent that when the velocity of the air supply increases, the interference caused by the plumes can be reduced.

  8. Modeling and experimental verification of tubular product formation during spray forming

    Institute of Scientific and Technical Information of China (English)

    LIU Dong-ming; ZHAO Jiu-zhou; LI Mu-sen

    2009-01-01

    A mathematical model is formulated to predict the shape evolution and the final geometry of a tubular product prepared by spray forming. The effects of several important processing parameters on the shape evolution of the tube are investigated. The model is validated against experiments of spray formed large diameter tubes. The experimental and the modeling results show that there are three distinct regions in the preform, i.e., the left transition region, the middle uniform diameter region and the right transition region. The results show that the atomization parameters as and bs, traversing speed v of the substrate, the outer diameter D0 of the substrate, and the initial deposition distance d0 play important roles in the contour and the wall thickness of the spray formed tube. But the angular velocity ω of the substrate has little effect on the buildup of the deposit. After a certain time from the beginning of the process, the deposit will come into a steady growth state. In addition, an equation is provided to estimate the wall thickness of the deposit under the steady growth state based on the mass conservation.

  9. Modeling and experimental verification of frequency-, amplitude-, and magneto-dependent viscoelasticity of magnetorheological elastomers

    Science.gov (United States)

    Xin, Fu-Long; Bai, Xian-Xu; Qian, Li-Jun

    2016-10-01

    Magnetorheological elastomers (MREs), a smart composite, exhibit dual characteristics of both MR materials and particle reinforced composites, i.e., the viscoelasticity of MREs depends on external magnetic field as well as strain amplitude and excitation frequency. In this article, the principle of a frequency-, amplitude-, and magneto-dependent linear dynamic viscoelastic model for isotropic MREs is proposed and investigated. The viscoelasticity of MREs is divided into frequency- and amplitude-dependent mechanical viscoelasticity and frequency-, amplitude-, and magneto-dependent magnetic viscoelasticity. Based on the microstructures of ferrous particles and matrix, the relationships between mechanical shear modulus corresponding to the mechanical viscoelasticity and strain amplitude and excitation frequency are obtained. The relationships between magnetic shear modulus corresponding to the magnetic viscoelasticity with strain amplitude, excitation frequency, and further external magnetic field are derived using the magneto-elastic theory. The influence of magnetic saturation on the MR effect is also considered. The dynamic characteristics of a fabricated isotropic MRE sample under different strain amplitudes, excitation frequencies and external magnetic fields are tested. The parameters of the proposed model are identified with the experimental data and the theoretical expressions of shear storage modulus and shear loss modulus of the MRE sample are obtained. In the light of the theoretical expressions, the loss factors of the MRE sample under different loading conditions are analyzed and compared with the test results to evaluate the effectiveness of the proposed model.

  10. Face verification for mobile personal devices

    NARCIS (Netherlands)

    Tao, Qian

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face sequenc

  11. Spectroscopic study of prompt-gamma emission for range verification in proton therapy.

    Science.gov (United States)

    Kelleter, Laurent; Wrońska, Aleksandra; Besuglow, Judith; Konefał, Adam; Laihem, Karim; Leidner, Johannes; Magiera, Andrzej; Parodi, Katia; Rusiecka, Katarzyna; Stahl, Achim; Tessonnier, Thomas

    2017-02-01

    We present the results of an investigation of the prompt-gamma emission from an interaction of a proton beam with phantom materials. Measurements were conducted with a novel setup allowing the precise selection of the investigated depth in the phantom, featuring three different materials composed of carbon, oxygen and hydrogen. We studied two beam energies of 70.54 and 130.87MeV and two detection angles: 90° and 120°. The results are presented in form of profiles of the prompt-gamma yield as a function of depth. In the analysis we focused on the transitions with the largest cross sections: (12)C4.44→g.s. and (16)O6.13→g.s.. We compare the profiles obtained under various irradiation conditions, with emphasis on the shape of the distal fall-off. The results are also compared to calculations including different cross-section models. They are in agreement with the model exploiting published cross-section data, but the comparison with the Talys model shows discrepancies. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  12. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  13. The Woodward Effect: Math Modeling and Continued Experimental Verifications at 2 to 4 MHz

    Science.gov (United States)

    March, Paul; Palfreyman, Andrew

    2006-01-01

    The Woodward Effect (W-E), the supposition that energy-storing ions experience a transient mass fluctuation near their rest mass when accelerated, has been tentatively verified using linear electrical thrusters based on the Heaviside-Lorentz force transformation. This type of electromagnetic field thruster, or Mach-Lorentz Thruster (MLT), purports to create a transient mass differential that is expressed in a working medium to produce a net thrust in the dielectric material contained in several capacitors. These mass differentials are hypothesized to result from gravity/inertia-based Wheeler-Feynman radiation reactions with the rest of the mass in the universe (per Mach's Principle) in order to conserve momentum. Thus if a net unidirectional force is produced in such a device, then mass fluctuations in the working media should be present. A net unidirectional and reversible force on the order of +/- 3.14 milli-Newton or 0.069% of the suspended test article mass was recorded by us in our first high frequency 2.2 MHz test article. The authors also developed a W-E model that integrates the various engineering parameters affecting the design, construction, and performance of W-E based MLTs for the next generation of systems. When Woodward's (2004a, 2004b, 2005) and our test results were compared with the model's predictions, the test results exceeded predictions by one to two orders of magnitude. Efforts are underway to understand the discrepancies and update the model. The test results imply that these devices, when fully developed, could be competitive with ion engines intended for use on satellite station keeping and/or orbital transfers.

  14. Including oxygen enhancement ratio in ion beam treatment planning: model implementation and experimental verification

    Science.gov (United States)

    Scifoni, E.; Tinganelli, W.; Weyrather, W. K.; Durante, M.; Maier, A.; Krämer, M.

    2013-06-01

    We present a method for adapting a biologically optimized treatment planning for particle beams to a spatially inhomogeneous tumor sensitivity due to hypoxia, and detected e.g., by PET functional imaging. The TRiP98 code, established treatment planning system for particles, has been extended for including explicitly the oxygen enhancement ratio (OER) in the biological effect calculation, providing the first set up of a dedicated ion beam treatment planning approach directed to hypoxic tumors, TRiP-OER, here reported together with experimental tests. A simple semi-empirical model for calculating the OER as a function of oxygen concentration and dose averaged linear energy transfer, generating input tables for the program is introduced. The code is then extended in order to import such tables coming from the present or alternative models, accordingly and to perform forward and inverse planning, i.e., predicting the survival response of differently oxygenated areas as well as optimizing the required dose for restoring a uniform survival effect in the whole irradiated target. The multiple field optimization results show how the program selects the best beam components for treating the hypoxic regions. The calculations performed for different ions, provide indications for the possible clinical advantages of a multi-ion treatment. Finally the predictivity of the code is tested through dedicated cell culture experiments on extended targets irradiation using specially designed hypoxic chambers, providing a qualitative agreement, despite some limits in full survival calculations arising from the RBE assessment. The comparison of the predictions resulting by using different model tables are also reported.

  15. PEPT: An invaluable tool for 3-D particle tracking and CFD simulation verification in hydrocyclone studies

    Directory of Open Access Journals (Sweden)

    Hoffmann Alex C.

    2013-05-01

    Full Text Available Particle tracks in a hydrocyclone generated both experimentally by positron emission particle tracking (PEPT and numerically with Eulerian-Lagranian CFD have been studied and compared. A hydrocyclone with a cylinder-on-cone design was used in this study, the geometries used in the CFD simulations and in the experiments being identical. It is shown that it is possible to track a fast-moving particle in a hydrocyclone using PEPT with high temporal and spatial resolutions. The numerical 3-D particle trajectories were generated using the Large Eddy Simulation (LES turbulence model for the fluid and Lagrangian particle tracking for the particles. The behaviors of the particles were analyzed in detail and were found to be consistent between experiments and CFD simulations. The tracks of the particles are discussed and related to the fluid flow field visualized in the CFD simulations using the cross-sectional static pressure distribution.

  16. Experimental verification of a weak zone model for timber in bending

    DEFF Research Database (Denmark)

    Källsner, B.; Ditlevsen, Ove Dalager; Salmela, K.

    1997-01-01

    to 15% lower than is predicted by the proposed hierarchical model. Energy considerations show that the reduction in strength of long beams may not be solely a statistical effect caused by an increased number of possible failure modes in the long beams as compared to the short test specimens. The large...... elastic energy released in a long highly bend beam at the onset of failure may cause that a later higher external load level cannot be realised such as it can in a controlled slowly progressing failure....

  17. A mesoscale model used in the Polar regions: modification and verification

    Institute of Scientific and Technical Information of China (English)

    Ma Yan; Chen Shang

    2006-01-01

    A polar version of mesoscale model, Polar MM5 is introduced in the paper. The modifications for the polar MM5 dynamics and physics compared with standard MM5 are described. Additionally, parallel simulations of the Polar MM5 and original MM5 reveal that the Polar MM5 reproduces better near-surface variables forecasts than the original MM5 over the North American Arctic regions. The well predicted near surface temperature and mixing ratio by the Polar MM5 confirm the modified physical parameterization schemes in the Polar MM5 are appropriate for the research region.

  18. Ion collection by planar Langmuir probes: Sheridan's model and its verification

    Science.gov (United States)

    Lee, Dongsoo; Hershkowitz, Noah

    2007-03-01

    Data analysis from planar Langmuir probes normally assumes that the sheath effects are not significant in determining electron density and temperature when the Debye length is small compared to the probe radius. However, analysis of ion saturation current requires careful attention due to sheath expansion near the probe electrode. It is experimentally verified for the first time that Sheridan's numerical model [T. E. Sheridan, Phys. Plasmas 7, 3084 (2000)] provides a correct method to measure the ion saturation current for which the ion density agrees with the electron density in argon plasmas.

  19. Verification of the Model of Inductive Coupling between a Josephson Oscillator and a Stripline

    Science.gov (United States)

    Kudo, Keisuke; Yoshida, Keiji; Enpuku, Keiji; Yamafuji, Kaoru

    1993-01-01

    In order to realize an efficient coupling between a flux-flow-type Josephson oscillator (FFO) and a stripline, we have carried out experiments to verify the mathematical model of the inductive coupling scheme between FFO and a stripline resonator in the frequency range between 50 GHz and 350 GHz. It is shown that the simulation using the proposed equivalent circuit for the inductive coupling scheme well explains the experimental results. The experimentally obtained center frequency and the bandwidth of the matching circuit were as large as 120 GHz and 40 GHz, respectively, which are also in reasonable agreement with those obtained in the simulation.

  20. Cross-Cultural Validity of the TIMSS-1999 Mathematics Test: Verification of a Cognitive Model

    Science.gov (United States)

    Chen, Yi-Hsin; Gorin, Joanna S.; Thompson, Marilyn S.; Tatsuoka, Kikumi K.

    2008-01-01

    As with any test administered across linguistically and culturally diverse groups, evidence suggesting the equivalence of score meaning across countries is needed for valid comparisons. The current study examines the cross-cultural equivalence of score interpretations from the Trends in International Mathematics and Science Study (TIMSS)-1999 from…

  1. Computed Verification for NASA-CRM Model%NASA桘CRM阻力预测模型的计算验证

    Institute of Scientific and Technical Information of China (English)

    戚姝妮; 郭承鹏; 章锦威; 董军

    2015-01-01

    Computational verification is carried out using the in-house unstructured grid flow solver UN-SMB in AVICARI for WB and WBT configurations of NASA-CRM model provided on the 4th AIAA drag prediction conference .Parameters like grid convergence characteristics ,lift-drag curves ,Reynolds number effect,down-wash effect and pressure distribution of the WBT configuration are analyzed with emphasis , and all the computational results are compared with those provided by ONERA on the conference .The a-nalysis results indicate a good agreement with ONERA′s results ,and the drag prediction accuracy of UN-SMB flow solver is verified to some extent .%采用自研的非结构网格解算器UNSMB 进行了AIAA第四届阻力会议提供的NASA-CRM 翼身组合体( WB)以及翼身组合加平尾( WBT)两种构型的计算验证。重点分析了WBT模型的网格收敛特性、升阻力曲线、雷诺数效应、下洗效应以及压力分布等,并把计算结果与阻力预测会议上ONERA的计算结果进行了对比。分析结果显示,非结构混合网格解算器的计算结果与ONERA的计算结果吻合度较好,同时在一定程度上验证与确认了解算器的阻力预测精度。

  2. Development and verification of the modified dynamic two-fluid model GOPS

    Science.gov (United States)

    Song, Chengyi; Li, Yuxing; Meng, Lan; Wang, Haiyan

    2013-07-01

    In the oil and gas industry, many versions of software have been developed to calculate the flow parameters of multiphase flow. However, the existing software is not perfect. To improve the accuracy, a new version of software GOPS has been developed by Daqing Oilfield Construction Design and Research Institute, and China University of Petroleum. GOPS modifies the general extended two-fluid model, and considers the gas bubble phase in liquid and liquid droplet phase in gas. There are four continuity equations, two momentum equations, one mixture energy-conservation equation and one pressure-conservation equation in the controlling equations of GOPS. These controlling equations are combined with flow pattern transition model and closure relationships for every flow pattern. By this way, GOPS can simulate the dynamic variation of multiphase flow. To verify GOPS, relevant experiment has been made in Surface Engineering Pilot Test Center, CNPC. The experimental pressure gradients are compared with the results from GOPS, and the accuracy of GOPS is high.

  3. Multi-Mode GF-3 Satellite Image Geometric Accuracy Verification Using the RPC Model.

    Science.gov (United States)

    Wang, Taoyang; Zhang, Guo; Yu, Lei; Zhao, Ruishan; Deng, Mingjun; Xu, Kai

    2017-09-01

    The GaoFen-3 (GF-3) satellite is the first C-band multi-polarization synthetic aperture radar (SAR) imaging satellite with a resolution up to 1 m in China. It is also the only SAR satellite of the High-Resolution Earth Observation System designed for civilian use. There are 12 different imaging models to meet the needs of different industry users. However, to use SAR satellite images for related applications, they must possess high geometric accuracy. In order to verify the geometric accuracy achieved by the different modes of GF-3 images, we analyze the SAR geometric error source and perform geometric correction tests based on the RPC model with and without ground control points (GCPs) for five imaging modes. These include the spotlight (SL), ultra-fine strip (UFS), Fine Strip I (FSI), Full polarized Strip I (QPSI), and standard strip (SS) modes. Experimental results show that the check point residuals are large and consistent without GCPs, but the root mean square error of the independent checkpoints for the case of four corner control points is better than 1.5 pixels, achieving a similar level of geometric positioning accuracy to that of international satellites. We conclude that the GF-3 satellite can be used for high-accuracy geometric processing and related industry applications.

  4. Approach for the Semi-Automatic Verification of 3d Building Models

    Science.gov (United States)

    Helmholz, P.; Belton, D.; Moncrieff, S.

    2013-04-01

    In the field of spatial sciences, there are a large number of disciplines and techniques for capturing data to solve a variety of different tasks and problems for different applications. Examples include: traditional survey for boundary definitions, aerial imagery for building models, and laser scanning for heritage facades. These techniques have different attributes such as the number of dimensions, accuracy and precision, and the format of the data. However, because of the number of applications and jobs, often over time these data sets captured from different sensor platforms and for different purposes will overlap in some way. In most cases, while this data is archived, it is not used in future applications to value add to the data capture campaign of current projects. It is also the case that newly acquire data are often not used to combine and improve existing models and data integrity. The purpose of this paper is to discuss a methodology and infrastructure to automatically support this concept. That is, based on a job specification, to automatically query existing and newly acquired data based on temporal and spatial relations, and to automatically combine and generate the best solution. To this end, there are three main challenges to examine; change detection, thematic accuracy and data matching.

  5. Development and verification of fuel burn-up calculation model in a reduced reactor geometry

    Energy Technology Data Exchange (ETDEWEB)

    Sembiring, Tagor Malem [Center for Reactor Technology and Nuclear Safety (PTKRN), National Nuclear Energy Agency (BATAN), Kawasan PUSPIPTEK Gd. No. 80, Serpong, Tangerang 15310 (Indonesia)], E-mail: tagorms@batan.go.id; Liem, Peng Hong [Research Laboratory for Nuclear Reactor (RLNR), Tokyo Institute of Technology (Tokyo Tech), O-okayama, Meguro-ku, Tokyo 152-8550 (Japan)

    2008-02-15

    A fuel burn-up model in a reduced reactor geometry (2-D) is successfully developed and implemented in the Batan in-core fuel management code, Batan-FUEL. Considering the bank mode operation of the control rods, several interpolation functions are investigated which best approximate the 3-D fuel assembly radial power distributions across the core as function of insertion depth of the control rods. Concerning the applicability of the interpolation functions, it can be concluded that the optimal coefficients of the interpolation functions are not very sensitive to the core configuration and core or fuel composition in RSG GAS (MPR-30) reactor. Consequently, once the optimal interpolation function and its coefficients are derived then they can be used for 2-D routine operational in-core fuel management without repeating the expensive 3-D neutron diffusion calculations. At the selected fuel elements (at H-9 and G-6 core grid positions), the discrepancy of the FECFs (fuel element channel power peaking factors) between the 2-D and 3-D models are within the range of 3.637 x 10{sup -4}, 3.241 x 10{sup -4} and 7.556 x 10{sup -4} for the oxide, silicide cores with 250 g {sup 235}U/FE and the silicide core with 300 g {sup 235}U/FE, respectively.

  6. Erythrocyte lysis in isotonic solution of ammonium chloride: theoretical modeling and experimental verification.

    Science.gov (United States)

    Chernyshev, Andrey V; Tarasov, Peter A; Semianov, Konstantin A; Nekrasov, Vyacheslav M; Hoekstra, Alfons G; Maltsev, Valeri P

    2008-03-07

    A mathematical model of erythrocyte lysis in isotonic solution of ammonium chloride is presented in frames of a statistical approach. The model is used to evaluate several parameters of mature erythrocytes (volume, surface area, hemoglobin concentration, number of anionic exchangers on membrane, elasticity and critical tension of membrane) through their sphering and lysis measured by a scanning flow cytometer (SFC). SFC allows measuring the light-scattering pattern (indicatrix) of an individual cell over the angular range from 10 degrees to 60 degrees . Comparison of the experimentally measured and theoretically calculated light scattering patterns allows discrimination of spherical from non-spherical erythrocytes and evaluation of volume and hemoglobin concentration for individual spherical cells. Three different processes were applied for erythrocytes sphering: (1) colloid osmotic lysis in isotonic solution of ammonium chloride, (2) isovolumetric sphering in the presence of sodium dodecyl sulphate and albumin in neutrally buffered isotonic saline, and (3) osmotic fragility test in hypotonic media. For the hemolysis in ammonium chloride, the evolution of distributions of sphered erythrocytes on volume and hemoglobin content was monitored in real-time experiments. The analysis of experimental data was performed in the context of a statistical approach, taking into account that parameters of erythrocytes vary from cell to cell.

  7. Using Visual Specifications in Verification of Industrial Automation Controllers

    Directory of Open Access Journals (Sweden)

    Valeriy Vyatkin

    2008-03-01

    Full Text Available This paper deals with further development of a graphical specification language resembling timing-diagrams and allowing specification of partially ordered events in input and output signals. The language specifically aims at application in modular modelling of industrial automation systems and their formal verification via model-checking. The graphical specifications are translated into a model which is connected with the original model under study.

  8. Using Visual Specifications in Verification of Industrial Automation Controllers

    Directory of Open Access Journals (Sweden)

    Bouzon Gustavo

    2008-01-01

    Full Text Available Abstract This paper deals with further development of a graphical specification language resembling timing-diagrams and allowing specification of partially ordered events in input and output signals. The language specifically aims at application in modular modelling of industrial automation systems and their formal verification via model-checking. The graphical specifications are translated into a model which is connected with the original model under study.

  9. Verification in referral-based crowdsourcing.

    Directory of Open Access Journals (Sweden)

    Victor Naroditskiy

    Full Text Available Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  10. Fractional Market Model and its Verification on the Warsaw STOCK Exchange

    Science.gov (United States)

    Kozłowska, Marzena; Kasprzak, Andrzej; Kutner, Ryszard

    We analyzed the rising and relaxation of the cusp-like local peaks superposed with oscillations which were well defined by the Warsaw Stock Exchange index WIG in a daily time horizon. We found that the falling paths of all index peaks were described by a generalized exponential function or the Mittag-Leffler (ML) one superposed with various types of oscillations. However, the rising paths (except the first one of WIG which rises exponentially and the most important last one which rises again according to the ML function) can be better described by bullish anti-bubbles or inverted bubbles.2-4 The ML function superposed with oscillations is a solution of the nonhomogeneous fractional relaxation equation which defines here our Fractional Market Model (FMM) of index dynamics which can be also called the Rheological Model of Market. This solution is a generalized analog of an exactly solvable fractional version of the Standard or Zener Solid Model of viscoelastic materials commonly used in modern rheology.5 For example, we found that the falling paths of the index can be considered to be a system in the intermediate state lying between two complex ones, defined by short and long-time limits of the Mittag-Leffler function; these limits are given by the Kohlrausch-Williams-Watts (KWW) law for the initial times, and the power-law or the Nutting law for asymptotic time. Some rising paths (i.e., the bullish anti-bubbles) are a kind of log-periodic oscillations of the market in the bullish state initiated by a crash. The peaks of the index can be viewed as precritical or precrash ones since: (i) the financial market changes its state too early from the bullish to bearish one before it reaches a scaling region (defined by the diverging power-law of return per unit time), and (ii) they are affected by a finite size effect. These features could be a reminiscence of a significant risk aversion of the investors and their finite number, respectively. However, this means that the

  11. Verification and comparison of four numerical schemes for a 1D viscoelastic blood flow model.

    Science.gov (United States)

    Wang, Xiaofei; Fullana, Jose-Maria; Lagrée, Pierre-Yves

    2015-01-01

    A reliable and fast numerical scheme is crucial for the 1D simulation of blood flow in compliant vessels. In this paper, a 1D blood flow model is incorporated with a Kelvin-Voigt viscoelastic arterial wall. This leads to a nonlinear hyperbolic-parabolic system, which is then solved with four numerical schemes, namely: MacCormack, Taylor-Galerkin, monotonic upwind scheme for conservation law and local discontinuous Galerkin. The numerical schemes are tested on a single vessel, a simple bifurcation and a network with 55 arteries. The numerical solutions are checked favorably against analytical, semi-analytical solutions or clinical observations. Among the numerical schemes, comparisons are made in four important aspects: accuracy, ability to capture shock-like phenomena, computational speed and implementation complexity. The suitable conditions for the application of each scheme are discussed.

  12. Verification and comparison of four numerical schemes for a 1D viscoelastic blood flow model

    CERN Document Server

    Wang, Xiaofei; Lagrée, Pierre-Yves

    2013-01-01

    In this paper, we present four numerical schemes for a 1D viscoelastic blood flow model. In the case with a small nonlinearity (small amplitude of wave), asymptotic analysis predicts several behaviours of the wave: propagation in a uniform tube, attenuation of the amplitude due to the skin friction, diffusion due to the viscosity of the wall, and reflection and transmission at a branching point. These predictions are compared very favorably with all of the numerical solutions. The schemes are also tested in case with a larger nonlinearity. Finally, we apply all of the schemes on a relatively realistic arterial system with 55 arteries. The schemes are compared in four aspects: the spatial and temporal convergence speed, the ability to capture shock phenomena, the computation speed and the complexity of the implementation. The suitable conditions for the application of the various schemes are discussed.

  13. Description and verification of a novel flow and transport model for silicate-gel emplacement

    Science.gov (United States)

    Walther, Marc; Solpuker, Utku; Böttcher, Norbert; Kolditz, Olaf; Liedl, Rudolf; Schwartz, Frank W.

    2014-02-01

    We present a novel approach for the numerical simulation of the gelation of silicate solutions under density-dependent flow conditions. The method utilizes an auxiliary, not density-dependent solute that is subject to a linear decay function to provide temporal information that is used to describe the viscosity change of the fluid. By comparing the modeling results to experimental data, we are able to simulate the behavior and the gelation process of the injected solute for three different compositions, including long-term stability of the gelated area, and non-gelation of low concentrations due to hydro-dynamic dispersion. This approach can also be used for other types of solutes with this gelling property and is useful in a variety of applications in geological, civil and environmental engineering.

  14. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  15. Modular verification of linked lists with views via separation logic

    DEFF Research Database (Denmark)

    Jensen, Jonas Braband; Birkedal, Lars; Sestoft, Peter

    2010-01-01

    We present a separation logic specification and verification of linked lists with views, a data structure from the C5 collection library for C#. A view is a generalization of the well-known concept of an iterator. Linked lists with views form an interesting case study for verification since...... mathematical model of lists with views, and formulate succinct modular abstract specifications of the operations on the data structure. To show that the concrete implementation realizes the specification, we use fractional permissions in a novel way to capture the sharing of data between views...

  16. Modular Verification of Linked Lists with Views via Separation Logic

    DEFF Research Database (Denmark)

    Jensen, Jonas Braband; Birkedal, Lars; Sestoft, Peter

    2011-01-01

    We present a separation logic specification and verification of linked lists with views, a data structure from the C5 collection library for .NET. A view is a generalization of the well-known concept of an iterator. Linked lists with views form an interesting case study for verification since...... model of lists with views, and formulate succinct modular abstract specifications of the operations on the data structure. To show that the concrete implementation realizes the specification, we use frac- tional permissions in a novel way to capture the sharing of data between views and their underlying...

  17. Verification of hydrological processes using the ACRU agro-hydrological modelling system for simulating potential climate change impacts in an alpine watershed in Alberta, Canada

    Science.gov (United States)

    Nemeth, M. W.; Kienzle, S. W.; Byrne, J. M.

    2009-12-01

    The upper North Saskatchewan River (UNSR) watershed is situated south-west of Edmonton, Alberta, with a watershed area of slightly over 20,000km2. This on-going research looks to model the UNSR watershed to help predict future streamflows in the UNSR, based on potential future climate change and land use changes within the watershed, by setting up the ACRU agro-hydrological modelling system (Schulze et al., 2004). The watershed was divided into hydrological response units (HRUs) by using soil, land cover, climate, and stream data that were collected and processed using a GIS. Each HRU was set up individually, so that a minimum daily time series of 30 years is available to simulate all elements of the hydrological cycle for each of the HRUs. Initial model runs were completed with simulated output for many hydrological variables including snowpack development and snow melt, actual evaporation, transpiration, soil moisture storage, storm flow and groundwater contributions. Simulated temperatures, streamflow, actual evaporation, snow cover, and snow water equivalent (SWE) were used to help verify that hydrological processes simulated by the model are consistent with that of the watershed. Observed temperature from six high elevation fire lookout stations were used to verify simulated temperatures. Four years of MODIS images were used to verify that spatial snow cover over the watershed was being adequately simulated. Snow water equivalent (SWE) was verified using observed data from thirteen snow courses, and two snow pillows in the watershed. Observed evaporation (A-pan) data from three meteorological stations just outside the study area were used to determine if simulated evaporation values were within physically meaningful ranges for this region. Observed naturalized streamflow data from sixteen gauging stations around the watershed were used to help verify streamflows from different areas of the watershed were being properly simulated. Verification analysis is

  18. Verification and Validation of the Spring Model Parachute Air Delivery System in Subsonic Flow

    Science.gov (United States)

    2015-02-27

    Almeida at Oak Ridge National Lab on the study of phase transition problems in nuclear reactor . References [1] D. L. Brown, R. Cortez, and M. L. Minion...functionality. Modularization is emphasized in our code development. The parachute module is an independent application program. This new module consists of

  19. Empirical Verification of Fault Models for FPGAs Operating in the Subcritical Voltage Region

    DEFF Research Database (Denmark)

    Birklykke, Alex Aaen; Koch, Peter; Prasad, Ramjee

    2013-01-01

    We present a rigorous empirical study of the bit-level error behavior of field programmable gate arrays operating in the subcricital voltage region. This region is of significant interest as voltage-scaling under normal circumstances is halted by the first occurrence of errors. However, accurate...

  20. Tracer experimental techniques for CFD model verification and validation in sugar crystallizer

    NARCIS (Netherlands)

    Griffith, J.; Borroto, J.; Dominguez, J.; Derivet, M.; Cuesta, J.; Flores, P.; Fernandez Rivas, D.; Amor, A.; Franklin, B.

    2004-01-01

    In the framework of the CRP improvement of the experimental design for RTD tests at a pilot crystallizer was performed. A new approach for RTD studies in non-Newtonian fluids for flow patterns characterization at the pilot crystallizer was carried out. Batch mixing process was tested and the homogen