WorldWideScience

Sample records for model quantitatively simulates

  1. Qualitative and Quantitative Integrated Modeling for Stochastic Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Xuefeng Yan

    2013-01-01

    Full Text Available The simulation and optimization of an actual physics system are usually constructed based on the stochastic models, which have both qualitative and quantitative characteristics inherently. Most modeling specifications and frameworks find it difficult to describe the qualitative model directly. In order to deal with the expert knowledge, uncertain reasoning, and other qualitative information, a qualitative and quantitative combined modeling specification was proposed based on a hierarchical model structure framework. The new modeling approach is based on a hierarchical model structure which includes the meta-meta model, the meta-model and the high-level model. A description logic system is defined for formal definition and verification of the new modeling specification. A stochastic defense simulation was developed to illustrate how to model the system and optimize the result. The result shows that the proposed method can describe the complex system more comprehensively, and the survival probability of the target is higher by introducing qualitative models into quantitative simulation.

  2. Qualitative vs. quantitative software process simulation modelling: conversion and comparison

    OpenAIRE

    Zhang, He; Kitchenham, Barbara; Jeffery, Ross

    2009-01-01

    peer-reviewed Software Process Simulation Modeling (SPSM) research has increased in the past two decades. However, most of these models are quantitative, which require detailed understanding and accurate measurement. As the continuous work to our previous studies in qualitative modeling of software process, this paper aims to investigate the structure equivalence and model conversion between quantitative and qualitative process modeling, and to compare the characteristics and performance o...

  3. Quantitative identification of technological discontinuities using simulation modeling

    CERN Document Server

    Park, Hyunseok

    2016-01-01

    The aim of this paper is to develop and test metrics to quantitatively identify technological discontinuities in a knowledge network. We developed five metrics based on innovation theories and tested the metrics by a simulation model-based knowledge network and hypothetically designed discontinuity. The designed discontinuity is modeled as a node which combines two different knowledge streams and whose knowledge is dominantly persistent in the knowledge network. The performances of the proposed metrics were evaluated by how well the metrics can distinguish the designed discontinuity from other nodes on the knowledge network. The simulation results show that the persistence times # of converging main paths provides the best performance in identifying the designed discontinuity: the designed discontinuity was identified as one of the top 3 patents with 96~99% probability by Metric 5 and it is, according to the size of a domain, 12~34% better than the performance of the second best metric. Beyond the simulation ...

  4. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  5. Quantitative Modeling of Entangled Polymer Rheology: Experiments, Tube Models and Slip-Link Simulations

    Science.gov (United States)

    Desai, Priyanka Subhash

    Rheology properties are sensitive indicators of molecular structure and dynamics. The relationship between rheology and polymer dynamics is captured in the constitutive model, which, if accurate and robust, would greatly aid molecular design and polymer processing. This dissertation is thus focused on building accurate and quantitative constitutive models that can help predict linear and non-linear viscoelasticity. In this work, we have used a multi-pronged approach based on the tube theory, coarse-grained slip-link simulations, and advanced polymeric synthetic and characterization techniques, to confront some of the outstanding problems in entangled polymer rheology. First, we modified simple tube based constitutive equations in extensional rheology and developed functional forms to test the effect of Kuhn segment alignment on a) tube diameter enlargement and b) monomeric friction reduction between subchains. We, then, used these functional forms to model extensional viscosity data for polystyrene (PS) melts and solutions. We demonstrated that the idea of reduction in segmental friction due to Kuhn alignment is successful in explaining the qualitative difference between melts and solutions in extension as revealed by recent experiments on PS. Second, we compiled literature data and used it to develop a universal tube model parameter set and prescribed their values and uncertainties for 1,4-PBd by comparing linear viscoelastic G' and G" mastercurves for 1,4-PBds of various branching architectures. The high frequency transition region of the mastercurves superposed very well for all the 1,4-PBds irrespective of their molecular weight and architecture, indicating universality in high frequency behavior. Therefore, all three parameters of the tube model were extracted from this high frequency transition region alone. Third, we compared predictions of two versions of the tube model, Hierarchical model and BoB model against linear viscoelastic data of blends of 1,4-PBd

  6. Business Scenario Evaluation Method Using Monte Carlo Simulation on Qualitative and Quantitative Hybrid Model

    Science.gov (United States)

    Samejima, Masaki; Akiyoshi, Masanori; Mitsukuni, Koshichiro; Komoda, Norihisa

    We propose a business scenario evaluation method using qualitative and quantitative hybrid model. In order to evaluate business factors with qualitative causal relations, we introduce statistical values based on propagation and combination of effects of business factors by Monte Carlo simulation. In propagating an effect, we divide a range of each factor by landmarks and decide an effect to a destination node based on the divided ranges. In combining effects, we decide an effect of each arc using contribution degree and sum all effects. Through applied results to practical models, it is confirmed that there are no differences between results obtained by quantitative relations and results obtained by the proposed method at the risk rate of 5%.

  7. Quantitative Genetics and Functional-Structural Plant Growth Models: Simulation of Quantitative Trait Loci Detection for Model Parameters and Application to Potential Yield Optimization

    CERN Document Server

    Letort, Veronique; Cournède, Paul-Henry; De Reffye, Philippe; Courtois, Brigitte; 10.1093/aob/mcm197

    2010-01-01

    Background and Aims: Prediction of phenotypic traits from new genotypes under untested environmental conditions is crucial to build simulations of breeding strategies to improve target traits. Although the plant response to environmental stresses is characterized by both architectural and functional plasticity, recent attempts to integrate biological knowledge into genetics models have mainly concerned specific physiological processes or crop models without architecture, and thus may prove limited when studying genotype x environment interactions. Consequently, this paper presents a simulation study introducing genetics into a functional-structural growth model, which gives access to more fundamental traits for quantitative trait loci (QTL) detection and thus to promising tools for yield optimization. Methods: The GreenLab model was selected as a reasonable choice to link growth model parameters to QTL. Virtual genes and virtual chromosomes were defined to build a simple genetic model that drove the settings ...

  8. A quantitative exposure model simulating human norovirus transmission during preparation of deli sandwiches.

    Science.gov (United States)

    Stals, Ambroos; Jacxsens, Liesbeth; Baert, Leen; Van Coillie, Els; Uyttendaele, Mieke

    2015-03-02

    Human noroviruses (HuNoVs) are a major cause of food borne gastroenteritis worldwide. They are often transmitted via infected and shedding food handlers manipulating foods such as deli sandwiches. The presented study aimed to simulate HuNoV transmission during the preparation of deli sandwiches in a sandwich bar. A quantitative exposure model was developed by combining the GoldSim® and @Risk® software packages. Input data were collected from scientific literature and from a two week observational study performed at two sandwich bars. The model included three food handlers working during a three hour shift on a shared working surface where deli sandwiches are prepared. The model consisted of three components. The first component simulated the preparation of the deli sandwiches and contained the HuNoV reservoirs, locations within the model allowing the accumulation of NoV and the working of intervention measures. The second component covered the contamination sources being (1) the initial HuNoV contaminated lettuce used on the sandwiches and (2) HuNoV originating from a shedding food handler. The third component included four possible intervention measures to reduce HuNoV transmission: hand and surface disinfection during preparation of the sandwiches, hand gloving and hand washing after a restroom visit. A single HuNoV shedding food handler could cause mean levels of 43±18, 81±37 and 18±7 HuNoV particles present on the deli sandwiches, hands and working surfaces, respectively. Introduction of contaminated lettuce as the only source of HuNoV resulted in the presence of 6.4±0.8 and 4.3±0.4 HuNoV on the food and hand reservoirs. The inclusion of hand and surface disinfection and hand gloving as a single intervention measure was not effective in the model as only marginal reductions of HuNoV levels were noticeable in the different reservoirs. High compliance of hand washing after a restroom visit did reduce HuNoV presence substantially on all reservoirs. The

  9. Can a quantitative simulation of an Otto engine be accurately rendered by a simple Novikov model with heat leak?

    Science.gov (United States)

    Fischer, A.; Hoffmann, K.-H.

    2004-03-01

    In this case study a complex Otto engine simulation provides data including, but not limited to, effects from losses due to heat conduction, exhaust losses and frictional losses. This data is used as a benchmark to test whether the Novikov engine with heat leak, a simple endoreversible model, can reproduce the complex engine behavior quantitatively by an appropriate choice of model parameters. The reproduction obtained proves to be of high quality.

  10. Comparing Simulation Output Accuracy of Discrete Event and Agent Based Models: A Quantitive Approach

    CERN Document Server

    Majid, Mazlina Abdul; Siebers, Peer-Olaf

    2010-01-01

    In our research we investigate the output accuracy of discrete event simulation models and agent based simulation models when studying human centric complex systems. In this paper we focus on human reactive behaviour as it is possible in both modelling approaches to implement human reactive behaviour in the model by using standard methods. As a case study we have chosen the retail sector, and here in particular the operations of the fitting room in the women wear department of a large UK department store. In our case study we looked at ways of determining the efficiency of implementing new management policies for the fitting room operation through modelling the reactive behaviour of staff and customers of the department. First, we have carried out a validation experiment in which we compared the results from our models to the performance of the real system. This experiment also allowed us to establish differences in output accuracy between the two modelling methids. In a second step a multi-scenario experimen...

  11. Multiple methods for multiple futures: Integrating qualitative scenario planning and quantitative simulation modeling for natural resource decision making

    Science.gov (United States)

    Symstad, Amy; Fisichelli, Nicholas A.; Miller, Brian; Rowland, Erika; Schuurman, Gregor W.

    2017-01-01

    Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.

  12. Quantitative Simulation of Granular Collapse Experiments with Visco-Plastic Models

    Science.gov (United States)

    Mangeney, A.; Ionescu, I. R.; Bouchut, F.; Roche, O.

    2014-12-01

    One of the key issues in landslide modeling is to define the appropriate rheological behavior of these natural granular flows. In particular the description of the static and of the flowing states of granular media is still an open issue. This plays a crucial role in erosion/deposition processes. A first step to address this issue is to derive models able to reproduce laboratory experiments of granular flows. We propose here a mechanical and numerical model of dry granular flows that quantitatively well reproduces granular column collapse over inclined planes, with rheological parameters directly derived from the laboratory experiments. We reformulate the so-called μ(I) rheology proposed by Jop et al. (2006) where I is the so-called inertial number in the framework of Drucker-Prager plasticity with yield stress and a viscosity η(||D||, p) depending on both the pressure p and the norm of the strain rate tensor ||D||. The resulting dynamic viscosity varies from very small values near the free surface and near the front to 1.5 Pa.s within the quasi-static zone. We show that taking into account a constant mean viscosity during the flow (η = 1 Pa.s here) provides results very similar to those obtained with the variable viscosity deduced from the μ(I) rheology, while significantly reducing the computational cost. This has important implication for application to real landslides and rock avalanches. The numerical results show that the flow is essentially located in a surface layer behind the front, while the whole granular material is flowing near the front where basal sliding occurs. The static/flowing interface changes as a function of space and time, in good agreement with experimental observations. Heterogeneities are observed within the flow with low and high pressure zones, localized small upward velocity zones and vortices near the transition between the flowing and static grains. These instabilities create 'sucking zones' and have some characteristics similar

  13. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    Science.gov (United States)

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  14. Molecular Modeling on Berberine Derivatives toward BuChE: An Integrated Study with Quantitative Structure-Activity Relationships Models, Molecular Docking, and Molecular Dynamics Simulations.

    Science.gov (United States)

    Fang, Jiansong; Pang, Xiaocong; Wu, Ping; Yan, Rong; Gao, Li; Li, Chao; Lian, Wenwen; Wang, Qi; Liu, Ai-lin; Du, Guan-hua

    2016-05-01

    A dataset of 67 berberine derivatives for the inhibition of butyrylcholinesterase (BuChE) was studied based on the combination of quantitative structure-activity relationships models, molecular docking, and molecular dynamics methods. First, a series of berberine derivatives were reported, and their inhibitory activities toward butyrylcholinesterase (BuChE) were evaluated. By 2D- quantitative structure-activity relationships studies, the best model built by partial least-square had a conventional correlation coefficient of the training set (R(2)) of 0.883, a cross-validation correlation coefficient (Qcv2) of 0.777, and a conventional correlation coefficient of the test set (Rpred2) of 0.775. The model was also confirmed by Y-randomization examination. In addition, the molecular docking and molecular dynamics simulation were performed to better elucidate the inhibitory mechanism of three typical berberine derivatives (berberine, C2, and C55) toward BuChE. The predicted binding free energy results were consistent with the experimental data and showed that the van der Waals energy term (ΔEvdw) difference played the most important role in differentiating the activity among the three inhibitors (berberine, C2, and C55). The developed quantitative structure-activity relationships models provide details on the fine relationship linking structure and activity and offer clues for structural modifications, and the molecular simulation helps to understand the inhibitory mechanism of the three typical inhibitors. In conclusion, the results of this study provide useful clues for new drug design and discovery of BuChE inhibitors from berberine derivatives.

  15. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  16. A quantitative comparison between the flow factor approach model and the molecular dynamics simulation results for the flow of a confined molecularly thin fluid film

    Science.gov (United States)

    Zhang, Yongbin

    2015-06-01

    Quantitative comparisons were made between the flow factor approach model and the molecular dynamics simulation (MDS) results both of which describe the flow of a molecularly thin fluid film confined between two solid walls. Although these two approaches, respectively, calculate the flow of a confined molecularly thin fluid film by different ways, very good agreements were found between them when the Couette and Poiseuille flows, respectively, calculated from them were compared. It strongly indicates the validity of the flow factor approach model in modeling the flow of a confined molecularly thin fluid film.

  17. Surgical Simulations Based on Limited Quantitative Data: Understanding How Musculoskeletal Models Can Be Used to Predict Moment Arms and Guide Experimental Design.

    Science.gov (United States)

    Nichols, Jennifer A; Bednar, Michael S; Murray, Wendy M

    2016-01-01

    The utility of biomechanical models and simulations to examine clinical problems is currently limited by the need for extensive amounts of experimental data describing how a given procedure or disease affects the musculoskeletal system. Methods capable of predicting how individual biomechanical parameters are altered by surgery are necessary for the efficient development of surgical simulations. In this study, we evaluate to what extent models based on limited amounts of quantitative data can be used to predict how surgery influences muscle moment arms, a critical parameter that defines how muscle force is transformed into joint torque. We specifically examine proximal row carpectomy and scaphoid-excision four-corner fusion, two common surgeries to treat wrist osteoarthritis. Using models of these surgeries, which are based on limited data and many assumptions, we perform simulations to formulate a hypothesis regarding how these wrist surgeries influence muscle moment arms. Importantly, the hypothesis is based on analysis of only the primary wrist muscles. We then test the simulation-based hypothesis using a cadaveric experiment that measures moment arms of both the primary wrist and extrinsic thumb muscles. The measured moment arms of the primary wrist muscles are used to verify the hypothesis, while those of the extrinsic thumb muscles are used as cross-validation to test whether the hypothesis is generalizable. The moment arms estimated by the models and measured in the cadaveric experiment both indicate that a critical difference between the surgeries is how they alter radial-ulnar deviation versus flexion-extension moment arms at the wrist. Thus, our results demonstrate that models based on limited quantitative data can provide novel insights. This work also highlights that synergistically utilizing simulation and experimental methods can aid the design of experiments and make it possible to test the predictive limits of current computer simulation techniques.

  18. NEXRAD quantitative precipitation estimates, data acquisition, and processing for the DuPage County, Illinois, streamflow-simulation modeling system

    Science.gov (United States)

    Ortel, Terry W.; Spies, Ryan R.

    2015-11-19

    Next-Generation Radar (NEXRAD) has become an integral component in the estimation of precipitation (Kitzmiller and others, 2013). The high spatial and temporal resolution of NEXRAD has revolutionized the ability to estimate precipitation across vast regions, which is especially beneficial in areas without a dense rain-gage network. With the improved precipitation estimates, hydrologic models can produce reliable streamflow forecasts for areas across the United States. NEXRAD data from the National Weather Service (NWS) has been an invaluable tool used by the U.S. Geological Survey (USGS) for numerous projects and studies; NEXRAD data processing techniques similar to those discussed in this Fact Sheet have been developed within the USGS, including the NWS Quantitative Precipitation Estimates archive developed by Blodgett (2013).

  19. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  20. Simulation evaluation of quantitative myocardial perfusion assessment from cardiac CT

    Science.gov (United States)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-03-01

    Contrast enhancement on cardiac CT provides valuable information about myocardial perfusion and methods have been proposed to assess perfusion with static and dynamic acquisitions. There is a lack of knowledge and consensus on the appropriate approach to ensure 1) sufficient diagnostic accuracy for clinical decisions and 2) low radiation doses for patient safety. This work developed a thorough dynamic CT simulation and several accepted blood flow estimation techniques to evaluate the performance of perfusion assessment across a range of acquisition and estimation scenarios. Cardiac CT acquisitions were simulated for a range of flow states (Flow = 0.5, 1, 2, 3 ml/g/min, cardiac output = 3,5,8 L/min). CT acquisitions were simulated with a validated CT simulator incorporating polyenergetic data acquisition and realistic x-ray flux levels for dynamic acquisitions with a range of scenarios including 1, 2, 3 sec sampling for 30 sec with 25, 70, 140 mAs. Images were generated using conventional image reconstruction with additional image-based beam hardening correction to account for iodine content. Time attenuation curves were extracted for multiple regions around the myocardium and used to estimate flow. In total, 2,700 independent realizations of dynamic sequences were generated and multiple MBF estimation methods were applied to each of these. Evaluation of quantitative kinetic modeling yielded blood flow estimates with an root mean square error (RMSE) of ~0.6 ml/g/min averaged across multiple scenarios. Semi-quantitative modeling and qualitative static imaging resulted in significantly more error (RMSE = ~1.2 and ~1.2 ml/min/g respectively). For quantitative methods, dose reduction through reduced temporal sampling or reduced tube current had comparable impact on the MBF estimate fidelity. On average, half dose acquisitions increased the RMSE of estimates by only 18% suggesting that substantial dose reductions can be employed in the context of quantitative myocardial

  1. Modeling quantitative phase image formation under tilted illuminations.

    Science.gov (United States)

    Bon, Pierre; Wattellier, Benoit; Monneret, Serge

    2012-05-15

    A generalized product-of-convolution model for simulation of quantitative phase microscopy of thick heterogeneous specimen under tilted plane-wave illumination is presented. Actual simulations are checked against a much more time-consuming commercial finite-difference time-domain method. Then modeled data are compared with experimental measurements that were made with a quadriwave lateral shearing interferometer.

  2. Quantitative application of Monte Carlo simulation in Fire-PSA

    Energy Technology Data Exchange (ETDEWEB)

    Mangs, J.; Hostikka, S.; Korhonen, T. [Valtion Teknillinen Tutkimuskeskus, Espoo (Finland); Keski-Rahkonen, O.

    2007-05-15

    In a power plant a fire cell forms the basic subunit. Since the fire is initially located there, the full-scale time dependent fire simulation and estimation of target response must be performed within the fire cell. Conditional, time dependent damage probabilities in a fire cell can now be calculated for arbitrary targets (component or a subsystem) combining probabilistic (Monte Carlo) and deterministic simulation. For the latter a spectrum from simple correlations up to latest computational fluid dynamics models is available. Selection of the code is made according to the requirements form the target cell. Although calculations are numerically heavy, it is now economically possible and feasible to carry out quantitative fire-PSA for a complete plant iteratively with the main PSA. From real applications examples are shown on assessment of fire spread possibility in a relay room, and potential of fire spread on cables in a tunnel. (orig.)

  3. Quantitative comparison of semi- and fully-distributed hydrologic models in simulating flood hydrographs on a mountain watershed in southwest China

    Institute of Scientific and Technical Information of China (English)

    张会兰; 王玉杰; 王云琦; 李丹勋; 王兴奎

    2013-01-01

    To investigate the performance of fully- and semi-distributed hydrologic models in simulating the process of transforma- tion from rainfall to runoff in mountain areas, the fully-distributed models Basin Pollution Calculation Center (BPCC) and HEC- HMS are calibrated for the Zhenjiangguan watershed located in the upper stream of Minjiang River Southwest China using stream- flow observations at the basin outlet. Semi-automatical optimization method is implemented to both models to improve simulated re- sults by removing artificial errors. Based on the consistency of the simulated hydrographs with the observed ones, the statistical coe- fficients such as the relative error, the probability distribution and the correlation coefficient, are further introduced to evaluate qua- ntitatively the performance of the two models. Analyses indicate that the hydrographs simulated by the BPCC are relatively closer to the observed ones than those simulated by the HEC-HMS in view of the spatial heterogeneity in terrain, soil texture, land cover and meteorological conditions in mountain areas.

  4. Quantitative structure - mesothelioma potency model ...

    Science.gov (United States)

    Cancer potencies of mineral and synthetic elongated particle (EP) mixtures, including asbestos fibers, are influenced by changes in fiber dose composition, bioavailability, and biodurability in combination with relevant cytotoxic dose-response relationships. A unique and comprehensive rat intra-pleural (IP) dose characterization data set with a wide variety of EP size, shape, crystallographic, chemical, and bio-durability properties facilitated extensive statistical analyses of 50 rat IP exposure test results for evaluation of alternative dose pleural mesothelioma response models. Utilizing logistic regression, maximum likelihood evaluations of thousands of alternative dose metrics based on hundreds of individual EP dimensional variations within each test sample, four major findings emerged: (1) data for simulations of short-term EP dose changes in vivo (mild acid leaching) provide superior predictions of tumor incidence compared to non-acid leached data; (2) sum of the EP surface areas (ÓSA) from these mildly acid-leached samples provides the optimum holistic dose response model; (3) progressive removal of dose associated with very short and/or thin EPs significantly degrades resultant ÓEP or ÓSA dose-based predictive model fits, as judged by Akaike’s Information Criterion (AIC); and (4) alternative, biologically plausible model adjustments provide evidence for reduced potency of EPs with length/width (aspect) ratios 80 µm. Regar

  5. CAUSA - An Environment For Modeling And Simulation

    Science.gov (United States)

    Dilger, Werner; Moeller, Juergen

    1989-03-01

    CAUSA is an environment for modeling and simulation of dynamic systems on a quantitative level. The environment provides a conceptual framework including primitives like objects, processes and causal dependencies which allow the modeling of a broad class of complex systems. The facility of simulation allows the quantitative and qualitative inspection and empirical investigation of the behavior of the modeled system. CAUSA is implemented in Knowledge-Craft and runs on a Symbolics 3640.

  6. From provocative narrative scenarios to quantitative biophysical model results: Simulating plausible futures to 2070 in an urbanizing agricultural watershed in Wisconsin, USA

    Science.gov (United States)

    Booth, E.; Chen, X.; Motew, M.; Qiu, J.; Zipper, S. C.; Carpenter, S. R.; Kucharik, C. J.; Steven, L. I.

    2015-12-01

    Scenario analysis is a powerful tool for envisioning future social-ecological change and its consequences on human well-being. Scenarios that integrate qualitative storylines and quantitative biophysical models can create a vivid picture of these potential futures but the integration process is not straightforward. We present - using the Yahara Watershed in southern Wisconsin (USA) as a case study - a method for developing quantitative inputs (climate, land use/cover, and land management) to drive a biophysical modeling suite based on four provocative and contrasting narrative scenarios that describe plausible futures of the watershed to 2070. The modeling suite consists of an agroecosystem model (AgroIBIS-VSF), hydrologic routing model (THMB), and empirical lake water quality model and estimates several biophysical indicators to evaluate the watershed system under each scenario. These indicators include water supply, lake flooding, agricultural production, and lake water quality. Climate (daily precipitation and air temperature) for each scenario was determined using statistics from 210 different downscaled future climate projections for two 20-year time periods (2046-2065 and 2081-2100) and modified using a stochastic weather generator to allow flexibility for matching specific climate events within the scenario narratives. Land use/cover for each scenario was determined first by quantifying changes in areal extent every decade for 15 categories at the watershed scale to be consistent with the storyline events and theme. Next, these changes were spatially distributed using a rule-based framework based on land suitability metrics that determine transition probabilities. Finally, agricultural inputs including manure and fertilizer application rates were determined for each scenario based on the prevalence of livestock, water quality regulations, and technological innovations. Each scenario is compared using model inputs (maps and time-series of land use/cover and

  7. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  8. The mathematics of cancer: integrating quantitative models.

    Science.gov (United States)

    Altrock, Philipp M; Liu, Lin L; Michor, Franziska

    2015-12-01

    Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.

  9. Simulating Quantitative Cellular Responses Using Asynchronous Threshold Boolean Network Ensembles

    Directory of Open Access Journals (Sweden)

    Shah Imran

    2011-07-01

    results suggest that this approach is both quantitative, allowing statistical verification and calibration, and extensible, allowing modification and revision as guided by experimental evidence. The simulation methodology is part of the US EPA Virtual Liver, which is investigating the effects of everyday contaminants on living tissues. Future models will incorporate additional crosstalk surrounding proliferation as well as the putative effects of xenobiotics on these signaling cascades within hepatocytes.

  10. 基于云模型融入定性信息的定量仿真方法%Quantitative Simulation Method Fusing Qualitative Information Based on Cloud Model

    Institute of Scientific and Technical Information of China (English)

    王洪利

    2013-01-01

    As the absence of effective depictive expression of uncertain information which causes the insufficient usage of uncertain information, the representation of uncertain information and qualitative modeling method based on cloud model was proposed with digital simulation. Firstly, the variable space expression of uncertain information based on cloud model was given. Then, the method fused with qualitative information of improved quantitative simulation was studied based on cloud model. The principle of improved quantity simulation and process was given in this article including the key problems about the fusion calculation of qualitative and quantitative information, the lasting and cycle variation of qualitative information and so on. Lastly, the modeling method was applied to the case of modeling in the complex supply chain management system simulation to verify the feasible of method. The results verify that method proposed in this paper has the merit of objective expression of the uncertain information. And the fusion of qualitative and quantitative information can be used in complex system simulation.%针对系统定量仿真中存在的定性信息无法充分合理使用、定量信息和定性信息不能融合的问题,提出了一种基于云模型的定性仿真建模方法,并将其融入定量仿真之中.首先给出了基于云模型的变量空间描述,然后提出了基于云模型的融入定性信息的定量仿真方法.给出了包括定性与定量信息的融合运算、定性信息的持续与周期变化等在内的难点问题的解决方法.最后通过一个复杂供应链系统仿真实例验证了该方法的可行性.结果表明,改进方法具有客观的表达不确定性信息的特点,能将定性信息有效地融合到系统的定量仿真之中.

  11. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage ti

  12. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  13. Theory Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Shlachter, Jack [Los Alamos National Laboratory

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  14. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    on the existence of a quotient construction, allowing a property phi of a parallel system phi/A to be transformed into a sufficient and necessary quotient-property yolA to be satisfied by the component 13. Given a model checking problem involving a network Pi I and a property yo, the method gradually move (by...

  15. Simulation modeling of carcinogenesis.

    Science.gov (United States)

    Ellwein, L B; Cohen, S M

    1992-03-01

    A discrete-time simulation model of carcinogenesis is described mathematically using recursive relationships between time-varying model variables. The dynamics of cellular behavior is represented within a biological framework that encompasses two irreversible and heritable genetic changes. Empirical data and biological supposition dealing with both control and experimental animal groups are used together to establish values for model input variables. The estimation of these variables is integral to the simulation process as described in step-by-step detail. Hepatocarcinogenesis in male F344 rats provides the basis for seven modeling scenarios which illustrate the complexity of relationships among cell proliferation, genotoxicity, and tumor risk.

  16. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    Science.gov (United States)

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  17. Concluding Report: Quantitative Tomography Simulations and Reconstruction Algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Aufderheide, M B; Martz, H E; Slone, D M; Jackson, J A; Schach von Wittenau, A E; Goodman, D M; Logan, C M; Hall, J M

    2002-02-01

    In this report we describe the original goals and final achievements of this Laboratory Directed Research and Development project. The Quantitative was Tomography Simulations and Reconstruction Algorithms project (99-ERD-015) funded as a multi-directorate, three-year effort to advance the state of the art in radiographic simulation and tomographic reconstruction by improving simulation and including this simulation in the tomographic reconstruction process. Goals were to improve the accuracy of radiographic simulation, and to couple advanced radiographic simulation tools with a robust, many-variable optimization algorithm. In this project, we were able to demonstrate accuracy in X-Ray simulation at the 2% level, which is an improvement of roughly a factor of 5 in accuracy, and we have successfully coupled our simulation tools with the CCG (Constrained Conjugate Gradient) optimization algorithm, allowing reconstructions that include spectral effects and blurring in the reconstructions. Another result of the project was the assembly of a low-scatter X-Ray imaging facility for use in nondestructive evaluation applications. We conclude with a discussion of future work.

  18. Modelling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Casetti, E.; Vogt, W.G.; Mickle, M.H.

    1984-01-01

    This conference includes papers on the uses of supercomputers, multiprocessors, artificial intelligence and expert systems in various energy applications. Topics considered include knowledge-based expert systems for power engineering, a solar air conditioning laboratory computer system, multivariable control systems, the impact of power system disturbances on computer systems, simulating shared-memory parallel computers, real-time image processing with multiprocessors, and network modeling and simulation of greenhouse solar systems.

  19. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...... of models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....... of models with regards to their purpose, character, field of application and time dimension inherently calls for a similar diversity in validation approaches. A classification of models in terms of the mentioned elements is presented and used to shed light on possible types of validation leading...

  20. Toward quantitative modeling of silicon phononic thermocrystals

    Energy Technology Data Exchange (ETDEWEB)

    Lacatena, V. [STMicroelectronics, 850, rue Jean Monnet, F-38926 Crolles (France); IEMN UMR CNRS 8520, Institut d' Electronique, de Microélectronique et de Nanotechnologie, Avenue Poincaré, F-59652 Villeneuve d' Ascq (France); Haras, M.; Robillard, J.-F., E-mail: jean-francois.robillard@isen.iemn.univ-lille1.fr; Dubois, E. [IEMN UMR CNRS 8520, Institut d' Electronique, de Microélectronique et de Nanotechnologie, Avenue Poincaré, F-59652 Villeneuve d' Ascq (France); Monfray, S.; Skotnicki, T. [STMicroelectronics, 850, rue Jean Monnet, F-38926 Crolles (France)

    2015-03-16

    The wealth of technological patterning technologies of deca-nanometer resolution brings opportunities to artificially modulate thermal transport properties. A promising example is given by the recent concepts of 'thermocrystals' or 'nanophononic crystals' that introduce regular nano-scale inclusions using a pitch scale in between the thermal phonons mean free path and the electron mean free path. In such structures, the lattice thermal conductivity is reduced down to two orders of magnitude with respect to its bulk value. Beyond the promise held by these materials to overcome the well-known “electron crystal-phonon glass” dilemma faced in thermoelectrics, the quantitative prediction of their thermal conductivity poses a challenge. This work paves the way toward understanding and designing silicon nanophononic membranes by means of molecular dynamics simulation. Several systems are studied in order to distinguish the shape contribution from bulk, ultra-thin membranes (8 to 15 nm), 2D phononic crystals, and finally 2D phononic membranes. After having discussed the equilibrium properties of these structures from 300 K to 400 K, the Green-Kubo methodology is used to quantify the thermal conductivity. The results account for several experimental trends and models. It is confirmed that the thin-film geometry as well as the phononic structure act towards a reduction of the thermal conductivity. The further decrease in the phononic engineered membrane clearly demonstrates that both phenomena are cumulative. Finally, limitations of the model and further perspectives are discussed.

  1. Quantitative comparison between simulated and experimental FCC rolling textures

    DEFF Research Database (Denmark)

    Wronski, M.; Wierzbanowski, K.; Leffers, Torben

    2015-01-01

    The degree of similarity between simulated and experimental fcc rolling textures is characterized by a single scalar parameter. The textures are simulated with a relatively simple and efficient 1-point model which allows us to vary the strength of the interaction between the grains and the surrou......The degree of similarity between simulated and experimental fcc rolling textures is characterized by a single scalar parameter. The textures are simulated with a relatively simple and efficient 1-point model which allows us to vary the strength of the interaction between the grains...

  2. Note on quantitatively correct simulations of the kinetic beam-plasma instability

    CERN Document Server

    Lotov, K V; Mesyats, E A; Snytnikov, A V; Vshivkov, V A

    2014-01-01

    A large number of model particles is shown necessary for quantitatively correct simulations of the kinetic beam-plasma instability with the clouds-in-cells method. The required number of particles scales inversely with the expected growth rate, as in the kinetic regime only a narrow interval of beam velocities is resonant with the wave.

  3. Note on quantitatively correct simulations of the kinetic beam-plasma instability

    Energy Technology Data Exchange (ETDEWEB)

    Lotov, K. V.; Timofeev, I. V. [Budker Institute of Nuclear Physics SB RAS, 630090 Novosibirsk (Russian Federation); Novosibirsk State University, 630090 Novosibirsk (Russian Federation); Mesyats, E. A.; Snytnikov, A. V.; Vshivkov, V. A. [Institute of Computational Mathematics and Mathematical Geophysics SB RAS, 630090 Novosibirsk (Russian Federation)

    2015-02-15

    A large number of model particles are shown necessary for quantitatively correct simulations of the kinetic beam-plasma instability with the clouds-in-cells method. The required number of particles scales inversely with the expected growth rate, as only a narrow interval of beam velocities is resonant with the wave in the kinetic regime.

  4. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  5. Quantitative surface evaluation by matching experimental and simulated ronchigram images

    Science.gov (United States)

    Kantún Montiel, Juana Rosaura; Cordero Dávila, Alberto; González García, Jorge

    2011-09-01

    To estimate qualitatively the surface errors with Ronchi test, the experimental and simulated ronchigrams are compared. Recently surface errors have been obtained quantitatively matching the intersection point coordinates of ronchigrama fringes with x-axis . In this case, gaussian fit must be done for each fringe, and interference orders are used in Malacara algorithm for the simulations. In order to evaluate surface errors, we added an error function in simulations, described with cubic splines, to the sagitta function of the ideal surface. We used the vectorial transversal aberration formula and a ruling with cosinusoidal transmittance, because these rulings reproduce better experimental ronchigram fringe profiles. Several error functions are tried until the whole experimental ronchigrama image is reproduced. The optimization process was done using genetic algorithms.

  6. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois;

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  7. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  8. SIBSIM - quantitative phenotype simulation in extended pedigreesFranke

    Directory of Open Access Journals (Sweden)

    Franke, Daniel

    2006-02-01

    Full Text Available A tool (SIBSIM is described for quantitative phenotype simulation in extended pedigrees. Download and installation information are given and the advantages and limitations of the tool are described. The input format is based on XML and the different sections of an input file are explained. A short explanation of the algorithm is given. Links to the download site, the user manual, and related literature as well as a detailed example are included.Availability: The software is available at: http://www.imbs.uni-luebeck.de/pub/sibsim

  9. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  10. Quantitative model validation techniques: new insights

    CERN Document Server

    Ling, You

    2012-01-01

    This paper develops new insights into quantitative methods for the validation of computational model prediction. Four types of methods are investigated, namely classical and Bayesian hypothesis testing, a reliability-based method, and an area metric-based method. Traditional Bayesian hypothesis testing is extended based on interval hypotheses on distribution parameters and equality hypotheses on probability distributions, in order to validate models with deterministic/stochastic output for given inputs. Two types of validation experiments are considered - fully characterized (all the model/experimental inputs are measured and reported as point values) and partially characterized (some of the model/experimental inputs are not measured or are reported as intervals). Bayesian hypothesis testing can minimize the risk in model selection by properly choosing the model acceptance threshold, and its results can be used in model averaging to avoid Type I/II errors. It is shown that Bayesian interval hypothesis testing...

  11. Building a Database for a Quantitative Model

    Science.gov (United States)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  12. Quantitative isothermal phase-field simulations of peritectic phase transformation in FeMn system

    Directory of Open Access Journals (Sweden)

    Celso Luiz Moraes Alves

    2016-01-01

    Full Text Available The present investigation shows quantitative results for the peritectic phase transformation of FeMn alloys utilizing phase-field simulations in 1-D and 2-D. The phase-field method used was based on an adaptation of the proposal of Folch and Plapp [Phys. Rev. E, 2005, 72, 011602] for the eutectic reaction. The two stages of peritectic phase transformation, the peritectic reaction and the peritectic transformation, were investigated numerically utilizing this phase-field approach. The evolution of the phases was quantitatively analyzed during the peritectic transformation and the fractions of the phases at the end of the solidification were compared with the thermodynamic equilibrium, defined by the phase diagram, for the case of 1-D simulation with peritectic concentration. An assessment of the behavior of the concentration gradient in the γ-phase (the peritectic phase through time was also carried out and a mathematical function which describes the γ-phase thickness evolution was defined. Finally, 2-D simulations were performed to clearly identify the two stages of the peritectic phase transformation. The obtained results show two main facts: (1 the numerical model is able to simulate quantitatively this phase transformation; and, (2 this numerical tool can be utilized for investigating quantitatively some aspects (normally determined indirectly that are difficult to be determined by direct measurements in experimental works.

  13. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.;

    , have the potential to include also mutual wake interaction phenomenons. The basic conjecture behind the dynamic wake meandering (DWM) model is that wake transportation in the atmospheric boundary layer is driven by the large scale lateral- and vertical turbulence components. Based on this conjecture...... and trailed vorticity, has been approached by a simple semi-empirical model essentially based on an eddy viscosity philosophy. Contrary to previous attempts to model wake loading, the DWM approach opens for a unifying description in the sense that turbine power- and load aspects can be treated simultaneously...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  14. Asynchronous adaptive time step in quantitative cellular automata modeling

    Directory of Open Access Journals (Sweden)

    Sun Yan

    2004-06-01

    Full Text Available Abstract Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment.

  15. Qualitative and quantitative simulation of androgen receptor antagonists: A case study of polybrominated diphenyl ethers.

    Science.gov (United States)

    Wu, Yang; Shi, Wei; Xia, Pu; Zhang, Xiaowei; Yu, Hongxia

    2017-12-15

    Recently, great attention has been paid to the identification and prediction of the androgen disrupting potencies of polybrominated diphenyl ethers (PBDEs). However, few existing models can discriminate active and inactive compounds, which make the quantitative prediction process including the quantitative structure-activity relationship (QSAR) technique unreliable. In this study, different grouping methods were investigated and compared for qualitative identification, including molecular docking and molecular dynamics simulations (MD). The results showed that qualitative identification based on MD, which is lab-independent, accurate and closer to the real transcriptional activation process, could separate 90.5% of active and inactive chemicals and was preferred. The 3D-QSAR models built as the quantitative simulation method showed r(2) and q(2) values of 0.513 and 0.980, respectively. Together, a novel workflow combining qualitative identification and quantitative simulations was generated with processes including activeness discrimination and activity prediction. This workflow, for analyzing the antagonism of androgen receptor (AR) of PBDEs is not only allowing researchers to reduce their intense laboratory experiments but also assisting them in inspecting and adjusting their laboratory systems and results. Copyright © 2017. Published by Elsevier B.V.

  16. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...... by the environment in which they are embedded. This thesis studies the semantics and properties of a model-based framework for re- active systems, in which models and specifications are assumed to contain quantifiable information, such as references to time or energy. Our goal is to develop a theory of approximation......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  17. A simulation study of the effects of assignment of prior identity-by-descent probabilities to unselected sib pairs, in covariance-structure modeling of a quantitative-trait locus.

    Science.gov (United States)

    Dolan, C V; Boomsma, D I; Neale, M C

    1999-01-01

    Sib pair-selection strategies, designed to identify the most informative sib pairs in order to detect a quantitative-trait locus (QTL), give rise to a missing-data problem in genetic covariance-structure modeling of QTL effects. After selection, phenotypic data are available for all sibs, but marker data-and, consequently, the identity-by-descent (IBD) probabilities-are available only in selected sib pairs. One possible solution to this missing-data problem is to assign prior IBD probabilities (i.e., expected values) to the unselected sib pairs. The effect of this assignment in genetic covariance-structure modeling is investigated in the present paper. Two maximum-likelihood approaches to estimation are considered, the pi-hat approach and the IBD-mixture approach. In the simulations, sample size, selection criteria, QTL-increaser allele frequency, and gene action are manipulated. The results indicate that the assignment of prior IBD probabilities results in serious estimation bias in the pi-hat approach. Bias is also present in the IBD-mixture approach, although here the bias is generally much smaller. The null distribution of the log-likelihood ratio (i.e., in absence of any QTL effect) does not follow the expected null distribution in the pi-hat approach after selection. In the IBD-mixture approach, the null distribution does agree with expectation.

  18. Quantitative magnetospheric models: results and perspectives.

    Science.gov (United States)

    Kuznetsova, M.; Hesse, M.; Gombosi, T.; Csem Team

    Global magnetospheric models are indispensable tool that allow multi-point measurements to be put into global context Significant progress is achieved in global MHD modeling of magnetosphere structure and dynamics Medium resolution simulations confirm general topological pictures suggested by Dungey State of the art global models with adaptive grids allow performing simulations with highly resolved magnetopause and magnetotail current sheet Advanced high-resolution models are capable to reproduced transient phenomena such as FTEs associated with formation of flux ropes or plasma bubbles embedded into magnetopause and demonstrate generation of vortices at magnetospheric flanks On the other hand there is still controversy about the global state of the magnetosphere predicted by MHD models to the point of questioning the length of the magnetotail and the location of the reconnection sites within it For example for steady southwards IMF driving condition resistive MHD simulations produce steady configuration with almost stationary near-earth neutral line While there are plenty of observational evidences of periodic loading unloading cycle during long periods of southward IMF Successes and challenges in global modeling of magnetispheric dynamics will be addessed One of the major challenges is to quantify the interaction between large-scale global magnetospheric dynamics and microphysical processes in diffusion regions near reconnection sites Possible solutions to controversies will be discussed

  19. Delay modeling in logic simulation

    Energy Technology Data Exchange (ETDEWEB)

    Acken, J. M.; Goldstein, L. H.

    1980-01-01

    As digital integrated circuit size and complexity increases, the need for accurate and efficient computer simulation increases. Logic simulators such as SALOGS (SAndia LOGic Simulator), which utilize transition states in addition to the normal stable states, provide more accurate analysis than is possible with traditional logic simulators. Furthermore, the computational complexity of this analysis is far lower than that of circuit simulation such as SPICE. An eight-value logic simulation environment allows the use of accurate delay models that incorporate both element response and transition times. Thus, timing simulation with an accuracy approaching that of circuit simulation can be accomplished with an efficiency comparable to that of logic simulation. 4 figures.

  20. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided......, allowing verification procedures to quantify judgements, on how suitable a model is for a given specification — hence mitigating the usual harsh distinction between satisfactory and non-satisfactory system designs. This information, among other things, allows us to evaluate the robustness of our framework......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  1. Simulation Analysis of Quantitative Analysis Model of Personal Credit Risk Assessment in Commercial Banks%商业银行个人信贷风险评估的定量分析模型仿真分析

    Institute of Scientific and Technical Information of China (English)

    宋强; 刘洋

    2015-01-01

    商业银行个人信贷受到管理不到位、风险信息管理系统滞后等因素的影响,需要进行商业银行个人信贷风险评估的定量分析模型构建,提高商业银行信贷风险管理的能力。提出一种基于多层步进动态评估的商业银行个人信贷风险评估的定量分析模型。首先进行了商业银行个人信贷风险种类分析和影响因素分析,构建商业银行个人信贷的风险评估参量评价体系,根据先期的数据分析,计算得到商业银行个人信贷风险标准中的等级梯度值,进行商业银行个人信贷风险评估的定量分析模型优化设计。仿真结果表明,采用该模型进行商业银行个人信贷风险评估的定量分析,数据分析拟合结果准确,能有效抵御信贷风险,通过多渠道建立有效的损失补偿机制,提高商业银行的信贷风险防控能力。%Personal credit in commercial banks affected by factors such as not in place management and lagged behind risk information management system, it is necessary to build up the quantitative analysis model of personal credit risk as⁃sessment in commercial banks to improve their capacity in credit risk management. Therefore, a kind of quantitative anal⁃ysis model of personal credit risk assessment in commercial banks is proposed based on multilayer step-by-step dynamic assessment. First, the article analyzes the types and influence factors of commercial banks' personal credit risks, then es⁃ tablish the parameters evaluation system of personal credit risk assessment in commercial banks, according to preliminary data analysis, calculates the rank gradient value in the personal credit risk standard of commercial banks, then optimizes the design of quantitative analysis model for commercial banks' personal credit risk assessment. The simulation results show that adopting this model to proceed the quantitative analysis for personal credit risk assessment of commercial

  2. Rapid quantitative pharmacodynamic imaging by a novel method: theory, simulation testing and proof of principle

    Directory of Open Access Journals (Sweden)

    Kevin J. Black

    2013-08-01

    Full Text Available Pharmacological challenge imaging has mapped, but rarely quantified, the sensitivity of a biological system to a given drug. We describe a novel method called rapid quantitative pharmacodynamic imaging. This method combines pharmacokinetic-pharmacodynamic modeling, repeated small doses of a challenge drug over a short time scale, and functional imaging to rapidly provide quantitative estimates of drug sensitivity including EC50 (the concentration of drug that produces half the maximum possible effect. We first test the method with simulated data, assuming a typical sigmoidal dose-response curve and assuming imperfect imaging that includes artifactual baseline signal drift and random error. With these few assumptions, rapid quantitative pharmacodynamic imaging reliably estimates EC50 from the simulated data, except when noise overwhelms the drug effect or when the effect occurs only at high doses. In preliminary fMRI studies of primate brain using a dopamine agonist, the observed noise level is modest compared with observed drug effects, and a quantitative EC50 can be obtained from some regional time-signal curves. Taken together, these results suggest that research and clinical applications for rapid quantitative pharmacodynamic imaging are realistic.

  3. SeqSIMLA2: simulating correlated quantitative traits accounting for shared environmental effects in user-specified pedigree structure.

    Science.gov (United States)

    Chung, Ren-Hua; Tsai, Wei-Yun; Hsieh, Chang-Hsun; Hung, Kuan-Yi; Hsiung, Chao A; Hauser, Elizabeth R

    2015-01-01

    Simulation tools that simulate sequence data in unrelated cases and controls or in families with quantitative traits or disease status are important for genetic studies. The simulation tools can be used to evaluate the statistical power for detecting the causal variants when planning a genetic epidemiology study, or to evaluate the statistical properties for new methods. We previously developed SeqSIMLA version 1 (SeqSIMLA1), which simulates family or case-control data with a disease or quantitative trait model. SeqSIMLA1, and several other tools that simulate quantitative traits, do not specifically model the shared environmental effects among relatives on a trait. However, shared environmental effects are commonly observed for some traits in families, such as body mass index. SeqSIMLA1 simulates a fixed three-generation family structure. However, it would be ideal to simulate prespecified pedigree structures for studies involving large pedigrees. Thus, we extended SeqSIMLA1 to create SeqSIMLA2, which can simulate correlated traits and considers the shared environmental effects. SeqSIMLA2 can also simulate prespecified large pedigree structures. There are no restrictions on the number of individuals that can be simulated in a pedigree. We used a blood pressure example to demonstrate that SeqSIMLA2 can simulate realistic correlation structures between the systolic and diastolic blood pressure among relatives. We also showed that SeqSIMLA2 can simulate large pedigrees with large chromosomal regions in a reasonable time frame. © 2014 WILEY PERIODICALS, INC.

  4. Simulation Modeling of a Facility Layout in Operations Management Classes

    Science.gov (United States)

    Yazici, Hulya Julie

    2006-01-01

    Teaching quantitative courses can be challenging. Similarly, layout modeling and lean production concepts can be difficult to grasp in an introductory OM (operations management) class. This article describes a simulation model developed in PROMODEL to facilitate the learning of layout modeling and lean manufacturing. Simulation allows for the…

  5. Simulation Modeling of a Facility Layout in Operations Management Classes

    Science.gov (United States)

    Yazici, Hulya Julie

    2006-01-01

    Teaching quantitative courses can be challenging. Similarly, layout modeling and lean production concepts can be difficult to grasp in an introductory OM (operations management) class. This article describes a simulation model developed in PROMODEL to facilitate the learning of layout modeling and lean manufacturing. Simulation allows for the…

  6. Preparations, models, and simulations.

    Science.gov (United States)

    Rheinberger, Hans-Jörg

    2015-01-01

    This paper proposes an outline for a typology of the different forms that scientific objects can take in the life sciences. The first section discusses preparations (or specimens)--a form of scientific object that accompanied the development of modern biology in different guises from the seventeenth century to the present: as anatomical-morphological specimens, as microscopic cuts, and as biochemical preparations. In the second section, the characteristics of models in biology are discussed. They became prominent from the end of the nineteenth century onwards. Some remarks on the role of simulations--characterising the life sciences of the turn from the twentieth to the twenty-first century--conclude the paper.

  7. Quantitative Analysis of Polarimetric Model-Based Decomposition Methods

    Directory of Open Access Journals (Sweden)

    Qinghua Xie

    2016-11-01

    Full Text Available In this paper, we analyze the robustness of the parameter inversion provided by general polarimetric model-based decomposition methods from the perspective of a quantitative application. The general model and algorithm we have studied is the method proposed recently by Chen et al., which makes use of the complete polarimetric information and outperforms traditional decomposition methods in terms of feature extraction from land covers. Nevertheless, a quantitative analysis on the retrieved parameters from that approach suggests that further investigations are required in order to fully confirm the links between a physically-based model (i.e., approaches derived from the Freeman–Durden concept and its outputs as intermediate products before any biophysical parameter retrieval is addressed. To this aim, we propose some modifications on the optimization algorithm employed for model inversion, including redefined boundary conditions, transformation of variables, and a different strategy for values initialization. A number of Monte Carlo simulation tests for typical scenarios are carried out and show that the parameter estimation accuracy of the proposed method is significantly increased with respect to the original implementation. Fully polarimetric airborne datasets at L-band acquired by German Aerospace Center’s (DLR’s experimental synthetic aperture radar (E-SAR system were also used for testing purposes. The results show different qualitative descriptions of the same cover from six different model-based methods. According to the Bragg coefficient ratio (i.e., β , they are prone to provide wrong numerical inversion results, which could prevent any subsequent quantitative characterization of specific areas in the scene. Besides the particular improvements proposed over an existing polarimetric inversion method, this paper is aimed at pointing out the necessity of checking quantitatively the accuracy of model-based PolSAR techniques for a

  8. Quantitative bioluminescence imaging of mouse tumor models.

    Science.gov (United States)

    Tseng, Jen-Chieh; Kung, Andrew L

    2015-01-05

    Bioluminescence imaging (BLI) has become an essential technique for preclinical evaluation of anticancer therapeutics and provides sensitive and quantitative measurements of tumor burden in experimental cancer models. For light generation, a vector encoding firefly luciferase is introduced into human cancer cells that are grown as tumor xenografts in immunocompromised hosts, and the enzyme substrate luciferin is injected into the host. Alternatively, the reporter gene can be expressed in genetically engineered mouse models to determine the onset and progression of disease. In addition to expression of an ectopic luciferase enzyme, bioluminescence requires oxygen and ATP, thus only viable luciferase-expressing cells or tissues are capable of producing bioluminescence signals. Here, we summarize a BLI protocol that takes advantage of advances in hardware, especially the cooled charge-coupled device camera, to enable detection of bioluminescence in living animals with high sensitivity and a large dynamic range.

  9. Quantitative assessment model for gastric cancer screening

    Institute of Scientific and Technical Information of China (English)

    Kun Chen; Wei-Ping Yu; Liang Song; Yi-Min Zhu

    2005-01-01

    AIM: To set up a mathematic model for gastric cancer screening and to evaluate its function in mass screening for gastric cancer.METHODS: A case control study was carried on in 66patients and 198 normal people, then the risk and protective factors of gastric cancer were determined, including heavy manual work, foods such as small yellow-fin tuna, dried small shrimps, squills, crabs, mothers suffering from gastric diseases, spouse alive, use of refrigerators and hot food,etc. According to some principles and methods of probability and fuzzy mathematics, a quantitative assessment model was established as follows: first, we selected some factors significant in statistics, and calculated weight coefficient for each one by two different methods; second, population space was divided into gastric cancer fuzzy subset and non gastric cancer fuzzy subset, then a mathematic model for each subset was established, we got a mathematic expression of attribute degree (AD).RESULTS: Based on the data of 63 patients and 693 normal people, AD of each subject was calculated. Considering the sensitivity and specificity, the thresholds of AD values calculated were configured with 0.20 and 0.17, respectively.According to these thresholds, the sensitivity and specificity of the quantitative model were about 69% and 63%.Moreover, statistical test showed that the identification outcomes of these two different calculation methods were identical (P>0.05).CONCLUSION: The validity of this method is satisfactory.It is convenient, feasible, economic and can be used to determine individual and population risks of gastric cancer.

  10. Global Quantitative Modeling of Chromatin Factor Interactions

    Science.gov (United States)

    Zhou, Jian; Troyanskaya, Olga G.

    2014-01-01

    Chromatin is the driver of gene regulation, yet understanding the molecular interactions underlying chromatin factor combinatorial patterns (or the “chromatin codes”) remains a fundamental challenge in chromatin biology. Here we developed a global modeling framework that leverages chromatin profiling data to produce a systems-level view of the macromolecular complex of chromatin. Our model ultilizes maximum entropy modeling with regularization-based structure learning to statistically dissect dependencies between chromatin factors and produce an accurate probability distribution of chromatin code. Our unsupervised quantitative model, trained on genome-wide chromatin profiles of 73 histone marks and chromatin proteins from modENCODE, enabled making various data-driven inferences about chromatin profiles and interactions. We provided a highly accurate predictor of chromatin factor pairwise interactions validated by known experimental evidence, and for the first time enabled higher-order interaction prediction. Our predictions can thus help guide future experimental studies. The model can also serve as an inference engine for predicting unknown chromatin profiles — we demonstrated that with this approach we can leverage data from well-characterized cell types to help understand less-studied cell type or conditions. PMID:24675896

  11. Quantitative Modeling of Landscape Evolution, Treatise on Geomorphology

    NARCIS (Netherlands)

    Temme, A.J.A.M.; Schoorl, J.M.; Claessens, L.F.G.; Veldkamp, A.; Shroder, F.S.

    2013-01-01

    This chapter reviews quantitative modeling of landscape evolution – which means that not just model studies but also modeling concepts are discussed. Quantitative modeling is contrasted with conceptual or physical modeling, and four categories of model studies are presented. Procedural studies focus

  12. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  13. Simulation of Gravity Currents Using VOF Model

    Institute of Scientific and Technical Information of China (English)

    邹建锋; 黄钰期; 应新亚; 任安禄

    2002-01-01

    By the Volume of Fluid (VOF) multiphase flow model two-dimensional gravity currents with three phases including air are numerically simulated in this article. The necessity of consideration of turbulence effect for high Reynolds numbers is demonstrated quantitatively by LES (the Large Eddy Simulation) turbulence model. The gravity currents are simulated for h ≠ H as well as h = H, where h is the depth of the gravity current before the release and H is the depth of the intruded fluid. Uprising of swell occurs when a current flows horizontally into another lighter one for h ≠ H. The problems under what condition the uprising of swell occurs and how long it takes are considered in this article. All the simulated results are in reasonable agreement with the experimental results available.

  14. Notes on modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Redondo, Antonio [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  15. A quantitative model for integrating landscape evolution and soil formation

    Science.gov (United States)

    Vanwalleghem, T.; Stockmann, U.; Minasny, B.; McBratney, Alex B.

    2013-06-01

    evolution is closely related to soil formation. Quantitative modeling of the dynamics of soils and landscapes should therefore be integrated. This paper presents a model, named Model for Integrated Landscape Evolution and Soil Development (MILESD), which describes the interaction between pedogenetic and geomorphic processes. This mechanistic model includes the most significant soil formation processes, ranging from weathering to clay translocation, and combines these with the lateral redistribution of soil particles through erosion and deposition. The model is spatially explicit and simulates the vertical variation in soil horizon depth as well as basic soil properties such as texture and organic matter content. In addition, sediment export and its properties are recorded. This model is applied to a 6.25 km2 area in the Werrikimbe National Park, Australia, simulating soil development over a period of 60,000 years. Comparison with field observations shows how the model accurately predicts trends in total soil thickness along a catena. Soil texture and bulk density are predicted reasonably well, with errors of the order of 10%, however, field observations show a much higher organic carbon content than predicted. At the landscape scale, different scenarios with varying erosion intensity result only in small changes of landscape-averaged soil thickness, while the response of the total organic carbon stored in the system is higher. Rates of sediment export show a highly nonlinear response to soil development stage and the presence of a threshold, corresponding to the depletion of the soil reservoir, beyond which sediment export drops significantly.

  16. Simulating an Optimizing Model of Currency Substitution Simulating an Optimizing Model of Currency Substitution

    Directory of Open Access Journals (Sweden)

    Leonardo Leiderman

    1992-03-01

    Full Text Available Simulating an Optimizing Model of Currency Substitution This paper reports simulations based on the parameter estimates of an intertemporal model of currency substitution under nonexpected utility obtained by Bufman and Leiderman (1991. Here we first study the quantitative impact of changes in the degree of dollarization and in the elasticity of currency substitution on government seigniorage. Then, when examine whether the model can account for the comovement of consumption growth and assets' returnr after the 1985 stabilization program, and in particular for the consumption boom of 1986-87. The results are generally encouraging for future applications of optimizing models of currencysubstitution to policy and practical issues.

  17. Evaluating uncertainty in simulation models

    Energy Technology Data Exchange (ETDEWEB)

    McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.

    1998-12-01

    The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.

  18. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operation...... in case of such faults. The design of the controller is described and its performance assessed by simulations. The control strategies are explained and the behaviour of the turbine discussed....

  19. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operation...

  20. Quantitative Modeling of Human-Environment Interactions in Preindustrial Time

    Science.gov (United States)

    Sommer, Philipp S.; Kaplan, Jed O.

    2017-04-01

    Quantifying human-environment interactions and anthropogenic influences on the environment prior to the Industrial revolution is essential for understanding the current state of the earth system. This is particularly true for the terrestrial biosphere, but marine ecosystems and even climate were likely modified by human activities centuries to millennia ago. Direct observations are however very sparse in space and time, especially as one considers prehistory. Numerical models are therefore essential to produce a continuous picture of human-environment interactions in the past. Agent-based approaches, while widely applied to quantifying human influence on the environment in localized studies, are unsuitable for global spatial domains and Holocene timescales because of computational demands and large parameter uncertainty. Here we outline a new paradigm for the quantitative modeling of human-environment interactions in preindustrial time that is adapted to the global Holocene. Rather than attempting to simulate agency directly, the model is informed by a suite of characteristics describing those things about society that cannot be predicted on the basis of environment, e.g., diet, presence of agriculture, or range of animals exploited. These categorical data are combined with the properties of the physical environment in coupled human-environment model. The model is, at its core, a dynamic global vegetation model with a module for simulating crop growth that is adapted for preindustrial agriculture. This allows us to simulate yield and calories for feeding both humans and their domesticated animals. We couple this basic caloric availability with a simple demographic model to calculate potential population, and, constrained by labor requirements and land limitations, we create scenarios of land use and land cover on a moderate-resolution grid. We further implement a feedback loop where anthropogenic activities lead to changes in the properties of the physical

  1. An infinitesimal model for quantitative trait genomic value prediction.

    Directory of Open Access Journals (Sweden)

    Zhiqiu Hu

    Full Text Available We developed a marker based infinitesimal model for quantitative trait analysis. In contrast to the classical infinitesimal model, we now have new information about the segregation of every individual locus of the entire genome. Under this new model, we propose that the genetic effect of an individual locus is a function of the genome location (a continuous quantity. The overall genetic value of an individual is the weighted integral of the genetic effect function along the genome. Numerical integration is performed to find the integral, which requires partitioning the entire genome into a finite number of bins. Each bin may contain many markers. The integral is approximated by the weighted sum of all the bin effects. We now turn the problem of marker analysis into bin analysis so that the model dimension has decreased from a virtual infinity to a finite number of bins. This new approach can efficiently handle virtually unlimited number of markers without marker selection. The marker based infinitesimal model requires high linkage disequilibrium of all markers within a bin. For populations with low or no linkage disequilibrium, we develop an adaptive infinitesimal model. Both the original and the adaptive models are tested using simulated data as well as beef cattle data. The simulated data analysis shows that there is always an optimal number of bins at which the predictability of the bin model is much greater than the original marker analysis. Result of the beef cattle data analysis indicates that the bin model can increase the predictability from 10% (multiple marker analysis to 33% (multiple bin analysis. The marker based infinitesimal model paves a way towards the solution of genetic mapping and genomic selection using the whole genome sequence data.

  2. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  3. On-line simulations of models for backward masking.

    Science.gov (United States)

    Francis, Gregory

    2003-11-01

    Five simulations of quantitative models of visual backward masking are available on the Internet at http://www.psych.purdue.edu/-gfrancis/Publications/BackwardMasking/. The simulations can be run in a Web browser that supports the Java programming language. This article describes the motivation for making the simulations available and gives a brief introduction as to how the simulations are used. The source code is available on the Web page, and this article describes how the code is organized.

  4. IVOA Recommendation: Simulation Data Model

    CERN Document Server

    Lemson, Gerard; Cervino, Miguel; Gheller, Claudio; Gray, Norman; LePetit, Franck; Louys, Mireille; Ooghe, Benjamin; Wagner, Rick; Wozniak, Herve

    2014-01-01

    In this document and the accompanying documents we describe a data model (Simulation Data Model) describing numerical computer simulations of astrophysical systems. The primary goal of this standard is to support discovery of simulations by describing those aspects of them that scientists might wish to query on, i.e. it is a model for meta-data describing simulations. This document does not propose a protocol for using this model. IVOA protocols are being developed and are supposed to use the model, either in its original form or in a form derived from the model proposed here, but more suited to the particular protocol. The SimDM has been developed in the IVOA Theory Interest Group with assistance of representatives of relevant working groups, in particular DM and Semantics.

  5. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    Energy Technology Data Exchange (ETDEWEB)

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  6. Modeling and Simulation with INS.

    Science.gov (United States)

    Roberts, Stephen D.; And Others

    INS, the Integrated Network Simulation language, puts simulation modeling into a network framework and automatically performs such programming activities as placing the problem into a next event structure, coding events, collecting statistics, monitoring status, and formatting reports. To do this, INS provides a set of symbols (nodes and branches)…

  7. Simulation modeling of estuarine ecosystems

    Science.gov (United States)

    Johnson, R. W.

    1980-01-01

    A simulation model has been developed of Galveston Bay, Texas ecosystem. Secondary productivity measured by harvestable species (such as shrimp and fish) is evaluated in terms of man-related and controllable factors, such as quantity and quality of inlet fresh-water and pollutants. This simulation model used information from an existing physical parameters model as well as pertinent biological measurements obtained by conventional sampling techniques. Predicted results from the model compared favorably with those from comparable investigations. In addition, this paper will discuss remotely sensed and conventional measurements in the framework of prospective models that may be used to study estuarine processes and ecosystem productivity.

  8. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  9. Modeling and Simulating Environmental Effects

    OpenAIRE

    Guest, Peter S.; Murphree, Tom; Frederickson, Paul A.; Guest, Arlene A.

    2012-01-01

    MOVES Research & Education Systems Seminar: Presentation; Session 4: Collaborative NWDC/NPS M&S Research; Moderator: Curtis Blais; Modeling and Simulating Environmental Effects; speakers: Peter Guest, Paul Frederickson & Tom Murphree Environmental Effects Group

  10. Quantitative evaluation of ozone and selected climate parameters in a set of EMAC simulations

    Directory of Open Access Journals (Sweden)

    M. Righi

    2015-03-01

    Full Text Available Four simulations with the ECHAM/MESSy Atmospheric Chemistry (EMAC model have been evaluated with the Earth System Model Validation Tool (ESMValTool to identify differences in simulated ozone and selected climate parameters that resulted from (i different setups of the EMAC model (nudged vs. free-running and (ii different boundary conditions (emissions, sea surface temperatures (SSTs and sea ice concentrations (SICs. To assess the relative performance of the simulations, quantitative performance metrics are calculated consistently for the climate parameters and ozone. This is important for the interpretation of the evaluation results since biases in climate can impact on biases in chemistry and vice versa. The observational data sets used for the evaluation include ozonesonde and aircraft data, meteorological reanalyses and satellite measurements. The results from a previous EMAC evaluation of a model simulation with nudging towards realistic meteorology in the troposphere have been compared to new simulations with different model setups and updated emission data sets in free-running time slice and nudged quasi chemistry-transport model (QCTM mode. The latter two configurations are particularly important for chemistry-climate projections and for the quantification of individual sources (e.g., the transport sector that lead to small chemical perturbations of the climate system, respectively. With the exception of some specific features which are detailed in this study, no large differences that could be related to the different setups (nudged vs. free-running of the EMAC simulations were found, which offers the possibility to evaluate and improve the overall model with the help of shorter nudged simulations. The main differences between the two setups is a better representation of the tropospheric and stratospheric temperature in the nudged simulations, which also better reproduce stratospheric water vapor concentrations, due to the improved

  11. TREAT Modeling and Simulation Strategy

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, Mark David [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report summarizes a four-phase process used to describe the strategy in developing modeling and simulation software for the Transient Reactor Test Facility. The four phases of this research and development task are identified as (1) full core transient calculations with feedback, (2) experiment modeling, (3) full core plus experiment simulation and (4) quality assurance. The document describes the four phases, the relationship between these research phases, and anticipated needs within each phase.

  12. Evaluation of a quantitative phosphorus transport model for potential improvement of southern phosphorus indices

    Science.gov (United States)

    Due to a shortage of available phosphorus (P) loss data sets, simulated data from a quantitative P transport model could be used to evaluate a P-index. However, the model would need to accurately predict the P loss data sets that are available. The objective of this study was to compare predictions ...

  13. Development of probabilistic models for quantitative pathway analysis of plant pests introduction for the EU territory

    NARCIS (Netherlands)

    Douma, J.C.; Robinet, C.; Hemerik, L.; Mourits, M.C.M.; Roques, A.; Werf, van der W.

    2015-01-01

    The aim of this report is to provide EFSA with probabilistic models for quantitative pathway analysis of plant pest introduction for the EU territory through non-edible plant products or plants. We first provide a conceptualization of two types of pathway models. The individual based PM simulates an

  14. Quantitative modelling of the biomechanics of the avian syrinx

    NARCIS (Netherlands)

    Elemans, C.P.H.; Larsen, O.N.; Hoffmann, M.R.; Leeuwen, van J.L.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts

  15. Quantitative modelling of the biomechanics of the avian syrinx

    DEFF Research Database (Denmark)

    Elemans, Coen P. H.; Larsen, Ole Næsbye; Hoffmann, Marc R.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts...

  16. Power analysis of artificial selection experiments using efficient whole genome simulation of quantitative traits.

    Science.gov (United States)

    Kessner, Darren; Novembre, John

    2015-04-01

    Evolve and resequence studies combine artificial selection experiments with massively parallel sequencing technology to study the genetic basis for complex traits. In these experiments, individuals are selected for extreme values of a trait, causing alleles at quantitative trait loci (QTL) to increase or decrease in frequency in the experimental population. We present a new analysis of the power of artificial selection experiments to detect and localize quantitative trait loci. This analysis uses a simulation framework that explicitly models whole genomes of individuals, quantitative traits, and selection based on individual trait values. We find that explicitly modeling QTL provides qualitatively different insights than considering independent loci with constant selection coefficients. Specifically, we observe how interference between QTL under selection affects the trajectories and lengthens the fixation times of selected alleles. We also show that a substantial portion of the genetic variance of the trait (50-100%) can be explained by detected QTL in as little as 20 generations of selection, depending on the trait architecture and experimental design. Furthermore, we show that power depends crucially on the opportunity for recombination during the experiment. Finally, we show that an increase in power is obtained by leveraging founder haplotype information to obtain allele frequency estimates.

  17. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  18. Model reduction for circuit simulation

    CERN Document Server

    Hinze, Michael; Maten, E Jan W Ter

    2011-01-01

    Simulation based on mathematical models plays a major role in computer aided design of integrated circuits (ICs). Decreasing structure sizes, increasing packing densities and driving frequencies require the use of refined mathematical models, and to take into account secondary, parasitic effects. This leads to very high dimensional problems which nowadays require simulation times too large for the short time-to-market demands in industry. Modern Model Order Reduction (MOR) techniques present a way out of this dilemma in providing surrogate models which keep the main characteristics of the devi

  19. Goal relevance as a quantitative model of human task relevance.

    Science.gov (United States)

    Tanner, James; Itti, Laurent

    2017-03-01

    The concept of relevance is used ubiquitously in everyday life. However, a general quantitative definition of relevance has been lacking, especially as pertains to quantifying the relevance of sensory observations to one's goals. We propose a theoretical definition for the information value of data observations with respect to a goal, which we call "goal relevance." We consider the probability distribution of an agent's subjective beliefs over how a goal can be achieved. When new data are observed, its goal relevance is measured as the Kullback-Leibler divergence between belief distributions before and after the observation. Theoretical predictions about the relevance of different obstacles in simulated environments agreed with the majority response of 38 human participants in 83.5% of trials, beating multiple machine-learning models. Our new definition of goal relevance is general, quantitative, explicit, and allows one to put a number onto the previously elusive notion of relevance of observations to a goal. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Quantitative modeling of the ionospheric response to geomagnetic activity

    Directory of Open Access Journals (Sweden)

    T. J. Fuller-Rowell

    Full Text Available A physical model of the coupled thermosphere and ionosphere has been used to determine the accuracy of model predictions of the ionospheric response to geomagnetic activity, and assess our understanding of the physical processes. The physical model is driven by empirical descriptions of the high-latitude electric field and auroral precipitation, as measures of the strength of the magnetospheric sources of energy and momentum to the upper atmosphere. Both sources are keyed to the time-dependent TIROS/NOAA auroral power index. The output of the model is the departure of the ionospheric F region from the normal climatological mean. A 50-day interval towards the end of 1997 has been simulated with the model for two cases. The first simulation uses only the electric fields and auroral forcing from the empirical models, and the second has an additional source of random electric field variability. In both cases, output from the physical model is compared with F-region data from ionosonde stations. Quantitative model/data comparisons have been performed to move beyond the conventional "visual" scientific assessment, in order to determine the value of the predictions for operational use. For this study, the ionosphere at two ionosonde stations has been studied in depth, one each from the northern and southern mid-latitudes. The model clearly captures the seasonal dependence in the ionospheric response to geomagnetic activity at mid-latitude, reproducing the tendency for decreased ion density in the summer hemisphere and increased densities in winter. In contrast to the "visual" success of the model, the detailed quantitative comparisons, which are necessary for space weather applications, are less impressive. The accuracy, or value, of the model has been quantified by evaluating the daily standard deviation, the root-mean-square error, and the correlation coefficient between the data and model predictions. The modeled quiet-time variability, or standard

  1. A VRLA battery simulation model

    Energy Technology Data Exchange (ETDEWEB)

    Pascoe, P.E.; Anbuky, A.H. [Invensys Energy Systems NZ Limited, Christchurch (New Zealand)

    2004-05-01

    A valve regulated lead acid (VRLA) battery simulation model is an invaluable tool for the standby power system engineer. The obvious use for such a model is to allow the assessment of battery performance. This may involve determining the influence of cells suffering from state of health (SOH) degradation on the performance of the entire string, or the running of test scenarios to ascertain the most suitable battery size for the application. In addition, it enables the engineer to assess the performance of the overall power system. This includes, for example, running test scenarios to determine the benefits of various load shedding schemes. It also allows the assessment of other power system components, either for determining their requirements and/or vulnerabilities. Finally, a VRLA battery simulation model is vital as a stand alone tool for educational purposes. Despite the fundamentals of the VRLA battery having been established for over 100 years, its operating behaviour is often poorly understood. An accurate simulation model enables the engineer to gain a better understanding of VRLA battery behaviour. A system level multipurpose VRLA battery simulation model is presented. It allows an arbitrary battery (capacity, SOH, number of cells and number of strings) to be simulated under arbitrary operating conditions (discharge rate, ambient temperature, end voltage, charge rate and initial state of charge). The model accurately reflects the VRLA battery discharge and recharge behaviour. This includes the complex start of discharge region known as the coup de fouet. (author)

  2. A New Method for Quantitative Simulating Hydrocarbon Expulsion and Its Application

    Institute of Scientific and Technical Information of China (English)

    Mingcheng Li

    1994-01-01

    @@ Quantitative Simulating Oil Expulsion 1.Traditional method The oil expulsion (QE) can be calculated by subtracting oil residuum (QR) from oil generation (QG) and then making expulsion efficiency (fE) from QE divided by QG, as the following equations:

  3. Modelling and Simulation: An Overview

    NARCIS (Netherlands)

    M.J. McAleer (Michael); F. Chan (Felix); L. Oxley (Les)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are bor

  4. General introduction to simulation models

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Boklund, Anette

    2012-01-01

    Monte Carlo simulation can be defined as a representation of real life systems to gain insight into their functions and to investigate the effects of alternative conditions or actions on the modeled system. Models are a simplification of a system. Most often, it is best to use experiments and field...

  5. Modelling, simulating and optimizing Boilers

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2003-01-01

    of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic- Equation system. Being able to operate...

  6. Fault diagnosis based on continuous simulation models

    Science.gov (United States)

    Feyock, Stefan

    1987-01-01

    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  7. Vehicle dynamics modeling and simulation

    CERN Document Server

    Schramm, Dieter; Bardini, Roberto

    2014-01-01

    The authors examine in detail the fundamentals and mathematical descriptions of the dynamics of automobiles. In this context different levels of complexity will be presented, starting with basic single-track models up to complex three-dimensional multi-body models. A particular focus is on the process of establishing mathematical models on the basis of real cars and the validation of simulation results. The methods presented are explained in detail by means of selected application scenarios.

  8. Quantitative investigation of cellular growth in directional solidification by phase-field simulation.

    Science.gov (United States)

    Wang, Zhijun; Wang, Jincheng; Li, Junjie; Yang, Gencang; Zhou, Yaohe

    2011-10-01

    Using a quantitative phase-field model, a systematic investigation of cellular growth in directional solidification is carried out with emphasis on the selection of cellular tip undercooling, tip radius, and cellular spacing. Previous analytical models of cellular growth are evaluated according to the phase-field simulation results. The results show that cellular tip undercooling and tip radius not only depend on the pulling velocity and thermal gradient, but also depend on the cellular interaction related to the cellular spacing. The cellular interaction results in a finite stable range of cellular spacing. The lower limit is determined by the submerging mechanism while the upper limit comes from the tip splitting instability corresponding to the absence of the cellular growth solution, both of which can be obtained from phase-field simulation. Further discussions on the phase-field results also present an analytical method to predict the lower limit. Phase-field simulations on cell elimination between cells with equal spacing validate the finite range of cellular spacing and give deep insight into the cellular doublon and oscillatory instability between cell elimination and tip splitting.

  9. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  10. Computational Modeling of Simulation Tests.

    Science.gov (United States)

    1980-06-01

    Mexico , March 1979. 14. Kinney, G. F.,.::. IeiN, .hoce 1h Ir, McMillan, p. 57, 1962. 15. Courant and Friedrichs, ,U: r. on moca an.: Jho...AD 79 275 NEW MEXICO UNIV ALBUGUERGUE ERIC H WANG CIVIL ENGINE-ETC F/6 18/3 COMPUTATIONAL MODELING OF SIMULATION TESTS.(U) JUN 80 6 LEIGH, W CHOWN, B...COMPUTATIONAL MODELING OF SIMULATION TESTS00 0G. Leigh W. Chown B. Harrison Eric H. Wang Civil Engineering Research Facility University of New Mexico

  11. SIMULATION OF COLLECTIVE RISK MODEL

    Directory of Open Access Journals (Sweden)

    Viera Pacáková

    2007-12-01

    Full Text Available The article focuses on providing brief theoretical definitions of the basic terms and methods of modeling and simulations of insurance risks in non-life insurance by means of mathematical and statistical methods using statistical software. While risk assessment of insurance company in connection with its solvency is a rather complex and comprehensible problem, its solution starts with statistical modeling of number and amount of individual claims. Successful solution of these fundamental problems enables solving of curtail problems of insurance such as modeling and simulation of collective risk, premium an reinsurance premium calculation, estimation of probabiliy of ruin etc. The article also presents some essential ideas underlying Monte Carlo methods and their applications to modeling of insurance risk. Solving problem is to find the probability distribution of the collective risk in non-life insurance portfolio. Simulation of the compound distribution function of the aggregate claim amount can be carried out, if the distibution functions of the claim number process and the claim size are assumed given. The Monte Carlo simulation is suitable method to confirm the results of other methods and for treatments of catastrophic claims, when small collectives are studied. Analysis of insurance risks using risk theory is important part of the project Solvency II. Risk theory is analysis of stochastic features of non-life insurance process. The field of application of risk theory has grown rapidly. There is a need to develop the theory into form suitable for practical purposes and demostrate their application. Modern computer simulation techniques open up a wide field of practical applications for risk theory concepts, without requiring the restricive assumptions and sophisticated mathematics. This article presents some comparisons of the traditional actuarial methods and of simulation methods of the collective risk model.

  12. Application of simulation modeling to lipid peroxidation processes.

    Science.gov (United States)

    Tappel, A L; Tappel, A A; Fraga, C G

    1989-01-01

    A quantitative simulation model was developed that utilized present knowledge of lipid peroxidation in biological systems. The simulation model incorporated the following features: peroxidizability of polyunsaturated lipids, activation of inducers and their initiation of lipid peroxidation, concurrent autoxidation, inhibition of lipid peroxidation by vitamin E, reduction of some of the hydroperoxides by glutathione peroxidase, and formation of thiobarbituric acid-reactive substances. Simulation calculations were done using a computer spreadsheet program. When the simulation program was applied to tissue slice and microsomal peroxidizing systems, the results of the stimulation were in agreement with the experimental data.

  13. A Quantitative Study of Simulated Bicuspid Aortic Valves

    Science.gov (United States)

    Szeto, Kai; Nguyen, Tran; Rodriguez, Javier; Pastuszko, Peter; Nigam, Vishal; Lasheras, Juan

    2010-11-01

    Previous studies have shown that congentially bicuspid aortic valves develop degenerative diseases earlier than the standard trileaflet, but the causes are not well understood. It has been hypothesized that the asymmetrical flow patterns and turbulence found in the bileaflet valves together with abnormally high levels of strain may result in an early thickening and eventually calcification and stenosis. Central to this hypothesis is the need for a precise quantification of the differences in the strain rate levels between bileaflets and trileaflet valves. We present here some in-vitro dynamic measurements of the spatial variation of the strain rate in pig aortic vales conducted in a left ventricular heart flow simulator device. We measure the strain rate of each leaflet during the whole cardiac cycle using phase-locked stereoscopic three-dimensional image surface reconstruction techniques. The bicuspid case is simulated by surgically stitching two of the leaflets in a normal valve.

  14. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun;

    2013-01-01

    , comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  15. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    This paper describes the modelling, simulating and optimizing including experimental verification as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and fire tube boilers. A detailed dynamic...... model of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic-Equation system. Being able...... to operate a boiler plant dynamically means that the boiler designs must be able to absorb any fluctuations in water level and temperature gradients resulting from the pressure change in the boiler. On the one hand a large water-/steam space may be required, i.e. to build the boiler as big as possible. Due...

  16. Intelligent Mobility Modeling and Simulation

    Science.gov (United States)

    2015-03-04

    cog.cs.drexel.edu/act-r/index.html) •Models sensory / motor performance of human driver or teleoperator 27UNCLASSIFIED: Distribution Statement A. Approved for...U.S. ARMY TANK AUTOMOTIVE RESEARCH, DEVELOPMENT AND ENGINEERING CENTER Intelligent Mobility Modeling and Simulation 1 Dr. P. Jayakumar, S. Arepally...Prescribed by ANSI Std Z39-18 Contents 1. Mobility - Autonomy - Latency Relationship 2. Machine - Human Partnership 3. Development of Shared Control

  17. A quantitative model for assessing community dynamics of pleistocene mammals.

    Science.gov (United States)

    Lyons, S Kathleen

    2005-06-01

    Previous studies have suggested that species responded individualistically to the climate change of the last glaciation, expanding and contracting their ranges independently. Consequently, many researchers have concluded that community composition is plastic over time. Here I quantitatively assess changes in community composition over broad timescales and assess the effect of range shifts on community composition. Data on Pleistocene mammal assemblages from the FAUNMAP database were divided into four time periods (preglacial, full glacial, postglacial, and modern). Simulation analyses were designed to determine whether the degree of change in community composition is consistent with independent range shifts, given the distribution of range shifts observed. Results indicate that many of the communities examined in the United States were more similar through time than expected if individual range shifts were completely independent. However, in each time transition examined, there were areas of nonanalogue communities. I conducted sensitivity analyses to explore how the results were affected by the assumptions of the null model. Conclusions about changes in mammalian distributions and community composition are robust with respect to the assumptions of the model. Thus, whether because of biotic interactions or because of common environmental requirements, community structure through time is more complex than previously thought.

  18. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  19. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... performance has been developed. Outputs from the simulations are shrinking and swelling of water level in the drum during for example a start-up of the boiler, these gures combined with the requirements with respect to allowable water level uctuations in the drum denes the requirements with respect to drum...... size. The model has been formulated with a specied building-up of the pressure during the start-up of the plant, i.e. the steam production during start-up of the boiler is output from the model. The steam outputs together with requirements with respect to steam space load have been utilized to dene...

  20. Modeling and Simulation of Nanoindentation

    Science.gov (United States)

    Huang, Sixie; Zhou, Caizhi

    2017-08-01

    Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.

  1. Multiscale Stochastic Simulation and Modeling

    Energy Technology Data Exchange (ETDEWEB)

    James Glimm; Xiaolin Li

    2006-01-10

    Acceleration driven instabilities of fluid mixing layers include the classical cases of Rayleigh-Taylor instability, driven by a steady acceleration and Richtmyer-Meshkov instability, driven by an impulsive acceleration. Our program starts with high resolution methods of numerical simulation of two (or more) distinct fluids, continues with analytic analysis of these solutions, and the derivation of averaged equations. A striking achievement has been the systematic agreement we obtained between simulation and experiment by using a high resolution numerical method and improved physical modeling, with surface tension. Our study is accompanies by analysis using stochastic modeling and averaged equations for the multiphase problem. We have quantified the error and uncertainty using statistical modeling methods.

  2. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  3. Animal models for simulating weightlessness

    Science.gov (United States)

    Morey-Holton, E.; Wronski, T. J.

    1982-01-01

    NASA has developed a rat model to simulate on earth some aspects of the weightlessness alterations experienced in space, i.e., unloading and fluid shifts. Comparison of data collected from space flight and from the head-down rat suspension model suggests that this model system reproduces many of the physiological alterations induced by space flight. Data from various versions of the rat model are virtually identical for the same parameters; thus, modifications of the model for acute, chronic, or metabolic studies do not alter the results as long as the critical components of the model are maintained, i.e., a cephalad shift of fluids and/or unloading of the rear limbs.

  4. Spatial stochastic simulation offers potential as a quantitative method for pest risk analysis.

    Science.gov (United States)

    Rafoss, Trond

    2003-08-01

    Pest risk analysis represents an emerging field of risk analysis that evaluates the potential risks of the introduction and establishment of plant pests into a new geographic location and then assesses the management options to reduce those potential risks. Development of new and adapted methodology is required to answer questions concerning pest risk analysis of exotic plant pests. This research describes a new method for predicting the potential establishment and spread of a plant pest into new areas using a case study, Ralstonia solanacearum, a bacterial disease of potato. This method combines current quantitative methodologies, stochastic simulation, and geographic information systems with knowledge of pest biology and environmental data to derive new information about pest establishment potential in a geographical region where a pest had not been introduced. This proposed method extends an existing methodology for matching pest characteristics with environmental conditions by modeling and simulating dissemination behavior of a pest organism. Issues related to integrating spatial variables into risk analysis models are further discussed in this article.

  5. A Simulation and Modeling Framework for Space Situational Awareness

    Energy Technology Data Exchange (ETDEWEB)

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  6. Simulation Tool for Inventory Models: SIMIN

    OpenAIRE

    Pratiksha Saxen; Tulsi Kushwaha

    2014-01-01

    In this paper, an integrated simulation optimization model for the inventory system is developed. An effective algorithm is developed to evaluate and analyze the back-end stored simulation results. This paper proposes simulation tool SIMIN (Inventory Simulation) to simulate inventory models. SIMIN is a tool which simulates and compares the results of different inventory models. To overcome various practical restrictive assumptions, SIMIN provides values for a number of performance measurement...

  7. Hazard Response Modeling Uncertainty (A Quantitative Method)

    Science.gov (United States)

    1988-10-01

    ersio 114-11aiaiI I I I II L I ATINI Iri Ig IN Ig - ISI I I s InWLS I I I II I I I IWILLa RguOSmI IT$ INDS In s list INDIN I Im Ad inla o o ILLS I...OesotASII II I I I" GASau ATI im IVS ES Igo Igo INC 9 -U TIg IN ImS. I IgoIDI II i t I I ol f i isI I I I I I * WOOL ETY tGMIM (SU I YESMI jWM# GUSA imp I...is the concentration predicted by some component or model.P The variance of C /C is calculated and defined as var(Model I), where Modelo p I could be

  8. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, James A.

    1988-01-01

    A theoretical model for evaluating human spatial habitability (HuSH) in the proposed U.S. Space Station is developed. Optimizing the fitness of the space station environment for human occupancy will help reduce environmental stress due to long-term isolation and confinement in its small habitable volume. The development of tools that operationalize the behavioral bases of spatial volume for visual kinesthetic, and social logic considerations is suggested. This report further calls for systematic scientific investigations of how much real and how much perceived volume people need in order to function normally and with minimal stress in space-based settings. The theoretical model presented in this report can be applied to any size or shape interior, at any scale of consideration, for the Space Station as a whole to an individual enclosure or work station. Using as a point of departure the Isovist model developed by Dr. Michael Benedikt of the U. of Texas, the report suggests that spatial habitability can become as amenable to careful assessment as engineering and life support concerns.

  9. A Qualitative and Quantitative Evaluation of 8 Clear Sky Models.

    Science.gov (United States)

    Bruneton, Eric

    2016-10-27

    We provide a qualitative and quantitative evaluation of 8 clear sky models used in Computer Graphics. We compare the models with each other as well as with measurements and with a reference model from the physics community. After a short summary of the physics of the problem, we present the measurements and the reference model, and how we "invert" it to get the model parameters. We then give an overview of each CG model, and detail its scope, its algorithmic complexity, and its results using the same parameters as in the reference model. We also compare the models with a perceptual study. Our quantitative results confirm that the less simplifications and approximations are used to solve the physical equations, the more accurate are the results. We conclude with a discussion of the advantages and drawbacks of each model, and how to further improve their accuracy.

  10. Simulation modeling for microbial risk assessment.

    Science.gov (United States)

    Cassin, M H; Paoli, G M; Lammerding, A M

    1998-11-01

    Quantitative microbial risk assessment implies an estimation of the probability and impact of adverse health outcomes due to microbial hazards. In the case of food safety, the probability of human illness is a complex function of the variability of many parameters that influence the microbial environment, from the production to the consumption of a food. The analytical integration required to estimate the probability of foodborne illness is intractable in all but the simplest of models. Monte Carlo simulation is an alternative to computing analytical solutions. In some cases, a risk assessment may be commissioned to serve a larger purpose than simply the estimation of risk. A Monte Carlo simulation can provide insights into complex processes that are invaluable, and otherwise unavailable, to those charged with the task of risk management. Using examples from a farm-to-fork model of the fate of Escherichia coli O157:H7 in ground beef hamburgers, this paper describes specifically how such goals as research prioritization, risk-based characterization of control points, and risk-based comparison of intervention strategies can be objectively achieved using Monte Carlo simulation.

  11. A Solved Model to Show Insufficiency of Quantitative Adiabatic Condition

    Institute of Scientific and Technical Information of China (English)

    LIU Long-Jiang; LIU Yu-Zhen; TONG Dian-Min

    2009-01-01

    The adiabatic theorem is a useful tool in processing quantum systems slowly evolving,but its practical application depends on the quantitative condition expressed by Hamiltonian's eigenvalues and eigenstates,which is usually taken as a sufficient condition.Recently,the sumciency of the condition was questioned,and several counterex amples have been reported.Here we present a new solved model to show the insufficiency of the traditional quantitative adiabatic condition.

  12. Standard for Models and Simulations

    Science.gov (United States)

    Steele, Martin J.

    2016-01-01

    This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.

  13. Small animal positron emission tomography with gas detectors. Simulations, prototyping, and quantitative image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Vernekohl, Don

    2014-04-15

    plain surfaces, predicted by simulations, was observed. Third, as the production of photon converters is time consuming and expensive, it was investigated whether or not thin gas detectors with single-lead-layer-converters would be an alternative to the HIDAC converter design. Following simulations, those concepts potentially offer impressive coincidence sensitivities up to 24% for plain lead foils and up to 40% for perforated lead foils. Fourth, compared to other PET scanner systems, the HIDAC concept suffers from missing energy information. Consequently, a substantial amount of scatter events can be found within the measured data. On the basis of image reconstruction and correction techniques the influence of random and scatter events and their characteristics on several simulated phantoms were presented. It was validated with the HIDAC simulator that the applied correction technique results in perfectly corrected images. Moreover, it was shown that the simulator is a credible tool to provide quantitatively improved images. Fifth, a new model for the non-collinearity of the positronium annihilation was developed, since it was observed that the model implemented in the GATE simulator does not correspond to the measured observation. The input parameter of the new model was trimmed to match to a point source measurement. The influence of both models on the spatial resolution was studied with three different reconstruction methods. Furthermore, it was demonstrated that the reduction of converter depth, proposed for increased sensitivity, also has an advantage on the spatial resolution and that a reduction of the FOV from 17 cm to 4 cm (with only 2 detector heads) results in a remarkable sensitivity increase of 150% and a substantial increase in spatial resolution. The presented simulations for the spatial resolution analysis used an intrinsic detector resolution of 0.125 x 0.125 x 3.2 mm{sup 3} and were able to reach fair resolutions down to 0.9-0.5 mm, which is an

  14. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  15. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  16. Model for Simulation Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik

    1976-01-01

    A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance, a co....... The method is unique in modeling the three velocity components simultaneously, and it is found that important cross-statistical features are reasonably well-behaved. It is concluded that the model provides a practical, operational simulator of atmospheric turbulence.......A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance......, a correct spectral shape, and non-Gaussian statistics, is selected in order to evaluate the model turbulence. An actual turbulence record is analyzed in detail providing both a standard for comparison and input statistics for the generalized spectral analysis, which in turn produces a set of orthonormal...

  17. A Quantitative Phase-Field Simulation of Soft-Impingement in Austenite to Ferrite Transformation with Mixed-Mode

    Science.gov (United States)

    Bhattacharya, Avisor; Upadhyay, C. S.; Sangal, S.

    2017-10-01

    The present work simulates the transformation of austenite to ferrite with mixed-mode in a simplified austenitic grain geometry. The quantitative phase-field model, developed on the basis of non-local equilibrium at the interface, simulates the transformation. The present work reveals that the soft-impingement takes place much earlier than has been so far considered. The accumulation of carbon in and around the meeting point of the diffusion layers is not continuous, and a detailed mechanism of soft-impingement is presented here. The growth of ferrite under mixed-mode of transformation is analyzed and found to be consistent with the theory.

  18. Quantitative Verification of a Force-based Model for Pedestrian Dynamics

    CERN Document Server

    Chraibi, Mohcine; Schadschneider, Andreas; Mackens, Wolfgang

    2009-01-01

    This paper introduces a spatially continuous force-based model for simulating pedestrian dynamics. The main intention of this work is the quantitative description of pedestrian movement through bottlenecks and in corridors. Measurements of flow and density at bottlenecks will be presented and compared with empirical data. Furthermore the fundamental diagram for the movement in a corridor is reproduced. The results of the proposed model show a good agreement with empirical data.

  19. Generalized PSF modeling for optimized quantitation in PET imaging

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-ud-Din, Hassan; Karakatsanis, Nicolas A.; Jha, Abhinav K.; Casey, Michael E.; Kadrmas, Dan J.; Rahmim, Arman

    2017-06-01

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUVmean and SUVmax, including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUVmean bias in small tumours. Overall, the results indicate that exactly matched PSF

  20. Multiscale mathematical modeling and simulation of cellular dynamical process.

    Science.gov (United States)

    Nakaoka, Shinji

    2014-01-01

    Epidermal homeostasis is maintained by dynamic interactions among molecules and cells at different spatiotemporal scales. Mathematical modeling and simulation is expected to provide clear understanding and precise description of multiscaleness in tissue homeostasis under systems perspective. We introduce a stochastic process-based description of multiscale dynamics. Agent-based modeling as a framework of multiscale modeling to achieve consistent integration of definitive subsystems is proposed. A newly developed algorithm that particularly aims to perform stochastic simulations of cellular dynamical process is introduced. Finally we review applications of multiscale modeling and quantitative study to important aspects of epidermal and epithelial homeostasis.

  1. Refining the quantitative pathway of the Pathways to Mathematics model.

    Science.gov (United States)

    Sowinski, Carla; LeFevre, Jo-Anne; Skwarchuk, Sheri-Lynn; Kamawar, Deepthi; Bisanz, Jeffrey; Smith-Chant, Brenda

    2015-03-01

    In the current study, we adopted the Pathways to Mathematics model of LeFevre et al. (2010). In this model, there are three cognitive domains--labeled as the quantitative, linguistic, and working memory pathways--that make unique contributions to children's mathematical development. We attempted to refine the quantitative pathway by combining children's (N=141 in Grades 2 and 3) subitizing, counting, and symbolic magnitude comparison skills using principal components analysis. The quantitative pathway was examined in relation to dependent numerical measures (backward counting, arithmetic fluency, calculation, and number system knowledge) and a dependent reading measure, while simultaneously accounting for linguistic and working memory skills. Analyses controlled for processing speed, parental education, and gender. We hypothesized that the quantitative, linguistic, and working memory pathways would account for unique variance in the numerical outcomes; this was the case for backward counting and arithmetic fluency. However, only the quantitative and linguistic pathways (not working memory) accounted for unique variance in calculation and number system knowledge. Not surprisingly, only the linguistic pathway accounted for unique variance in the reading measure. These findings suggest that the relative contributions of quantitative, linguistic, and working memory skills vary depending on the specific cognitive task. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  3. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  4. Epistasis analysis for quantitative traits by functional regression model.

    Science.gov (United States)

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-06-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study.

  5. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... size. The model has been formulated with a specied building-up of the pressure during the start-up of the plant, i.e. the steam production during start-up of the boiler is output from the model. The steam outputs together with requirements with respect to steam space load have been utilized to dene...... of the boiler is (with an acceptable accuracy) proportional with the volume of the boiler. For the dynamic operation capability a cost function penalizing limited dynamic operation capability and vise-versa has been dened. The main idea is that it by mean of the parameters in this function is possible to t its...

  6. Computer simulation modeling of abnormal behavior: a program approach.

    Science.gov (United States)

    Reilly, K D; Freese, M R; Rowe, P B

    1984-07-01

    A need for modeling abnormal behavior on a comprehensive, systematic basis exists. Computer modeling and simulation tools offer especially good opportunities to establish such a program of studies. Issues concern deciding which modeling tools to use, how to relate models to behavioral data, what level of modeling to employ, and how to articulate theory to facilitate such modeling. Four levels or types of modeling, two qualitative and two quantitative, are identified. Their properties are examined and interrelated to include illustrative applications to the study of abnormal behavior, with an emphasis on schizophrenia.

  7. Quantitative phase-field modeling of nonisothermal solidification in dilute multicomponent alloys with arbitrary diffusivities.

    Science.gov (United States)

    Ohno, Munekazu

    2012-11-01

    A quantitative phase-field model is developed for simulating microstructural pattern formation in nonisothermal solidification in dilute multicomponent alloys with arbitrary thermal and solutal diffusivities. By performing the matched asymptotic analysis, it is shown that the present model with antitrapping current terms reproduces the free-boundary problem of interest in the thin-interface limit. Convergence of the simulation outcome with decreasing the interface thickness is demonstrated for nonisothermal free dendritic growth in binary alloys and isothermal and nonisothermal free dendritic growth in a ternary alloy.

  8. Simulated annealing model of acupuncture

    Science.gov (United States)

    Shang, Charles; Szu, Harold

    2015-05-01

    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  9. Simulation of Potential Production and Optimum Population Quantitative Indices for the Second Hybrid Rice

    Institute of Scientific and Technical Information of China (English)

    YAN Li-jiao; YAO Zhong; ZHENG Zhi-ming; LI Hua-bin

    2006-01-01

    The article established the HDRICE model by modifying the structure of the ORYZA1 model and revising its parameters by field experiments. The HDRICE model consists of the modules of morphological development of rice, daily dry matter accumulation and partitioning, daily CO2 assimilation of the canopy, leaf area, and tiller development. The model preferably simulated the dynamic rice development because of the thorough integration of the effects of temperature and light on the rates of rice development, photosynthesis, respiration, and. other ecophysiological processes. In addition, this model has attainable grain yield in the test experiment that showed the potential yield of cultivar Xieyou 46 ranged from 11 to 13 tons ha-1. Besides, the model was used to optimize the combinations of the transplanting date, seedling age and density for cultivar Xieyou 46 at Jinhua area, and the population quantitative indices to attain the potential yield such as maximum stems, effective panicles, filled grain number/leaf area, and so on. The result showed that the combination of transplanting date on July 25, seedling age of 35 days and base seedling density of 1.33 × 106ha-1 is the optimum combination for the second hybrid rice production in Jinhua County, China. And the maximum stems, the effective panicles, the filled grain per panicle, the peak of optimum LAI, LAI in later filling stage, and the filled grain number/leaf were 6.03 × 106 ha, 3.99 × 106 ha,119.2, 8.59, 5-6, and 0.64, respectively.

  10. Uterine Contraction Modeling and Simulation

    Science.gov (United States)

    Liu, Miao; Belfore, Lee A.; Shen, Yuzhong; Scerbo, Mark W.

    2010-01-01

    Building a training system for medical personnel to properly interpret fetal heart rate tracing requires developing accurate models that can relate various signal patterns to certain pathologies. In addition to modeling the fetal heart rate signal itself, the change of uterine pressure that bears strong relation to fetal heart rate and provides indications of maternal and fetal status should also be considered. In this work, we have developed a group of parametric models to simulate uterine contractions during labor and delivery. Through analysis of real patient records, we propose to model uterine contraction signals by three major components: regular contractions, impulsive noise caused by fetal movements, and low amplitude noise invoked by maternal breathing and measuring apparatus. The regular contractions are modeled by an asymmetric generalized Gaussian function and least squares estimation is used to compute the parameter values of the asymmetric generalized Gaussian function based on uterine contractions of real patients. Regular contractions are detected based on thresholding and derivative analysis of uterine contractions. Impulsive noise caused by fetal movements and low amplitude noise by maternal breathing and measuring apparatus are modeled by rational polynomial functions and Perlin noise, respectively. Experiment results show the synthesized uterine contractions can mimic the real uterine contractions realistically, demonstrating the effectiveness of the proposed algorithm.

  11. Applications of Joint Tactical Simulation Modeling

    Science.gov (United States)

    1997-12-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS APPLICATIONS OF JOINT TACTICAL SIMULATION MODELING by Steve VanLandingham December 1997...SUBTITLE APPLICATIONS OF JOINT TACTICAL SIMULATION MODELING 5. FUNDING NUMBERS 6. AUTHOR(S) VanLandingham, Steve 7. PERFORMING ORGANIZATION NAME(S...release; distribution is unlimited. APPLICATIONS OF JOINT TACTICAL SIMULATION MODELING Steve VanLandingham Lieutenant, United States Navy B.S

  12. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to p...... already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity. © IWA Publishing 2013....... and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work...

  13. Human judgment vs. quantitative models for the management of ecological resources.

    Science.gov (United States)

    Holden, Matthew H; Ellner, Stephen P

    2016-07-01

    Despite major advances in quantitative approaches to natural resource management, there has been resistance to using these tools in the actual practice of managing ecological populations. Given a managed system and a set of assumptions, translated into a model, optimization methods can be used to solve for the most cost-effective management actions. However, when the underlying assumptions are not met, such methods can potentially lead to decisions that harm the environment and economy. Managers who develop decisions based on past experience and judgment, without the aid of mathematical models, can potentially learn about the system and develop flexible management strategies. However, these strategies are often based on subjective criteria and equally invalid and often unstated assumptions. Given the drawbacks of both methods, it is unclear whether simple quantitative models improve environmental decision making over expert opinion. In this study, we explore how well students, using their experience and judgment, manage simulated fishery populations in an online computer game and compare their management outcomes to the performance of model-based decisions. We consider harvest decisions generated using four different quantitative models: (1) the model used to produce the simulated population dynamics observed in the game, with the values of all parameters known (as a control), (2) the same model, but with unknown parameter values that must be estimated during the game from observed data, (3) models that are structurally different from those used to simulate the population dynamics, and (4) a model that ignores age structure. Humans on average performed much worse than the models in cases 1-3, but in a small minority of scenarios, models produced worse outcomes than those resulting from students making decisions based on experience and judgment. When the models ignored age structure, they generated poorly performing management decisions, but still outperformed

  14. Quantitative modelling in design and operation of food supply systems

    NARCIS (Netherlands)

    Beek, van P.

    2004-01-01

    During the last two decades food supply systems not only got interest of food technologists but also from the field of Operations Research and Management Science. Operations Research (OR) is concerned with quantitative modelling and can be used to get insight into the optimal configuration and opera

  15. SWEEPOP a simulation model for Target Simulation Mode minesweeping

    NARCIS (Netherlands)

    Keus, H.E.; Beckers, A.L.D.; Cleophas, P.L.H.

    2005-01-01

    SWEEPOP is a flexible model that simulates the physical interaction between objects in a maritime underwater environment. The model was built to analyse the deployment and the performance of a Target Simulation Mode (TSM) minesweeping system for the Royal Netherlands Navy (RNLN) and to support its p

  16. Quantitative photoacoustic tomography using forward and adjoint Monte Carlo models of radiance

    CERN Document Server

    Hochuli, Roman; Arridge, Simon; Cox, Ben

    2016-01-01

    Forward and adjoint Monte Carlo (MC) models of radiance are proposed for use in model-based quantitative photoacoustic tomography. A 2D radiance MC model using a harmonic angular basis is introduced and validated against analytic solutions for the radiance in heterogeneous media. A gradient-based optimisation scheme is then used to recover 2D absorption and scattering coefficients distributions from simulated photoacoustic measurements. It is shown that the functional gradients, which are a challenge to compute efficiently using MC models, can be calculated directly from the coefficients of the harmonic angular basis used in the forward and adjoint models. This work establishes a framework for transport-based quantitative photoacoustic tomography that can fully exploit emerging highly parallel computing architectures.

  17. Modeling and simulation of cascading contingencies

    Science.gov (United States)

    Zhang, Jianfeng

    This dissertation proposes a new approach to model and study cascading contingencies in large power systems. The most important contribution of the work involves the development and validation of a heuristic analytic model to assess the likelihood of cascading contingencies, and the development and validation of a uniform search strategy. We model the probability of cascading contingencies as a function of power flow and power flow changes. Utilizing logistic regression, the proposed model is calibrated using real industry data. This dissertation analyzes random search strategies for Monte Carlo simulations and proposes a new uniform search strategy based on the Metropolis-Hastings Algorithm. The proposed search strategy is capable of selecting the most significant cascading contingencies, and it is capable of constructing an unbiased estimator to provide a measure of system security. This dissertation makes it possible to reasonably quantify system security and justify security operations when economic concerns conflict with reliability concerns in the new competitive power market environment. It can also provide guidance to system operators about actions that may be taken to reduce the risk of major system blackouts. Various applications can be developed to take advantage of the quantitative security measures provided in this dissertation.

  18. Machine Learning and Cosmological Simulations I: Semi-Analytical Models

    OpenAIRE

    Kamdar, Harshil M.; Turk, Matthew J.; Brunner, Robert J.

    2015-01-01

    We present a new exploratory framework to model galaxy formation and evolution in a hierarchical universe by using machine learning (ML). Our motivations are two-fold: (1) presenting a new, promising technique to study galaxy formation, and (2) quantitatively analyzing the extent of the influence of dark matter halo properties on galaxies in the backdrop of semi-analytical models (SAMs). We use the influential Millennium Simulation and the corresponding Munich SAM to train and test various so...

  19. Predicting dislocation climb: Classical modeling versus atomistic simulations

    OpenAIRE

    Clouet, Emmanuel

    2011-01-01

    International audience; The classical modeling of dislocation climb based on a continuous description of vacancy diffusion is compared to recent atomistic simulations of dislocation climb in body-centered cubic iron under vacancy supersaturation [Phys. Rev. Lett. 105 095501 (2010)]. A quantitative agreement is obtained, showing the ability of the classical approach to describe dislocation climb. The analytical model is then used to extrapolate dislocation climb velocities to lower dislocation...

  20. Validation of Compton Scattering Monte Carlo Simulation Models

    CERN Document Server

    Weidenspointner, Georg; Hauf, Steffen; Hoff, Gabriela; Kuster, Markus; Pia, Maria Grazia; Saracco, Paolo

    2014-01-01

    Several models for the Monte Carlo simulation of Compton scattering on electrons are quantitatively evaluated with respect to a large collection of experimental data retrieved from the literature. Some of these models are currently implemented in general purpose Monte Carlo systems; some have been implemented and evaluated for possible use in Monte Carlo particle transport for the first time in this study. Here we present first and preliminary results concerning total and differential Compton scattering cross sections.

  1. A GPGPU accelerated modeling environment for quantitatively characterizing karst systems

    Science.gov (United States)

    Myre, J. M.; Covington, M. D.; Luhmann, A. J.; Saar, M. O.

    2011-12-01

    The ability to derive quantitative information on the geometry of karst aquifer systems is highly desirable. Knowing the geometric makeup of a karst aquifer system enables quantitative characterization of the systems response to hydraulic events. However, the relationship between flow path geometry and karst aquifer response is not well understood. One method to improve this understanding is the use of high speed modeling environments. High speed modeling environments offer great potential in this regard as they allow researchers to improve their understanding of the modeled karst aquifer through fast quantitative characterization. To that end, we have implemented a finite difference model using General Purpose Graphics Processing Units (GPGPUs). GPGPUs are special purpose accelerators which are capable of high speed and highly parallel computation. The GPGPU architecture is a grid like structure, making it is a natural fit for structured systems like finite difference models. To characterize the highly complex nature of karst aquifer systems our modeling environment is designed to use an inverse method to conduct the parameter tuning. Using an inverse method reduces the total amount of parameter space needed to produce a set of parameters describing a system of good fit. Systems of good fit are determined with a comparison to reference storm responses. To obtain reference storm responses we have collected data from a series of data-loggers measuring water depth, temperature, and conductivity at locations along a cave stream with a known geometry in southeastern Minnesota. By comparing the modeled response to those of the reference responses the model parameters can be tuned to quantitatively characterize geometry, and thus, the response of the karst system.

  2. Techniques and Simulation Models in Risk Management

    OpenAIRE

    Mirela GHEORGHE

    2012-01-01

    In the present paper, the scientific approach of the research starts from the theoretical framework of the simulation concept and then continues in the setting of the practical reality, thus providing simulation models for a broad range of inherent risks specific to any organization and simulation of those models, using the informatics instrument @Risk (Palisade). The reason behind this research lies in the need for simulation models that will allow the person in charge with decision taking i...

  3. Reservoir Stochastic Modeling Constrained by Quantitative Geological Conceptual Patterns

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper discusses the principles of geologic constraints on reservoir stochastic modeling. By using the system science theory, two kinds of uncertainties, including random uncertainty and fuzzy uncertainty, are recognized. In order to improve the precision of stochastic modeling and reduce the uncertainty in realization, the fuzzy uncertainty should be stressed, and the "geological genesis-controlled modeling" is conducted under the guidance of a quantitative geological pattern. An example of the Pingqiao horizontal-well division of the Ansai Oilfield in the Ordos Basin is taken to expound the method of stochastic modeling.

  4. Lessons Learned from Quantitative Dynamical Modeling in Systems Biology

    Science.gov (United States)

    Bachmann, Julie; Matteson, Andrew; Schelke, Max; Kaschek, Daniel; Hug, Sabine; Kreutz, Clemens; Harms, Brian D.; Theis, Fabian J.; Klingmüller, Ursula; Timmer, Jens

    2013-01-01

    Due to the high complexity of biological data it is difficult to disentangle cellular processes relying only on intuitive interpretation of measurements. A Systems Biology approach that combines quantitative experimental data with dynamic mathematical modeling promises to yield deeper insights into these processes. Nevertheless, with growing complexity and increasing amount of quantitative experimental data, building realistic and reliable mathematical models can become a challenging task: the quality of experimental data has to be assessed objectively, unknown model parameters need to be estimated from the experimental data, and numerical calculations need to be precise and efficient. Here, we discuss, compare and characterize the performance of computational methods throughout the process of quantitative dynamic modeling using two previously established examples, for which quantitative, dose- and time-resolved experimental data are available. In particular, we present an approach that allows to determine the quality of experimental data in an efficient, objective and automated manner. Using this approach data generated by different measurement techniques and even in single replicates can be reliably used for mathematical modeling. For the estimation of unknown model parameters, the performance of different optimization algorithms was compared systematically. Our results show that deterministic derivative-based optimization employing the sensitivity equations in combination with a multi-start strategy based on latin hypercube sampling outperforms the other methods by orders of magnitude in accuracy and speed. Finally, we investigated transformations that yield a more efficient parameterization of the model and therefore lead to a further enhancement in optimization performance. We provide a freely available open source software package that implements the algorithms and examples compared here. PMID:24098642

  5. Lessons learned from quantitative dynamical modeling in systems biology.

    Directory of Open Access Journals (Sweden)

    Andreas Raue

    Full Text Available Due to the high complexity of biological data it is difficult to disentangle cellular processes relying only on intuitive interpretation of measurements. A Systems Biology approach that combines quantitative experimental data with dynamic mathematical modeling promises to yield deeper insights into these processes. Nevertheless, with growing complexity and increasing amount of quantitative experimental data, building realistic and reliable mathematical models can become a challenging task: the quality of experimental data has to be assessed objectively, unknown model parameters need to be estimated from the experimental data, and numerical calculations need to be precise and efficient. Here, we discuss, compare and characterize the performance of computational methods throughout the process of quantitative dynamic modeling using two previously established examples, for which quantitative, dose- and time-resolved experimental data are available. In particular, we present an approach that allows to determine the quality of experimental data in an efficient, objective and automated manner. Using this approach data generated by different measurement techniques and even in single replicates can be reliably used for mathematical modeling. For the estimation of unknown model parameters, the performance of different optimization algorithms was compared systematically. Our results show that deterministic derivative-based optimization employing the sensitivity equations in combination with a multi-start strategy based on latin hypercube sampling outperforms the other methods by orders of magnitude in accuracy and speed. Finally, we investigated transformations that yield a more efficient parameterization of the model and therefore lead to a further enhancement in optimization performance. We provide a freely available open source software package that implements the algorithms and examples compared here.

  6. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    Computational models in physiology often integrate functional and structural information from a large range of spatiotemporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and skepticism concerning how computational methods can improve our understan...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....... of biovariability; 2) testing and developing robust techniques and tools as a prerequisite to conducting physiological investigations; 3) defining and adopting standards to facilitate the interoperability of experiments, models, and simulations; 4) and understanding physiological validation as an iterative process...... understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...

  7. COMPARISON OF RF CAVITY TRANSPORT MODELS FOR BBU SIMULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Ilkyoung Shin,Byung Yunn,Todd Satogata,Shahid Ahmed

    2011-03-01

    The transverse focusing effect in RF cavities plays a considerable role in beam dynamics for low-energy beamline sections and can contribute to beam breakup (BBU) instability. The purpose of this analysis is to examine RF cavity models in simulation codes which will be used for BBU experiments at Jefferson Lab and improve BBU simulation results. We review two RF cavity models in the simulation codes elegant and TDBBU (a BBU simulation code developed at Jefferson Lab). elegant can include the Rosenzweig-Serafini (R-S) model for the RF focusing effect. Whereas TDBBU uses a model from the code TRANSPORT which considers the adiabatic damping effect, but not the RF focusing effect. Quantitative comparisons are discussed for the CEBAF beamline. We also compare the R-S model with the results from numerical simulations for a CEBAF-type 5-cell superconducting cavity to validate the use of the R-S model as an improved low-energy RF cavity transport model in TDBBU. We have implemented the R-S model in TDBBU. It will improve BBU simulation results to be more matched with analytic calculations and experimental results.

  8. Quantitative modelling in cognitive ergonomics: predicting signals passed at danger.

    Science.gov (United States)

    Moray, Neville; Groeger, John; Stanton, Neville

    2017-02-01

    This paper shows how to combine field observations, experimental data and mathematical modelling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example, we consider a major railway accident. In 1999, a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, 'black box' data, and accident and engineering reports to construct a case history of the accident. We show how to combine field data with mathematical modelling to estimate the probability that the driver observed and identified the state of the signals, and checked their status. Our methodology can explain the SPAD ('Signal Passed At Danger'), generate recommendations about signal design and placement and provide quantitative guidance for the design of safer railway systems' speed limits and the location of signals. Practitioner Summary: Detailed ergonomic analysis of railway signals and rail infrastructure reveals problems of signal identification at this location. A record of driver eye movements measures attention, from which a quantitative model for out signal placement and permitted speeds can be derived. The paper is an example of how to combine field data, basic research and mathematical modelling to solve ergonomic design problems.

  9. A quantitative comparison of Calvin-Benson cycle models.

    Science.gov (United States)

    Arnold, Anne; Nikoloski, Zoran

    2011-12-01

    The Calvin-Benson cycle (CBC) provides the precursors for biomass synthesis necessary for plant growth. The dynamic behavior and yield of the CBC depend on the environmental conditions and regulation of the cellular state. Accurate quantitative models hold the promise of identifying the key determinants of the tightly regulated CBC function and their effects on the responses in future climates. We provide an integrative analysis of the largest compendium of existing models for photosynthetic processes. Based on the proposed ranking, our framework facilitates the discovery of best-performing models with regard to metabolomics data and of candidates for metabolic engineering.

  10. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  11. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  12. Quantitative modelling in cognitive ergonomics: predicting signals passed at danger

    OpenAIRE

    Moray, Neville; Groeger, John; Stanton, Neville

    2016-01-01

    This paper shows how to combine field observations, experimental data, and mathematical modeling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example we consider a major railway accident. In 1999 a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, "black box" data, and accident and engineering reports, to construct a case history of the accident. We show how t...

  13. Comparison of blood flow models and acquisitions for quantitative myocardial perfusion estimation from dynamic CT

    Science.gov (United States)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-04-01

    Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g)-1, cardiac output = 3, 5, 8 L min-1). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This suggests that

  14. Quantitative metal magnetic memory reliability modeling for welded joints

    Science.gov (United States)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  15. Quantitative versus qualitative modeling: a complementary approach in ecosystem study.

    Science.gov (United States)

    Bondavalli, C; Favilla, S; Bodini, A

    2009-02-01

    Natural disturbance or human perturbation act upon ecosystems by changing some dynamical parameters of one or more species. Foreseeing these modifications is necessary before embarking on an intervention: predictions may help to assess management options and define hypothesis for interventions. Models become valuable tools for studying and making predictions only when they capture types of interactions and their magnitude. Quantitative models are more precise and specific about a system, but require a large effort in model construction. Because of this very often ecological systems remain only partially specified and one possible approach to their description and analysis comes from qualitative modelling. Qualitative models yield predictions as directions of change in species abundance but in complex systems these predictions are often ambiguous, being the result of opposite actions exerted on the same species by way of multiple pathways of interactions. Again, to avoid such ambiguities one needs to know the intensity of all links in the system. One way to make link magnitude explicit in a way that can be used in qualitative analysis is described in this paper and takes advantage of another type of ecosystem representation: ecological flow networks. These flow diagrams contain the structure, the relative position and the connections between the components of a system, and the quantity of matter flowing along every connection. In this paper it is shown how these ecological flow networks can be used to produce a quantitative model similar to the qualitative counterpart. Analyzed through the apparatus of loop analysis this quantitative model yields predictions that are by no means ambiguous, solving in an elegant way the basic problem of qualitative analysis. The approach adopted in this work is still preliminary and we must be careful in its application.

  16. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  17. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  18. An introduction to enterprise modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ostic, J.K.; Cannon, C.E. [Los Alamos National Lab., NM (United States). Technology Modeling and Analysis Group

    1996-09-01

    As part of an ongoing effort to continuously improve productivity, quality, and efficiency of both industry and Department of Energy enterprises, Los Alamos National Laboratory is investigating various manufacturing and business enterprise simulation methods. A number of enterprise simulation software models are being developed to enable engineering analysis of enterprise activities. In this document the authors define the scope of enterprise modeling and simulation efforts, and review recent work in enterprise simulation at Los Alamos National Laboratory as well as at other industrial, academic, and research institutions. References of enterprise modeling and simulation methods and a glossary of enterprise-related terms are provided.

  19. Modelling toolkit for simulation of maglev devices

    Science.gov (United States)

    Peña-Roche, J.; Badía-Majós, A.

    2017-01-01

    A stand-alone App1 has been developed, focused on obtaining information about relevant engineering properties of magnetic levitation systems. Our modelling toolkit provides real time simulations of 2D magneto-mechanical quantities for superconductor (SC)/permanent magnet structures. The source code is open and may be customised for a variety of configurations. Ultimately, it relies on the variational statement of the critical state model for the superconducting component and has been verified against experimental data for YBaCuO/NdFeB assemblies. On a quantitative basis, the values of the arising forces, induced superconducting currents, as well as a plot of the magnetic field lines are displayed upon selection of an arbitrary trajectory of the magnet in the vicinity of the SC. The stability issues related to the cooling process, as well as the maximum attainable forces for a given material and geometry are immediately observed. Due to the complexity of the problem, a strategy based on cluster computing, database compression, and real-time post-processing on the device has been implemented.

  20. A Quantitative Dynamic Simulation of Bremia lactucae Airborne Conidia Concentration above a Lettuce Canopy.

    Directory of Open Access Journals (Sweden)

    Mamadou Lamine Fall

    Full Text Available Lettuce downy mildew, caused by the oomycete Bremia lactucae Regel, is a major threat to lettuce production worldwide. Lettuce downy mildew is a polycyclic disease driven by airborne spores. A weather-based dynamic simulation model for B. lactucae airborne spores was developed to simulate the aerobiological characteristics of the pathogen. The model was built using the STELLA platform by following the system dynamics methodology. The model was developed using published equations describing disease subprocesses (e.g., sporulation and assembled knowledge of the interactions among pathogen, host, and weather. The model was evaluated with four years of independent data by comparing model simulations with observations of hourly and daily airborne spore concentrations. The results show an accurate simulation of the trend and shape of B. lactucae temporal dynamics of airborne spore concentration. The model simulated hourly and daily peaks in airborne spore concentrations. More than 95% of the simulation runs, the daily-simulated airborne conidia concentration was 0 when airborne conidia were not observed. Also, the relationship between the simulated and the observed airborne spores was linear. In more than 94% of the simulation runs, the proportion of the linear variation in the hourly-observed values explained by the variation in the hourly-simulated values was greater than 0.7 in all years except one. Most of the errors came from the deviation from the 1:1 line, and the proportion of errors due to the model bias was low. This model is the only dynamic model developed to mimic the dynamics of airborne inoculum and represents an initial step towards improved lettuce downy mildew understanding, forecasting and management.

  1. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  2. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  3. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  4. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  5. Nonsmooth Modeling and Simulation for Switched Circuits

    CERN Document Server

    Acary, Vincent; Brogliato, Bernard

    2011-01-01

    "Nonsmooth Modeling and Simulation for Switched Circuits" concerns the modeling and the numerical simulation of switched circuits with the nonsmooth dynamical systems (NSDS) approach, using piecewise-linear and multivalued models of electronic devices like diodes, transistors, switches. Numerous examples (ranging from introductory academic circuits to various types of power converters) are analyzed and many simulation results obtained with the INRIA open-source SICONOS software package are presented. Comparisons with SPICE and hybrid methods demonstrate the power of the NSDS approach

  6. Juno model rheometry and simulation

    Science.gov (United States)

    Sampl, Manfred; Macher, Wolfgang; Oswald, Thomas; Plettemeier, Dirk; Rucker, Helmut O.; Kurth, William S.

    2016-10-01

    The experiment Waves aboard the Juno spacecraft, which will arrive at its target planet Jupiter in 2016, was devised to study the plasma and radio waves of the Jovian magnetosphere. We analyzed the Waves antennas, which consist of two nonparallel monopoles operated as a dipole. For this investigation we applied two independent methods: the experimental technique, rheometry, which is based on a downscaled model of the spacecraft to measure the antenna properties in an electrolytic tank and numerical simulations, based on commercial computer codes, from which the quantities of interest (antenna impedances and effective length vectors) are calculated. In this article we focus on the results for the low-frequency range up to about 4 MHz, where the antenna system is in the quasi-static regime. Our findings show that there is a significant deviation of the effective length vectors from the physical monopole directions, caused by the presence of the conducting spacecraft body. The effective axes of the antenna monopoles are offset from the mechanical axes by more than 30°, and effective lengths show a reduction to about 60% of the antenna rod lengths. The antennas' mutual capacitances are small compared to the self-capacitances, and the latter are almost the same for the two monopoles. The overall performance of the antennas in dipole configuration is very stable throughout the frequency range up to about 4-5 MHz and therefore can be regarded as the upper frequency bound below which the presented quasi-static results are applicable.

  7. Quantitative magnetospheric models derived from spacecraft magnetometer data

    Science.gov (United States)

    Mead, G. D.; Fairfield, D. H.

    1973-01-01

    Quantitative models of the external magnetospheric field were derived by making least-squares fits to magnetic field measurements from four IMP satellites. The data were fit to a power series expansion in the solar magnetic coordinates and the solar wind-dipole tilt angle, and thus the models contain the effects of seasonal north-south asymmetries. The expansions are divergence-free, but unlike the usual scalar potential expansions, the models contain a nonzero curl representing currents distributed within the magnetosphere. Characteristics of four models are presented, representing different degrees of magnetic disturbance as determined by the range of Kp values. The latitude at the earth separating open polar cap field lines from field lines closing on the dayside is about 5 deg lower than that determined by previous theoretically-derived models. At times of high Kp, additional high latitude field lines are drawn back into the tail.

  8. Network Modeling and Simulation A Practical Perspective

    CERN Document Server

    Guizani, Mohsen; Khan, Bilal

    2010-01-01

    Network Modeling and Simulation is a practical guide to using modeling and simulation to solve real-life problems. The authors give a comprehensive exposition of the core concepts in modeling and simulation, and then systematically address the many practical considerations faced by developers in modeling complex large-scale systems. The authors provide examples from computer and telecommunication networks and use these to illustrate the process of mapping generic simulation concepts to domain-specific problems in different industries and disciplines. Key features: Provides the tools and strate

  9. Multi-scale modelling and simulation in systems biology.

    Science.gov (United States)

    Dada, Joseph O; Mendes, Pedro

    2011-02-01

    The aim of systems biology is to describe and understand biology at a global scale where biological functions are recognised as a result of complex mechanisms that happen at several scales, from the molecular to the ecosystem. Modelling and simulation are computational tools that are invaluable for description, prediction and understanding these mechanisms in a quantitative and integrative way. Therefore the study of biological functions is greatly aided by multi-scale methods that enable the coupling and simulation of models spanning several spatial and temporal scales. Various methods have been developed for solving multi-scale problems in many scientific disciplines, and are applicable to continuum based modelling techniques, in which the relationship between system properties is expressed with continuous mathematical equations or discrete modelling techniques that are based on individual units to model the heterogeneous microscopic elements such as individuals or cells. In this review, we survey these multi-scale methods and explore their application in systems biology.

  10. A Simulation and Modeling Framework for Space Situational Awareness

    Science.gov (United States)

    Olivier, S.

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. This framework includes detailed models for threat scenarios, signatures, sensors, observables and knowledge extraction algorithms. The framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the details of the modeling and simulation framework, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical and infra-red brightness calculations, generic radar system models, generic optical and infra-red system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The specific modeling of the Space Surveillance Network is performed in collaboration with the Air Force Space Command Space Control Group. We will demonstrate the use of this integrated simulation and modeling framework on specific threat scenarios, including space debris and satellite maneuvers, and we will examine the results of case studies involving the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

  11. VHDL simulation with access to transistor models

    Science.gov (United States)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  12. Modeling and Simulation of Low Voltage Arcs

    NARCIS (Netherlands)

    Ghezzi, L.; Balestrero, A.

    2010-01-01

    Modeling and Simulation of Low Voltage Arcs is an attempt to improve the physical understanding, mathematical modeling and numerical simulation of the electric arcs that are found during current interruptions in low voltage circuit breakers. An empirical description is gained by refined electrical

  13. Modeling and Simulation of Low Voltage Arcs

    NARCIS (Netherlands)

    Ghezzi, L.; Balestrero, A.

    2010-01-01

    Modeling and Simulation of Low Voltage Arcs is an attempt to improve the physical understanding, mathematical modeling and numerical simulation of the electric arcs that are found during current interruptions in low voltage circuit breakers. An empirical description is gained by refined electrical m

  14. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single mat...

  15. Model for Quantitative Evaluation of Enzyme Replacement Treatment

    Directory of Open Access Journals (Sweden)

    Radeva B.

    2009-12-01

    Full Text Available Gaucher disease is the most frequent lysosomal disorder. Its enzyme replacement treatment was the new progress of modern biotechnology, successfully used in the last years. The evaluation of optimal dose of each patient is important due to health and economical reasons. The enzyme replacement is the most expensive treatment. It must be held continuously and without interruption. Since 2001, the enzyme replacement therapy with Cerezyme*Genzyme was formally introduced in Bulgaria, but after some time it was interrupted for 1-2 months. The dose of the patients was not optimal. The aim of our work is to find a mathematical model for quantitative evaluation of ERT of Gaucher disease. The model applies a kind of software called "Statistika 6" via the input of the individual data of 5-year-old children having the Gaucher disease treated with Cerezyme. The output results of the model gave possibilities for quantitative evaluation of the individual trends in the development of the disease of each child and its correlation. On the basis of this results, we might recommend suitable changes in ERT.

  16. Quantitative modeling and data analysis of SELEX experiments

    Science.gov (United States)

    Djordjevic, Marko; Sengupta, Anirvan M.

    2006-03-01

    SELEX (systematic evolution of ligands by exponential enrichment) is an experimental procedure that allows the extraction, from an initially random pool of DNA, of those oligomers with high affinity for a given DNA-binding protein. We address what is a suitable experimental and computational procedure to infer parameters of transcription factor-DNA interaction from SELEX experiments. To answer this, we use a biophysical model of transcription factor-DNA interactions to quantitatively model SELEX. We show that a standard procedure is unsuitable for obtaining accurate interaction parameters. However, we theoretically show that a modified experiment in which chemical potential is fixed through different rounds of the experiment allows robust generation of an appropriate dataset. Based on our quantitative model, we propose a novel bioinformatic method of data analysis for such a modified experiment and apply it to extract the interaction parameters for a mammalian transcription factor CTF/NFI. From a practical point of view, our method results in a significantly improved false positive/false negative trade-off, as compared to both the standard information theory based method and a widely used empirically formulated procedure.

  17. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  18. A Team Mental Model Perspective of Pre-Quantitative Risk

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  19. Quantitative risk assessment modeling for nonhomogeneous urban road tunnels.

    Science.gov (United States)

    Meng, Qiang; Qu, Xiaobo; Wang, Xinchang; Yuanita, Vivi; Wong, Siew Chee

    2011-03-01

    Urban road tunnels provide an increasingly cost-effective engineering solution, especially in compact cities like Singapore. For some urban road tunnels, tunnel characteristics such as tunnel configurations, geometries, provisions of tunnel electrical and mechanical systems, traffic volumes, etc. may vary from one section to another. These urban road tunnels that have characterized nonuniform parameters are referred to as nonhomogeneous urban road tunnels. In this study, a novel quantitative risk assessment (QRA) model is proposed for nonhomogeneous urban road tunnels because the existing QRA models for road tunnels are inapplicable to assess the risks in these road tunnels. This model uses a tunnel segmentation principle whereby a nonhomogeneous urban road tunnel is divided into various homogenous sections. Individual risk for road tunnel sections as well as the integrated risk indices for the entire road tunnel is defined. The article then proceeds to develop a new QRA model for each of the homogeneous sections. Compared to the existing QRA models for road tunnels, this section-based model incorporates one additional top event-toxic gases due to traffic congestion-and employs the Poisson regression method to estimate the vehicle accident frequencies of tunnel sections. This article further illustrates an aggregated QRA model for nonhomogeneous urban tunnels by integrating the section-based QRA models. Finally, a case study in Singapore is carried out.

  20. Quantitative modeling of a gene's expression from its intergenic sequence.

    Directory of Open Access Journals (Sweden)

    Md Abul Hassan Samee

    2014-03-01

    Full Text Available Modeling a gene's expression from its intergenic locus and trans-regulatory context is a fundamental goal in computational biology. Owing to the distributed nature of cis-regulatory information and the poorly understood mechanisms that integrate such information, gene locus modeling is a more challenging task than modeling individual enhancers. Here we report the first quantitative model of a gene's expression pattern as a function of its locus. We model the expression readout of a locus in two tiers: 1 combinatorial regulation by transcription factors bound to each enhancer is predicted by a thermodynamics-based model and 2 independent contributions from multiple enhancers are linearly combined to fit the gene expression pattern. The model does not require any prior knowledge about enhancers contributing toward a gene's expression. We demonstrate that the model captures the complex multi-domain expression patterns of anterior-posterior patterning genes in the early Drosophila embryo. Altogether, we model the expression patterns of 27 genes; these include several gap genes, pair-rule genes, and anterior, posterior, trunk, and terminal genes. We find that the model-selected enhancers for each gene overlap strongly with its experimentally characterized enhancers. Our findings also suggest the presence of sequence-segments in the locus that would contribute ectopic expression patterns and hence were "shut down" by the model. We applied our model to identify the transcription factors responsible for forming the stripe boundaries of the studied genes. The resulting network of regulatory interactions exhibits a high level of agreement with known regulatory influences on the target genes. Finally, we analyzed whether and why our assumption of enhancer independence was necessary for the genes we studied. We found a deterioration of expression when binding sites in one enhancer were allowed to influence the readout of another enhancer. Thus, interference

  1. Multicomponent ballistic transport in narrow single wall carbon nanotubes: Analytic model and molecular dynamics simulations

    Science.gov (United States)

    Mutat, T.; Adler, J.; Sheintuch, M.

    2011-01-01

    The transport of gas mixtures through molecular-sieve membranes such as narrow nanotubes has many potential applications, but there remain open questions and a paucity of quantitative predictions. Our model, based on extensive molecular dynamics simulations, proposes that ballistic motion, hindered by counter diffusion, is the dominant mechanism. Our simulations of transport of mixtures of molecules between control volumes at both ends of nanotubes give quantitative support to the model's predictions. The combination of simulation and model enable extrapolation to longer tubes and pore networks.

  2. Modeling Error in Quantitative Macro-Comparative Research

    Directory of Open Access Journals (Sweden)

    Salvatore J. Babones

    2015-08-01

    Full Text Available Much quantitative macro-comparative research (QMCR relies on a common set of published data sources to answer similar research questions using a limited number of statistical tools. Since all researchers have access to much the same data, one might expect quick convergence of opinion on most topics. In reality, of course, differences of opinion abound and persist. Many of these differences can be traced, implicitly or explicitly, to the different ways researchers choose to model error in their analyses. Much careful attention has been paid in the political science literature to the error structures characteristic of time series cross-sectional (TSCE data, but much less attention has been paid to the modeling of error in broadly cross-national research involving large panels of countries observed at limited numbers of time points. Here, and especially in the sociology literature, multilevel modeling has become a hegemonic – but often poorly understood – research tool. I argue that widely-used types of multilevel models, commonly known as fixed effects models (FEMs and random effects models (REMs, can produce wildly spurious results when applied to trended data due to mis-specification of error. I suggest that in most commonly-encountered scenarios, difference models are more appropriate for use in QMC.

  3. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  4. Simulation model of metallurgical production management

    Directory of Open Access Journals (Sweden)

    P. Šnapka

    2013-07-01

    Full Text Available This article is focused to the problems of the metallurgical production process intensification. The aim is the explaining of simulation model which presents metallurgical production management system adequated to new requirements. The knowledge of a dynamic behavior and features of metallurgical production system and its management are needed to this model creation. Characteristics which determine the dynamics of metallurgical production process are characterized. Simulation model is structured as functional blocks and their linkages with regard to organizational and temporal hierarchy of their actions. The creation of presented simulation model is based on theoretical findings of regulation, hierarchical systems and optimization.

  5. A generalised individual-based algorithm for modelling the evolution of quantitative herbicide resistance in arable weed populations.

    Science.gov (United States)

    Liu, Chun; Bridges, Melissa E; Kaundun, Shiv S; Glasgow, Les; Owen, Micheal Dk; Neve, Paul

    2017-02-01

    Simulation models are useful tools for predicting and comparing the risk of herbicide resistance in weed populations under different management strategies. Most existing models assume a monogenic mechanism governing herbicide resistance evolution. However, growing evidence suggests that herbicide resistance is often inherited in a polygenic or quantitative fashion. Therefore, we constructed a generalised modelling framework to simulate the evolution of quantitative herbicide resistance in summer annual weeds. Real-field management parameters based on Amaranthus tuberculatus (Moq.) Sauer (syn. rudis) control with glyphosate and mesotrione in Midwestern US maize-soybean agroecosystems demonstrated that the model can represent evolved herbicide resistance in realistic timescales. Sensitivity analyses showed that genetic and management parameters were impactful on the rate of quantitative herbicide resistance evolution, whilst biological parameters such as emergence and seed bank mortality were less important. The simulation model provides a robust and widely applicable framework for predicting the evolution of quantitative herbicide resistance in summer annual weed populations. The sensitivity analyses identified weed characteristics that would favour herbicide resistance evolution, including high annual fecundity, large resistance phenotypic variance and pre-existing herbicide resistance. Implications for herbicide resistance management and potential use of the model are discussed. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  6. Development of models and methods for the molecular simulation of large systems and molecules

    CERN Document Server

    Walter, Jonathan; Horsch, Martin; Vrabec, Jadran; Hasse, Hans

    2010-01-01

    The most important factor for quantitative results in molecular dynamics simulation are well developed force fields and models. In the present work, the development of new models and the usage of force fields from the literature in large systems are presented. Both tasks lead to time consuming simulations that require massively parallel high performance computing. In the present work, new models for carbon dioxide and cyclohexanolare discussed and a new method for the model development is introduced. Force fields and models for the simulation of PNIPAAm hydrogel in pure water and sodium chloride solution are tested and verified and applied to the simulation of nucleation processes.

  7. Simulation modeling for the health care manager.

    Science.gov (United States)

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  8. Pilot performance evaluation of simulated flight approach and landing manoeuvres using quantitative assessment tools

    Indian Academy of Sciences (India)

    P ARCHANA HEBBAR; ABHAY A PASHILKAR

    2017-03-01

    This research work examines the application of different statistical and empirical analysis methods to quantify pilot performance. A realistic approach and landing flight scenario is executed using the reconfigurable flight simulator at National Aerospace Laboratories and both subjective and quantitative measures are applied to the pilot performance data. Simulations were repeated for different difficult landing conditions likelanding with degraded visibility, with crosswinds, with degraded aircraft handling qualities and with emergency conditions. Relative assessment of the different applicable metrics is made and significance of task difficulties on pilot performance is investigated. Changes in the pilot’s control strategy with respect to primary and secondary tasks are also discussed in detail. Results indicate that analysing pilot’s control strategy together with his/her deviations from predetermined flight profile provides a means to quantify pilot performance.

  9. IMPROVED QUANTITATIVE FEEDBACK THEORY TECHNIQUE AND APPLICATION TO THREE-AXIS HYDRAULIC SIMULATOR

    Institute of Scientific and Technical Information of China (English)

    YU Jinying; ZHAO Keding; CAO Jian

    2006-01-01

    In order to meet tracking performance index of three-axis hydraulic simulator, based on classical quantitative feedback theory (QFT), an improved QFT technique is used to synthesize controller of low gain and bandwidth. By choosing a special nominal plant, the improved method assigns relative magnitude and phase tracking error between system uncertainty and nominal control plant.Relative tracking error induced by system uncertainty is transformed into sensitivity problem and relative tracking error induced by nominal plant forms into a region on Nichols chart. The two constraints further form into a combined bound which is fit for magnitude and phase loop shaping. Because of leaving out pre-filter of classical QFT controller structure, tracking performance is enhanced greatly. Furthermore, a cascaded two-loop control strategy is proposed to heighten control effect. The improved technique's efficacy is validated by simulation and experiment results.

  10. Warehouse Simulation Through Model Configuration

    NARCIS (Netherlands)

    Verriet, J.H.; Hamberg, R.; Caarls, J.; Wijngaarden, B. van

    2013-01-01

    The pre-build development of warehouse systems leads from a specific customer request to a specific customer quotation. This involves a process of configuring a warehouse system using a sequence of steps that contain increasingly more details. Simulation is a helpful tool in analyzing warehouse desi

  11. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction

    Directory of Open Access Journals (Sweden)

    Cobbs Gary

    2012-08-01

    Full Text Available Abstract Background Numerous models for use in interpreting quantitative PCR (qPCR data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Results Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the

  12. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...

  13. Simulation of collaborative studies for real-time PCR-based quantitation methods for genetically modified crops.

    Science.gov (United States)

    Watanabe, Satoshi; Sawada, Hiroshi; Naito, Shigehiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi; Hino, Akihiro

    2013-01-01

    To study impacts of various random effects and parameters of collaborative studies on the precision of quantitation methods of genetically modified (GM) crops, we developed a set of random effects models for cycle time values of a standard curve-based relative real-time PCR that makes use of an endogenous gene sequence as the internal standard. The models and data from a published collaborative study for six GM lines at four concentration levels were used to simulate collaborative studies under various conditions. Results suggested that by reducing the numbers of well replications from three to two, and standard levels of endogenous sequence from five to three, the number of unknown samples analyzable on a 96-well PCR plate in routine analyses could be almost doubled, and still the acceptable repeatability RSD (RSDr crops by real-time PCR and their collaborative studies.

  14. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  15. Quantitative model studies for interfaces in organic electronic devices

    Science.gov (United States)

    Gottfried, J. Michael

    2016-11-01

    In organic light-emitting diodes and similar devices, organic semiconductors are typically contacted by metal electrodes. Because the resulting metal/organic interfaces have a large impact on the performance of these devices, their quantitative understanding is indispensable for the further rational development of organic electronics. A study by Kröger et al (2016 New J. Phys. 18 113022) of an important single-crystal based model interface provides detailed insight into its geometric and electronic structure and delivers valuable benchmark data for computational studies. In view of the differences between typical surface-science model systems and real devices, a ‘materials gap’ is identified that needs to be addressed by future research to make the knowledge obtained from fundamental studies even more beneficial for real-world applications.

  16. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  17. A Quantitative Theory Model of a Photobleaching Mechanism

    Institute of Scientific and Technical Information of China (English)

    陈同生; 曾绍群; 周炜; 骆清铭

    2003-01-01

    A photobleaching model:D-P(dye-photon interaction)and D-O(Dye-oxygen oxidative reaction)photobleaching theory,is proposed.The quantitative power dependences of photobleaching rates with both one-and two-photon excitations(1 PE and TPE)are obtained.This photobleaching model can be used to elucidate our and other experimental results commendably.Experimental studies of the photobleaching rates for rhodamine B with TPE under unsaturation conditions reveals that the power dependences of photobleaching rates increase with the increasing dye concentration,and that the photobleaching rate of a single molecule increases in the second power of the excitation intensity,which is different from the high-order(> 3)nonlinear dependence of ensemble molecules.

  18. Quantum simulation of the t- J model

    Science.gov (United States)

    Yamaguchi, Fumiko; Yamamoto, Yoshihisa

    2002-12-01

    Computer simulation of a many-particle quantum system is bound to reach the inevitable limits of its ability as the system size increases. The primary reason for this is that the memory size used in a classical simulator grows polynomially whereas the Hilbert space of the quantum system does so exponentially. Replacing the classical simulator by a quantum simulator would be an effective method of surmounting this obstacle. The prevailing techniques for simulating quantum systems on a quantum computer have been developed for purposes of computing numerical algorithms designed to obtain approximate physical quantities of interest. The method suggested here requires no numerical algorithms; it is a direct isomorphic translation between a quantum simulator and the quantum system to be simulated. In the quantum simulator, physical parameters of the system, which are the fixed parameters of the simulated quantum system, are under the control of the experimenter. A method of simulating a model for high-temperature superconducting oxides, the t- J model, by optical control, as an example of such a quantum simulation, is presented.

  19. FCC Rolling Textures Reviewed in the Light of Quantitative Comparisons between Simulated and Experimental Textures

    DEFF Research Database (Denmark)

    Wierzbanowski, Krzysztof; Wroński, Marcin; Leffers, Torben

    2014-01-01

    of the copper-type texture is best simulated with {111} slip combined with type CL/PR lattice rotation and relatively strong interaction between the grains-but not with the full-constraint Taylor model and neither with the classical relaxed-constraint models. The development of the brass-type texture is best...... investigations. © 2014 Taylor and Francis Group, LLC....

  20. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...... in details. The results of simulations developed for different researches reveal that different mdel may be suitable for different purpose, thus the model should be chosen different carefully. Some details and tricks in modeling are also introduced which give a reference for further research....

  1. Simulation-based Manufacturing System Modeling

    Institute of Scientific and Technical Information of China (English)

    卫东; 金烨; 范秀敏; 严隽琪

    2003-01-01

    In recent years, computer simulation appears to be very advantageous technique for researching the resource-constrained manufacturing system. This paper presents an object-oriented simulation modeling method, which combines the merits of traditional methods such as IDEF0 and Petri Net. In this paper, a four-layer-one-angel hierarchical modeling framework based on OOP is defined. And the modeling description of these layers is expounded, such as: hybrid production control modeling and human resource dispatch modeling. To validate the modeling method, a case study of an auto-product line in a motor manufacturing company has been carried out.

  2. Quantitative comparisons of satellite observations and cloud models

    Science.gov (United States)

    Wang, Fang

    Microwave radiation interacts directly with precipitating particles and can therefore be used to compare microphysical properties found in models with those found in nature. Lower frequencies (minimization procedures but produce different CWP and RWP. The similarity in Tb can be attributed to comparable Total Water Path (TWP) between the two retrievals while the disagreement in the microphysics is caused by their different degrees of constraint of the cloud/rain ratio by the observations. This situation occurs frequently and takes up 46.9% in the one month 1D-Var retrievals examined. To attain better constrained cloud/rain ratios and improved retrieval quality, this study suggests the implementation of higher microwave frequency channels in the 1D-Var algorithm. Cloud Resolving Models (CRMs) offer an important pathway to interpret satellite observations of microphysical properties of storms. High frequency microwave brightness temperatures (Tbs) respond to precipitating-sized ice particles and can, therefore, be compared with simulated Tbs at the same frequencies. By clustering the Tb vectors at these frequencies, the scene can be classified into distinct microphysical regimes, in other words, cloud types. The properties for each cloud type in the simulated scene are compared to those in the observation scene to identify the discrepancies in microphysics within that cloud type. A convective storm over the Amazon observed by the Tropical Rainfall Measuring Mission (TRMM) is simulated using the Regional Atmospheric Modeling System (RAMS) in a semi-ideal setting, and four regimes are defined within the scene using cluster analysis: the 'clear sky/thin cirrus' cluster, the 'cloudy' cluster, the 'stratiform anvil' cluster and the 'convective' cluster. The relationship between Tb difference of 37 and 85 GHz and Tb at 85 GHz is found to contain important information of microphysical properties such as hydrometeor species and size distributions. Cluster

  3. Multiscale Model Approach for Magnetization Dynamics Simulations

    CERN Document Server

    De Lucia, Andrea; Tretiakov, Oleg A; Kläui, Mathias

    2016-01-01

    Simulations of magnetization dynamics in a multiscale environment enable rapid evaluation of the Landau-Lifshitz-Gilbert equation in a mesoscopic sample with nanoscopic accuracy in areas where such accuracy is required. We have developed a multiscale magnetization dynamics simulation approach that can be applied to large systems with spin structures that vary locally on small length scales. To implement this, the conventional micromagnetic simulation framework has been expanded to include a multiscale solving routine. The software selectively simulates different regions of a ferromagnetic sample according to the spin structures located within in order to employ a suitable discretization and use either a micromagnetic or an atomistic model. To demonstrate the validity of the multiscale approach, we simulate the spin wave transmission across the regions simulated with the two different models and different discretizations. We find that the interface between the regions is fully transparent for spin waves with f...

  4. Automated quantitative gait analysis in animal models of movement disorders

    Directory of Open Access Journals (Sweden)

    Vandeputte Caroline

    2010-08-01

    Full Text Available Abstract Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD, Huntington's disease (HD and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders.

  5. A Quantitative Model of Keyhole Instability Induced Porosity in Laser Welding of Titanium Alloy

    Science.gov (United States)

    Pang, Shengyong; Chen, Weidong; Wang, Wen

    2014-06-01

    Quantitative prediction of the porosity defects in deep penetration laser welding has generally been considered as a very challenging task. In this study, a quantitative model of porosity defects induced by keyhole instability in partial penetration CO2 laser welding of a titanium alloy is proposed. The three-dimensional keyhole instability, weld pool dynamics, and pore formation are determined by direct numerical simulation, and the results are compared to prior experimental results. It is shown that the simulated keyhole depth fluctuations could represent the variation trends in the number and average size of pores for the studied process conditions. Moreover, it is found that it is possible to use the predicted keyhole depth fluctuations as a quantitative measure of the average size of porosity. The results also suggest that due to the shadowing effect of keyhole wall humps, the rapid cooling of the surface of the keyhole tip before keyhole collapse could lead to a substantial decrease in vapor pressure inside the keyhole tip, which is suggested to be the mechanism by which shielding gas enters into the porosity.

  6. Quantitative Relative Comparison of CFD Simulation Uncertainties for a Transonic Diffuser Problem

    OpenAIRE

    Hosder, Serhat; Grossman, Bernard; Haftka, Raphael T.; Mason, William H.; Watson, Layne T.

    2004-01-01

    Different sources of uncertainty in CFD simulations are illustrated by a detailed study of two-dimensional, turbulent, transonic flow in a converging-diverging channel. Runs were performed with the commercial CFD code GASP using different turbulence models, grid levels, and flux-limiters to see the effect of each on the CFD simulation uncertainties. Two flow conditions were studied by changing the exit pressure ratio: the first is a complex case with a strong shock and a separated flow region...

  7. Quantitative model of the growth of floodplains by vertical accretion

    Science.gov (United States)

    Moody, J.A.; Troutman, B.M.

    2000-01-01

    A simple one-dimensional model is developed to quantitatively predict the change in elevation, over a period of decades, for vertically accreting floodplains. This unsteady model approximates the monotonic growth of a floodplain as an incremental but constant increase of net sediment deposition per flood for those floods of a partial duration series that exceed a threshold discharge corresponding to the elevation of the floodplain. Sediment deposition from each flood increases the elevation of the floodplain and consequently the magnitude of the threshold discharge resulting in a decrease in the number of floods and growth rate of the floodplain. Floodplain growth curves predicted by this model are compared to empirical growth curves based on dendrochronology and to direct field measurements at five floodplain sites. The model was used to predict the value of net sediment deposition per flood which best fits (in a least squares sense) the empirical and field measurements; these values fall within the range of independent estimates of the net sediment deposition per flood based on empirical equations. These empirical equations permit the application of the model to estimate of floodplain growth for other floodplains throughout the world which do not have detailed data of sediment deposition during individual floods. Copyright (C) 2000 John Wiley and Sons, Ltd.

  8. Quantitative Model for Estimating Soil Erosion Rates Using 137Cs

    Institute of Scientific and Technical Information of China (English)

    YANGHAO; GHANGQING; 等

    1998-01-01

    A quantitative model was developed to relate the amount of 137Cs loss from the soil profile to the rate of soil erosion,According th mass balance model,the depth distribution pattern of 137Cs in the soil profile ,the radioactive decay of 137Cs,sampling year and the difference of 137Cs fallout amount among years were taken into consideration.By introducing typical depth distribution functions of 137Cs into the model ,detailed equations for the model were got for different soil,The model shows that the rate of soil erosion is mainly controlled by the depth distrbution pattern of 137Cs ,the year of sampling,and the percentage reduction in total 137Cs,The relationship between the rate of soil loss and 137Cs depletion i neither linear nor logarithmic,The depth distribution pattern of 137Cs is a major factor for estimating the rate of soil loss,Soil erosion rate is directly related with the fraction of 137Cs content near the soil surface. The influences of the radioactive decay of 137Cs,sampling year and 137Cs input fraction are not large compared with others.

  9. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose of the s......The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  10. Integration of CFD codes and advanced combustion models for quantitative burnout determination

    Energy Technology Data Exchange (ETDEWEB)

    Javier Pallares; Inmaculada Arauzo; Alan Williams [University of Zaragoza, Zaragoza (Spain). Centre of Research for Energy Resources and Consumption (CIRCE)

    2007-10-15

    CFD codes and advanced kinetics combustion models are extensively used to predict coal burnout in large utility boilers. Modelling approaches based on CFD codes can accurately solve the fluid dynamics equations involved in the problem but this is usually achieved by including simple combustion models. On the other hand, advanced kinetics combustion models can give a detailed description of the coal combustion behaviour by using a simplified description of the flow field, this usually being obtained from a zone-method approach. Both approximations describe correctly general trends on coal burnout, but fail to predict quantitative values. In this paper a new methodology which takes advantage of both approximations is described. In the first instance CFD solutions were obtained of the combustion conditions in the furnace in the Lamarmora power plant (ASM Brescia, Italy) for a number of different conditions and for three coals. Then, these furnace conditions were used as inputs for a more detailed chemical combustion model to predict coal burnout. In this, devolatilization was modelled using a commercial macromolecular network pyrolysis model (FG-DVC). For char oxidation an intrinsic reactivity approach including thermal annealing, ash inhibition and maceral effects, was used. Results from the simulations were compared against plant experimental values, showing a reasonable agreement in trends and quantitative values. 28 refs., 4 figs., 4 tabs.

  11. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  12. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  13. Simulation of realistic abnormal SPECT brain perfusion images: application in semi-quantitative analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ward, T [Department of Medical Physics and Bioengineering, Southampton University Hospitals Trust, Southampton, Hampshire, SO16 6YD (United Kingdom); Fleming, J S [Department of Medical Physics and Bioengineering, Southampton University Hospitals Trust, Southampton, Hampshire, SO16 6YD (United Kingdom); Hoffmann, S M A [Department of Medical Physics and Bioengineering, Southampton University Hospitals Trust, Southampton, Hampshire, SO16 6YD (United Kingdom); Kemp, P M [Department of Nuclear Medicine, Southampton University Hospitals Trust, Southampton, Hampshire, SO16 6YD (United Kingdom)

    2005-11-21

    Simulation is useful in the validation of functional image analysis methods, particularly when considering the number of analysis techniques currently available lacking thorough validation. Problems exist with current simulation methods due to long run times or unrealistic results making it problematic to generate complete datasets. A method is presented for simulating known abnormalities within normal brain SPECT images using a measured point spread function (PSF), and incorporating a stereotactic atlas of the brain for anatomical positioning. This allows for the simulation of realistic images through the use of prior information regarding disease progression. SPECT images of cerebral perfusion have been generated consisting of a control database and a group of simulated abnormal subjects that are to be used in a UK audit of analysis methods. The abnormality is defined in the stereotactic space, then transformed to the individual subject space, convolved with a measured PSF and removed from the normal subject image. The dataset was analysed using SPM99 (Wellcome Department of Imaging Neuroscience, University College, London) and the MarsBaR volume of interest (VOI) analysis toolbox. The results were evaluated by comparison with the known ground truth. The analysis showed improvement when using a smoothing kernel equal to system resolution over the slightly larger kernel used routinely. Significant correlation was found between effective volume of a simulated abnormality and the detected size using SPM99. Improvements in VOI analysis sensitivity were found when using the region median over the region mean. The method and dataset provide an efficient methodology for use in the comparison and cross validation of semi-quantitative analysis methods in brain SPECT, and allow the optimization of analysis parameters.

  14. Simulation of realistic abnormal SPECT brain perfusion images: application in semi-quantitative analysis

    Science.gov (United States)

    Ward, T.; Fleming, J. S.; Hoffmann, S. M. A.; Kemp, P. M.

    2005-11-01

    Simulation is useful in the validation of functional image analysis methods, particularly when considering the number of analysis techniques currently available lacking thorough validation. Problems exist with current simulation methods due to long run times or unrealistic results making it problematic to generate complete datasets. A method is presented for simulating known abnormalities within normal brain SPECT images using a measured point spread function (PSF), and incorporating a stereotactic atlas of the brain for anatomical positioning. This allows for the simulation of realistic images through the use of prior information regarding disease progression. SPECT images of cerebral perfusion have been generated consisting of a control database and a group of simulated abnormal subjects that are to be used in a UK audit of analysis methods. The abnormality is defined in the stereotactic space, then transformed to the individual subject space, convolved with a measured PSF and removed from the normal subject image. The dataset was analysed using SPM99 (Wellcome Department of Imaging Neuroscience, University College, London) and the MarsBaR volume of interest (VOI) analysis toolbox. The results were evaluated by comparison with the known ground truth. The analysis showed improvement when using a smoothing kernel equal to system resolution over the slightly larger kernel used routinely. Significant correlation was found between effective volume of a simulated abnormality and the detected size using SPM99. Improvements in VOI analysis sensitivity were found when using the region median over the region mean. The method and dataset provide an efficient methodology for use in the comparison and cross validation of semi-quantitative analysis methods in brain SPECT, and allow the optimization of analysis parameters.

  15. HVDC System Characteristics and Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Moon, S.I.; Han, B.M.; Jang, G.S. [Electric Enginnering and Science Research Institute, Seoul (Korea)

    2001-07-01

    This report deals with the AC-DC power system simulation method by PSS/E and EUROSTAG for the development of a strategy for the reliable operation of the Cheju-Haenam interconnected system. The simulation using both programs is performed to analyze HVDC simulation models. In addition, the control characteristics of the Cheju-Haenam HVDC system as well as Cheju AC system characteristics are described in this work. (author). 104 figs., 8 tabs.

  16. Contribution to the Development of Simulation Model of Ship Turbine

    Directory of Open Access Journals (Sweden)

    Božić Ratko

    2015-01-01

    Full Text Available Simulation modelling, performed by System Dynamics Modelling Approach and intensive use of computers, is one of the most convenient and most successful scientific methods of analysis of performance dynamics of nonlinear and very complex natural technical and organizational systems [1]. The purpose of this work is to demonstrate the successful application of system dynamics simulation modelling at analyzing performance dynamics of a complex system of ship’s propulsion system. Gas turbine is a complex non-linear system, which needs to be systematically investigated as a unit consisting of a number of subsystems and elements, which are linked by cause-effect (UPV feedback loops (KPD, both within the propulsion system and with the relevant surrounding. In this paper the authors will present an efficient application of scientific methods for the study of complex dynamic systems called qualitative and quantitative simulation System Dynamics Methodology. Gas turbine will be presented by a set of non-linear differential equations, after which mental-verbal structural models and flowcharts in System dynamics symbols will be produced, and the performance dynamics in load condition will be simulated in POWERSIM simulation language.

  17. Simulation modeling and analysis with Arena

    Energy Technology Data Exchange (ETDEWEB)

    Tayfur Altiok; Benjamin Melamed [Rutgers University, NJ (United States). Department of Industrial and Systems Engineering

    2007-06-15

    The textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment. It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings. Chapter 13.3.3 is on coal loading operations on barges/tugboats.

  18. Object Oriented Modelling and Dynamical Simulation

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1998-01-01

    This report with appendix describes the work done in master project at DTU.The goal of the project was to develop a concept for simulation of dynamical systems based on object oriented methods.The result was a library of C++-classes, for use when both building componentbased models and when...... onduction simulation experiments....

  19. Modeling and simulation for RF system design

    CERN Document Server

    Frevert, Ronny; Jancke, Roland; Knöchel, Uwe; Schwarz, Peter; Kakerow, Ralf; Darianian, Mohsen

    2005-01-01

    Focusing on RF specific modeling and simulation methods, and system and circuit level descriptions, this work contains application-oriented training material. Accompanied by a CD- ROM, it combines the presentation of a mixed-signal design flow, an introduction into VHDL-AMS and Verilog-A, and the application of commercially available simulators.

  20. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  1. MEGACELL: a nanocrystal model construction software for HRTEM multislice simulation.

    Science.gov (United States)

    Stroppa, Daniel G; Righetto, Ricardo D; Montoro, Luciano A; Ramirez, Antonio J

    2011-07-01

    Image simulation has an invaluable importance for the accurate analysis of High Resolution Transmission Electron Microscope (HRTEM) results, especially due to its non-linear image formation mechanism. Because the as-obtained images cannot be interpreted in a straightforward fashion, the retrieval of both qualitative and quantitative information from HRTEM micrographs requires an iterative process including the simulation of a nanocrystal model and its comparison with experimental images. However most of the available image simulation software requires atom-by-atom coordinates as input for the calculations, which can be prohibitive for large finite crystals and/or low-symmetry systems and zone axis orientations. This paper presents an open source citation-ware tool named MEGACELL, which was developed to assist on the construction of nanocrystals models. It allows the user to build nanocrystals with virtually any convex polyhedral geometry and to retrieve its atomic positions either as a plain text file or as an output compatible with EMS (Electron Microscopy Software) input protocol. In addition to the description of this tool features, some construction examples and its application for scientific studies are presented. These studies show MEGACELL as a handy tool, which allows an easier construction of complex nanocrystal models and improves the quantitative information extraction from HRTEM images. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Phylogenetic ANOVA: The Expression Variance and Evolution Model for Quantitative Trait Evolution.

    Science.gov (United States)

    Rohlfs, Rori V; Nielsen, Rasmus

    2015-09-01

    A number of methods have been developed for modeling the evolution of a quantitative trait on a phylogeny. These methods have received renewed interest in the context of genome-wide studies of gene expression, in which the expression levels of many genes can be modeled as quantitative traits. We here develop a new method for joint analyses of quantitative traits within- and between species, the Expression Variance and Evolution (EVE) model. The model parameterizes the ratio of population to evolutionary expression variance, facilitating a wide variety of analyses, including a test for lineage-specific shifts in expression level, and a phylogenetic ANOVA that can detect genes with increased or decreased ratios of expression divergence to diversity, analogous to the famous Hudson Kreitman Aguadé (HKA) test used to detect selection at the DNA level. We use simulations to explore the properties of these tests under a variety of circumstances and show that the phylogenetic ANOVA is more accurate than the standard ANOVA (no accounting for phylogeny) sometimes used in transcriptomics. We then apply the EVE model to a mammalian phylogeny of 15 species typed for expression levels in liver tissue. We identify genes with high expression divergence between species as candidates for expression level adaptation, and genes with high expression diversity within species as candidates for expression level conservation and/or plasticity. Using the test for lineage-specific expression shifts, we identify several candidate genes for expression level adaptation on the catarrhine and human lineages, including genes putatively related to dietary changes in humans. We compare these results to those reported previously using a model which ignores expression variance within species, uncovering important differences in performance. We demonstrate the necessity for a phylogenetic model in comparative expression studies and show the utility of the EVE model to detect expression divergence

  3. A Quantitative Model to Estimate Drug Resistance in Pathogens

    Directory of Open Access Journals (Sweden)

    Frazier N. Baker

    2016-12-01

    Full Text Available Pneumocystis pneumonia (PCP is an opportunistic infection that occurs in humans and other mammals with debilitated immune systems. These infections are caused by fungi in the genus Pneumocystis, which are not susceptible to standard antifungal agents. Despite decades of research and drug development, the primary treatment and prophylaxis for PCP remains a combination of trimethoprim (TMP and sulfamethoxazole (SMX that targets two enzymes in folic acid biosynthesis, dihydrofolate reductase (DHFR and dihydropteroate synthase (DHPS, respectively. There is growing evidence of emerging resistance by Pneumocystis jirovecii (the species that infects humans to TMP-SMX associated with mutations in the targeted enzymes. In the present study, we report the development of an accurate quantitative model to predict changes in the binding affinity of inhibitors (Ki, IC50 to the mutated proteins. The model is based on evolutionary information and amino acid covariance analysis. Predicted changes in binding affinity upon mutations highly correlate with the experimentally measured data. While trained on Pneumocystis jirovecii DHFR/TMP data, the model shows similar or better performance when evaluated on the resistance data for a different inhibitor of PjDFHR, another drug/target pair (PjDHPS/SMX and another organism (Staphylococcus aureus DHFR/TMP. Therefore, we anticipate that the developed prediction model will be useful in the evaluation of possible resistance of the newly sequenced variants of the pathogen and can be extended to other drug targets and organisms.

  4. MEGACELL: A nanocrystal model construction software for HRTEM multislice simulation

    Energy Technology Data Exchange (ETDEWEB)

    Stroppa, Daniel G., E-mail: dstroppa@lnls.br [Brazilian Synchrotron Light Laboratory, 13083-970 Campinas, SP (Brazil); Mechanical Engineering School, University of Campinas, 13083-860 Campinas, SP (Brazil); Righetto, Ricardo D. [Brazilian Synchrotron Light Laboratory, 13083-970 Campinas, SP (Brazil); School of Electrical and Computer Engineering, University of Campinas, 13083-852 Campinas, SP (Brazil); Montoro, Luciano A. [Brazilian Synchrotron Light Laboratory, 13083-970 Campinas, SP (Brazil); Ramirez, Antonio J. [Brazilian Synchrotron Light Laboratory, 13083-970 Campinas, SP (Brazil); Mechanical Engineering School, University of Campinas, 13083-860 Campinas, SP (Brazil)

    2011-07-15

    Image simulation has an invaluable importance for the accurate analysis of High Resolution Transmission Electron Microscope (HRTEM) results, especially due to its non-linear image formation mechanism. Because the as-obtained images cannot be interpreted in a straightforward fashion, the retrieval of both qualitative and quantitative information from HRTEM micrographs requires an iterative process including the simulation of a nanocrystal model and its comparison with experimental images. However most of the available image simulation software requires atom-by-atom coordinates as input for the calculations, which can be prohibitive for large finite crystals and/or low-symmetry systems and zone axis orientations. This paper presents an open source citation-ware tool named MEGACELL, which was developed to assist on the construction of nanocrystals models. It allows the user to build nanocrystals with virtually any convex polyhedral geometry and to retrieve its atomic positions either as a plain text file or as an output compatible with EMS (Electron Microscopy Software) input protocol. In addition to the description of this tool features, some construction examples and its application for scientific studies are presented. These studies show MEGACELL as a handy tool, which allows an easier construction of complex nanocrystal models and improves the quantitative information extraction from HRTEM images. -- Highlights: {yields} A software to support the HRTEM image simulation of nanocrystals in actual size. {yields} MEGACELL allows the construction of complex nanocrystals models for multislice image simulation. {yields} Some examples of improved nanocrystalline system characterization are presented, including the analysis of 3D morphology and growth behavior.

  5. A sand wave simulation model

    NARCIS (Netherlands)

    Nemeth, A.A.; Hulscher, S.J.M.H.; Damme, van R.M.J.

    2003-01-01

    Sand waves form a prominent regular pattern in the offshore seabeds of sandy shallow seas. A two dimensional vertical (2DV) flow and morphological numerical model describing the behaviour of these sand waves has been developed. The model contains the 2DV shallow water equations, with a free water su

  6. Modelling Reactive and Proactive Behaviour in Simulation

    CERN Document Server

    Majid, Mazlina Abdul; Aickelin, Uwe

    2010-01-01

    This research investigated the simulation model behaviour of a traditional and combined discrete event as well as agent based simulation models when modelling human reactive and proactive behaviour in human centric complex systems. A departmental store was chosen as human centric complex case study where the operation system of a fitting room in WomensWear department was investigated. We have looked at ways to determine the efficiency of new management policies for the fitting room operation through simulating the reactive and proactive behaviour of staff towards customers. Once development of the simulation models and their verification had been done, we carried out a validation experiment in the form of a sensitivity analysis. Subsequently, we executed a statistical analysis where the mixed reactive and proactive behaviour experimental results were compared with some reactive experimental results from previously published works. Generally, this case study discovered that simple proactive individual behaviou...

  7. Challenges in SysML Model Simulation

    Directory of Open Access Journals (Sweden)

    Mara Nikolaidou

    2016-07-01

    Full Text Available Systems Modeling Language (SysML is a standard proposed by the OMG for systems-of-systems (SoS modeling and engineering. To this end, it provides the means to depict SoS components and their behavior in a hierarchical, multi-layer fashion, facilitating alternative engineering activities, such as system design. To explore the performance of SysML, simulation is one of the preferred methods. There are many efforts targeting simulation code generation from SysML models. Numerous simulation methodologies and tools are employed, while different SysML diagrams are utilized. Nevertheless, this process is not standardized, although most of current approaches tend to follow the same steps, even if they employ different tools. The scope of this paper is to provide a comprehensive understanding of the similarities and differences of existing approaches and identify current challenges in fully automating SysML models simulation process.

  8. SIMULATION MODELING SLOW SPATIALLY HETER- OGENEOUS COAGULATION

    Directory of Open Access Journals (Sweden)

    P. A. Zdorovtsev

    2013-01-01

    Full Text Available A new model of spatially inhomogeneous coagulation, i.e. formation of larger clusters by joint interaction of smaller ones, is under study. The results of simulation are compared with known analytical and numerical solutions.

  9. Quantitative Test of the Evolution of Geant4 Electron Backscattering Simulation

    CERN Document Server

    Basaglia, Tullio; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Pia, Maria Grazia; Saracco, Paolo

    2016-01-01

    Evolutions of Geant4 code have affected the simulation of electron backscattering with respect to previously published results. Their effects are quantified by analyzing the compatibility of the simulated electron backscattering fraction with a large collection of experimental data for a wide set of physics configuration options available in Geant4. Special emphasis is placed on two electron scattering implementations first released in Geant4 version 10.2: the Goudsmit-Saunderson multiple scattering model and a single Coulomb scattering model based on Mott cross section calculation. The new Goudsmit-Saunderson multiple scattering model appears to perform equally or less accurately than the model implemented in previous Geant4 versions, depending on the electron energy. The new Coulomb scattering model was flawed from a physics point of view, but computationally fast in Geant4 version 10.2; the physics correction released in Geant4 version 10.2p01 severely degrades its computational performance. Evolutions in ...

  10. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  11. Quantitative modeling of Cerenkov light production efficiency from medical radionuclides.

    Science.gov (United States)

    Beattie, Bradley J; Thorek, Daniel L J; Schmidtlein, Charles R; Pentlow, Keith S; Humm, John L; Hielscher, Andreas H

    2012-01-01

    There has been recent and growing interest in applying Cerenkov radiation (CR) for biological applications. Knowledge of the production efficiency and other characteristics of the CR produced by various radionuclides would help in accessing the feasibility of proposed applications and guide the choice of radionuclides. To generate this information we developed models of CR production efficiency based on the Frank-Tamm equation and models of CR distribution based on Monte-Carlo simulations of photon and β particle transport. All models were validated against direct measurements using multiple radionuclides and then applied to a number of radionuclides commonly used in biomedical applications. We show that two radionuclides, Ac-225 and In-111, which have been reported to produce CR in water, do not in fact produce CR directly. We also propose a simple means of using this information to calibrate high sensitivity luminescence imaging systems and show evidence suggesting that this calibration may be more accurate than methods in routine current use.

  12. Quantitative interpretation of molecular dynamics simulations for X-ray photoelectron spectroscopy of aqueous solutions

    Science.gov (United States)

    Olivieri, Giorgia; Parry, Krista M.; Powell, Cedric J.; Tobias, Douglas J.; Brown, Matthew A.

    2016-04-01

    Over the past decade, energy-dependent ambient pressure X-ray photoelectron spectroscopy (XPS) has emerged as a powerful analytical probe of the ion spatial distributions at the vapor (vacuum)-aqueous electrolyte interface. These experiments are often paired with complementary molecular dynamics (MD) simulations in an attempt to provide a complete description of the liquid interface. There is, however, no systematic protocol that permits a straightforward comparison of the two sets of results. XPS is an integrated technique that averages signals from multiple layers in a solution even at the lowest photoelectron kinetic energies routinely employed, whereas MD simulations provide a microscopic layer-by-layer description of the solution composition near the interface. Here, we use the National Institute of Standards and Technology database for the Simulation of Electron Spectra for Surface Analysis (SESSA) to quantitatively interpret atom-density profiles from MD simulations for XPS signal intensities using sodium and potassium iodide solutions as examples. We show that electron inelastic mean free paths calculated from a semi-empirical formula depend strongly on solution composition, varying by up to 30% between pure water and concentrated NaI. The XPS signal thus arises from different information depths in different solutions for a fixed photoelectron kinetic energy. XPS signal intensities are calculated using SESSA as a function of photoelectron kinetic energy (probe depth) and compared with a widely employed ad hoc method. SESSA simulations illustrate the importance of accounting for elastic-scattering events at low photoelectron kinetic energies (hoc method systematically underestimates the preferential enhancement of anions over cations. Finally, some technical aspects of applying SESSA to liquid interfaces are discussed.

  13. Application of Chebyshev Polynomial to simulated modeling

    Institute of Scientific and Technical Information of China (English)

    CHI Hai-hong; LI Dian-pu

    2006-01-01

    Chebyshev polynomial is widely used in many fields, and used usually as function approximation in numerical calculation. In this paper, Chebyshev polynomial expression of the propeller properties across four quadrants is given at first, then the expression of Chebyshev polynomial is transformed to ordinary polynomial for the need of simulation of propeller dynamics. On the basis of it,the dynamical models of propeller across four quadrants are given. The simulation results show the efficiency of mathematical model.

  14. Collisionless Electrostatic Shock Modeling and Simulation

    Science.gov (United States)

    2016-10-21

    Briefing Charts 3. DATES COVERED (From - To) 30 September 2016 – 21 October 2016 4. TITLE AND SUBTITLE Collisionless Electrostatic Shock Modeling and...release: distribution unlimited. PA#16490 Air Force Research Laboratory Collisionless Electrostatic Shock Modeling and Simulation Daniel W. Crews In-Space...unlimited. PA#16490 Overview • Motivation and Background • What is a Collisionless Shock Wave? • Features of the Collisionless Shock • The Shock Simulation

  15. Modelling Activities In Kinematics Understanding quantitative relations with the contribution of qualitative reasoning

    Science.gov (United States)

    Orfanos, Stelios

    2010-01-01

    In Greek traditional teaching a lot of significant concepts are introduced with a sequence that does not provide the students with all the necessary information required to comprehend. We consider that understanding concepts and the relations among them is greatly facilitated by the use of modelling tools, taking into account that the modelling process forces students to change their vague, imprecise ideas into explicit causal relationships. It is not uncommon to find students who are able to solve problems by using complicated relations without getting a qualitative and in-depth grip on them. Researchers have already shown that students often have a formal mathematical and physical knowledge without a qualitative understanding of basic concepts and relations." The aim of this communication is to present some of the results of our investigation into modelling activities related to kinematical concepts. For this purpose, we have used ModellingSpace, an environment that was especially designed to allow students from eleven to seventeen years old to express their ideas and gradually develop them. The ModellingSpace enables students to build their own models and offers the choice of observing directly simulations of real objects and/or all the other alternative forms of representations (tables of values, graphic representations and bar-charts). The students -in order to answer the questions- formulate hypotheses, they create models, they compare their hypotheses with the representations of their models and they modify or create other models when their hypotheses did not agree with the representations. In traditional ways of teaching, students are educated to utilize formulas as the most important strategy. Several times the students recall formulas in order to utilize them, without getting an in-depth understanding on them. Students commonly use the quantitative type of reasoning, since it is primarily used in teaching, although it may not be fully understood by them

  16. Modeling of magnetic particle suspensions for simulations

    CERN Document Server

    Satoh, Akira

    2017-01-01

    The main objective of the book is to highlight the modeling of magnetic particles with different shapes and magnetic properties, to provide graduate students and young researchers information on the theoretical aspects and actual techniques for the treatment of magnetic particles in particle-based simulations. In simulation, we focus on the Monte Carlo, molecular dynamics, Brownian dynamics, lattice Boltzmann and stochastic rotation dynamics (multi-particle collision dynamics) methods. The latter two simulation methods can simulate both the particle motion and the ambient flow field simultaneously. In general, specialized knowledge can only be obtained in an effective manner under the supervision of an expert. The present book is written to play such a role for readers who wish to develop the skill of modeling magnetic particles and develop a computer simulation program using their own ability. This book is therefore a self-learning book for graduate students and young researchers. Armed with this knowledge,...

  17. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    A simple model of the wave load on slender members of offshore structures is described. The wave elevation of the sea state is modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...... velocity can be approximated by a Gaussian Markov process. Known approximate results for the first-passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results...

  18. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...... velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results...

  19. Modeling and simulation of multiport RF switch

    Energy Technology Data Exchange (ETDEWEB)

    Vijay, J [Student, Department of Instrumentation and Control Engineering, National Institute of Technology, Tiruchirappalli-620015 (India); Saha, Ivan [Scientist, Indian Space Research Organisation (ISRO) (India); Uma, G [Lecturer, Department of Instrumentation and Control Engineering, National Institute of Technology, Tiruchirappalli-620015 (India); Umapathy, M [Assistant Professor, Department of Instrumentation and Control Engineering, National Institute of Technology, Tiruchirappalli-620015 (India)

    2006-04-01

    This paper describes the modeling and simulation of 'Multi Port RF Switch' where the latching mechanism is realized with two hot arm electro thermal actuators and the switching action is realized with electrostatic actuators. It can act as single pole single thrown as well as single pole multi thrown switch. The proposed structure is modeled analytically and required parameters are simulated using MATLAB. The analytical simulation results are validated using Finite Element Analysis of the same in the COVENTORWARE software.

  20. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  1. Traffic Modeling in WCDMA System Level Simulations

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Traffic modeling is a crucial element in WCDMA system level simulations. A clear understanding of the nature of traffic in the WCDMA system and subsequent selection of an appropriate random traffic model are critical to the success of the modeling enterprise. The resultant performances will evidently be of a function that our design has been well adapted to the traffic, channel and user mobility models, and these models are also accurate. In this article, our attention will be focused on modeling voice and WWW data traffic with the SBBP model and Victor model respectively.

  2. Mechanics of neutrophil phagocytosis: experiments and quantitative models.

    Science.gov (United States)

    Herant, Marc; Heinrich, Volkmar; Dembo, Micah

    2006-05-01

    To quantitatively characterize the mechanical processes that drive phagocytosis, we observed the FcgammaR-driven engulfment of antibody-coated beads of diameters 3 mum to 11 mum by initially spherical neutrophils. In particular, the time course of cell morphology, of bead motion and of cortical tension were determined. Here, we introduce a number of mechanistic models for phagocytosis and test their validity by comparing the experimental data with finite element computations for multiple bead sizes. We find that the optimal models involve two key mechanical interactions: a repulsion or pressure between cytoskeleton and free membrane that drives protrusion, and an attraction between cytoskeleton and membrane newly adherent to the bead that flattens the cell into a thin lamella. Other models such as cytoskeletal expansion or swelling appear to be ruled out as main drivers of phagocytosis because of the characteristics of bead motion during engulfment. We finally show that the protrusive force necessary for the engulfment of large beads points towards storage of strain energy in the cytoskeleton over a large distance from the leading edge ( approximately 0.5 microm), and that the flattening force can plausibly be generated by the known concentrations of unconventional myosins at the leading edge.

  3. A Quantitative Model for Assessing Visual Simulation Software Architecture

    Science.gov (United States)

    2011-09-01

    Off-the-Shelf DAC Data Abstraction Coupling DIT Depth of Inheritance Tree DSRS Defense Software Reuse System DoDD Department of Defense Directive DoD...features and integrates a number of open source libraries such as Open Scene Graph for rendering, Open Dynamics Engine for physics, and OpenAL for audio ...2,530 frameworks/ audio 11 1,962 frameworks/entity 28 6,849 frameworks/event 10 2,317 frameworks/input 18 2,659 frameworks/net 27 8,754 frameworks

  4. Quantitative Modeling of the Alternative Pathway of the Complement System.

    Science.gov (United States)

    Zewde, Nehemiah; Gorham, Ronald D; Dorado, Angel; Morikis, Dimitrios

    2016-01-01

    The complement system is an integral part of innate immunity that detects and eliminates invading pathogens through a cascade of reactions. The destructive effects of the complement activation on host cells are inhibited through versatile regulators that are present in plasma and bound to membranes. Impairment in the capacity of these regulators to function in the proper manner results in autoimmune diseases. To better understand the delicate balance between complement activation and regulation, we have developed a comprehensive quantitative model of the alternative pathway. Our model incorporates a system of ordinary differential equations that describes the dynamics of the four steps of the alternative pathway under physiological conditions: (i) initiation (fluid phase), (ii) amplification (surfaces), (iii) termination (pathogen), and (iv) regulation (host cell and fluid phase). We have examined complement activation and regulation on different surfaces, using the cellular dimensions of a characteristic bacterium (E. coli) and host cell (human erythrocyte). In addition, we have incorporated neutrophil-secreted properdin into the model highlighting the cross talk of neutrophils with the alternative pathway in coordinating innate immunity. Our study yields a series of time-dependent response data for all alternative pathway proteins, fragments, and complexes. We demonstrate the robustness of alternative pathway on the surface of pathogens in which complement components were able to saturate the entire region in about 54 minutes, while occupying less than one percent on host cells at the same time period. Our model reveals that tight regulation of complement starts in fluid phase in which propagation of the alternative pathway was inhibited through the dismantlement of fluid phase convertases. Our model also depicts the intricate role that properdin released from neutrophils plays in initiating and propagating the alternative pathway during bacterial infection.

  5. A hierarchical statistical model for estimating population properties of quantitative genes

    Directory of Open Access Journals (Sweden)

    Wu Rongling

    2002-06-01

    Full Text Available Abstract Background Earlier methods for detecting major genes responsible for a quantitative trait rely critically upon a well-structured pedigree in which the segregation pattern of genes exactly follow Mendelian inheritance laws. However, for many outcrossing species, such pedigrees are not available and genes also display population properties. Results In this paper, a hierarchical statistical model is proposed to monitor the existence of a major gene based on its segregation and transmission across two successive generations. The model is implemented with an EM algorithm to provide maximum likelihood estimates for genetic parameters of the major locus. This new method is successfully applied to identify an additive gene having a large effect on stem height growth of aspen trees. The estimates of population genetic parameters for this major gene can be generalized to the original breeding population from which the parents were sampled. A simulation study is presented to evaluate finite sample properties of the model. Conclusions A hierarchical model was derived for detecting major genes affecting a quantitative trait based on progeny tests of outcrossing species. The new model takes into account the population genetic properties of genes and is expected to enhance the accuracy, precision and power of gene detection.

  6. Digital clocks: simple Boolean models can quantitatively describe circadian systems.

    Science.gov (United States)

    Akman, Ozgur E; Watterson, Steven; Parton, Andrew; Binns, Nigel; Millar, Andrew J; Ghazal, Peter

    2012-09-07

    The gene networks that comprise the circadian clock modulate biological function across a range of scales, from gene expression to performance and adaptive behaviour. The clock functions by generating endogenous rhythms that can be entrained to the external 24-h day-night cycle, enabling organisms to optimally time biochemical processes relative to dawn and dusk. In recent years, computational models based on differential equations have become useful tools for dissecting and quantifying the complex regulatory relationships underlying the clock's oscillatory dynamics. However, optimizing the large parameter sets characteristic of these models places intense demands on both computational and experimental resources, limiting the scope of in silico studies. Here, we develop an approach based on Boolean logic that dramatically reduces the parametrization, making the state and parameter spaces finite and tractable. We introduce efficient methods for fitting Boolean models to molecular data, successfully demonstrating their application to synthetic time courses generated by a number of established clock models, as well as experimental expression levels measured using luciferase imaging. Our results indicate that despite their relative simplicity, logic models can (i) simulate circadian oscillations with the correct, experimentally observed phase relationships among genes and (ii) flexibly entrain to light stimuli, reproducing the complex responses to variations in daylength generated by more detailed differential equation formulations. Our work also demonstrates that logic models have sufficient predictive power to identify optimal regulatory structures from experimental data. By presenting the first Boolean models of circadian circuits together with general techniques for their optimization, we hope to establish a new framework for the systematic modelling of more complex clocks, as well as other circuits with different qualitative dynamics. In particular, we anticipate

  7. Modeling and simulation of luminescence detection platforms.

    Science.gov (United States)

    Salama, Khaled; Eltoukhy, Helmy; Hassibi, Arjang; El-Gamal, Abbas

    2004-06-15

    Motivated by the design of an integrated CMOS-based detection platform, a simulation model for CCD and CMOS imager-based luminescence detection systems is developed. The model comprises four parts. The first portion models the process of photon flux generation from luminescence probes using ATP-based and luciferase label-based assay kinetics. An optics simulator is then used to compute the incident photon flux on the imaging plane for a given photon flux and system geometry. Subsequently, the output image is computed using a detailed imaging sensor model that accounts for photodetector spectral response, dark current, conversion gain, and various noise sources. Finally, signal processing algorithms are applied to the image to enhance detection reliability and hence increase the overall system throughput. To validate the model, simulation results are compared to experimental results obtained from a CCD-based system that was built to emulate the integrated CMOS-based platform.

  8. SOFT MODELLING AND SIMULATION IN STRATEGY

    Directory of Open Access Journals (Sweden)

    Luciano Rossoni

    2006-06-01

    Full Text Available A certain resistance on the part of the responsible controllers for the strategy exists, in using techniques and tools of modeling and simulation. Many find them excessively complicated, already others see them as rigid and mathematical for excessively for the use of strategies in uncertain and turbulent environments. However, some interpretative boarding that take care of, in part exist, the necessities of these borrowers of decision. The objective of this work is to demonstrate of a clear and simple form, some of the most powerful boarding, methodologies and interpretative tools (soft of modeling and simulation in the business-oriented area of strategy. We will define initially, what they are on models, simulation and some aspects to the modeling and simulation in the strategy area. Later we will see some boarding of modeling soft, that they see the modeling process much more of that simply a mechanical process, therefore, as seen for Simon, the human beings rationally are limited and its decisions are influenced by a series of questions of subjective character, related to the way where it is inserted. Keywords: strategy, modeling and simulation, soft systems methodology, cognitive map, systems dynamics.

  9. Modeling of microfluidic microbial fuel cells using quantitative bacterial transport parameters

    Science.gov (United States)

    Mardanpour, Mohammad Mahdi; Yaghmaei, Soheila; Kalantar, Mohammad

    2017-02-01

    The objective of present study is to analyze the dynamic modeling of bioelectrochemical processes and improvement of the performance of previous models using quantitative data of bacterial transport parameters. The main deficiency of previous MFC models concerning spatial distribution of biocatalysts is an assumption of initial distribution of attached/suspended bacteria on electrode or in anolyte bulk which is the foundation for biofilm formation. In order to modify this imperfection, the quantification of chemotactic motility to understand the mechanisms of the suspended microorganisms' distribution in anolyte and/or their attachment to anode surface to extend the biofilm is implemented numerically. The spatial and temporal distributions of the bacteria, as well as the dynamic behavior of the anolyte and biofilm are simulated. The performance of the microfluidic MFC as a chemotaxis assay is assessed by analyzing the bacteria activity, substrate variation, bioelectricity production rate and the influences of external resistance on the biofilm and anolyte's features.

  10. Modeling and Simulation of Hydraulic Engine Mounts

    Institute of Scientific and Technical Information of China (English)

    DUAN Shanzhong; Marshall McNea

    2012-01-01

    Hydraulic engine mounts are widely used in automotive powertrains for vibration isolation.A lumped mechanical parameter model is a traditional approach to model and simulate such mounts.This paper presents a dynamical model of a passive hydraulic engine mount with a double-chamber,an inertia track,a decoupler,and a plunger.The model is developed based on analogy between electrical systems and mechanical-hydraulic systems.The model is established to capture both low and high frequency dynatmic behaviors of the hydraulic mount.The model will be further used to find the approximate pulse responses of the mounts in terms of the force transmission and top chamber pressure.The close form solution from the simplifiod linear model may provide some insight into the highly nonlinear behavior of the mounts.Based on the model,computer simulation has been carried out to study dynamic performance of the hydraulic mount.

  11. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels;

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  12. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, Kim; Karstensen, Claus; Condra, Thomas Joseph;

    2003-01-01

    A model for a ue gas boiler covering the ue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been dened for the furnace, the convection zone (split in 2: a zone...... submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic- Equation system (DAE). Subsequently MatLab/Simulink has...... been applied for carrying out the simulations. To be able to verify the simulated results an experiments has been carried out on a full scale boiler plant....

  13. Modeling and simulation of normal and hemiparetic gait

    Science.gov (United States)

    Luengas, Lely A.; Camargo, Esperanza; Sanchez, Giovanni

    2015-09-01

    Gait is the collective term for the two types of bipedal locomotion, walking and running. This paper is focused on walking. The analysis of human gait is of interest to many different disciplines, including biomechanics, human-movement science, rehabilitation and medicine in general. Here we present a new model that is capable of reproducing the properties of walking, normal and pathological. The aim of this paper is to establish the biomechanical principles that underlie human walking by using Lagrange method. The constraint forces of Rayleigh dissipation function, through which to consider the effect on the tissues in the gait, are included. Depending on the value of the factor present in the Rayleigh dissipation function, both normal and pathological gait can be simulated. First of all, we apply it in the normal gait and then in the permanent hemiparetic gait. Anthropometric data of adult person are used by simulation, and it is possible to use anthropometric data for children but is necessary to consider existing table of anthropometric data. Validation of these models includes simulations of passive dynamic gait that walk on level ground. The dynamic walking approach provides a new perspective of gait analysis, focusing on the kinematics and kinetics of gait. There have been studies and simulations to show normal human gait, but few of them have focused on abnormal, especially hemiparetic gait. Quantitative comparisons of the model predictions with gait measurements show that the model can reproduce the significant characteristics of normal gait.

  14. Light scattering by neutrophils: model, simulation, and experiment.

    Science.gov (United States)

    Orlova, Darya Yu; Yurkin, Maxim A; Hoekstra, Alfons G; Maltsev, Valeri P

    2008-01-01

    We studied the elastic light-scattering properties of human blood neutrophils, both experimentally and theoretically. The experimental study was performed with a scanning flow cytometer measuring the light-scattering patterns (LSPs) of individual cells over an angular range of 5-60 deg. We determined the absolute differential light-scattering cross sections of neutrophils. We also proposed an optical model for a neutrophil as a sphere filled by small spheres and prolate spheroids that correspond to granules and segmented nucleus, respectively. This model was used in simulations of LSPs using the discrete dipole approximation and different compositions of internal organelles. A comparison of experimentally measured and simulated LSPs gives a good qualitative agreement in LSP shape and quantitative agreement in overall magnitude of the differential light-scattering cross section.

  15. Simulation of Dam Break Flow Using Quasi-Molecular Modelling

    Directory of Open Access Journals (Sweden)

    Sitthichai KULSRI

    2007-01-01

    Full Text Available We developed a new method based on quasi-molecular modelling to simulate dam break flow. Each quasi-molecule was a group of particles that interacted in a fashion entirely analogous to classical Newtonian molecular interactions. The tank had a base length of 58.4 cm. A water column with a base length of 14.6 cm and a height of 29.2 cm was initially supported on the right side by a vertical plate drawn up rapidly at time t = 0.0 s. The water fell under the influence of gravity acting vertically downwards. The numerical results were validated by quantitative comparison with a previous study. The predicted height and leading edge of the water column corresponded very well with experimental measurements from a previous study. Therefore, our new method based on quasi-molecular modelling showed its ability to adequately simulate a free surface problem.

  16. Pushing the Frontier of Data-Oriented Geodynamic Modeling: from Qualitative to Quantitative to Predictive

    Science.gov (United States)

    Liu, L.; Hu, J.; Zhou, Q.

    2016-12-01

    The rapid accumulation of geophysical and geological data sets poses an increasing demand for the development of geodynamic models to better understand the evolution of the solid Earth. Consequently, the earlier qualitative physical models are no long satisfying. Recent efforts are focusing on more quantitative simulations and more efficient numerical algorithms. Among these, a particular line of research is on the implementation of data-oriented geodynamic modeling, with the purpose of building an observationally consistent and physically correct geodynamic framework. Such models could often catalyze new insights into the functioning mechanisms of the various aspects of plate tectonics, and their predictive nature could also guide future research in a deterministic fashion. Over the years, we have been working on constructing large-scale geodynamic models with both sequential and variational data assimilation techniques. These models act as a bridge between different observational records, and the superposition of the constraining power from different data sets help reveal unknown processes and mechanisms of the dynamics of the mantle and lithosphere. We simulate the post-Cretaceous subduction history in South America using a forward (sequential) approach. The model is constrained using past subduction history, seafloor age evolution, tectonic architecture of continents, and the present day geophysical observations. Our results quantify the various driving forces shaping the present South American flat slabs, which we found are all internally torn. The 3-D geometry of these torn slabs further explains the abnormal seismicity pattern and enigmatic volcanic history. An inverse (variational) model simulating the late Cenozoic western U.S. mantle dynamics with similar constraints reveals a different mechanism for the formation of Yellowstone-related volcanism from traditional understanding. Furthermore, important insights on the mantle density and viscosity structures

  17. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented......, focusing on universality of the ac response in the extreme disorder limit. Finally, some important unsolved problems relating to hopping models for ac conduction are listed....

  18. Quantitative Circulatory Physiology: an integrative mathematical model of human physiology for medical education.

    Science.gov (United States)

    Abram, Sean R; Hodnett, Benjamin L; Summers, Richard L; Coleman, Thomas G; Hester, Robert L

    2007-06-01

    We have developed Quantitative Circulatory Physiology (QCP), a mathematical model of integrative human physiology containing over 4,000 variables of biological interactions. This model provides a teaching environment that mimics clinical problems encountered in the practice of medicine. The model structure is based on documented physiological responses within peer-reviewed literature and serves as a dynamic compendium of physiological knowledge. The model is solved using a desktop, Windows-based program, allowing students to calculate time-dependent solutions and interactively alter over 750 parameters that modify physiological function. The model can be used to understand proposed mechanisms of physiological function and the interactions among physiological variables that may not be otherwise intuitively evident. In addition to open-ended or unstructured simulations, we have developed 30 physiological simulations, including heart failure, anemia, diabetes, and hemorrhage. Additional stimulations include 29 patients in which students are challenged to diagnose the pathophysiology based on their understanding of integrative physiology. In summary, QCP allows students to examine, integrate, and understand a host of physiological factors without causing harm to patients. This model is available as a free download for Windows computers at http://physiology.umc.edu/themodelingworkshop.

  19. Modeling and simulating of unloading welding transformer

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The simulation model of an unloading welding transformer was established on the basis of MATLAB software, and the modeling principle was described in detail in the paper. The model was made up of three sub-models, i.e. the linear inductor sub-model, the non-linear inductor sub-model and series connection sub-model controlled by current, and these sub-models were jointed together by means of segmented linearization. The simulating results showed that, in the conditions of the high convert frequency and the large cross section of the magnet core of a welding transformer, the non-linear inductor sub-model can be substituted by a linear inductor sub-model in the model; and the leakage reactance in the welding transformer is one of the main reasons of producing over-current and over-voltage in the inverter. The simulation results demonstrate that the over-voltage produced by leakage reactance is nearly two times of the input voltage supplied to the transformer, and the lasting time of over-voltage depends on time constant τ1. With reducing of τ1, the amplitude of the over-current will increase, and the lasting time becomes shorter. Contrarily, with increasing of τ1, the amplitude of the over-current will decrease, and the lasting time becomes longer. The model has played the important role for the development of the inverter resistance welding machine.

  20. A quantitative model of technological catch-up

    Directory of Open Access Journals (Sweden)

    Hossein Gholizadeh

    2015-02-01

    Full Text Available This article presents a quantitative model for the analysis of technological gap. The rates of development of technological leaders and followers in nanotechnology are expressed in terms of coupled equations. On the basis of this model (first step comparative technological gap and rate of that will be studied. We can calculate the dynamics of the gap between leader and follower. In the Second step, we estimate the technology gap using the metafrontier approach. Then we test the relationship between the technology gap and the quality of dimensions of the Catch-up technology which were identified in previous step. The usefulness of this approach is then demonstrated in the analysis of the technological gap of nanotechnology in Iran, the leader in Middle East and the world. We shall present the behaviors of the technological leader and followers. At the end, analyzing Iran position will be identified and implying effective dimension of catch-up Suggestions will be offered which could be a fundamental for long-term policies of Iran.

  1. Modeling neutron guides using Monte Carlo simulations

    CERN Document Server

    Wang, D Q; Crow, M L; Wang, X L; Lee, W T; Hubbard, C R

    2002-01-01

    Four neutron guide geometries, straight, converging, diverging and curved, were characterized using Monte Carlo ray-tracing simulations. The main areas of interest are the transmission of the guides at various neutron energies and the intrinsic time-of-flight (TOF) peak broadening. Use of a delta-function time pulse from a uniform Lambert neutron source allows one to quantitatively simulate the effect of guides' geometry on the TOF peak broadening. With a converging guide, the intensity and the beam divergence increases while the TOF peak width decreases compared with that of a straight guide. By contrast, use of a diverging guide decreases the intensity and the beam divergence, and broadens the width (in TOF) of the transmitted neutron pulse.

  2. Revolutions in energy through modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Tatro, M.; Woodard, J.

    1998-08-01

    The development and application of energy technologies for all aspects from generation to storage have improved dramatically with the advent of advanced computational tools, particularly modeling and simulation. Modeling and simulation are not new to energy technology development, and have been used extensively ever since the first commercial computers were available. However, recent advances in computing power and access have broadened the extent and use, and, through increased fidelity (i.e., accuracy) of the models due to greatly enhanced computing power, the increased reliance on modeling and simulation has shifted the balance point between modeling and experimentation. The complex nature of energy technologies has motivated researchers to use these tools to understand better performance, reliability and cost issues related to energy. The tools originated in sciences such as the strength of materials (nuclear reactor containment vessels); physics, heat transfer and fluid flow (oil production); chemistry, physics, and electronics (photovoltaics); and geosciences and fluid flow (oil exploration and reservoir storage). Other tools include mathematics, such as statistics, for assessing project risks. This paper describes a few advancements made possible by these tools and explores the benefits and costs of their use, particularly as they relate to the acceleration of energy technology development. The computational complexity ranges from basic spreadsheets to complex numerical simulations using hardware ranging from personal computers (PCs) to Cray computers. In all cases, the benefits of using modeling and simulation relate to lower risks, accelerated technology development, or lower cost projects.

  3. Inventory Reduction Using Business Process Reengineering and Simulation Modeling.

    Science.gov (United States)

    1996-12-01

    center is analyzed using simulation modeling and business process reengineering (BPR) concepts. The two simulation models were designed and evaluated by...reengineering and simulation modeling offer powerful tools to aid the manager in reducing cycle time and inventory levels.

  4. Quantitative comparisons of analogue models of brittle wedge dynamics

    Science.gov (United States)

    Schreurs, Guido

    2010-05-01

    Analogue model experiments are widely used to gain insights into the evolution of geological structures. In this study, we present a direct comparison of experimental results of 14 analogue modelling laboratories using prescribed set-ups. A quantitative analysis of the results will document the variability among models and will allow an appraisal of reproducibility and limits of interpretation. This has direct implications for comparisons between structures in analogue models and natural field examples. All laboratories used the same frictional analogue materials (quartz and corundum sand) and prescribed model-building techniques (sieving and levelling). Although each laboratory used its own experimental apparatus, the same type of self-adhesive foil was used to cover the base and all the walls of the experimental apparatus in order to guarantee identical boundary conditions (i.e. identical shear stresses at the base and walls). Three experimental set-ups using only brittle frictional materials were examined. In each of the three set-ups the model was shortened by a vertical wall, which moved with respect to the fixed base and the three remaining sidewalls. The minimum width of the model (dimension parallel to mobile wall) was also prescribed. In the first experimental set-up, a quartz sand wedge with a surface slope of ˜20° was pushed by a mobile wall. All models conformed to the critical taper theory, maintained a stable surface slope and did not show internal deformation. In the next two experimental set-ups, a horizontal sand pack consisting of alternating quartz sand and corundum sand layers was shortened from one side by the mobile wall. In one of the set-ups a thin rigid sheet covered part of the model base and was attached to the mobile wall (i.e. a basal velocity discontinuity distant from the mobile wall). In the other set-up a basal rigid sheet was absent and the basal velocity discontinuity was located at the mobile wall. In both types of experiments

  5. Quantitative performance metrics for stratospheric-resolving chemistry-climate models

    Directory of Open Access Journals (Sweden)

    D. W. Waugh

    2008-06-01

    Full Text Available A set of performance metrics is applied to stratospheric-resolving chemistry-climate models (CCMs to quantify their ability to reproduce key processes relevant for stratospheric ozone. The same metrics are used to assign a quantitative measure of performance ("grade" to each model-observations comparison shown in Eyring et al. (2006. A wide range of grades is obtained, both for different diagnostics applied to a single model and for the same diagnostic applied to different models, highlighting the wide range in ability of the CCMs to simulate key processes in the stratosphere. No model scores high or low on all tests, but differences in the performance of models can be seen, especially for transport processes where several models get low grades on multiple tests. The grades are used to assign relative weights to the CCM projections of 21st century total ozone. However, only small differences are found between weighted and unweighted multi-model mean total ozone projections. This study raises several issues with the grading and weighting of CCMs that need further examination, but it does provide a framework that will enable quantification of model improvements and assignment of relative weights to the model projections.

  6. Simulation and modeling of turbulent flows

    CERN Document Server

    Gatski, Thomas B; Lumley, John L

    1996-01-01

    This book provides students and researchers in fluid engineering with an up-to-date overview of turbulent flow research in the areas of simulation and modeling. A key element of the book is the systematic, rational development of turbulence closure models and related aspects of modern turbulent flow theory and prediction. Starting with a review of the spectral dynamics of homogenous and inhomogeneous turbulent flows, succeeding chapters deal with numerical simulation techniques, renormalization group methods and turbulent closure modeling. Each chapter is authored by recognized leaders in their respective fields, and each provides a thorough and cohesive treatment of the subject.

  7. Modeling & Simulation Executive Agent Panel

    Science.gov (United States)

    2007-11-02

    Richard W. ; 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME AND ADDRESS Office of the Oceanographer of the Navy...acquisition, and training communities.” MSEA Role • Facilitator in the project startup phase • Catalyst during development • Certifier in the...ACOUSTIC MODELS Parabolic Equation 5.0 ASTRAL 5.0 ASPM 4.3 Gaussian Ray Bundle 1.0 High Freq Env Acoustic (HFEVA) 1.0 COLOSSUS II 1.0 Low Freq Bottom LOSS

  8. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    , and the total stress level (i.e. stresses introduced due to internal pressure plus stresses introduced due to temperature gradients) must always be kept below the allowable stress level. In this way, the increased water-/steam space that should allow for better dynamic performance, in the end causes limited...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantification of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to define parts...

  9. Modelling, simulating and optimizing Boilers

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2003-01-01

    , and the total stress level (i.e. stresses introduced due to internal pressure plus stresses introduced due to temperature gradients) must always be kept below the allowable stress level. In this way, the increased water-/steam space that should allow for better dynamic performance, in the end causes limited...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantication of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to dene parts...

  10. Simulering af dagslys i digitale modeller

    DEFF Research Database (Denmark)

    Villaume, René Domine; Ørstrup, Finn Rude

    2004-01-01

    Projektet undersøger via forskellige simuleringer af dagslys, kvaliteten af visualiseringer af komplekse lysforhold i digitale modeller i forbindelse med formidling af arkitektur via nettet. I en digital 3D model af Utzon Associates Paustians hus, simulers naturligt dagslysindfald med  forskellig...... Renderingsmetoder som: "shaded render" /  ”raytraceing” /  "Final Gather /  ”Global Illumination”...

  11. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  12. Molecular simulation and modeling of complex I.

    Science.gov (United States)

    Hummer, Gerhard; Wikström, Mårten

    2016-07-01

    Molecular modeling and molecular dynamics simulations play an important role in the functional characterization of complex I. With its large size and complicated function, linking quinone reduction to proton pumping across a membrane, complex I poses unique modeling challenges. Nonetheless, simulations have already helped in the identification of possible proton transfer pathways. Simulations have also shed light on the coupling between electron and proton transfer, thus pointing the way in the search for the mechanistic principles underlying the proton pump. In addition to reviewing what has already been achieved in complex I modeling, we aim here to identify pressing issues and to provide guidance for future research to harness the power of modeling in the functional characterization of complex I. This article is part of a Special Issue entitled Respiratory complex I, edited by Volker Zickermann and Ulrich Brandt. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Hybrid simulation models of production networks

    CERN Document Server

    Kouikoglou, Vassilis S

    2001-01-01

    This book is concerned with a most important area of industrial production, that of analysis and optimization of production lines and networks using discrete-event models and simulation. The book introduces a novel approach that combines analytic models and discrete-event simulation. Unlike conventional piece-by-piece simulation, this method observes a reduced number of events between which the evolution of the system is tracked analytically. Using this hybrid approach, several models are developed for the analysis of production lines and networks. The hybrid approach combines speed and accuracy for exceptional analysis of most practical situations. A number of optimization problems, involving buffer design, workforce planning, and production control, are solved through the use of hybrid models.

  14. Investigating Output Accuracy for a Discrete Event Simulation Model and an Agent Based Simulation Model

    CERN Document Server

    Majid, Mazlina Abdul; Siebers, Peer-Olaf

    2010-01-01

    In this paper, we investigate output accuracy for a Discrete Event Simulation (DES) model and Agent Based Simulation (ABS) model. The purpose of this investigation is to find out which of these simulation techniques is the best one for modelling human reactive behaviour in the retail sector. In order to study the output accuracy in both models, we have carried out a validation experiment in which we compared the results from our simulation models to the performance of a real system. Our experiment was carried out using a large UK department store as a case study. We had to determine an efficient implementation of management policy in the store's fitting room using DES and ABS. Overall, we have found that both simulation models were a good representation of the real system when modelling human reactive behaviour.

  15. Flow assignment model for quantitative analysis of diverting bulk freight from road to railway.

    Science.gov (United States)

    Liu, Chang; Lin, Boliang; Wang, Jiaxi; Xiao, Jie; Liu, Siqi; Wu, Jianping; Li, Jian

    2017-01-01

    Since railway transport possesses the advantage of high volume and low carbon emissions, diverting some freight from road to railway will help reduce the negative environmental impacts associated with transport. This paper develops a flow assignment model for quantitative analysis of diverting truck freight to railway. First, a general network which considers road transportation, railway transportation, handling and transferring is established according to all the steps in the whole transportation process. Then general functions which embody the factors which the shippers will pay attention to when choosing mode and path are formulated. The general functions contain the congestion cost on road, the capacity constraints of railways and freight stations. Based on the general network and general cost function, a user equilibrium flow assignment model is developed to simulate the flow distribution on the general network under the condition that all shippers choose transportation mode and path independently. Since the model is nonlinear and challenging, we adopt a method that uses tangent lines to constitute envelope curve to linearize it. Finally, a numerical example is presented to test the model and show the method of making quantitative analysis of bulk freight modal shift between road and railway.

  16. Growth mixture modeling as an exploratory analysis tool in longitudinal quantitative trait loci analysis.

    Science.gov (United States)

    Chang, Su-Wei; Choi, Seung Hoan; Li, Ke; Fleur, Rose Saint; Huang, Chengrui; Shen, Tong; Ahn, Kwangmi; Gordon, Derek; Kim, Wonkuk; Wu, Rongling; Mendell, Nancy R; Finch, Stephen J

    2009-12-15

    We examined the properties of growth mixture modeling in finding longitudinal quantitative trait loci in a genome-wide association study. Two software packages are commonly used in these analyses: Mplus and the SAS TRAJ procedure. We analyzed the 200 replicates of the simulated data with these programs using three tests: the likelihood-ratio test statistic, a direct test of genetic model coefficients, and the chi-square test classifying subjects based on the trajectory model's posterior Bayesian probability. The Mplus program was not effective in this application due to its computational demands. The distributions of these tests applied to genes not related to the trait were sensitive to departures from Hardy-Weinberg equilibrium. The likelihood-ratio test statistic was not usable in this application because its distribution was far from the expected asymptotic distributions when applied to markers with no genetic relation to the quantitative trait. The other two tests were satisfactory. Power was still substantial when we used markers near the gene rather than the gene itself. That is, growth mixture modeling may be useful in genome-wide association studies. For markers near the actual gene, there was somewhat greater power for the direct test of the coefficients and lesser power for the posterior Bayesian probability chi-square test.

  17. Experimental Research on Quantitative Inversion Models of Suspended Sediment Concentration Using Remote Sensing Technology

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Research on quantitative models of suspended sediment concentration (SSC) using remote sensing technology is very important to understand the scouring and siltation variation in harbors and water channels. Based on laboratory study of the relationship between different suspended sediment concentrations and reflectance spectra measured synchronously, quantitative inversion models of SSC based on single factor, band ratio and sediment parameter were developed, which provides an effective method to retrieve the SSC from satellite images. Results show that the b1 (430-500nm) and b3 (670-735nm) are the optimal wavelengths for the estimation of lower SSC and the b4 (780-835nm) is the optimal wavelength to estimate the higher SSC. Furthermore the band ratio B2/B3 can be used to simulate the variation of lower SSC better and the B4/B1 to estimate the higher SSC accurately. Also the inversion models developed by sediment parameters of higher and lower SSCs can get a relatively higher accuracy than the single factor and band ratio models.

  18. Quantitative Imaging of Turbulent Mixing Dynamics in High-Pressure Fuel Injection to Enable Predictive Simulations of Engine Combustion

    Energy Technology Data Exchange (ETDEWEB)

    Frank, Jonathan H. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Reacting Flows Dept.; Pickett, Lyle M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Engine Combustion Dept.; Bisson, Scott E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Remote Sensing and Energetic Materials Dept.; Patterson, Brian D. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). combustion Chemistry Dept.; Ruggles, Adam J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Reacting Flows Dept.; Skeen, Scott A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Engine Combustion Dept.; Manin, Julien Luc [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Engine Combustion Dept.; Huang, Erxiong [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Reacting Flows Dept.; Cicone, Dave J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Engine Combustion Dept.; Sphicas, Panos [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Engine Combustion Dept.

    2015-09-01

    In this LDRD project, we developed a capability for quantitative high - speed imaging measurements of high - pressure fuel injection dynamics to advance understanding of turbulent mixing in transcritical flows, ignition, and flame stabilization mechanisms, and to provide e ssential validation data for developing predictive tools for engine combustion simulations. Advanced, fuel - efficient engine technologies rely on fuel injection into a high - pressure, high - temperature environment for mixture preparation and com bustion. Howe ver, the dynamics of fuel injection are not well understood and pose significant experimental and modeling challenges. To address the need for quantitative high - speed measurements, we developed a Nd:YAG laser that provides a 5ms burst of pulses at 100 kHz o n a robust mobile platform . Using this laser, we demonstrated s patially and temporally resolved Rayleigh scattering imaging and particle image velocimetry measurements of turbulent mixing in high - pressure gas - phase flows and vaporizing sprays . Quantitativ e interpretation of high - pressure measurements was advanced by reducing and correcting interferences and imaging artifacts.

  19. Quantitative property-structural relation modeling on polymeric dielectric materials

    Science.gov (United States)

    Wu, Ke

    Nowadays, polymeric materials have attracted more and more attention in dielectric applications. But searching for a material with desired properties is still largely based on trial and error. To facilitate the development of new polymeric materials, heuristic models built using the Quantitative Structure Property Relationships (QSPR) techniques can provide reliable "working solutions". In this thesis, the application of QSPR on polymeric materials is studied from two angles: descriptors and algorithms. A novel set of descriptors, called infinite chain descriptors (ICD), are developed to encode the chemical features of pure polymers. ICD is designed to eliminate the uncertainty of polymer conformations and inconsistency of molecular representation of polymers. Models for the dielectric constant, band gap, dielectric loss tangent and glass transition temperatures of organic polymers are built with high prediction accuracy. Two new algorithms, the physics-enlightened learning method (PELM) and multi-mechanism detection, are designed to deal with two typical challenges in material QSPR. PELM is a meta-algorithm that utilizes the classic physical theory as guidance to construct the candidate learning function. It shows better out-of-domain prediction accuracy compared to the classic machine learning algorithm (support vector machine). Multi-mechanism detection is built based on a cluster-weighted mixing model similar to a Gaussian mixture model. The idea is to separate the data into subsets where each subset can be modeled by a much simpler model. The case study on glass transition temperature shows that this method can provide better overall prediction accuracy even though less data is available for each subset model. In addition, the techniques developed in this work are also applied to polymer nanocomposites (PNC). PNC are new materials with outstanding dielectric properties. As a key factor in determining the dispersion state of nanoparticles in the polymer matrix

  20. Power electronics system modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lai, Jih-Sheng

    1994-12-31

    This paper introduces control system design based softwares, SIMNON and MATLAB/SIMULINK, for power electronics system simulation. A complete power electronics system typically consists of a rectifier bridge along with its smoothing capacitor, an inverter, and a motor. The system components, featuring discrete or continuous, linear or nonlinear, are modeled in mathematical equations. Inverter control methods,such as pulse-width-modulation and hysteresis current control, are expressed in either computer algorithms or digital circuits. After describing component models and control methods, computer programs are then developed for complete systems simulation. Simulation results are mainly used for studying system performances, such as input and output current harmonics, torque ripples, and speed responses. Key computer programs and simulation results are demonstrated for educational purposes.

  1. Quantitative Modeling of Acid Wormholing in Carbonates- What Are the Gaps to Bridge

    KAUST Repository

    Qiu, Xiangdong

    2013-01-01

    Carbonate matrix acidization extends a well\\'s effective drainage radius by dissolving rock and forming conductive channels (wormholes) from the wellbore. Wormholing is a dynamic process that involves balance between the acid injection rate and reaction rate. Generally, injection rate is well defined where injection profiles can be controlled, whereas the reaction rate can be difficult to obtain due to its complex dependency on interstitial velocity, fluid composition, rock surface properties etc. Conventional wormhole propagation models largely ignore the impact of reaction products. When implemented in a job design, the significant errors can result in treatment fluid schedule, rate, and volume. A more accurate method to simulate carbonate matrix acid treatments would accomodate the effect of reaction products on reaction kinetics. It is the purpose of this work to properly account for these effects. This is an important step in achieving quantitative predictability of wormhole penetration during an acidzing treatment. This paper describes the laboratory procedures taken to obtain the reaction-product impacted kinetics at downhole conditions using a rotating disk apparatus, and how this new set of kinetics data was implemented in a 3D wormholing model to predict wormhole morphology and penetration velocity. The model explains some of the differences in wormhole morphology observed in limestone core flow experiments where injection pressure impacts the mass transfer of hydrogen ions to the rock surface. The model uses a CT scan rendered porosity field to capture the finer details of the rock fabric and then simulates the fluid flow through the rock coupled with reactions. Such a validated model can serve as a base to scale up to near wellbore reservoir and 3D radial flow geometry allowing a more quantitative acid treatment design.

  2. Development of NASA's Models and Simulations Standard

    Science.gov (United States)

    Bertch, William J.; Zang, Thomas A.; Steele, Martin J.

    2008-01-01

    From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.

  3. Quantitative orientation preference and susceptibility to space motion sickness simulated in a virtual reality environment.

    Science.gov (United States)

    Chen, Wei; Chao, Jian-Gang; Chen, Xue-Wen; Wang, Jin-Kun; Tan, Cheng

    2015-04-01

    Orientation preference should appear when variable weightings of spatial orientation cues are used between individuals. It is possible that astronauts' orientation preferences could be a potential predictor for susceptibility to space motion sickness (SMS). The present study was conducted to confirm this relationship on Earth by quantifying orientation preferences and simulating SMS in a virtual reality environment. Two tests were carried out. The first was to quantitatively determine one's orientation preference. Thirty-two participants' vision and body cue preferences were determined by measuring perceptual up (PU) orientations. The ratio of vision and body vector (ROVB) was used as the indicator of one's orientation preference. The second test was to visually induce motion sickness symptoms that represent similar sensory conflicts as SMS using a virtual reality environment. Relationships between ROVB values and motion sickness scores were analyzed, which revealed cubic functions by using optimal fits. According to ROVB level, participants were divided into three groups - body group, vision group, and confusion group - and the factor of gender was further considered as a covariate in the analysis. Consistent differences in motion sickness scores were observed between the three groups. Thus, orientation preference had a significant relationship with susceptibility to simulated SMS symptoms. This knowledge could assist with astronaut selection and might be a useful countermeasure when developing new preflight trainings. Copyright © 2015. Published by Elsevier Inc.

  4. Quantitative Agent Based Model of Opinion Dynamics: Polish Elections of 2015.

    Directory of Open Access Journals (Sweden)

    Pawel Sobkowicz

    Full Text Available We present results of an abstract, agent based model of opinion dynamics simulations based on the emotion/information/opinion (E/I/O approach, applied to a strongly polarized society, corresponding to the Polish political scene between 2005 and 2015. Under certain conditions the model leads to metastable coexistence of two subcommunities of comparable size (supporting the corresponding opinions-which corresponds to the bipartisan split found in Poland. Spurred by the recent breakdown of this political duopoly, which occurred in 2015, we present a model extension that describes both the long term coexistence of the two opposing opinions and a rapid, transitory change due to the appearance of a third party alternative. We provide quantitative comparison of the model with the results of polls and elections in Poland, testing the assumptions related to the modeled processes and the parameters used in the simulations. It is shown, that when the propaganda messages of the two incumbent parties differ in emotional tone, the political status quo may be unstable. The asymmetry of the emotions within the support bases of the two parties allows one of them to be 'invaded' by a newcomer third party very quickly, while the second remains immune to such invasion.

  5. A bivariate quantitative genetic model for a linear Gaussian trait and a survival trait

    Directory of Open Access Journals (Sweden)

    Damgaard Lars

    2005-12-01

    Full Text Available Abstract With the increasing use of survival models in animal breeding to address the genetic aspects of mainly longevity of livestock but also disease traits, the need for methods to infer genetic correlations and to do multivariate evaluations of survival traits and other types of traits has become increasingly important. In this study we derived and implemented a bivariate quantitative genetic model for a linear Gaussian and a survival trait that are genetically and environmentally correlated. For the survival trait, we considered the Weibull log-normal animal frailty model. A Bayesian approach using Gibbs sampling was adopted. Model parameters were inferred from their marginal posterior distributions. The required fully conditional posterior distributions were derived and issues on implementation are discussed. The twoWeibull baseline parameters were updated jointly using a Metropolis-Hastingstep. The remaining model parameters with non-normalized fully conditional distributions were updated univariately using adaptive rejection sampling. Simulation results showed that the estimated marginal posterior distributions covered well and placed high density to the true parameter values used in the simulation of data. In conclusion, the proposed method allows inferring additive genetic and environmental correlations, and doing multivariate genetic evaluation of a linear Gaussian trait and a survival trait.

  6. A bivariate quantitative genetic model for a linear Gaussian trait and a survival trait.

    Science.gov (United States)

    Damgaard, Lars Holm; Korsgaard, Inge Riis

    2006-01-01

    With the increasing use of survival models in animal breeding to address the genetic aspects of mainly longevity of livestock but also disease traits, the need for methods to infer genetic correlations and to do multivariate evaluations of survival traits and other types of traits has become increasingly important. In this study we derived and implemented a bivariate quantitative genetic model for a linear Gaussian and a survival trait that are genetically and environmentally correlated. For the survival trait, we considered the Weibull log-normal animal frailty model. A Bayesian approach using Gibbs sampling was adopted. Model parameters were inferred from their marginal posterior distributions. The required fully conditional posterior distributions were derived and issues on implementation are discussed. The two Weibull baseline parameters were updated jointly using a Metropolis-Hasting step. The remaining model parameters with non-normalized fully conditional distributions were updated univariately using adaptive rejection sampling. Simulation results showed that the estimated marginal posterior distributions covered well and placed high density to the true parameter values used in the simulation of data. In conclusion, the proposed method allows inferring additive genetic and environmental correlations, and doing multivariate genetic evaluation of a linear Gaussian trait and a survival trait.

  7. Quantitative phase-field modeling for wetting phenomena.

    Science.gov (United States)

    Badillo, Arnoldo

    2015-03-01

    A new phase-field model is developed for studying partial wetting. The introduction of a third phase representing a solid wall allows for the derivation of a new surface tension force that accounts for energy changes at the contact line. In contrast to other multi-phase-field formulations, the present model does not need the introduction of surface energies for the fluid-wall interactions. Instead, all wetting properties are included in a unique parameter known as the equilibrium contact angle θeq. The model requires the solution of a single elliptic phase-field equation, which, coupled to conservation laws for mass and linear momentum, admits the existence of steady and unsteady compact solutions (compactons). The representation of the wall by an additional phase field allows for the study of wetting phenomena on flat, rough, or patterned surfaces in a straightforward manner. The model contains only two free parameters, a measure of interface thickness W and β, which is used in the definition of the mixture viscosity μ=μlϕl+μvϕv+βμlϕw. The former controls the convergence towards the sharp interface limit and the latter the energy dissipation at the contact line. Simulations on rough surfaces show that by taking values for β higher than 1, the model can reproduce, on average, the effects of pinning events of the contact line during its dynamic motion. The model is able to capture, in good agreement with experimental observations, many physical phenomena fundamental to wetting science, such as the wetting transition on micro-structured surfaces and droplet dynamics on solid substrates.

  8. Modelling and Simulation of Crude Oil Dispersion

    Directory of Open Access Journals (Sweden)

    Abdulfatai JIMOH

    2006-01-01

    Full Text Available This research work was carried out to develop a model equation for the dispersion of crude oil in water. Seven different crude oils (Bonny Light, Antan Terminal, Bonny Medium, Qua Iboe Light, Brass Light Mbede, Forcados Blend and Heavy H were used as the subject crude oils. The developed model equation in this project which is given as...It was developed starting from the equation for the oil dispersion rate in water which is given as...The developed equation was then simulated with the aid of MathCAD 2000 Professional software. The experimental and model results obtained from the simulation of the model equation were plotted on the same axis against time of dispersion. The model results revealed close fittings between the experimental and the model results because the correlation coefficients and the r-square values calculated using Spreadsheet Program were both found to be unity (1.00.

  9. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  10. Incorporation of RAM techniques into simulation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, S.C. Jr.; Haire, M.J.; Schryver, J.C.

    1995-07-01

    This work concludes that reliability, availability, and maintainability (RAM) analytical techniques can be incorporated into computer network simulation modeling to yield an important new analytical tool. This paper describes the incorporation of failure and repair information into network simulation to build a stochastic computer model represents the RAM Performance of two vehicles being developed for the US Army: The Advanced Field Artillery System (AFAS) and the Future Armored Resupply Vehicle (FARV). The AFAS is the US Army`s next generation self-propelled cannon artillery system. The FARV is a resupply vehicle for the AFAS. Both vehicles utilize automation technologies to improve the operational performance of the vehicles and reduce manpower. The network simulation model used in this work is task based. The model programmed in this application requirements a typical battle mission and the failures and repairs that occur during that battle. Each task that the FARV performs--upload, travel to the AFAS, refuel, perform tactical/survivability moves, return to logistic resupply, etc.--is modeled. Such a model reproduces a model reproduces operational phenomena (e.g., failures and repairs) that are likely to occur in actual performance. Simulation tasks are modeled as discrete chronological steps; after the completion of each task decisions are programmed that determine the next path to be followed. The result is a complex logic diagram or network. The network simulation model is developed within a hierarchy of vehicle systems, subsystems, and equipment and includes failure management subnetworks. RAM information and other performance measures are collected which have impact on design requirements. Design changes are evaluated through ``what if`` questions, sensitivity studies, and battle scenario changes.

  11. Interdiffusion of the aluminum magnesium system. Quantitative analysis and numerical model; Interdiffusion des Aluminium-Magnesium-Systems. Quantitative Analyse und numerische Modellierung

    Energy Technology Data Exchange (ETDEWEB)

    Seperant, Florian

    2012-03-21

    Aluminum coatings are a promising approach to protect magnesium alloys against corrosion and thereby making them accessible to a variety of technical applications. Thermal treatment enhances the adhesion of the aluminium coating on magnesium by interdiffusion. For a deeper understanding of the diffusion process at the interface, a quantitative description of the Al-Mg system is necessary. On the basis of diffusion experiments with infinite reservoirs of aluminum and magnesium, the interdiffusion coefficients of the intermetallic phases of the Al-Mg-system are calculated with the Sauer-Freise method for the first time. To solve contradictions in the literature concerning the intrinsic diffusion coefficients, the possibility of a bifurcation of the Kirkendall plane is considered. Furthermore, a physico-chemical description of interdiffusion is provided to interpret the observed phase transitions. The developed numerical model is based on a temporally varied discretization of the space coordinate. It exhibits excellent quantitative agreement with the experimentally measured concentration profile. This confirms the validity of the obtained diffusion coefficients. Moreover, the Kirkendall shift in the Al-Mg system is simulated for the first time. Systems with thin aluminum coatings on magnesium also exhibit a good correlation between simulated and experimental concentration profiles. Thus, the diffusion coefficients are also valid for Al-coated systems. Hence, it is possible to derive parameters for a thermal treatment by simulation, resulting in an optimized modification of the magnesium surface for technical applications.

  12. Quantitative Model for Supply Chain Visibility: Process Capability Perspective

    Directory of Open Access Journals (Sweden)

    Youngsu Lee

    2016-01-01

    Full Text Available Currently, the intensity of enterprise competition has increased as a result of a greater diversity of customer needs as well as the persistence of a long-term recession. The results of competition are becoming severe enough to determine the survival of company. To survive global competition, each firm must focus on achieving innovation excellence and operational excellence as core competency for sustainable competitive advantage. Supply chain management is now regarded as one of the most effective innovation initiatives to achieve operational excellence, and its importance has become ever more apparent. However, few companies effectively manage their supply chains, and the greatest difficulty is in achieving supply chain visibility. Many companies still suffer from a lack of visibility, and in spite of extensive research and the availability of modern technologies, the concepts and quantification methods to increase supply chain visibility are still ambiguous. Based on the extant researches in supply chain visibility, this study proposes an extended visibility concept focusing on a process capability perspective and suggests a more quantitative model using Z score in Six Sigma methodology to evaluate and improve the level of supply chain visibility.

  13. An integrated qualitative and quantitative modeling framework for computer‐assisted HAZOP studies

    DEFF Research Database (Denmark)

    Wu, Jing; Zhang, Laibin; Hu, Jinqiu

    2014-01-01

    and validated on a case study concerning a three‐phase separation process. The multilevel flow modeling (MFM) methodology is used to represent the plant goals and functions. First, means‐end analysis is used to identify and formulate the intention of the process design in terms of components, functions...... safety critical operations, its causes and consequences. The outcome is a qualitative hazard analysis of selected process deviations from normal operations and their consequences as input to a traditional HAZOP table. The list of unacceptable high risk deviations identified by the qualitative HAZOP...... analysis is used as input for rigorous analysis and evaluation by the quantitative analysis part of the framework. To this end, dynamic first‐principles modeling is used to simulate the system behavior and thereby complement the results of the qualitative analysis part. The practical framework for computer...

  14. A system for quantitative morphological measurement and electronic modelling of neurons: three-dimensional reconstruction.

    Science.gov (United States)

    Stockley, E W; Cole, H M; Brown, A D; Wheal, H V

    1993-04-01

    A system for accurately reconstructing neurones from optical sections taken at high magnification is described. Cells are digitised on a 68000-based microcomputer to form a database consisting of a series of linked nodes each consisting of x, y, z coordinates and an estimate of dendritic diameter. This database is used to generate three-dimensional (3-D) displays of the neurone and allows quantitative analysis of the cell volume, surface area and dendritic length. Images of the cell can be manipulated locally or transferred to an IBM 3090 mainframe where a wireframe model can be displayed on an IBM 5080 graphics terminal and rotated interactively in real time, allowing visualisation of the cell from all angles. Space-filling models can also be produced. Reconstructions can also provide morphological data for passive electrical simulations of hippocampal pyramidal cells.

  15. Formal modeling and quantitative evaluation for information system survivability based on PEPA

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Hui-qiang; ZHAO Guo-sheng

    2008-01-01

    Survivability should be considered beyond security for information system. To assess system survivability accurately, for improvement, a formal modeling and analysis method based on stochastic process algebra is proposed in this article. By abstracting the interactive behaviors between intruders and information system, a transferring graph of system state oriented survivability is constructed. On that basis, parameters are defined and system behaviors are characterized precisely with performance evaluation process algebra (PEPA), simultaneously considering the influence of different attack modes. Ultimately the formal model for survivability is established and quantitative analysis results are obtained by PEPA Workbench tool. Simulation experiments show the effectiveness and feasibility of the developed method, and it can help to direct the designation of survivable system.

  16. Quantitative modeling of Cerenkov light production efficiency from medical radionuclides.

    Directory of Open Access Journals (Sweden)

    Bradley J Beattie

    Full Text Available There has been recent and growing interest in applying Cerenkov radiation (CR for biological applications. Knowledge of the production efficiency and other characteristics of the CR produced by various radionuclides would help in accessing the feasibility of proposed applications and guide the choice of radionuclides. To generate this information we developed models of CR production efficiency based on the Frank-Tamm equation and models of CR distribution based on Monte-Carlo simulations of photon and β particle transport. All models were validated against direct measurements using multiple radionuclides and then applied to a number of radionuclides commonly used in biomedical applications. We show that two radionuclides, Ac-225 and In-111, which have been reported to produce CR in water, do not in fact produce CR directly. We also propose a simple means of using this information to calibrate high sensitivity luminescence imaging systems and show evidence suggesting that this calibration may be more accurate than methods in routine current use.

  17. Testing turbulent closure models with convection simulations

    CERN Document Server

    Snellman, J E; Mantere, M J; Rheinhardt, M; Dintrans, B

    2012-01-01

    Aims: To compare simple analytical closure models of turbulent Boussinesq convection for stellar applications with direct three-dimensional simulations both in homogeneous and inhomogeneous (bounded) setups. Methods: We use simple analytical closure models to compute the fluxes of angular momentum and heat as a function of rotation rate measured by the Taylor number. We also investigate cases with varying angles between the angular velocity and gravity vectors, corresponding to locating the computational domain at different latitudes ranging from the pole to the equator of the star. We perform three-dimensional numerical simulations in the same parameter regimes for comparison. The free parameters appearing in the closure models are calibrated by two fit methods using simulation data. Unique determination of the closure parameters is possible only in the non-rotating case and when the system is placed at the pole. In the other cases the fit procedures yield somewhat differing results. The quality of the closu...

  18. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  19. Comparison of recorded rainfall with quantitative precipitation forecast in a rainfall-runoff simulation for the Langat River Basin, Malaysia

    Science.gov (United States)

    Billa, Lawal; Assilzadeh, Hamid; Mansor, Shattri; Mahmud, Ahmed; Ghazali, Abdul

    2011-09-01

    Observed rainfall is used for runoff modeling in flood forecasting where possible, however in cases where the response time of the watershed is too short for flood warning activities, a deterministic quantitative precipitation forecast (QPF) can be used. This is based on a limited-area meteorological model and can provide a forecasting horizon in the order of six hours or less. This study applies the results of a previously developed QPF based on a 1D cloud model using hourly NOAA-AVHRR (Advanced Very High Resolution Radiometer) and GMS (Geostationary Meteorological Satellite) datasets. Rainfall intensity values in the range of 3-12 mm/hr were extracted from these datasets based on the relation between cloud top temperature (CTT), cloud reflectance (CTR) and cloud height (CTH) using defined thresholds. The QPF, prepared for the rainstorm event of 27 September to 8 October 2000 was tested for rainfall runoff on the Langat River Basin, Malaysia, using a suitable NAM rainfall-runoff model. The response of the basin both to the rainfall-runoff simulation using the QPF estimate and the recorded observed rainfall is compared here, based on their corresponding discharge hydrographs. The comparison of the QPF and recorded rainfall showed R2 = 0.9028 for the entire basin. The runoff hydrograph for the recorded rainfall in the Kajang sub-catchment showed R2 = 0.9263 between the observed and the simulated, while that of the QPF rainfall was R2 = 0.819. This similarity in runoff suggests there is a high level of accuracy shown in the improved QPF, and that significant improvement of flood forecasting can be achieved through `Nowcasting', thus increasing the response time for flood early warnings.

  20. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies.

  1. Modeling and simulation with operator scaling

    CERN Document Server

    Cohen, Serge; Rosinski, Jan

    2009-01-01

    Self-similar processes are useful in modeling diverse phenomena that exhibit scaling properties. Operator scaling allows a different scale factor in each coordinate. This paper develops practical methods for modeling and simulating stochastic processes with operator scaling. A simulation method for operator stable Levy processes is developed, based on a series representation, along with a Gaussian approximation of the small jumps. Several examples are given to illustrate practical applications. A classification of operator stable Levy processes in two dimensions is provided according to their exponents and symmetry groups. We conclude with some remarks and extensions to general operator self-similar processes.

  2. Hemispherical sky simulator for daylighting model studies

    Energy Technology Data Exchange (ETDEWEB)

    Selkowitz, S.

    1981-07-01

    The design of a 24-foot-diameter hemispherical sky simulator recently completed at LBL is described. The goal was to produce a facility in which large models could be tested; which was suitable for research, teaching, and design; which could provide a uniform sky, an overcast sky, and several clear-sky luminance distributions, as well as accommodating an artificial sun. Initial operating experience with the facility is described, the sky simulator capabilities are reviewed, and its strengths and weaknesses relative to outdoor modeling tests are discussed.

  3. Wind Shear Target Echo Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Xiaoyang Liu

    2015-01-01

    Full Text Available Wind shear is a dangerous atmospheric phenomenon in aviation. Wind shear is defined as a sudden change of speed or direction of the wind. In order to analyze the influence of wind shear on the efficiency of the airplane, this paper proposes a mathematical model of point target rain echo and weather target signal echo based on Doppler effect. The wind field model is developed in this paper, and the antenna model is also studied by using Bessel function. The spectrum distribution of symmetric and asymmetric wind fields is researched by using the mathematical model proposed in this paper. The simulation results are in accordance with radial velocity component, and the simulation results also confirm the correctness of the established model of antenna.

  4. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)

    1999-06-01

    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  5. Qualitative analysis and quantitative simulation on Yin-Huang water salinization mechanism in Bei-Da-Gang Reservoir

    Institute of Scientific and Technical Information of China (English)

    ZHAO Wen-yu; WANG Qi-shan; WU Li-bo; ZHANG Bin; WANG Xiao-qin

    2005-01-01

    Yellow River water transfer for Tianjin is important in solving the water shortage in Tianjin, which facilitate economic development and social progress for many years. Fresh water drawn from Yellow River(i. e., Yin-Huang water) becomes saltier and saltier when being stored in the Bei-Da-Gang reservoir. We qualitatively analyze the water salinization mechanism based on mass transfer theory. The main factors are salinity transfer of saline soil, evaporation concentrating, and the agitation of wind. A simulative experimental pond and an evaporation pond were built beside the Bei-Da-Gang reservoir to quantitatively investigate the water salinization based on water and solute balance in the simulative pond. 80% of increased [Cl- ] is due to the salinity transfer of the saline soil and the other 20% is due to evaporation concentrating, so the former is the most important factor. We found that the salinization of Yin-Huang water can be described with a zero-dimension linear model.

  6. Battery thermal models for hybrid vehicle simulations

    Science.gov (United States)

    Pesaran, Ahmad A.

    This paper summarizes battery thermal modeling capabilities for: (1) an advanced vehicle simulator (ADVISOR); and (2) battery module and pack thermal design. The National Renewable Energy Laboratory's (NREL's) ADVISOR is developed in the Matlab/Simulink environment. There are several battery models in ADVISOR for various chemistry types. Each one of these models requires a thermal model to predict the temperature change that could affect battery performance parameters, such as resistance, capacity and state of charges. A lumped capacitance battery thermal model in the Matlab/Simulink environment was developed that included the ADVISOR battery performance models. For thermal evaluation and design of battery modules and packs, NREL has been using various computer aided engineering tools including commercial finite element analysis software. This paper will discuss the thermal ADVISOR battery model and its results, along with the results of finite element modeling that were presented at the workshop on "Development of Advanced Battery Engineering Models" in August 2001.

  7. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    Science.gov (United States)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  8. OPSMODEL, an or-orbit operations simulation modeling tool for Space Station

    Science.gov (United States)

    Davis, William T.; Wright, Robert L.

    1988-01-01

    The 'OPSMODEL' operations-analysis and planning tool simulates on-orbit crew operations for the NASA Space Station, furnishing a quantitative measure of the effectiveness of crew activities in various alternative Station configurations while supporting engineering and cost analyses. OPSMODEL is entirely data-driven; the top-down modeling structure of the software allows the user to control both the content and the complexity level of model definition during data base population. Illustrative simulation samples are given.

  9. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  10. Modeling the Effect of Polychromatic Light in Quantitative Absorbance Spectroscopy

    Science.gov (United States)

    Smith, Rachel; Cantrell, Kevin

    2007-01-01

    Laboratory experiment is conducted to give the students practical experience with the principles of electronic absorbance spectroscopy. This straightforward approach creates a powerful tool for exploring many of the aspects of quantitative absorbance spectroscopy.

  11. Quantitative model for the generic 3D shape of ICMEs at 1 AU

    Science.gov (United States)

    Démoulin, P.; Janvier, M.; Masías-Meza, J. J.; Dasso, S.

    2016-10-01

    Context. Interplanetary imagers provide 2D projected views of the densest plasma parts of interplanetary coronal mass ejections (ICMEs), while in situ measurements provide magnetic field and plasma parameter measurements along the spacecraft trajectory, that is, along a 1D cut. The data therefore only give a partial view of the 3D structures of ICMEs. Aims: By studying a large number of ICMEs, crossed at different distances from their apex, we develop statistical methods to obtain a quantitative generic 3D shape of ICMEs. Methods: In a first approach we theoretically obtained the expected statistical distribution of the shock-normal orientation from assuming simple models of 3D shock shapes, including distorted profiles, and compared their compatibility with observed distributions. In a second approach we used the shock normal and the flux rope axis orientations together with the impact parameter to provide statistical information across the spacecraft trajectory. Results: The study of different 3D shock models shows that the observations are compatible with a shock that is symmetric around the Sun-apex line as well as with an asymmetry up to an aspect ratio of around 3. Moreover, flat or dipped shock surfaces near their apex can only be rare cases. Next, the sheath thickness and the ICME velocity have no global trend along the ICME front. Finally, regrouping all these new results and those of our previous articles, we provide a quantitative ICME generic 3D shape, including the global shape of the shock, the sheath, and the flux rope. Conclusions: The obtained quantitative generic ICME shape will have implications for several aims. For example, it constrains the output of typical ICME numerical simulations. It is also a base for studying the transport of high-energy solar and cosmic particles during an ICME propagation as well as for modeling and forecasting space weather conditions near Earth.

  12. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting...... particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLAN) allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLAN semantics based on discrete-time Markov chains....... The Maude implementation of PFLAN is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning) and the expected average...

  13. Quantitative computational models of molecular self-assembly in systems biology

    Science.gov (United States)

    Thomas, Marcus; Schwartz, Russell

    2017-06-01

    Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally.

  14. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  15. Workshop on quantitative dynamic stratigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Cross, T.A.

    1988-04-01

    This document discusses the development of quantitative simulation models for the investigation of geologic systems. The selection of variables, model verification, evaluation, and future directions in quantitative dynamic stratigraphy (QDS) models are detailed. Interdisciplinary applications, integration, implementation, and transfer of QDS are also discussed. (FI)

  16. Herd immunity and pneumococcal conjugate vaccine: a quantitative model.

    Science.gov (United States)

    Haber, Michael; Barskey, Albert; Baughman, Wendy; Barker, Lawrence; Whitney, Cynthia G; Shaw, Kate M; Orenstein, Walter; Stephens, David S

    2007-07-20

    Invasive pneumococcal disease in older children and adults declined markedly after introduction in 2000 of the pneumococcal conjugate vaccine for young children. An empirical quantitative model was developed to estimate the herd (indirect) effects on the incidence of invasive disease among persons >or=5 years of age induced by vaccination of young children with 1, 2, or >or=3 doses of the pneumococcal conjugate vaccine, Prevnar (PCV7), containing serotypes 4, 6B, 9V, 14, 18C, 19F and 23F. From 1994 to 2003, cases of invasive pneumococcal disease were prospectively identified in Georgia Health District-3 (eight metropolitan Atlanta counties) by Active Bacterial Core surveillance (ABCs). From 2000 to 2003, vaccine coverage levels of PCV7 for children aged 19-35 months in Fulton and DeKalb counties (of Atlanta) were estimated from the National Immunization Survey (NIS). Based on incidence data and the estimated average number of doses received by 15 months of age, a Poisson regression model was fit, describing the trend in invasive pneumococcal disease in groups not targeted for vaccination (i.e., adults and older children) before and after the introduction of PCV7. Highly significant declines in all the serotypes contained in PCV7 in all unvaccinated populations (5-19, 20-39, 40-64, and >64 years) from 2000 to 2003 were found under the model. No significant change in incidence was seen from 1994 to 1999, indicating rates were stable prior to vaccine introduction. Among unvaccinated persons 5+ years of age, the modeled incidence of disease caused by PCV7 serotypes as a group dropped 38.4%, 62.0%, and 76.6% for 1, 2, and 3 doses, respectively, received on average by the population of children by the time they are 15 months of age. Incidence of serotypes 14 and 23F had consistent significant declines in all unvaccinated age groups. In contrast, the herd immunity effects on vaccine-related serotype 6A incidence were inconsistent. Increasing trends of non

  17. Quantitative utilization of prior biological knowledge in the Bayesian network modeling of gene expression data

    Directory of Open Access Journals (Sweden)

    Gao Shouguo

    2011-08-01

    Full Text Available Abstract Background Bayesian Network (BN is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.

  18. EXACT SIMULATION OF A BOOLEAN MODEL

    Directory of Open Access Journals (Sweden)

    Christian Lantuéjoul

    2013-06-01

    Full Text Available A Boolean model is a union of independent objects (compact random subsets located at Poisson points. Two algorithms are proposed for simulating a Boolean model in a bounded domain. The first one applies only to stationary models. It generates the objects prior to their Poisson locations. Two examples illustrate its applicability. The second algorithm applies to stationary and non-stationary models. It generates the Poisson points prior to the objects. Its practical difficulties of implementation are discussed. Both algorithms are based on importance sampling techniques, and the generated objects are weighted.

  19. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo

    2015-09-15

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation and angiogenesis) and ion transportation networks (e.g., neural networks) is explained in detail and basic analytical features like the gradient flow structure of the fluid transportation network model and the impact of the model parameters on the geometry and topology of network formation are analyzed. We also present a numerical finite-element based discretization scheme and discuss sample cases of network formation simulations.

  20. Modeling and Simulation of Nuclear Fuel Materials

    Energy Technology Data Exchange (ETDEWEB)

    Devanathan, Ramaswami; Van Brutzel, Laurent; Chartier, Alan; Gueneau, Christine; Mattsson, Ann E.; Tikare, Veena; Bartel, Timothy; Besmann, T. M.; Stan, Marius; Van Uffelen, Paul

    2010-10-01

    We review the state of modeling and simulation of nuclear fuels with emphasis on the most widely used nuclear fuel, UO2. The hierarchical scheme presented represents a science-based approach to modeling nuclear fuels by progressively passing information in several stages from ab initio to continuum levels. Such an approach is essential to overcome the challenges posed by radioactive materials handling, experimental limitations in modeling extreme conditions and accident scenarios, and the small time and distance scales of fundamental defect processes. When used in conjunction with experimental validation, this multiscale modeling scheme can provide valuable guidance to development of fuel for advanced reactors to meet rising global energy demand.

  1. Simulation modeling of health care policy.

    Science.gov (United States)

    Glied, Sherry; Tilipman, Nicholas

    2010-01-01

    Simulation modeling of health reform is a standard part of policy development and, in the United States, a required element in enacting health reform legislation. Modelers use three types of basic structures to build models of the health system: microsimulation, individual choice, and cell-based. These frameworks are filled in with data on baseline characteristics of the system and parameters describing individual behavior. Available data on baseline characteristics are imprecise, and estimates of key empirical parameters vary widely. A comparison of estimated and realized consequences of several health reform proposals suggests that models provided reasonably accurate estimates, with confidence bounds of approximately 30%.

  2. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    Science.gov (United States)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  3. Modeling and simulation of epidemic spread

    DEFF Research Database (Denmark)

    Shatnawi, Maad; Lazarova-Molnar, Sanja; Zaki, Nazar

    2013-01-01

    and control such epidemics. This paper presents an overview of the epidemic spread modeling and simulation, and summarizes the main technical challenges in this field. It further investigates the most relevant recent approaches carried out towards this perspective and provides a comparison and classification...

  4. Object Oriented Modelling and Dynamical Simulation

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1998-01-01

    This report with appendix describes the work done in master project at DTU.The goal of the project was to develop a concept for simulation of dynamical systems based on object oriented methods.The result was a library of C++-classes, for use when both building componentbased models and when...

  5. Modeling and Simulating Virtual Anatomical Humans

    NARCIS (Netherlands)

    Madehkhaksar, Forough; Luo, Zhiping; Pronost, Nicolas; Egges, Arjan

    2014-01-01

    This chapter presents human musculoskeletal modeling and simulation as a challenging field that lies between biomechanics and computer animation. One of the main goals of computer animation research is to develop algorithms and systems that produce plausible motion. On the other hand, the main chall

  6. Modeling and Simulation in Healthcare Future Directions

    Science.gov (United States)

    2010-07-13

    Quantify performance (Competency - based) 6. Simulate before practice ( Digital Libraries ) Classic Education and Examination What is the REVOLUTION in...av $800,000 yr 2.) Actor patients - $250,000 – $400,000/yr 2. Digital Libraries or synthetic tissue models a. Subscription vs up-front costs

  7. Simulation Versus Models: Which One and When?

    Science.gov (United States)

    Dorn, William S.

    1975-01-01

    Describes two types of computer-based experiments: simulation (which assumes no student knowledge of the workings of the computer program) is recommended for experiments aimed at inductive reasoning; and modeling (which assumes student understanding of the computer program) is recommended for deductive processes. (MLH)

  8. Love Kills:. Simulations in Penna Ageing Model

    Science.gov (United States)

    Stauffer, Dietrich; Cebrat, Stanisław; Penna, T. J. P.; Sousa, A. O.

    The standard Penna ageing model with sexual reproduction is enlarged by adding additional bit-strings for love: Marriage happens only if the male love strings are sufficiently different from the female ones. We simulate at what level of required difference the population dies out.

  9. Inverse modeling for Large-Eddy simulation

    NARCIS (Netherlands)

    Geurts, Bernardus J.

    1998-01-01

    Approximate higher order polynomial inversion of the top-hat filter is developed with which the turbulent stress tensor in Large-Eddy Simulation can be consistently represented using the filtered field. Generalized (mixed) similarity models are proposed which improved the agreement with the kinetic

  10. Microdata Simulation Modeling After Twenty Years.

    Science.gov (United States)

    Haveman, Robert H.

    1986-01-01

    This article describes the method and the development of microdata simulation modeling over the past two decades. After tracing a brief history of this evaluation method, its problems and prospects are assessed. The effects of this research method on the development of the social sciences are examined. (JAZ)

  11. Simulation Modeling on the Macintosh using STELLA.

    Science.gov (United States)

    Costanza, Robert

    1987-01-01

    Describes a new software package for the Apple Macintosh computer which can be used to create elaborate simulation models in a fraction of the time usually required without using a programming language. Illustrates the use of the software which relates to water usage. (TW)

  12. Simulation Modeling of Radio Direction Finding Results

    Directory of Open Access Journals (Sweden)

    K. Pelikan

    1994-12-01

    Full Text Available It is sometimes difficult to determine analytically error probabilities of direction finding results for evaluating algorithms of practical interest. Probalistic simulation models are described in this paper that can be to study error performance of new direction finding systems or to geographical modifications of existing configurations.

  13. A Prison/Parole System Simulation Model,

    Science.gov (United States)

    parole system on future prison and parole populations. A simulation model is presented, viewing a prison / parole system as a feedback process for...ciminal offenders . Transitions among the states in which an offender might be located, imprisoned, paroled , and discharged, are assumed to be in...accordance with a discrete time semi-Markov process. Projected prison and parole populations for sample data and applications of the model are discussed. (Author)

  14. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    Science.gov (United States)

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment.

  15. Understanding responder neurobiology in schizophrenia using a quantitative systems pharmacology model: application to iloperidone.

    Science.gov (United States)

    Geerts, Hugo; Roberts, Patrick; Spiros, Athan; Potkin, Steven

    2015-04-01

    The concept of targeted therapies remains a holy grail for the pharmaceutical drug industry for identifying responder populations or new drug targets. Here we provide quantitative systems pharmacology as an alternative to the more traditional approach of retrospective responder pharmacogenomics analysis and applied this to the case of iloperidone in schizophrenia. This approach implements the actual neurophysiological effect of genotypes in a computer-based biophysically realistic model of human neuronal circuits, is parameterized with human imaging and pathology, and is calibrated by clinical data. We keep the drug pharmacology constant, but allowed the biological model coupling values to fluctuate in a restricted range around their calibrated values, thereby simulating random genetic mutations and representing variability in patient response. Using hypothesis-free Design of Experiments methods the dopamine D4 R-AMPA (receptor-alpha-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid) receptor coupling in cortical neurons was found to drive the beneficial effect of iloperidone, likely corresponding to the rs2513265 upstream of the GRIA4 gene identified in a traditional pharmacogenomics analysis. The serotonin 5-HT3 receptor-mediated effect on interneuron gamma-aminobutyric acid conductance was identified as the process that moderately drove the differentiation of iloperidone versus ziprasidone. This paper suggests that reverse-engineered quantitative systems pharmacology is a powerful alternative tool to characterize the underlying neurobiology of a responder population and possibly identifying new targets. © The Author(s) 2015.

  16. Quantitative evaluation and modeling of two-dimensional neovascular network complexity: the surface fractal dimension

    Directory of Open Access Journals (Sweden)

    Franceschini Barbara

    2005-02-01

    Full Text Available Abstract Background Modeling the complex development and growth of tumor angiogenesis using mathematics and biological data is a burgeoning area of cancer research. Architectural complexity is the main feature of every anatomical system, including organs, tissues, cells and sub-cellular entities. The vascular system is a complex network whose geometrical characteristics cannot be properly defined using the principles of Euclidean geometry, which is only capable of interpreting regular and smooth objects that are almost impossible to find in Nature. However, fractal geometry is a more powerful means of quantifying the spatial complexity of real objects. Methods This paper introduces the surface fractal dimension (Ds as a numerical index of the two-dimensional (2-D geometrical complexity of tumor vascular networks, and their behavior during computer-simulated changes in vessel density and distribution. Results We show that Ds significantly depends on the number of vessels and their pattern of distribution. This demonstrates that the quantitative evaluation of the 2-D geometrical complexity of tumor vascular systems can be useful not only to measure its complex architecture, but also to model its development and growth. Conclusions Studying the fractal properties of neovascularity induces reflections upon the real significance of the complex form of branched anatomical structures, in an attempt to define more appropriate methods of describing them quantitatively. This knowledge can be used to predict the aggressiveness of malignant tumors and design compounds that can halt the process of angiogenesis and influence tumor growth.

  17. Kinetics simulation of luminol chemiluminescence based on quantitative analysis of photons generated in electrochemical oxidation.

    Science.gov (United States)

    Koizumi, Yozo; Nosaka, Yoshio

    2013-08-22

    The kinetics of electrogenerated chemiluminescence (ECL) of luminol at a gold electrode in alkaline solution was investigated by measuring the absolute number of photons emitted in an integrating sphere. The ECL efficiency as the ratio of photon to electric charge was 0.0004 in cyclic voltammography and 0.0005 in chronoamperometry. By numerically solving the rate equations based on a diffusion layer model, the observed time profile of the luminescence intensity could be successfully simulated from the oxidation current of luminol in the chronoamperometry. In the simulation, the rate constant for the oxidation of luminol by superoxide radicals in alkaline solution was determined to be 6 × 10(5) M(-1) s(-1). The present methodology and the achievement could be widely applicable to various analytical techniques using chemiluminescence.

  18. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    Science.gov (United States)

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  19. Linkage disequilibrium fine mapping of quantitative trait loci: A simulation study

    Directory of Open Access Journals (Sweden)

    Pérez-Enciso Miguel

    2003-09-01

    Full Text Available Abstract Recently, the use of linkage disequilibrium (LD to locate genes which affect quantitative traits (QTL has received an increasing interest, but the plausibility of fine mapping using linkage disequilibrium techniques for QTL has not been well studied. The main objectives of this work were to (1 measure the extent and pattern of LD between a putative QTL and nearby markers in finite populations and (2 investigate the usefulness of LD in fine mapping QTL in simulated populations using a dense map of multiallelic or biallelic marker loci. The test of association between a marker and QTL and the power of the test were calculated based on single-marker regression analysis. The results show the presence of substantial linkage disequilibrium with closely linked marker loci after 100 to 200 generations of random mating. Although the power to test the association with a frequent QTL of large effect was satisfactory, the power was low for the QTL with a small effect and/or low frequency. More powerful, multi-locus methods may be required to map low frequent QTL with small genetic effects, as well as combining both linkage and linkage disequilibrium information. The results also showed that multiallelic markers are more useful than biallelic markers to detect linkage disequilibrium and association at an equal distance.

  20. A Quantitative Model-Driven Comparison of Command Approaches in an Adversarial Process Model

    Science.gov (United States)

    2007-06-01

    12TH ICCRTS “Adapting C2 to the 21st Century” A Quantitative Model-Driven Comparison of Command Approaches in an Adversarial Process Model Tracks...Lenahan2 identified metrics and techniques for adversarial C2 process modeling . We intend to further that work by developing a set of adversarial process ...Approaches in an Adversarial Process Model 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  1. Twitter's tweet method modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.

  2. Quantitative Modelling of Multiphase Lithospheric Stretching and Deep Thermal History of Some Tertiary Rift Basins in Eastern China

    Institute of Scientific and Technical Information of China (English)

    林畅松; 张燕梅; 李思田; 刘景彦; 仝志刚; 丁孝忠; 李喜臣

    2002-01-01

    The stretching process of some Tertiary rift basins in eastern China is characterized by multiphase rifting. A multiple instantaneous uniform stretching model is proposed in this paper to simulate the formation of the basins as the rifting process cannot be accurately described by a simple (one episode) stretching model. The study shows that the multiphase stretching model, combined with the back-stripping technique, can be used to reconstruct the subsidence history and the stretching process of the lithosphere, and to evaluate the depth to the top of the asthenosphere and the deep thermal evolution of the basins. The calculated results obtained by applying the quantitative model to the episodic rifting process of the Tertiary Qiongdongnan and Yinggehai basins in the South China Sea are in agreement with geophysical data and geological observations. This provides a new method for quantitative evaluation of the geodynamic process of multiphase rifting occurring during the Tertiary in eastern China.

  3. Quantitative Modeling of Membrane Transport and Anisogamy by Small Groups Within a Large-Enrollment Organismal Biology Course

    Directory of Open Access Journals (Sweden)

    Eric S. Haag

    2016-12-01

    Full Text Available Quantitative modeling is not a standard part of undergraduate biology education, yet is routine in the physical sciences. Because of the obvious biophysical aspects, classes in anatomy and physiology offer an opportunity to introduce modeling approaches to the introductory curriculum. Here, we describe two in-class exercises for small groups working within a large-enrollment introductory course in organismal biology. Both build and derive biological insights from quantitative models, implemented using spreadsheets. One exercise models the evolution of anisogamy (i.e., small sperm and large eggs from an initial state of isogamy. Groups of four students work on Excel spreadsheets (from one to four laptops per group. The other exercise uses an online simulator to generate data related to membrane transport of a solute, and a cloud-based spreadsheet to analyze them. We provide tips for implementing these exercises gleaned from two years of experience.

  4. Simulating Chemical Kinetics Without Differential Equations: A Quantitative Theory Based on Chemical Pathways.

    Science.gov (United States)

    Bai, Shirong; Skodje, Rex T

    2017-08-17

    A new approach is presented for simulating the time-evolution of chemically reactive systems. This method provides an alternative to conventional modeling of mass-action kinetics that involves solving differential equations for the species concentrations. The method presented here avoids the need to solve the rate equations by switching to a representation based on chemical pathways. In the Sum Over Histories Representation (or SOHR) method, any time-dependent kinetic observable, such as concentration, is written as a linear combination of probabilities for chemical pathways leading to a desired outcome. In this work, an iterative method is introduced that allows the time-dependent pathway probabilities to be generated from a knowledge of the elementary rate coefficients, thus avoiding the pitfalls involved in solving the differential equations of kinetics. The method is successfully applied to the model Lotka-Volterra system and to a realistic H2 combustion model.

  5. A quantitative confidence signal detection model: 1. Fitting psychometric functions.

    Science.gov (United States)

    Yi, Yongwoo; Merfeld, Daniel M

    2016-04-01

    Perceptual thresholds are commonly assayed in the laboratory and clinic. When precision and accuracy are required, thresholds are quantified by fitting a psychometric function to forced-choice data. The primary shortcoming of this approach is that it typically requires 100 trials or more to yield accurate (i.e., small bias) and precise (i.e., small variance) psychometric parameter estimates. We show that confidence probability judgments combined with a model of confidence can yield psychometric parameter estimates that are markedly more precise and/or markedly more efficient than conventional methods. Specifically, both human data and simulations show that including confidence probability judgments for just 20 trials can yield psychometric parameter estimates that match the precision of those obtained from 100 trials using conventional analyses. Such an efficiency advantage would be especially beneficial for tasks (e.g., taste, smell, and vestibular assays) that require more than a few seconds for each trial, but this potential benefit could accrue for many other tasks. Copyright © 2016 the American Physiological Society.

  6. Toward a quantitative model of metamorphic nucleation and growth

    Science.gov (United States)

    Gaidies, F.; Pattison, D. R. M.; de Capitani, C.

    2011-11-01

    The formation of metamorphic garnet during isobaric heating is simulated on the basis of the classical nucleation and reaction rate theories and Gibbs free energy dissipation in a multi-component model system. The relative influences are studied of interfacial energy, chemical mobility at the surface of garnet clusters, heating rate and pressure on interface-controlled garnet nucleation and growth kinetics. It is found that the interfacial energy controls the departure from equilibrium required to nucleate garnet if attachment and detachment processes at the surface of garnet limit the overall crystallization rate. The interfacial energy for nucleation of garnet in a metapelite of the aureole of the Nelson Batholith, BC, is estimated to range between 0.03 and 0.3 J/m2 at a pressure of ca. 3,500 bar. This corresponds to a thermal overstep of the garnet-forming reaction of ca. 30°C. The influence of the heating rate on thermal overstepping is negligible. A significant feedback is predicted between chemical fractionation associated with garnet formation and the kinetics of nucleation and crystal growth of garnet giving rise to its lognormal—shaped crystal size distribution.

  7. A bivariate quantitative genetic model for a threshold trait and a survival trait

    Directory of Open Access Journals (Sweden)

    Damgaard Lars

    2006-11-01

    Full Text Available Abstract Many of the functional traits considered in animal breeding can be analyzed as threshold traits or survival traits with examples including disease traits, conformation scores, calving difficulty and longevity. In this paper we derive and implement a bivariate quantitative genetic model for a threshold character and a survival trait that are genetically and environmentally correlated. For the survival trait, we considered the Weibull log-normal animal frailty model. A Bayesian approach using Gibbs sampling was adopted in which model parameters were augmented with unobserved liabilities associated with the threshold trait. The fully conditional posterior distributions associated with parameters of the threshold trait reduced to well known distributions. For the survival trait the two baseline Weibull parameters were updated jointly by a Metropolis-Hastings step. The remaining model parameters with non-normalized fully conditional distributions were updated univariately using adaptive rejection sampling. The Gibbs sampler was tested in a simulation study and illustrated in a joint analysis of calving difficulty and longevity of dairy cattle. The simulation study showed that the estimated marginal posterior distributions covered well and placed high density to the true values used in the simulation of data. The data analysis of calving difficulty and longevity showed that genetic variation exists for both traits. The additive genetic correlation was moderately favorable with marginal posterior mean equal to 0.37 and 95% central posterior credibility interval ranging between 0.11 and 0.61. Therefore, this study suggests that selection for improving one of the two traits will be beneficial for the other trait as well.

  8. A bivariate quantitative genetic model for a threshold trait and a survival trait.

    Science.gov (United States)

    Damgaard, Lars Holm; Korsgaard, Inge Riis

    2006-01-01

    Many of the functional traits considered in animal breeding can be analyzed as threshold traits or survival traits with examples including disease traits, conformation scores, calving difficulty and longevity. In this paper we derive and implement a bivariate quantitative genetic model for a threshold character and a survival trait that are genetically and environmentally correlated. For the survival trait, we considered the Weibull log-normal animal frailty model. A Bayesian approach using Gibbs sampling was adopted in which model parameters were augmented with unobserved liabilities associated with the threshold trait. The fully conditional posterior distributions associated with parameters of the threshold trait reduced to well known distributions. For the survival trait the two baseline Weibull parameters were updated jointly by a Metropolis-Hastings step. The remaining model parameters with non-normalized fully conditional distributions were updated univariately using adaptive rejection sampling. The Gibbs sampler was tested in a simulation study and illustrated in a joint analysis of calving difficulty and longevity of dairy cattle. The simulation study showed that the estimated marginal posterior distributions covered well and placed high density to the true values used in the simulation of data. The data analysis of calving difficulty and longevity showed that genetic variation exists for both traits. The additive genetic correlation was moderately favorable with marginal posterior mean equal to 0.37 and 95% central posterior credibility interval ranging between 0.11 and 0.61. Therefore, this study suggests that selection for improving one of the two traits will be beneficial for the other trait as well.

  9. Modelling and simulation of affinity membrane adsorption.

    Science.gov (United States)

    Boi, Cristiana; Dimartino, Simone; Sarti, Giulio C

    2007-08-24

    A mathematical model for the adsorption of biomolecules on affinity membranes is presented. The model considers convection, diffusion and adsorption kinetics on the membrane module as well as the influence of dead end volumes and lag times; an analysis of flow distribution on the whole system is also included. The parameters used in the simulations were obtained from equilibrium and dynamic experimental data measured for the adsorption of human IgG on A2P-Sartoepoxy affinity membranes. The identification of a bi-Langmuir kinetic mechanisms for the experimental system investigated was paramount for a correct process description and the simulated breakthrough curves were in good agreement with the experimental data. The proposed model provides a new insight into the phenomena involved in the adsorption on affinity membranes and it is a valuable tool to assess the use of membrane adsorbers in large scale processes.

  10. Multiphase reacting flows modelling and simulation

    CERN Document Server

    Marchisio, Daniele L

    2007-01-01

    The papers in this book describe the most widely applicable modeling approaches and are organized in six groups covering from fundamentals to relevant applications. In the first part, some fundamentals of multiphase turbulent reacting flows are covered. In particular the introduction focuses on basic notions of turbulence theory in single-phase and multi-phase systems as well as on the interaction between turbulence and chemistry. In the second part, models for the physical and chemical processes involved are discussed. Among other things, particular emphasis is given to turbulence modeling strategies for multiphase flows based on the kinetic theory for granular flows. Next, the different numerical methods based on Lagrangian and/or Eulerian schemes are presented. In particular the most popular numerical approaches of computational fluid dynamics codes are described (i.e., Direct Numerical Simulation, Large Eddy Simulation, and Reynolds-Averaged Navier-Stokes approach). The book will cover particle-based meth...

  11. Modeling, simulation and optimization of bipedal walking

    CERN Document Server

    Berns, Karsten

    2013-01-01

    The model-based investigation of motions of anthropomorphic systems is an important interdisciplinary research topic involving specialists from many fields such as Robotics, Biomechanics, Physiology, Orthopedics, Psychology, Neurosciences, Sports, Computer Graphics and Applied Mathematics. This book presents a study of basic locomotion forms such as walking and running is of particular interest due to the high demand on dynamic coordination, actuator efficiency and balance control. Mathematical models and numerical simulation and optimization techniques are explained, in combination with experimental data, which can help to better understand the basic underlying mechanisms of these motions and to improve them. Example topics treated in this book are Modeling techniques for anthropomorphic bipedal walking systems Optimized walking motions for different objective functions Identification of objective functions from measurements Simulation and optimization approaches for humanoid robots Biologically inspired con...

  12. A Superbubble Feedback Model for Galaxy Simulations

    CERN Document Server

    Keller, B W; Benincasa, S M; Couchman, H M P

    2014-01-01

    We present a new stellar feedback model that reproduces superbubbles. Superbubbles from clustered young stars evolve quite differently to individual supernovae and are substantially more efficient at generating gas motions. The essential new components of the model are thermal conduction, sub-grid evaporation and a sub-grid multi-phase treatment for cases where the simulation mass resolution is insufficient to model the early stages of the superbubble. The multi-phase stage is short compared to superbubble lifetimes. Thermal conduction physically regulates the hot gas mass without requiring a free parameter. Accurately following the hot component naturally avoids overcooling. Prior approaches tend to heat too much mass, leaving the hot ISM below $10^6$ K and susceptible to rapid cooling unless ad-hoc fixes were used. The hot phase also allows feedback energy to correctly accumulate from multiple, clustered sources, including stellar winds and supernovae. We employ high-resolution simulations of a single star ...

  13. Advancing Material Models for Automotive Forming Simulations

    Science.gov (United States)

    Vegter, H.; An, Y.; ten Horn, C. H. L. J.; Atzema, E. H.; Roelofsen, M. E.

    2005-08-01

    Simulations in automotive industry need more advanced material models to achieve highly reliable forming and springback predictions. Conventional material models implemented in the FEM-simulation models are not capable to describe the plastic material behaviour during monotonic strain paths with sufficient accuracy. Recently, ESI and Corus co-operate on the implementation of an advanced material model in the FEM-code PAMSTAMP 2G. This applies to the strain hardening model, the influence of strain rate, and the description of the yield locus in these models. A subsequent challenge is the description of the material after a change of strain path. The use of advanced high strength steels in the automotive industry requires a description of plastic material behaviour of multiphase steels. The simplest variant is dual phase steel consisting of a ferritic and a martensitic phase. Multiphase materials also contain a bainitic phase in addition to the ferritic and martensitic phase. More physical descriptions of strain hardening than simple fitted Ludwik/Nadai curves are necessary. Methods to predict plastic behaviour of single-phase materials use a simple dislocation interaction model based on the formed cells structures only. At Corus, a new method is proposed to predict plastic behaviour of multiphase materials have to take hard phases into account, which deform less easily. The resulting deformation gradients create geometrically necessary dislocations. Additional micro-structural information such as morphology and size of hard phase particles or grains is necessary to derive the strain hardening models for this type of materials. Measurements available from the Numisheet benchmarks allow these models to be validated. At Corus, additional measured values are available from cross-die tests. This laboratory test can attain critical deformations by large variations in blank size and processing conditions. The tests are a powerful tool in optimising forming simulations

  14. Modelling and simulation of thermal power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eborn, J.

    1998-02-01

    Mathematical modelling and simulation are important tools when dealing with engineering systems that today are becoming increasingly more complex. Integrated production and recycling of materials are trends that give rise to heterogenous systems, which are difficult to handle within one area of expertise. Model libraries are an excellent way to package engineering knowledge of systems and units to be reused by those who are not experts in modelling. Many commercial packages provide good model libraries, but they are usually domain-specific and closed. Heterogenous, multi-domain systems requires open model libraries written in general purpose modelling languages. This thesis describes a model database for thermal power plants written in the object-oriented modelling language OMOLA. The models are based on first principles. Subunits describe volumes with pressure and enthalpy dynamics and flows of heat or different media. The subunits are used to build basic units such as pumps, valves and heat exchangers which can be used to build system models. Several applications are described; a heat recovery steam generator, equipment for juice blending, steam generation in a sulphuric acid plant and a condensing steam plate heat exchanger. Model libraries for industrial use must be validated against measured data. The thesis describes how parameter estimation methods can be used for model validation. Results from a case-study on parameter optimization of a non-linear drum boiler model show how the technique can be used 32 refs, 21 figs

  15. Dynamics modeling and simulation of flexible airships

    Science.gov (United States)

    Li, Yuwen

    The resurgence of airships has created a need for dynamics models and simulation capabilities of these lighter-than-air vehicles. The focus of this thesis is a theoretical framework that integrates the flight dynamics, structural dynamics, aerostatics and aerodynamics of flexible airships. The study begins with a dynamics model based on a rigid-body assumption. A comprehensive computation of aerodynamic effects is presented, where the aerodynamic forces and moments are categorized into various terms based on different physical effects. A series of prediction approaches for different aerodynamic effects are unified and applied to airships. The numerical results of aerodynamic derivatives and the simulated responses to control surface deflection inputs are verified by comparing to existing wind-tunnel and flight test data. With the validated aerodynamics and rigid-body modeling, the equations of motion of an elastic airship are derived by the Lagrangian formulation. The airship is modeled as a free-free Euler-Bernoulli beam and the bending deformations are represented by shape functions chosen as the free-free normal modes. In order to capture the coupling between the aerodynamic forces and the structural elasticity, local velocity on the deformed vehicle is used in the computation of aerodynamic forces. Finally, with the inertial, gravity, aerostatic and control forces incorporated, the dynamics model of a flexible airship is represented by a single set of nonlinear ordinary differential equations. The proposed model is implemented as a dynamics simulation program to analyze the dynamics characteristics of the Skyship-500 airship. Simulation results are presented to demonstrate the influence of structural deformation on the aerodynamic forces and the dynamics behavior of the airship. The nonlinear equations of motion are linearized numerically for the purpose of frequency domain analysis and for aeroelastic stability analysis. The results from the latter for the

  16. Mathematical models and numerical simulation in electromagnetism

    CERN Document Server

    Bermúdez, Alfredo; Salgado, Pilar

    2014-01-01

    The book represents a basic support for a master course in electromagnetism oriented to numerical simulation. The main goal of the book is that the reader knows the boundary-value problems of partial differential equations that should be solved in order to perform computer simulation of electromagnetic processes. Moreover it includes a part devoted to electric circuit theory  based on ordinary differential equations. The book is mainly oriented to electric engineering applications, going from the general to the specific, namely, from the full Maxwell’s equations to the particular cases of electrostatics, direct current, magnetostatics and eddy currents models. Apart from standard exercises related to analytical calculus, the book includes some others oriented to real-life applications solved with MaxFEM free simulation software.

  17. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  18. Deep Drawing Simulations With Different Polycrystalline Models

    Science.gov (United States)

    Duchêne, Laurent; de Montleau, Pierre; Bouvier, Salima; Habraken, Anne Marie

    2004-06-01

    The goal of this research is to study the anisotropic material behavior during forming processes, represented by both complex yield loci and kinematic-isotropic hardening models. A first part of this paper describes the main concepts of the `Stress-strain interpolation' model that has been implemented in the non-linear finite element code Lagamine. This model consists of a local description of the yield locus based on the texture of the material through the full constraints Taylor's model. The texture evolution due to plastic deformations is computed throughout the FEM simulations. This `local yield locus' approach was initially linked to the classical isotropic Swift hardening law. Recently, a more complex hardening model was implemented: the physically-based microstructural model of Teodosiu. It takes into account intergranular heterogeneity due to the evolution of dislocation structures, that affects isotropic and kinematic hardening. The influence of the hardening model is compared to the influence of the texture evolution thanks to deep drawing simulations.

  19. Facebook's personal page modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  20. Towards Better Coupling of Hydrological Simulation Models

    Science.gov (United States)

    Penton, D.; Stenson, M.; Leighton, B.; Bridgart, R.

    2012-12-01

    Standards for model interoperability and scientific workflow software provide techniques and tools for coupling hydrological simulation models. However, model builders are yet to realize the benefits of these and continue to write ad hoc implementations and scripts. Three case studies demonstrate different approaches to coupling models, the first using tight interfaces (OpenMI), the second using a scientific workflow system (Trident) and the third using a tailored execution engine (Delft Flood Early Warning System - Delft-FEWS). No approach was objectively better than any other approach. The foremost standard for coupling hydrological models is the Open Modeling Interface (OpenMI), which defines interfaces for models to interact. An implementation of the OpenMI standard involves defining interchange terms and writing a .NET/Java wrapper around the model. An execution wrapper such as OatC.GUI or Pipistrelle executes the models. The team built two OpenMI implementations for eWater Source river system models. Once built, it was easy to swap river system models. The team encountered technical challenges with versions of the .Net framework (3.5 calling 4.0) and with the performance of the execution wrappers when running daily simulations. By design, the OpenMI interfaces are general, leaving significant decisions around the semantics of the interfaces to the implementer. Increasingly, scientific workflow tools such as Kepler, Taverna and Trident are able to replace custom scripts. These tools aim to improve the provenance and reproducibility of processing tasks. In particular, Taverna and the myExperiment website have had success making many bioinformatics workflows reusable and sharable. The team constructed Trident activities for hydrological software including IQQM, REALM and eWater Source. They built an activity generator for model builders to build activities for particular river systems. The models were linked at a simulation level, without any daily time

  1. Modeling and simulation of the human eye

    Science.gov (United States)

    Duran, R.; Ventura, L.; Nonato, L.; Bruno, O.

    2007-02-01

    The computational modeling of the human eye has been wide studied for different sectors of the scientific and technological community. One of the main reasons for this increasing interest is the possibility to reproduce eye optic properties by means of computational simulations, becoming possible the development of efficient devices to treat and to correct the problems of the vision. This work explores this aspect still little investigated of the modeling of the visual system, considering a computational sketch that make possible the use of real data in the modeling and simulation of the human visual system. This new approach makes possible the individual inquiry of the optic system, assisting in the construction of new techniques used to infer vital data in medical investigations. Using corneal topography to collect real data from patients, a computational model of cornea is constructed and a set of simulations were build to ensure the correctness of the system and to investigate the effect of corneal abnormalities in retinal image formation, such as Plcido Discs, Point Spread Function, Wave front and the projection of a real image and it's visualization on retina.

  2. A superbubble feedback model for galaxy simulations

    Science.gov (United States)

    Keller, B. W.; Wadsley, J.; Benincasa, S. M.; Couchman, H. M. P.

    2014-08-01

    We present a new stellar feedback model that reproduces superbubbles. Superbubbles from clustered young stars evolve quite differently to individual supernovae and are substantially more efficient at generating gas motions. The essential new components of the model are thermal conduction, subgrid evaporation and a subgrid multiphase treatment for cases where the simulation mass resolution is insufficient to model the early stages of the superbubble. The multiphase stage is short compared to superbubble lifetimes. Thermal conduction physically regulates the hot gas mass without requiring a free parameter. Accurately following the hot component naturally avoids overcooling. Prior approaches tend to heat too much mass, leaving the hot interstellar medium (ISM) below 106 K and susceptible to rapid cooling unless ad hoc fixes were used. The hot phase also allows feedback energy to correctly accumulate from multiple, clustered sources, including stellar winds and supernovae. We employ high-resolution simulations of a single star cluster to show the model is insensitive to numerical resolution, unresolved ISM structure and suppression of conduction by magnetic fields. We also simulate a Milky Way analogue and a dwarf galaxy. Both galaxies show regulated star formation and produce strong outflows.

  3. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  4. ASSETS MANAGEMENT - A CONCEPTUAL MODEL DECOMPOSING VALUE FOR THE CUSTOMER AND A QUANTITATIVE MODEL

    Directory of Open Access Journals (Sweden)

    Susana Nicola

    2015-03-01

    Full Text Available In this paper we describe de application of a modeling framework, the so-called Conceptual Model Decomposing Value for the Customer (CMDVC, in a Footwear Industry case study, to ascertain the usefulness of this approach. The value networks were used to identify the participants, both tangible and intangible deliverables/endogenous and exogenous assets, and the analysis of their interactions as the indication for an adequate value proposition. The quantitative model of benefits and sacrifices, using the Fuzzy AHP method, enables the discussion of how the CMDVC can be applied and used in the enterprise environment and provided new relevant relations between perceived benefits (PBs.

  5. Simulation modelling of fynbos ecosystems: Systems analysis and conceptual models

    CSIR Research Space (South Africa)

    Kruger, FJ

    1985-03-01

    Full Text Available This report outlines progress with the development of computer based dynamic simulation models for ecosystems in the fynbos biome. The models are planned to run on a portable desktop computer with 500 kbytes of memory, extended BASIC language...

  6. A Model Management Approach for Co-Simulation Model Evaluation

    NARCIS (Netherlands)

    Zhang, X.C.; Broenink, Johannes F.; Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2011-01-01

    Simulating formal models is a common means for validating the correctness of the system design and reduce the time-to-market. In most of the embedded control system design, multiple engineering disciplines and various domain-specific models are often involved, such as mechanical, control, software

  7. Non Linear Programming (NLP) formulation for quantitative modeling of protein signal transduction pathways.

    Science.gov (United States)

    Mitsos, Alexander; Melas, Ioannis N; Morris, Melody K; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Alexopoulos, Leonidas G

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  8. eShopper modeling and simulation

    Science.gov (United States)

    Petrushin, Valery A.

    2001-03-01

    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.

  9. NUMERICAL MODEL APPLICATION IN ROWING SIMULATOR DESIGN

    Directory of Open Access Journals (Sweden)

    Petr Chmátal

    2016-04-01

    Full Text Available The aim of the research was to carry out a hydraulic design of rowing/sculling and paddling simulator. Nowadays there are two main approaches in the simulator design. The first one includes a static water with no artificial movement and counts on specially cut oars to provide the same resistance in the water. The second approach, on the other hand uses pumps or similar devices to force the water to circulate but both of the designs share many problems. Such problems are affecting already built facilities and can be summarized as unrealistic feeling, unwanted turbulent flow and bad velocity profile. Therefore, the goal was to design a new rowing simulator that would provide nature-like conditions for the racers and provide an unmatched experience. In order to accomplish this challenge, it was decided to use in-depth numerical modeling to solve the hydraulic problems. The general measures for the design were taken in accordance with space availability of the simulator ́s housing. The entire research was coordinated with other stages of the construction using BIM. The detailed geometry was designed using a numerical model in Ansys Fluent and parametric auto-optimization tools which led to minimum negative hydraulic phenomena and decreased investment and operational costs due to the decreased hydraulic losses in the system.

  10. The simulation and prediction of spatio-temporal urban growth trends using cellular automata models: A review

    Science.gov (United States)

    Aburas, Maher Milad; Ho, Yuek Ming; Ramli, Mohammad Firuz; Ash'aari, Zulfa Hanan

    2016-10-01

    In recent years, several types of simulation and prediction models have been used within a GIS environment to determine a realistic future for urban growth patterns. These models include quantitative and spatio-temporal techniques that are implemented to monitor urban growth. The results derived through these techniques are used to create future policies that take into account sustainable development and the demands of future generations. The aim of this paper is to provide a basis for a literature review of urban Cellular Automata (CA) models to find the most suitable approach for a realistic simulation of land use changes. The general characteristics of simulation models of urban growth and urban CA models are described, and the different techniques used in the design of these models are classified. The strengths and weaknesses of the various models are identified based on the analysis and discussion of the characteristics of these models. The results of the review confirm that the CA model is one of the strongest models for simulating urban growth patterns owing to its structure, simplicity, and possibility of evolution. Limitations of the CA model, namely weaknesses in the quantitative aspect, and the inability to include the driving forces of urban growth in the simulation process, may be minimized by integrating it with other quantitative models, such as via the Analytic Hierarchy Process (AHP), Markov Chain and frequency ratio models. Realistic simulation can be achieved when socioeconomic factors and spatial and temporal dimensions are integrated in the simulation process.

  11. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    Science.gov (United States)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  12. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  13. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    Science.gov (United States)

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case.

  14. A quantitative quasispecies theory-based model of virus escape mutation under immune selection.

    Science.gov (United States)

    Woo, Hyung-June; Reifman, Jaques

    2012-08-07

    Viral infections involve a complex interplay of the immune response and escape mutation of the virus quasispecies inside a single host. Although fundamental aspects of such a balance of mutation and selection pressure have been established by the quasispecies theory decades ago, its implications have largely remained qualitative. Here, we present a quantitative approach to model the virus evolution under cytotoxic T-lymphocyte immune response. The virus quasispecies dynamics are explicitly represented by mutations in the combined sequence space of a set of epitopes within the viral genome. We stochastically simulated the growth of a viral population originating from a single wild-type founder virus and its recognition and clearance by the immune response, as well as the expansion of its genetic diversity. Applied to the immune escape of a simian immunodeficiency virus epitope, model predictions were quantitatively comparable to the experimental data. Within the model parameter space, we found two qualitatively different regimes of infectious disease pathogenesis, each representing alternative fates of the immune response: It can clear the infection in finite time or eventually be overwhelmed by viral growth and escape mutation. The latter regime exhibits the characteristic disease progression pattern of human immunodeficiency virus, while the former is bounded by maximum mutation rates that can be suppressed by the immune response. Our results demonstrate that, by explicitly representing epitope mutations and thus providing a genotype-phenotype map, the quasispecies theory can form the basis of a detailed sequence-specific model of real-world viral pathogens evolving under immune selection.

  15. Interactive Modelling and Simulation of Human Motion

    DEFF Research Database (Denmark)

    Engell-Nørregård, Morten Pol

    Dansk resumé Denne ph.d.-afhandling beskæftiger sig med modellering og simulation af menneskelig bevægelse. Emnerne i denne afhandling har mindst to ting til fælles. For det første beskæftiger de sig med menneskelig bevægelse. Selv om de udviklede modeller også kan benyttes til andre ting,er det ....... Endvidere kan den anvendes med enhver softbody simuleringsmodel som finite elements eller mass spring systemer. • En kontrol metode til deformerbare legemer baseret på rum tids opti- mering. fremgangsmåden kan anvendes til at styre sammentrækning af muskler i en muskel simulering....

  16. Computer Modelling and Simulation for Inventory Control

    Directory of Open Access Journals (Sweden)

    G.K. Adegoke

    2012-07-01

    Full Text Available This study concerns the role of computer simulation as a device for conducting scientific experiments on inventory control. The stores function utilizes a bulk of physical assets and engages a bulk of financial resources in a manufacturing outfit therefore there is a need for an efficient inventory control. The reason being that inventory control reduces cost of production and thereby facilitates the effective and efficient accomplishment of production objectives of an organization. Some mathematical and statistical models were used to compute the Economic Order Quantity (EOQ. Test data were gotten from a manufacturing company and same were simulated. The results generated were used to predict a real life situation and have been presented and discussed. The language of implementation for the three models is Turbo Pascal due to its capability, generality and flexibility as a scientific programming language.

  17. Model parameters for simulation of physiological lipids

    Science.gov (United States)

    McGlinchey, Nicholas

    2016-01-01

    Coarse grain simulation of proteins in their physiological membrane environment can offer insight across timescales, but requires a comprehensive force field. Parameters are explored for multicomponent bilayers composed of unsaturated lipids DOPC and DOPE, mixed‐chain saturation POPC and POPE, and anionic lipids found in bacteria: POPG and cardiolipin. A nonbond representation obtained from multiscale force matching is adapted for these lipids and combined with an improved bonding description of cholesterol. Equilibrating the area per lipid yields robust bilayer simulations and properties for common lipid mixtures with the exception of pure DOPE, which has a known tendency to form nonlamellar phase. The models maintain consistency with an existing lipid–protein interaction model, making the force field of general utility for studying membrane proteins in physiologically representative bilayers. © 2016 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:26864972

  18. Simulation modeling of wheeled vehicle dynamics on the stand "Roller"

    Directory of Open Access Journals (Sweden)

    G. O. Kotiev

    2014-01-01

    Full Text Available The tests are an integral part of the wheeled vehicle design, manufacturing, and operation. The need for their conducting arises from the research and experimental activities to assess the qualitative and quantitative characteristics of the vehicles in general, as well as the individual components and assemblies. It is obvious that a variety of design features of wheeled vehicles request a development of methods both for experimental studies and for creating the original bench equipment for these purposes.The main positive feature of bench tests of automotive engineering is a broad capability to control the combinations of traction loads, speed rates, and external input conditions. Here, the steady state conditions can be used for a long time, allowing all the necessary measurements to be made, including those with video and photo recording experiment.It is known that the benefits of test "M" type (using a roller dynamometer include a wide range of test modes, which do not depend on the climatic conditions, as well as a capability to use a computer-aided testing programs. At the same time, it is known that the main drawback of bench tests of full-size vehicle is that the tire rolling conditions on the drum mismatch to the real road pavements, which are difficult to simulate on the drum surface. This problem can be solved owing to wheeled vehicle tests at the benches "Roller" to be, in efficiency, the most preferable research method. The article gives a detailed presentation of developed at BMSTU approach to its solving.Problem of simulation mathematical modeling has been solved for the vehicle with the wheel formula 8 × 8, and individual wheel-drive.The simulation results have led to the conclusion that the proposed principle to simulate a vehicle rolling on a smooth non-deformable support base using a bench " Roller " by simulation modeling is efficient.

  19. Theory, Modeling and Simulation Annual Report 2000

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, David A.; Garrett, Bruce C.; Straatsma, Tp; Jones, Donald R.; Studham, Ronald S.; Harrison, Robert J.; Nichols, Jeffrey A.

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM&S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  20. Theory, Modeling and Simulation Annual Report 2000

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  1. Catalog of Wargaming and Military Simulation Models

    Science.gov (United States)

    1992-02-07

    PROPONENT: USAF ASD, McDonnell Douglas Corp. POINT OF CONTACT: Photon Research Associates (Alias): Mr. Jeff Johnson , (619) 455-9741; McDonnell Douglas...POINTOF CONTACT: Dr. R. Johnson , (DSN) 295-1593 or (301) 295-1593. PURPOSE: The model provides simulation of airland activities in a theater of operations...training, and education. PROPONENT: J-8 Political Military Affairs Directorate. POINT OF CONTACT: LTC Steven G. Stainer . PURPOSE: RDSS is a system

  2. Fully Adaptive Radar Modeling and Simulation Development

    Science.gov (United States)

    2017-04-01

    Organization (NATO) Sensors Electronics Technology (SET)-227 Panel on Cognitive Radar. The FAR M&S architecture developed in Phase I allows for...Air Force’s previously developed radar M&S tools. This report is organized as follows. In Chapter 3, we provide an overview of the FAR framework...AFRL-RY-WP-TR-2017-0074 FULLY ADAPTIVE RADAR MODELING AND SIMULATION DEVELOPMENT Kristine L. Bell and Anthony Kellems Metron, Inc

  3. Difficulties with True Interoperability in Modeling & Simulation

    Science.gov (United States)

    2011-12-01

    Standards in M&S cover multiple layers of technical abstraction. There are middleware specifica- tions, such as the High Level Architecture (HLA) ( IEEE Xplore ... IEEE Xplore Digital Library. 2010. 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA) – Framework and Rules...using different communication protocols being able to allow da- 2642978-1-4577-2109-0/11/$26.00 ©2011 IEEE Report Documentation Page Form ApprovedOMB No

  4. Modelling interplanetary CMEs using magnetohydrodynamic simulations

    Directory of Open Access Journals (Sweden)

    P. J. Cargill

    Full Text Available The dynamics of Interplanetary Coronal Mass Ejections (ICMEs are discussed from the viewpoint of numerical modelling. Hydrodynamic models are shown to give a good zero-order picture of the plasma properties of ICMEs, but they cannot model the important magnetic field effects. Results from MHD simulations are shown for a number of cases of interest. It is demonstrated that the strong interaction of the ICME with the solar wind leads to the ICME and solar wind velocities being close to each other at 1 AU, despite their having very different speeds near the Sun. It is also pointed out that this interaction leads to a distortion of the ICME geometry, making cylindrical symmetry a dubious assumption for the CME field at 1 AU. In the presence of a significant solar wind magnetic field, the magnetic fields of the ICME and solar wind can reconnect with each other, leading to an ICME that has solar wind-like field lines. This effect is especially important when an ICME with the right sense of rotation propagates down the heliospheric current sheet. It is also noted that a lack of knowledge of the coronal magnetic field makes such simulations of little use in space weather forecasts that require knowledge of the ICME magnetic field strength.

    Key words. Interplanetary physics (interplanetary magnetic fields Solar physics, astrophysics, and astronomy (flares and mass ejections Space plasma physics (numerical simulation studies

  5. Interactive Modelling and Simulation of Human Motion

    DEFF Research Database (Denmark)

    Engell-Nørregård, Morten Pol

    Dansk resumé Denne ph.d.-afhandling beskæftiger sig med modellering og simulation af menneskelig bevægelse. Emnerne i denne afhandling har mindst to ting til fælles. For det første beskæftiger de sig med menneskelig bevægelse. Selv om de udviklede modeller også kan benyttes til andre ting,er det...... menneskers led, der udviser både ikke-konveksitet og flere frihedsgrader • En generel og alsidig model for aktivering af bløde legemer. Modellen kan anvendes som et animations værktøj, men er lige så velegnet til simulering af menneskelige muskler, da den opfylder de grundlæggende fysiske principper...... primære fokus på at modellere den menneskelige krop. For det andet, beskæftiger de sig alle med simulering som et redskab til at syntetisere bevægelse og dermed skabe animationer. Dette er en vigtigt pointe, da det betyder, at vi ikke kun skaber værktøjer til animatorer, som de kan bruge til at lave sjove...

  6. WiBro Mobility Simulation Model

    Directory of Open Access Journals (Sweden)

    Junaid Qayyum

    2011-09-01

    Full Text Available WiBro, or Wireless Broadband, is the newest variety of mobile wireless broadband access. WiBro technology is being developed by the Korean Telecoms industry. It is based on the IEEE 802.16e (Mobile WiMax international standard. Korean based fixed-line operators KT, SK Telecom were the first to get the licenses by the South Korean government to provide WiBro Commercially. Samsung had a demonstration on WiBro Mobile Phones and Systems at the "APEC IT Exhibition 2006". WiBro is comprised of two phases namely WiBro Phase I and WiBro Phase II. Samsung Electronics has been extensively contributing to Koreas WiBro (Wireless Broadband initiative as well as the IEEE 802.16 standards. The WiBro is a specific subset of the 802.16 standards, specially focusing on supporting full mobility of wireless access systems with OFDMA PHY interface. In this work, we have developed a simulation model of the WiBro system consisting of a set of Base Stations and Mobile Subscriber Stations by using the OPNET Modeler. The simulation model has been utilized to evaluate effective MAC layer throughput, resource usage efficiency, QoS class differentiation, and system capacity and performance under various simulation scenarios.

  7. Progress in Modeling and Simulation of Batteries

    Energy Technology Data Exchange (ETDEWEB)

    Turner, John A [ORNL

    2016-01-01

    Modeling and simulation of batteries, in conjunction with theory and experiment, are important research tools that offer opportunities for advancement of technologies that are critical to electric motors. The development of data from the application of these tools can provide the basis for managerial and technical decision-making. Together, these will continue to transform batteries for electric vehicles. This collection of nine papers presents the modeling and simulation of batteries and the continuing contribution being made to this impressive progress, including topics that cover: * Thermal behavior and characteristics * Battery management system design and analysis * Moderately high-fidelity 3D capabilities * Optimization Techniques and Durability As electric vehicles continue to gain interest from manufacturers and consumers alike, improvements in economy and affordability, as well as adoption of alternative fuel sources to meet government mandates are driving battery research and development. Progress in modeling and simulation will continue to contribute to battery improvements that deliver increased power, energy storage, and durability to further enhance the appeal of electric vehicles.

  8. Unbiased Quantitative Models of Protein Translation Derived from Ribosome Profiling Data.

    Directory of Open Access Journals (Sweden)

    Alexey A Gritsenko

    2015-08-01

    Full Text Available Translation of RNA to protein is a core process for any living organism. While for some steps of this process the effect on protein production is understood, a holistic understanding of translation still remains elusive. In silico modelling is a promising approach for elucidating the process of protein synthesis. Although a number of computational models of the process have been proposed, their application is limited by the assumptions they make. Ribosome profiling (RP, a relatively new sequencing-based technique capable of recording snapshots of the locations of actively translating ribosomes, is a promising source of information for deriving unbiased data-driven translation models. However, quantitative analysis of RP data is challenging due to high measurement variance and the inability to discriminate between the number of ribosomes measured on a gene and their speed of translation. We propose a solution in the form of a novel multi-scale interpretation of RP data that allows for deriving models with translation dynamics extracted from the snapshots. We demonstrate the usefulness of this approach by simultaneously determining for the first time per-codon translation elongation and per-gene translation initiation rates of Saccharomyces cerevisiae from RP data for two versions of the Totally Asymmetric Exclusion Process (TASEP model of translation. We do this in an unbiased fashion, by fitting the models using only RP data with a novel optimization scheme based on Monte Carlo simulation to keep the problem tractable. The fitted models match the data significantly better than existing models and their predictions show better agreement with several independent protein abundance datasets than existing models. Results additionally indicate that the tRNA pool adaptation hypothesis is incomplete, with evidence suggesting that tRNA post-transcriptional modifications and codon context may play a role in determining codon elongation rates.

  9. Unbiased Quantitative Models of Protein Translation Derived from Ribosome Profiling Data.

    Science.gov (United States)

    Gritsenko, Alexey A; Hulsman, Marc; Reinders, Marcel J T; de Ridder, Dick

    2015-08-01

    Translation of RNA to protein is a core process for any living organism. While for some steps of this process the effect on protein production is understood, a holistic understanding of translation still remains elusive. In silico modelling is a promising approach for elucidating the process of protein synthesis. Although a number of computational models of the process have been proposed, their application is limited by the assumptions they make. Ribosome profiling (RP), a relatively new sequencing-based technique capable of recording snapshots of the locations of actively translating ribosomes, is a promising source of information for deriving unbiased data-driven translation models. However, quantitative analysis of RP data is challenging due to high measurement variance and the inability to discriminate between the number of ribosomes measured on a gene and their speed of translation. We propose a solution in the form of a novel multi-scale interpretation of RP data that allows for deriving models with translation dynamics extracted from the snapshots. We demonstrate the usefulness of this approach by simultaneously determining for the first time per-codon translation elongation and per-gene translation initiation rates of Saccharomyces cerevisiae from RP data for two versions of the Totally Asymmetric Exclusion Process (TASEP) model of translation. We do this in an unbiased fashion, by fitting the models using only RP data with a novel optimization scheme based on Monte Carlo simulation to keep the problem tractable. The fitted models match the data significantly better than existing models and their predictions show better agreement with several independent protein abundance datasets than existing models. Results additionally indicate that the tRNA pool adaptation hypothesis is incomplete, with evidence suggesting that tRNA post-transcriptional modifications and codon context may play a role in determining codon elongation rates.

  10. Visualization of simulated small vessels on computed tomography using a model-based iterative reconstruction technique

    Directory of Open Access Journals (Sweden)

    Toru Higaki

    2017-08-01

    Full Text Available This article describes a quantitative evaluation of visualizing small vessels using several image reconstruction methods in computed tomography. Simulated vessels with diameters of 1–6 mm made by 3D printer was scanned using 320-row detector computed tomography (CT. Hybrid iterative reconstruction (hybrid IR and model-based iterative reconstruction (MBIR were performed for the image reconstruction.

  11. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  12. Consolidation modelling for thermoplastic composites forming simulation

    Science.gov (United States)

    Xiong, H.; Rusanov, A.; Hamila, N.; Boisse, P.

    2016-10-01

    Pre-impregnated thermoplastic composites are widely used in the aerospace industry for their excellent mechanical properties, Thermoforming thermoplastic prepregs is a fast manufacturing process, the automotive industry has shown increasing interest in this manufacturing processes, in which the reconsolidation is an essential stage. The model of intimate contact is investigated as the consolidation model, compression experiments have been launched to identify the material parameters, several numerical tests show the influents of the temperature and pressure applied during processing. Finally, a new solid-shell prismatic element has been presented for the simulation of consolidation step in the thermoplastic composites forming process.

  13. Modeling and simulation of reactive flows

    CERN Document Server

    Bortoli, De AL; Pereira, Felipe

    2015-01-01

    Modelling and Simulation of Reactive Flows presents information on modeling and how to numerically solve reactive flows. The book offers a distinctive approach that combines diffusion flames and geochemical flow problems, providing users with a comprehensive resource that bridges the gap for scientists, engineers, and the industry. Specifically, the book looks at the basic concepts related to reaction rates, chemical kinetics, and the development of reduced kinetic mechanisms. It considers the most common methods used in practical situations, along with equations for reactive flows, and va

  14. Solar Electric Bicycle Body Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Zhikun Wang

    2013-10-01

    Full Text Available A new solar electric bicycle design and study were carried out on in this paper. Application of CAD technology to establish three-dimension geometric model, using the kinetic analysis on the frame and other parts for numerical simulation and static strength analysis for the vehicle model design, virtual assembly, complete frame dynamics analysis and vibration analysis, with considering other factors, first on the frame structure improvement, second on security of design calculation analysis and comparison, finally get the ideal body design.

  15. Derivation of a quantitative minimal model from a detailed elementary-step mechanism supported by mathematical coupling analysis

    Science.gov (United States)

    Shaik, O. S.; Kammerer, J.; Gorecki, J.; Lebiedz, D.

    2005-12-01

    Accurate experimental data increasingly allow the development of detailed elementary-step mechanisms for complex chemical and biochemical reaction systems. Model reduction techniques are widely applied to obtain representations in lower-dimensional phase space which are more suitable for mathematical analysis, efficient numerical simulation, and model-based control tasks. Here, we exploit a recently implemented numerical algorithm for error-controlled computation of the minimum dimension required for a still accurate reduced mechanism based on automatic time scale decomposition and relaxation of fast modes. We determine species contributions to the active (slow) dynamical modes of the reaction system and exploit this information in combination with quasi-steady-state and partial-equilibrium approximations for explicit model reduction of a novel detailed chemical mechanism for the Ru-catalyzed light-sensitive Belousov-Zhabotinsky reaction. The existence of a minimum dimension of seven is demonstrated to be mandatory for the reduced model to show good quantitative consistency with the full model in numerical simulations. We derive such a maximally reduced seven-variable model from the detailed elementary-step mechanism and demonstrate that it reproduces quantitatively accurately the dynamical features of the full model within a given accuracy tolerance.

  16. Bayesian model choice and search strategies for mapping interacting quantitative trait Loci.

    Science.gov (United States)

    Yi, Nengjun; Xu, Shizhong; Allison, David B

    2003-01-01

    Most complex traits of animals, plants, and humans are influenced by multiple genetic and environmental factors. Interactions among multiple genes play fundamental roles in the genetic control and evolution of complex traits. Statistical modeling of interaction effects in quantitative trait loci (QTL) analysis must accommodate a very large number of potential genetic effects, which presents a major challenge to determining the genetic model with respect to the number of QTL, their positions, and their genetic effects. In this study, we use the methodology of Bayesian model and variable selection to develop strategies for identifying multiple QTL with complex epistatic patterns in experimental designs with two segregating genotypes. Specifically, we develop a reversible jump Markov chain Monte Carlo algorithm to determine the number of QTL and to select main and epistatic effects. With the proposed method, we can jointly infer the genetic model of a complex trait and the associated genetic parameters, including the number, positions, and main and epistatic effects of the identified QTL. Our method can map a large number of QTL with any combination of main and epistatic effects. Utility and flexibility of the method are demonstrated using both simulated data and a real data set. Sensitivity of posterior inference to prior specifications of the number and genetic effects of QTL is investigated. PMID:14573494

  17. Viscoelastic flow simulations in model porous media

    Science.gov (United States)

    De, S.; Kuipers, J. A. M.; Peters, E. A. J. F.; Padding, J. T.

    2017-05-01

    We investigate the flow of unsteadfy three-dimensional viscoelastic fluid through an array of symmetric and asymmetric sets of cylinders constituting a model porous medium. The simulations are performed using a finite-volume methodology with a staggered grid. The solid-fluid interfaces of the porous structure are modeled using a second-order immersed boundary method [S. De et al., J. Non-Newtonian Fluid Mech. 232, 67 (2016), 10.1016/j.jnnfm.2016.04.002]. A finitely extensible nonlinear elastic constitutive model with Peterlin closure is used to model the viscoelastic part. By means of periodic boundary conditions, we model the flow behavior for a Newtonian as well as a viscoelastic fluid through successive contractions and expansions. We observe the presence of counterrotating vortices in the dead ends of our geometry. The simulations provide detailed insight into how flow structure, viscoelastic stresses, and viscoelastic work change with increasing Deborah number De. We observe completely different flow structures and different distributions of the viscoelastic work at high De in the symmetric and asymmetric configurations, even though they have the exact same porosity. Moreover, we find that even for the symmetric contraction-expansion flow, most energy dissipation is occurring in shear-dominated regions of the flow domain, not in extensional-flow-dominated regions.

  18. Improving quantitative precipitation nowcasting with a local ensemble transform Kalman filter radar data assimilation system: observing system simulation experiments

    Directory of Open Access Journals (Sweden)

    Chih-Chien Tsai

    2014-03-01

    Full Text Available This study develops a Doppler radar data assimilation system, which couples the local ensemble transform Kalman filter with the Weather Research and Forecasting model. The benefits of this system to quantitative precipitation nowcasting (QPN are evaluated with observing system simulation experiments on Typhoon Morakot (2009, which brought record-breaking rainfall and extensive damage to central and southern Taiwan. The results indicate that the assimilation of radial velocity and reflectivity observations improves the three-dimensional winds and rain-mixing ratio most significantly because of the direct relations in the observation operator. The patterns of spiral rainbands become more consistent between different ensemble members after radar data assimilation. The rainfall intensity and distribution during the 6-hour deterministic nowcast are also improved, especially for the first 3 hours. The nowcasts with and without radar data assimilation have similar evolution trends driven by synoptic-scale conditions. Furthermore, we carry out a series of sensitivity experiments to develop proper assimilation strategies, in which a mixed localisation method is proposed for the first time and found to give further QPN improvement in this typhoon case.

  19. Can simulations quantitatively predict peptide transfer free energies to urea solutions? Thermodynamic concepts and force field limitations.

    Science.gov (United States)

    Horinek, Dominik; Netz, Roland R

    2011-06-16

    Many proteins denature when they are transferred to concentrated urea solutions. Three mechanisms for urea's denaturing ability have been proposed: (i) direct binding to polar parts of the protein surface, (ii) direct binding to nonpolar parts of the protein surface, and (iii) an indirect effect mediated by modifications of the bulk water properties. The disentanglement of these three processes has been the goal of many experimental and computational studies, yet there is no final agreement on the relative importance of the three contributions. The separation of the two direct mechanisms, albeit conceptually clear, is difficult in experimental studies and in simulations depends subtly on how the discrimination between polar and nonpolar groups is accomplished. Indirect effects, embodied in the change of solution activity as urea is added, are rarely monitored in urea/peptide simulations and thus have remained elusive in numerical studies. In this paper we establish a rigorous separation of all three contributions to the solvation thermodynamics of stretched peptide chains. We contrast this scenario with two commonly used model systems: the air/water interface and the interface between water and a hydrophobic alkane self-assembled monolayer. Together with bulk thermodynamic properties of urea/water mixed solvents, a complete thermodynamic description of the urea/water/peptide system is obtained: urea avoids the air/water interface but readily adsorbs at the oil-water interface and at hydrophobic as well as hydrophilic peptide chains, in accordance with experimental results. Simple thermodynamic arguments show that the indirect contribution to urea's denaturing capability is negligibly small, although urea strongly changes the water bulk properties as judged by the number of hydrogen bonds formed. Urea's tendency to bind to proteins is correctly reproduced with several force field combinations, but the quantitative binding strength as well as the relative importance

  20. LISP based simulation generators for modeling complex space processes

    Science.gov (United States)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  1. Biomedical Simulation Models of Human Auditory Processes

    Science.gov (United States)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  2. Simulation Model of Brushless Excitation System

    Directory of Open Access Journals (Sweden)

    Ahmed N.A.  Alla

    2007-01-01

    Full Text Available Excitation system is key element in the dynamic performance of electric power systems, accurate excitation models are of great importance in simulating and investigating the power system transient phenomena. Parameter identification of the Brushless excitation system was presented. First a block diagram for the EXS parameter was proposed based on the documents and maps in the power station. To identify the parameters of this model, a test procedure to obtain step response, was presented. Using the Genetic Algorithm with the Matlab-software it was possible to identify all the necessary parameters of the model. Using the same measured input signals the response from the standard model showed nearly the same behavior as the excitation system.

  3. Modeling and simulation of direct contact evaporators

    Directory of Open Access Journals (Sweden)

    F.B. Campos

    2001-09-01

    Full Text Available A dynamic model of a direct contact evaporator was developed and coupled to a recently developed superheated bubble model. The latter model takes into account heat and mass transfer during the bubble formation and ascension stages and is able to predict gas holdup in nonisothermal systems. The results of the coupled model, which does not have any adjustable parameter, were compared with experimental data. The transient behavior of the liquid-phase temperature and the vaporization rate under quasi-steady-state conditions were in very good agreement with experimental data. The transient behavior of liquid height was only reasonably simulated. In order to explain this partial disagreement, some possible causes were analyzed.

  4. Efficient Turbulence Modeling for CFD Wake Simulations

    DEFF Research Database (Denmark)

    van der Laan, Paul

    , that can accurately and efficiently simulate wind turbine wakes. The linear k-ε eddy viscosity model (EVM) is a popular turbulence model in RANS; however, it underpredicts the velocity wake deficit and cannot predict the anisotropic Reynolds-stresses in the wake. In the current work, nonlinear eddy...... viscosity models (NLEVM) are applied to wind turbine wakes. NLEVMs can model anisotropic turbulence through a nonlinear stress-strain relation, and they can improve the velocity deficit by the use of a variable eddy viscosity coefficient, that delays the wake recovery. Unfortunately, all tested NLEVMs show...... numerically unstable behavior for fine grids, which inhibits a grid dependency study for numerical verification. Therefore, a simpler EVM is proposed, labeled as the k-ε - fp EVM, that has a linear stress-strain relation, but still has a variable eddy viscosity coefficient. The k-ε - fp EVM is numerically...

  5. Physiological role of Kv1.3 channel in T lymphocyte cell investigated quantitatively by kinetic modeling.

    Directory of Open Access Journals (Sweden)

    Panpan Hou

    Full Text Available Kv1.3 channel is a delayed rectifier channel abundant in human T lymphocytes. Chronic inflammatory and autoimmune disorders lead to the over-expression of Kv1.3 in T cells. To quantitatively study the regulatory mechanism and physiological function of Kv1.3 in T cells, it is necessary to have a precise kinetic model of Kv1.3. In this study, we firstly established a kinetic model capable to precisely replicate all the kinetic features for Kv1.3 channels, and then constructed a T-cell model composed of ion channels including Ca2+-release activated calcium (CRAC channel, intermediate K+ (IK channel, TASK channel and Kv1.3 channel for quantitatively simulating the changes in membrane potentials and local Ca2+ signaling messengers during activation of T cells. Based on the experimental data from current-clamp recordings, we successfully demonstrated that Kv1.3 dominated the membrane potential of T cells to manipulate the Ca2+ influx via CRAC channel. Our results revealed that the deficient expression of Kv1.3 channel would cause the less Ca2+ signal, leading to the less efficiency in secretion. This was the first successful attempt to simulate membrane potential in non-excitable cells, which laid a solid basis for quantitatively studying the regulatory mechanism and physiological role of channels in non-excitable cells.

  6. Integrating Visualizations into Modeling NEST Simulations.

    Science.gov (United States)

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.

  7. Integrating Visualizations into Modeling NEST Simulations

    Directory of Open Access Journals (Sweden)

    Christian eNowke

    2015-12-01

    Full Text Available Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.

  8. Application of simulation models for the optimization of business processes

    Science.gov (United States)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  9. Raytracing simulations of coupled dark energy models

    CERN Document Server

    Pace, Francesco; Moscardini, Lauro; Bacon, David; Crittenden, Robert

    2014-01-01

    Dark matter and dark energy are usually assumed to be independent, coupling only gravitationally. An extension to this simple picture is to model dark energy as a scalar field which is directly coupled to the cold dark matter fluid. Such a non-trivial coupling in the dark sector leads to a fifth force and a time-dependent dark matter particle mass. In this work we examine the impact that dark energy-dark matter couplings have on weak lensing statistics by constructing realistic simulated weak-lensing maps using raytracing techniques through a suite of N-body cosmological simulations. We construct maps for an array of different lensing quantities, covering a range of scales from a few arcminutes to several degrees. The concordance $\\Lambda$CDM model is compared to different coupled dark energy models, described either by an exponential scalar field potential (standard coupled dark energy scenario) or by a SUGRA potential (bouncing model). We analyse several statistical quantities, in particular the power spect...

  10. Quantitative risk assessment integrated with process simulator for a new technology of methanol production plant using recycled CO₂.

    Science.gov (United States)

    Di Domenico, Julia; Vaz, Carlos André; de Souza, Maurício Bezerra

    2014-06-15

    The use of process simulators can contribute with quantitative risk assessment (QRA) by minimizing expert time and large volume of data, being mandatory in the case of a future plant. This work illustrates the advantages of this association by integrating UNISIM DESIGN simulation and QRA to investigate the acceptability of a new technology of a Methanol Production Plant in a region. The simulated process was based on the hydrogenation of chemically sequestered carbon dioxide, demanding stringent operational conditions (high pressures and temperatures) and involving the production of hazardous materials. The estimation of the consequences was performed using the PHAST software, version 6.51. QRA results were expressed in terms of individual and social risks. Compared to existing tolerance levels, the risks were considered tolerable in nominal conditions of operation of the plant. The use of the simulator in association with the QRA also allowed testing the risk in new operating conditions in order to delimit safe regions for the plant.

  11. Quantitative, comprehensive, analytical model for magnetic reconnection in Hall magnetohydrodynamics.

    Science.gov (United States)

    Simakov, Andrei N; Chacón, L

    2008-09-05

    Dissipation-independent, or "fast", magnetic reconnection has been observed computationally in Hall magnetohydrodynamics (MHD) and predicted analytically in electron MHD. However, a quantitative analytical theory of reconnection valid for arbitrary ion inertial lengths, d{i}, has been lacking and is proposed here for the first time. The theory describes a two-dimensional reconnection diffusion region, provides expressions for reconnection rates, and derives a formal criterion for fast reconnection in terms of dissipation parameters and d{i}. It also confirms the electron MHD prediction that both open and elongated diffusion regions allow fast reconnection, and reveals strong dependence of the reconnection rates on d{i}.

  12. Implementing a Simulation Study Using Multiple Software Packages for Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Sunbok Lee

    2015-07-01

    Full Text Available A Monte Carlo simulation study is an essential tool for evaluating the behavior of various quantitative methods including structural equation modeling (SEM under various conditions. Typically, a large number of replications are recommended for a Monte Carlo simulation study, and therefore automating a Monte Carlo simulation study is important to get the desired number of replications for a simulation study. This article is intended to provide concrete examples for automating a Monte Carlo simulation study using some standard software packages for SEM: Mplus, LISREL, SAS PROC CALIS, and R package lavaan. Also, the equivalence between the multilevel SEM and hierarchical linear modeling (HLM is discussed, and relevant examples are provided. It is hoped that the codes in this article can provide some building blocks for researchers to write their own code to automate simulation procedures.

  13. Quantitative models of hydrothermal fluid-mineral reaction: The Ischia case

    Science.gov (United States)

    Di Napoli, Rossella; Federico, Cinzia; Aiuppa, Alessandro; D'Antonio, Massimo; Valenza, Mariano

    2013-03-01

    The intricate pathways of fluid-mineral reactions occurring underneath active hydrothermal systems are explored in this study by applying reaction path modelling to the Ischia case study. Ischia Island, in Southern Italy, hosts a well-developed and structurally complex hydrothermal system which, because of its heterogeneity in chemical and physical properties, is an ideal test sites for evaluating potentialities/limitations of quantitative geochemical models of hydrothermal reactions. We used the EQ3/6 software package, version 7.2b, to model reaction of infiltrating waters (mixtures of meteoric water and seawater in variable proportions) with Ischia's reservoir rocks (the Mount Epomeo Green Tuff units; MEGT). The mineral assemblage and composition of such MEGT units were initially characterised by ad hoc designed optical microscopy and electron microprobe analysis, showing that phenocrysts (dominantly alkali-feldspars and plagioclase) are set in a pervasively altered (with abundant clay minerals and zeolites) groundmass. Reaction of infiltrating waters with MEGT minerals was simulated over a range of realistic (for Ischia) temperatures (95-260 °C) and CO2 fugacities (10-0.2 to 100.5) bar. During the model runs, a set of secondary minerals (selected based on independent information from alteration minerals' studies) was allowed to precipitate from model solutions, when saturation was achieved. The compositional evolution of model solutions obtained in the 95-260 °C runs were finally compared with compositions of Ischia's thermal groundwaters, demonstrating an overall agreement. Our simulations, in particular, well reproduce the Mg-depleting maturation path of hydrothermal solutions, and have end-of-run model solutions whose Na-K-Mg compositions well reflect attainment of full-equilibrium conditions at run temperature. High-temperature (180-260 °C) model runs are those best matching the Na-K-Mg compositions of Ischia's most chemically mature water samples

  14. 76 FR 28819 - NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection...

    Science.gov (United States)

    2011-05-18

    ... COMMISSION NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection... issued for public comment a document entitled: NUREG/CR-XXXX, ``Development of Quantitative Software... development of regulatory guidance for using risk information related to digital systems in the...

  15. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained. It is fo......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained....... It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected...

  16. Galaxy alignments: Theory, modelling and simulations

    CERN Document Server

    Kiessling, Alina; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L; Rassat, Anais

    2015-01-01

    The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in large-scale structure tend to align the shapes and angular momenta of nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both $N$-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the ...

  17. A Simulation Model for Component Commonality

    Institute of Scientific and Technical Information of China (English)

    ZHU Xiao-chi; ZHANG Zi-gang

    2002-01-01

    Component commonality has been cited as a powerful approach for manufacturers to cope with increased component proliferation and to control inventory costs. To fully realize its potential benefits, one needs a clear understanding of its impacts on the system. In this paper, the feasibility of using a simulation model to provide a systematic perspective for manufacturing firms to implement a commonality strategy is demonstrated. Alternative commonality strategies including the stage of employing commonality and the allocation policies are simulated. Several interesting results on effects of commonality, allocation policies,and optimal solutions are obtained. We then summarize qualitative insights and managerial implications into the component commonality design and implementation, and inventory management in a general multi-stage assembly system.

  18. Assumed PDF modeling in rocket combustor simulations

    Science.gov (United States)

    Lempke, M.; Gerlinger, P.; Aigner, M.

    2013-03-01

    In order to account for the interaction between turbulence and chemistry, a multivariate assumed PDF (Probability Density Function) approach is used to simulate a model rocket combustor with finite-rate chemistry. The reported test case is the PennState preburner combustor with a single shear coaxial injector. Experimental data for the wall heat flux is available for this configuration. Unsteady RANS (Reynolds-averaged Navier-Stokes) simulation results with and without the assumed PDF approach are analyzed and compared with the experimental data. Both calculations show a good agreement with the experimental wall heat flux data. Significant changes due to the utilization of the assumed PDF approach can be observed in the radicals, e. g., the OH mass fraction distribution, while the effect on the wall heat flux is insignificant.

  19. Automated simulation of areal bone mineral density assessment in the distal radius from high-resolution peripheral quantitative computed tomography

    OpenAIRE

    Burghardt, A. J.; Kazakia, G. J.; Link, T.M.; Majumdar, S

    2009-01-01

    Summary An automated image processing method is presented for simulating areal bone mineral density measures using high-resolution peripheral quantitative computed tomography (HR-pQCT) in the ultra-distal radius. The accuracy of the method is validated against clinical dual X-ray absorptiometry (DXA). This technique represents a useful reference to gauge the utility of novel 3D quantification methods applied to HR-pQCT in multi-center clinical studies and potentially negates the need for sepa...

  20. Traffic flow dynamics data, models and simulation

    CERN Document Server

    Treiber, Martin

    2013-01-01

    This textbook provides a comprehensive and instructive coverage of vehicular traffic flow dynamics and modeling. It makes this fascinating interdisciplinary topic, which to date was only documented in parts by specialized monographs, accessible to a broad readership. Numerous figures and problems with solutions help the reader to quickly understand and practice the presented concepts. This book is targeted at students of physics and traffic engineering and, more generally, also at students and professionals in computer science, mathematics, and interdisciplinary topics. It also offers material for project work in programming and simulation at college and university level. The main part, after presenting different categories of traffic data, is devoted to a mathematical description of the dynamics of traffic flow, covering macroscopic models which describe traffic in terms of density, as well as microscopic many-particle models in which each particle corresponds to a vehicle and its driver. Focus chapters on ...

  1. Biomechanics trends in modeling and simulation

    CERN Document Server

    Ogden, Ray

    2017-01-01

    The book presents a state-of-the-art overview of biomechanical and mechanobiological modeling and simulation of soft biological tissues. Seven well-known scientists working in that particular field discuss topics such as biomolecules, networks and cells as well as failure, multi-scale, agent-based, bio-chemo-mechanical and finite element models appropriate for computational analysis. Applications include arteries, the heart, vascular stents and valve implants as well as adipose, brain, collagenous and engineered tissues. The mechanics of the whole cell and sub-cellular components as well as the extracellular matrix structure and mechanotransduction are described. In particular, the formation and remodeling of stress fibers, cytoskeletal contractility, cell adhesion and the mechanical regulation of fibroblast migration in healing myocardial infarcts are discussed. The essential ingredients of continuum mechanics are provided. Constitutive models of fiber-reinforced materials with an emphasis on arterial walls ...

  2. Hierarchical Boltzmann simulations and model error estimation

    Science.gov (United States)

    Torrilhon, Manuel; Sarna, Neeraj

    2017-08-01

    A hierarchical simulation approach for Boltzmann's equation should provide a single numerical framework in which a coarse representation can be used to compute gas flows as accurately and efficiently as in computational fluid dynamics, but a subsequent refinement allows to successively improve the result to the complete Boltzmann result. We use Hermite discretization, or moment equations, for the steady linearized Boltzmann equation for a proof-of-concept of such a framework. All representations of the hierarchy are rotationally invariant and the numerical method is formulated on fully unstructured triangular and quadrilateral meshes using a implicit discontinuous Galerkin formulation. We demonstrate the performance of the numerical method on model problems which in particular highlights the relevance of stability of boundary conditions on curved domains. The hierarchical nature of the method allows also to provide model error estimates by comparing subsequent representations. We present various model errors for a flow through a curved channel with obstacles.

  3. Modelling and Simulation for Major Incidents

    Directory of Open Access Journals (Sweden)

    Eleonora Pacciani

    2015-11-01

    Full Text Available In recent years, there has been a rise in Major Incidents with big impact on the citizens health and the society. Without the possibility of conducting live experiments when it comes to physical and/or toxic trauma, only an accurate in silico reconstruction allows us to identify organizational solutions with the best possible chance of success, in correlation with the limitations on available resources (e.g. medical team, first responders, treatments, transports, and hospitals availability and with the variability of the characteristic of event (e.g. type of incident, severity of the event and type of lesions. Utilizing modelling and simulation techniques, a simplified mathematical model of physiological evolution for patients involved in physical and toxic trauma incident scenarios has been developed and implemented. The model formalizes the dynamics, operating standards and practices of medical response and the main emergency service in the chain of emergency management during a Major Incident.

  4. Vertical eddy heat fluxes from model simulations

    Science.gov (United States)

    Stone, Peter H.; Yao, Mao-Sung

    1991-01-01

    Vertical eddy fluxes of heat are calculated from simulations with a variety of climate models, ranging from three-dimensional GCMs to a one-dimensional radiative-convective model. The models' total eddy flux in the lower troposphere is found to agree well with Hantel's analysis from observations, but in the mid and upper troposphere the models' values are systematically 30 percent to 50 percent smaller than Hantel's. The models nevertheless give very good results for the global temperature profile, and the reason for the discrepancy is unclear. The model results show that the manner in which the vertical eddy flux is carried is very sensitive to the parameterization of moist convection. When a moist adiabatic adjustment scheme with a critical value for the relative humidity of 100 percent is used, the vertical transports by large-scale eddies and small-scale convection on a global basis are equal: but when a penetrative convection scheme is used, the large-scale flux on a global basis is only about one-fifth to one-fourth the small-scale flux. Comparison of the model results with observations indicates that the results with the latter scheme are more realistic. However, even in this case, in mid and high latitudes the large and small-scale vertical eddy fluxes of heat are comparable in magnitude above the planetary boundary layer.

  5. NetLand: quantitative modeling and visualization of Waddington's epigenetic landscape using probabilistic potential.

    Science.gov (United States)

    Guo, Jing; Lin, Feng; Zhang, Xiaomeng; Tanavde, Vivek; Zheng, Jie

    2017-05-15

    Waddington's epigenetic landscape is a powerful metaphor for cellular dynamics driven by gene regulatory networks (GRNs). Its quantitative modeling and visualization, however, remains a challenge, especially when there are more than two genes in the network. A software tool for Waddington's landscape has not been available in the literature. We present NetLand, an open-source software tool for modeling and simulating the kinetic dynamics of GRNs, and visualizing the corresponding Waddington's epigenetic landscape in three dimensions without restriction on the number of genes in a GRN. With an interactive and graphical user interface, NetLand can facilitate the knowledge discovery and experimental design in the study of cell fate regulation (e.g. stem cell differentiation and reprogramming). NetLand can run under operating systems including Windows, Linux and OS X. The executive files and source code of NetLand as well as a user manual, example models etc. can be downloaded from http://netland-ntu.github.io/NetLand/ . zhengjie@ntu.edu.sg. Supplementary data are available at Bioinformatics online.

  6. The Comparison of Distributed P2P Trust Models Based on Quantitative Parameters in the File Downloading Scenarios

    Directory of Open Access Journals (Sweden)

    Jingpei Wang

    2016-01-01

    Full Text Available Varied P2P trust models have been proposed recently; it is necessary to develop an effective method to evaluate these trust models to resolve the commonalities (guiding the newly generated trust models in theory and individuality (assisting a decision maker in choosing an optimal trust model to implement in specific context issues. A new method for analyzing and comparing P2P trust models based on hierarchical parameters quantization in the file downloading scenarios is proposed in this paper. Several parameters are extracted from the functional attributes and quality feature of trust relationship, as well as requirements from the specific network context and the evaluators. Several distributed P2P trust models are analyzed quantitatively with extracted parameters modeled into a hierarchical model. The fuzzy inferring method is applied to the hierarchical modeling of parameters to fuse the evaluated values of the candidate trust models, and then the relative optimal one is selected based on the sorted overall quantitative values. Finally, analyses and simulation are performed. The results show that the proposed method is reasonable and effective compared with the previous algorithms.

  7. Quantitative modeling of the accuracy in registering preoperative patient-specific anatomic models into left atrial cardiac ablation procedures

    Energy Technology Data Exchange (ETDEWEB)

    Rettmann, Maryam E., E-mail: rettmann.maryam@mayo.edu; Holmes, David R.; Camp, Jon J.; Cameron, Bruce M.; Robb, Richard A. [Biomedical Imaging Resource, Mayo Clinic College of Medicine, Rochester, Minnesota 55905 (United States); Kwartowitz, David M. [Department of Bioengineering, Clemson University, Clemson, South Carolina 29634 (United States); Gunawan, Mia [Department of Biochemistry and Molecular and Cellular Biology, Georgetown University, Washington D.C. 20057 (United States); Johnson, Susan B.; Packer, Douglas L. [Division of Cardiovascular Diseases, Mayo Clinic, Rochester, Minnesota 55905 (United States); Dalegrave, Charles [Clinical Cardiac Electrophysiology, Cardiology Division Hospital Sao Paulo, Federal University of Sao Paulo, 04024-002 Brazil (Brazil); Kolasa, Mark W. [David Grant Medical Center, Fairfield, California 94535 (United States)

    2014-02-15

    Purpose: In cardiac ablation therapy, accurate anatomic guidance is necessary to create effective tissue lesions for elimination of left atrial fibrillation. While fluoroscopy, ultrasound, and electroanatomic maps are important guidance tools, they lack information regarding detailed patient anatomy which can be obtained from high resolution imaging techniques. For this reason, there has been significant effort in incorporating detailed, patient-specific models generated from preoperative imaging datasets into the procedure. Both clinical and animal studies have investigated registration and targeting accuracy when using preoperative models; however, the effect of various error sources on registration accuracy has not been quantitatively evaluated. Methods: Data from phantom, canine, and patient studies are used to model and evaluate registration accuracy. In the phantom studies, data are collected using a magnetically tracked catheter on a static phantom model. Monte Carlo simulation studies were run to evaluate both baseline errors as well as the effect of different sources of error that would be present in a dynamicin vivo setting. Error is simulated by varying the variance parameters on the landmark fiducial, physical target, and surface point locations in the phantom simulation studies. In vivo validation studies were undertaken in six canines in which metal clips were placed in the left atrium to serve as ground truth points. A small clinical evaluation was completed in three patients. Landmark-based and combined landmark and surface-based registration algorithms were evaluated in all studies. In the phantom and canine studies, both target registration error and point-to-surface error are used to assess accuracy. In the patient studies, no ground truth is available and registration accuracy is quantified using point-to-surface error only. Results: The phantom simulation studies demonstrated that combined landmark and surface-based registration improved

  8. Heinrich events modeled in transient glacial simulations

    Science.gov (United States)

    Ziemen, Florian; Kapsch, Marie; Mikolajewicz, Uwe

    2017-04-01

    Heinrich events are among the most prominent events of climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet — climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under debate, and their climatic consequences are far from being fully understood. We address open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability. The framework consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global AOVGCM ECHAM5/MPIOM/LPJ. The simulations were performed fully coupled and with transient orbital and greenhouse gas forcing. They span from several millennia before the last glacial maximum into the deglaciation. To make these long simulations feasible, the atmosphere is accelerated by a factor of 10 relative to the other model components using a periodical-synchronous coupling technique. To disentangle effects of the Heinrich events and the deglaciation, we focus on the events occurring before the deglaciation. The modeled Heinrich events show a peak ice discharge of about 0.05 Sv and raise the sea level by 2.3 m on average. The resulting surface water freshening reduces the Atlantic meridional overturning circulation and ocean heat release. The reduction in ocean heat release causes a sub-surface warming and decreases the air temperature and precipitation regionally and downstream into Eurasia. The surface elevation decrease of the ice sheet enhances moisture transport onto the ice sheet and thus increases precipitation over the Hudson Bay area, thereby accelerating the recovery after an event.

  9. Modeling and Simulation. III. Simulation of a Model for Development of Visual Cortical Specificity.

    Science.gov (United States)

    1986-12-15

    of parameter values. Experiment, model, and simulation 5’ The simulations we consider mimic, in form, classic deprivation experiments. Kittens are...second paper of the series (ref. 8) reviews the results of numerous experiments on the neuronal development of kitten visual cortex. We have...restricted to a very limited range of oriented contours (see citations in ref. 8). Kittens were raised, for example, viewing only horizontal or only vertical

  10. Testing quantitative pollen dispersal models in animal-pollinated vegetation mosaics: An example from temperate Tasmania, Australia

    Science.gov (United States)

    Mariani, M.; Connor, S. E.; Theuerkauf, M.; Kuneš, P.; Fletcher, M.-S.

    2016-12-01

    Reconstructing past vegetation abundance and land-cover changes through time has important implications in land management and climate modelling. To date palaeovegetation reconstructions in Australia have been limited to qualitative or semi-quantitative inferences from pollen data. Testing pollen dispersal models constitutes a crucial step in developing quantitative past vegetation and land cover reconstructions. Thus far, the application of quantitative pollen dispersal models has been restricted to regions dominated by wind-pollinated plants (e.g. Europe) and their performance in a landscape dominated by animal-pollinated plant taxa is still unexplored. Here we test, for the first time in Australia, two well-known pollen dispersal models to assess their performance in the wind- and animal-pollinated vegetation mosaics of western Tasmania. We focus on a mix of wind- (6 taxa) and animal- (7 taxa) pollinated species that comprise the most common pollen types and key representatives of the dominant vegetation formations. Pollen Productivity Estimates and Relevant Source Area of Pollen obtained using Lagrangian Stochastic turbulent simulations appear to be more realistic when compared to the results from the widely used Gaussian Plume Model.

  11. Photon-tissue interaction model for quantitative assessment of biological tissues

    Science.gov (United States)

    Lee, Seung Yup; Lloyd, William R.; Wilson, Robert H.; Chandra, Malavika; McKenna, Barbara; Simeone, Diane; Scheiman, James; Mycek, Mary-Ann

    2014-02-01

    In this study, we describe a direct fit photon-tissue interaction model to quantitatively analyze reflectance spectra of biological tissue samples. The model rapidly extracts biologically-relevant parameters associated with tissue optical scattering and absorption. This model was employed to analyze reflectance spectra acquired from freshly excised human pancreatic pre-cancerous tissues (intraductal papillary mucinous neoplasm (IPMN), a common precursor lesion to pancreatic cancer). Compared to previously reported models, the direct fit model improved fit accuracy and speed. Thus, these results suggest that such models could serve as real-time, quantitative tools to characterize biological tissues assessed with reflectance spectroscopy.

  12. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  13. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  14. The simulation model of planar electrochemical transducer

    Science.gov (United States)

    Zhevnenko, D. A.; Vergeles, S. S.; Krishtop, T. V.; Tereshonok, D. V.; Gornev, E. S.; Krishtop, V. G.

    2016-12-01

    Planar electrochemical systems are very perspective to build modern motion and pressure sensors. Planar microelectronic technology is successfully used for electrochemical transducer of motion parameters. These systems are characterized by an exceptionally high sensitivity towards mechanic exposure due to high rate of conversion of the mechanic signal to electric current. In this work, we have developed a mathematical model of this planar electrochemical system, which detects the mechanical signals. We simulate the processes of mass and charge transfer in planar electrochemical transducer and calculated its transfer function with different geometrical parameters of the system.

  15. Agent-based modeling and simulation

    CERN Document Server

    Taylor, Simon

    2014-01-01

    Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with pa

  16. Petroleum reservoir data for testing simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Lloyd, J.M.; Harrison, W.

    1980-09-01

    This report consists of reservoir pressure and production data for 25 petroleum reservoirs. Included are 5 data sets for single-phase (liquid) reservoirs, 1 data set for a single-phase (liquid) reservoir with pressure maintenance, 13 data sets for two-phase (liquid/gas) reservoirs and 6 for two-phase reservoirs with pressure maintenance. Also given are ancillary data for each reservoir that could be of value in the development and validation of simulation models. A bibliography is included that lists the publications from which the data were obtained.

  17. Schwinger model simulations with dynamical overlap fermions

    CERN Document Server

    Bietenholz, W; Volkholz, J

    2007-01-01

    We present simulation results for the 2-flavour Schwinger model with dynamical overlap fermions. In particular we apply the overlap hypercube operator at seven light fermion masses. In each case we collect sizable statistics in the topological sectors 0 and 1. Since the chiral condensate Sigma vanishes in the chiral limit, we observe densities for the microscopic Dirac spectrum, which have not been addressed yet by Random Matrix Theory (RMT). Nevertheless, by confronting the averages of the lowest eigenvalues in different topological sectors with chiral RMT in unitary ensemble we obtain -- for the very light fermion masses -- values for $\\Sigma$ that follow closely the analytical predictions in the continuum.

  18. Schwinger model simulations with dynamical overlap fermions

    Energy Technology Data Exchange (ETDEWEB)

    Bietenholz, W. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Shcheredin, S. [Bielefeld Univ. (Germany). Fakultaet fuer Physik; Volkholz, J. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik

    2007-11-15

    We present simulation results for the 2-flavour Schwinger model with dynamical overlap fermions. In particular we apply the overlap hypercube operator at seven light fermion masses. In each case we collect sizable statistics in the topological sectors 0 and 1. Since the chiral condensate {sigma} vanishes in the chiral limit, we observe densities for the microscopic Dirac spectrum, which have not been addressed yet by Random Matrix Theory (RMT). Nevertheless, by confronting the averages of the lowest eigenvalues in different topological sectors with chiral RMT in unitary ensemble we obtain - for the very light fermion masses - values for {sigma} that follow closely the analytical predictions in the continuum. (orig.)

  19. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    . It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained...

  20. CASTOR detector Model, objectives and simulated performance

    CERN Document Server

    Angelis, Aris L S; Bartke, Jerzy; Bogolyubsky, M Yu; Chileev, K; Erine, S; Gladysz-Dziadus, E; Kharlov, Yu V; Kurepin, A B; Lobanov, M O; Maevskaya, A I; Mavromanolakis, G; Nicolis, N G; Panagiotou, A D; Sadovsky, S A; Wlodarczyk, Z

    2001-01-01

    We present a phenomenological model describing the formation and evolution of a Centauro fireball in the baryon-rich region in nucleus-nucleus interactions in the upper atmosphere and at the LHC. The small particle multiplicity and imbalance of electromagnetic and hadronic content characterizing a Centauro event and also the strongly penetrating particles (assumed to be strangelets) frequently accompanying them can be naturally explained. We describe the CASTOR calorimeter, a subdetector of the ALICE experiment dedicated to the search for Centauro in the very forward, baryon-rich region of central Pb+Pb collisions at the LHC. The basic characteristics and simulated performance of the calorimeter are presented. (22 refs).

  1. CASTOR detector. Model, objectives and simulated performance

    Energy Technology Data Exchange (ETDEWEB)

    Angelis, A. L. S.; Mavromanolakis, G.; Panagiotou, A. D. [University of Athens, Nuclear and Particle Physics Division, Athens (Greece); Aslanoglou, X.; Nicolis, N. [Ioannina Univ., Ioannina (Greece). Dept. of Physics; Bartke, J.; Gladysz-Dziadus, E. [Institute of Nuclear Physics, Cracow (Poland); Lobanov, M.; Erine, S.; Kharlov, Y.V.; Bogolyubsky, M.Y. [Institute for High Energy Physics, Protvino (Russian Federation); Kurepin, A.B.; Chileev, K. [Institute for Nuclear Research, Moscow (Russian Federation); Wlodarczyk, Z. [Pedagogical University, Institute of Physics, Kielce (Poland)

    2001-10-01

    It is presented a phenomenological model describing the formation and evolution of a Centauro fireball in the baryon-rich region in nucleus-nucleus interactions in the upper atmosphere and at the LHC. The small particle multiplicity and imbalance of electromagnetic and hadronic content characterizing a Centauro event and also the strongly penetrating particles (assumed to be strangelets) frequently accompanying them can be naturally explained. It is described the CASTOR calorimeter, a sub detector of the ALICE experiment dedicated to the search for Centauro in the very forward, baryon-rich region of central Pb+Pb collisions at the LHC. The basic characteristics and simulated performance of the calorimeter are presented.

  2. Modelling and simulations of controlled release fertilizer

    Science.gov (United States)

    Irfan, Sayed Ameenuddin; Razali, Radzuan; Shaari, Ku Zilati Ku; Mansor, Nurlidia

    2016-11-01

    The recent advancement in controlled release fertilizer has provided an alternative solution to the conventional urea, controlled release fertilizer has a good plant nutrient uptake they are environment friendly. To have an optimum plant intake of nutrients from controlled release fertilizer it is very essential to understand the release characteristics. A mathematical model is developed to predict the release characteristics from polymer coated granule. Numerical simulations are performed by varying the parameters radius of granule, soil water content and soil porosity to study their effect on fertilizer release. Understanding these parameters helps in the better design and improve the efficiency of controlled release fertilizer.

  3. Near infrared spectroscopy for body fat sensing in neonates: quantitative analysis by GAMOS simulations.

    Science.gov (United States)

    Mustafa, Fatin Hamimi; Jones, Peter W; McEwan, Alistair L

    2017-01-11

    Under-nutrition in neonates is closely linked to low body fat percentage. Undernourished neonates are exposed to immediate mortality as well as unwanted health impacts in their later life including obesity and hypertension. One potential low cost approach for obtaining direct measurements of body fat is near-infrared (NIR) interactance. The aims of this study were to model the effect of varying volume fractions of melanin and water in skin over NIR spectra, and to define sensitivity of NIR reflection on changes of thickness of subcutaneous fat. GAMOS simulations were used to develop two single fat layer models and four complete skin models over a range of skin colour (only for four skin models) and hydration within a spectrum of 800-1100 nm. The thickness of the subcutaneous fat was set from 1 to 15 mm in 1 mm intervals in each model. Varying volume fractions of water in skin resulted minimal changes of NIR intensity at ranges of wavelengths from 890 to 940 nm and from 1010 to 1100 nm. Variation of the melanin volume in skin meanwhile was found to strongly influence the NIR intensity and sensitivity. The NIR sensitivities and NIR intensity over thickness of fat decreased from the Caucasian skin to African skin throughout the range of wavelengths. For the relationship between the NIR reflection and the thickness of subcutaneous fat, logarithmic relationship was obtained. The minimal changes of NIR intensity values at wavelengths within the ranges from 890 to 940 nm and from 1010 to 1100 nm to variation of volume fractions of water suggests that wavelengths within those two ranges are considered for use in measurement of body fat to solve the variation of hydration in neonates. The stronger influence of skin colour on NIR shows that the melanin effect needs to be corrected by an independent measurement or by a modeling approach. The logarithmic response obtained with higher sensitivity at the lower range of thickness of fat suggests that implementation of NIRS

  4. Modular System Modeling for Quantitative Reliability Evaluation of Technical Systems

    Directory of Open Access Journals (Sweden)

    Stephan Neumann

    2016-01-01

    Full Text Available In modern times, it is necessary to offer reliable products to match the statutory directives concerning product liability and the high expectations of customers for durable devices. Furthermore, to maintain a high competitiveness, engineers need to know as accurately as possible how long their product will last and how to influence the life expectancy without expensive and time-consuming testing. As the components of a system are responsible for the system reliability, this paper introduces and evaluates calculation methods for life expectancy of common machine elements in technical systems. Subsequently, a method for the quantitative evaluation of the reliability of technical systems is proposed and applied to a heavy-duty power shift transmission.

  5. Improvement of the ID model for quantitative network data

    DEFF Research Database (Denmark)

    Sørensen, Peter Borgen; Damgaard, Christian Frølund; Dupont, Yoko Luise

    2015-01-01

    )1. This presentation will illustrate the application of the ID method based on a data set which consists of counts of visits by 152 pollinator species to 16 plant species. The method is based on two definitions of the underlying probabilities for each combination of pollinator and plant species: (1), pi...... reproduce the high number of zero valued cells in the data set and mimic the sampling distribution. 1 Sørensen et al, Journal of Pollination Ecology, 6(18), 2011, pp129-139......Many interactions are often poorly registered or even unobserved in empirical quantitative networks. Hence, the output of the statistical analyses may fail to differentiate between patterns that are statistical artefacts and those which are real characteristics of ecological networks...

  6. Monte Carlo Simulation of River Meander Modelling

    Science.gov (United States)

    Posner, A. J.; Duan, J. G.

    2010-12-01

    This study first compares the first order analytical solutions for flow field by Ikeda et. al. (1981) and Johanesson and Parker (1989b). Ikeda et. al.’s (1981) linear bank erosion model was implemented to predict the rate of bank erosion in which the bank erosion coefficient is treated as a stochastic variable that varies with physical properties of the bank (e.g. cohesiveness, stratigraphy, vegetation density). The developed model was used to predict the evolution of meandering planforms. Then, the modeling results were analyzed and compared to the observed data. Since the migration of meandering channel consists of downstream translation, lateral expansion, and downstream or upstream rotations. Several measures are formulated in order to determine which of the resulting planform is closest to the experimental measured one. Results from the deterministic model highly depend on the calibrated erosion coefficient. Since field measurements are always limited, the stochastic model yielded more realistic predictions of meandering planform evolutions. Due to the random nature of bank erosion coefficient, the meandering planform evolution is a stochastic process that can only be accurately predicted by a stochastic model. Quasi-2D Ikeda (1989) flow solution with Monte Carlo Simulation of Bank Erosion Coefficient.

  7. Axisymmetric Vortex Simulations with Various Turbulence Models

    Directory of Open Access Journals (Sweden)

    Brian Howard Fiedler

    2010-10-01

    Full Text Available The CFD code FLUENTTM has been applied to a vortex within an updraft above a frictional lower boundary. The sensitivity of vortex intensity and structure to the choice of turbulent model is explored. A high Reynolds number of 108 is employed to make the investigation relevant to the atmospheric vortex known as a tornado. The simulations are axisymmetric and are integrated forward in time to equilibrium.  In a variety of turbulence models tested, the Reynolds Stress Model allows for the greatest intensification of the vortex, with the azimuthal wind speed near the surface being 2.4 times the speed of the updraft, consistent with the destructive nature of tornadoes.  The Standard k-e Model, which is simpler than the Reynolds Stress Model but still more detailed than what is commonly available in numerical weather prediction models, produces an azimuthal wind speed near the surface of at most 0.6 times the updraft speed.        

  8. Simulation model for port shunting yards

    Science.gov (United States)

    Rusca, A.; Popa, M.; Rosca, E.; Rosca, M.; Dragu, V.; Rusca, F.

    2016-08-01

    Sea ports are important nodes in the supply chain, joining two high capacity transport modes: rail and maritime transport. The huge cargo flows transiting port requires high capacity construction and installation such as berths, large capacity cranes, respectively shunting yards. However, the port shunting yards specificity raises several problems such as: limited access since these are terminus stations for rail network, the in-output of large transit flows of cargo relatively to the scarcity of the departure/arrival of a ship, as well as limited land availability for implementing solutions to serve these flows. It is necessary to identify technological solutions that lead to an answer to these problems. The paper proposed a simulation model developed with ARENA computer simulation software suitable for shunting yards which serve sea ports with access to the rail network. Are investigates the principal aspects of shunting yards and adequate measures to increase their transit capacity. The operation capacity for shunting yards sub-system is assessed taking in consideration the required operating standards and the measure of performance (e.g. waiting time for freight wagons, number of railway line in station, storage area, etc.) of the railway station are computed. The conclusion and results, drawn from simulation, help transports and logistics specialists to test the proposals for improving the port management.

  9. The mathematical model of a LUNG simulator

    Directory of Open Access Journals (Sweden)

    František Šolc

    2014-12-01

    Full Text Available The paper discusses the design, modelling, implementation and testing of a specific LUNG simulator,. The described research was performed as a part of the project AlveoPic – Advanced Lung Research for Veterinary Medicine of Particles for Inhalation. The simulator was designed to establish a combined study programme comprising Biomedical Engineering Sciences (FEEC BUT and Healthcare and Rehabilitation Technology (FH Technikum Wien. The simulator is supposed to be an advanced laboratory equipment which should enhance the standard of the existing research activities within the above-mentioned study programs to the required level. Thus, the proposed paper introduces significant technical equipment for the laboratory education of students at both FH Technikum Wien and the Faculty of Electrical Engineering and Communication, Brno University of Technology. The apparatuses described here will be also used to support cooperative research activities. In the given context, the authors specify certain technical solutions and parameters related to artificial lungs, present the electrical equipment of the system, and point out the results of the PC-based measurement and control.

  10. Quantitative assessment of meteorological and tropospheric Zenith Hydrostatic Delay models

    Science.gov (United States)

    Zhang, Di; Guo, Jiming; Chen, Ming; Shi, Junbo; Zhou, Lv

    2016-09-01

    Tropospheric delay has always been an important issue in GNSS/DORIS/VLBI/InSAR processing. Most commonly used empirical models for the determination of tropospheric Zenith Hydrostatic Delay (ZHD), including three meteorological models and two empirical ZHD models, are carefully analyzed in this paper. Meteorological models refer to UNB3m, GPT2 and GPT2w, while ZHD models include Hopfield and Saastamoinen. By reference to in-situ meteorological measurements and ray-traced ZHD values of 91 globally distributed radiosonde sites, over a four-years period from 2010 to 2013, it is found that there is strong correlation between errors of model-derived values and latitudes. Specifically, the Saastamoinen model shows a systematic error of about -3 mm. Therefore a modified Saastamoinen model is developed based on the "best average" refractivity constant, and is validated by radiosonde data. Among different models, the GPT2w and the modified Saastamoinen model perform the best. ZHD values derived from their combination have a mean bias of -0.1 mm and a mean RMS of 13.9 mm. Limitations of the present models are discussed and suggestions for further improvements are given.

  11. Three-dimensional conceptual model for service-oriented simulation

    Institute of Scientific and Technical Information of China (English)

    Wen-guang WANG; Wei-ping WANG; Justyna ZANDER; Yi-fan ZHU

    2009-01-01

    In this letter, we propose a novel three-dimensional conceptual model for an emerging service-oriented simulation paradigm. The model can be used as a guideline or an analytic means to find the potential and possible future directions of the current simulation frameworks, In particular, the model inspects the crossover between the disciplines of modeling and simulation,service-orientation, and software/systems engineering. Finally, two specific simulation frameworks are studied as examples.

  12. Three-dimensional conceptual model for service-oriented simulation

    CERN Document Server

    Wang, Wenguang; Zander, Justyna; Zhu, Yifan; 10.1631/jzus.A0920258

    2009-01-01

    In this letter, we propose a novel three-dimensional conceptual model for an emerging service-oriented simulation paradigm. The model can be used as a guideline or an analytic means to find the potential and possible future directions of the current simulation frameworks. In particular, the model inspects the crossover between the disciplines of modeling and simulation, service-orientation, and software/systems engineering. Finally, two specific simulation frameworks are studied as examples.

  13. Results from modeling and simulation of chemical downstream etch systems

    Energy Technology Data Exchange (ETDEWEB)

    Meeks, E.; Vosen, S.R.; Shon, J.W.; Larson, R.S.; Fox, C.A.; Buchenauer

    1996-05-01

    This report summarizes modeling work performed at Sandia in support of Chemical Downstream Etch (CDE) benchmark and tool development programs under a Cooperative Research and Development Agreement (CRADA) with SEMATECH. The Chemical Downstream Etch (CDE) Modeling Project supports SEMATECH Joint Development Projects (JDPs) with Matrix Integrated Systems, Applied Materials, and Astex Corporation in the development of new CDE reactors for wafer cleaning and stripping processes. These dry-etch reactors replace wet-etch steps in microelectronics fabrication, enabling compatibility with other process steps and reducing the use of hazardous chemicals. Models were developed at Sandia to simulate the gas flow, chemistry and transport in CDE reactors. These models address the essential components of the CDE system: a microwave source, a transport tube, a showerhead/gas inlet, and a downstream etch chamber. The models have been used in tandem to determine the evolution of reactive species throughout the system, and to make recommendations for process and tool optimization. A significant part of this task has been in the assembly of a reasonable set of chemical rate constants and species data necessary for successful use of the models. Often the kinetic parameters were uncertain or unknown. For this reason, a significant effort was placed on model validation to obtain industry confidence in the model predictions. Data for model validation were obtained from the Sandia Molecular Beam Mass Spectrometry (MBMS) experiments, from the literature, from the CDE Benchmark Project (also part of the Sandia/SEMATECH CRADA), and from the JDP partners. The validated models were used to evaluate process behavior as a function of microwave-source operating parameters, transport-tube geometry, system pressure, and downstream chamber geometry. In addition, quantitative correlations were developed between CDE tool performance and operation set points.

  14. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  15. Quantitative statistical assessment of conditional models for synthetic aperture radar.

    Science.gov (United States)

    DeVore, Michael D; O'Sullivan, Joseph A

    2004-02-01

    Many applications of object recognition in the presence of pose uncertainty rely on statistical models-conditioned on pose-for observations. The image statistics of three-dimensional (3-D) objects are often assumed to belong to a family of distributions with unknown model parameters that vary with one or more continuous-valued pose parameters. Many methods for statistical model assessment, for example the tests of Kolmogorov-Smirnov and K. Pearson, require that all model parameters be fully specified or that sample sizes be large. Assessing pose-dependent models from a finite number of observations over a variety of poses can violate these requirements. However, a large number of small samples, corresponding to unique combinations of object, pose, and pixel location, are often available. We develop methods for model testing which assume a large number of small samples and apply them to the comparison of three models for synthetic aperture radar images of 3-D objects with varying pose. Each model is directly related to the Gaussian distribution and is assessed both in terms of goodness-of-fit and underlying model assumptions, such as independence, known mean, and homoscedasticity. Test results are presented in terms of the functional relationship between a given significance level and the percentage of samples that wold fail a test at that level.

  16. Global Solar Dynamo Models: Simulations and Predictions

    Indian Academy of Sciences (India)

    Mausumi Dikpati; Peter A. Gilman

    2008-03-01

    Flux-transport type solar dynamos have achieved considerable success in correctly simulating many solar cycle features, and are now being used for prediction of solar cycle timing and amplitude.We first define flux-transport dynamos and demonstrate how they work. The essential added ingredient in this class of models is meridional circulation, which governs the dynamo period and also plays a crucial role in determining the Sun’s memory about its past magnetic fields.We show that flux-transport dynamo models can explain many key features of solar cycles. Then we show that a predictive tool can be built from this class of dynamo that can be used to predict mean solar cycle features by assimilating magnetic field data from previous cycles.

  17. Design and Simulation of Toroidal Twister Model

    Institute of Scientific and Technical Information of China (English)

    TIAN Huifang; LIN Xizhen; ZENG Qinqin

    2006-01-01

    Toroidal composite vessel winded with fiber is a new kind of structural pressure vessels, which not only has high structure efficiency of compound materials pressure vessel, good security and so on, but also has special shape and the property of utilizing toroidal space, and the prospect of the application of toroidal composite vessel winded with fiber is extremely broad. By introducing parameters establishment of toroidal vessel and elaborating the principle of filament winding for toroidal vessel, the design model of filament winding machine for toroidal vessel has been introduced, and the design model has been dynamically simulated by the software of ADAMS, which will give more referrence for the design of real toroidal vessel twister.

  18. VISION: Verifiable Fuel Cycle Simulation Model

    Energy Technology Data Exchange (ETDEWEB)

    Jacob J. Jacobson; Abdellatif M. Yacout; Gretchen E. Matthern; Steven J. Piet; David E. Shropshire

    2009-04-01

    The nuclear fuel cycle is a very complex system that includes considerable dynamic complexity as well as detail complexity. In the nuclear power realm, there are experts and considerable research and development in nuclear fuel development, separations technology, reactor physics and waste management. What is lacking is an overall understanding of the entire nuclear fuel cycle and how the deployment of new fuel cycle technologies affects the overall performance of the fuel cycle. The Advanced Fuel Cycle Initiative’s systems analysis group is developing a dynamic simulation model, VISION, to capture the relationships, timing and delays in and among the fuel cycle components to help develop an understanding of how the overall fuel cycle works and can transition as technologies are changed. This paper is an overview of the philosophy and development strategy behind VISION. The paper includes some descriptions of the model and some examples of how to use VISION.

  19. Modeling and visual simulation of Microalgae photobioreactor

    Science.gov (United States)

    Zhao, Ming; Hou, Dapeng; Hu, Dawei

    Microalgae is a kind of nutritious and high photosynthetic efficiency autotrophic plant, which is widely distributed in the land and the sea. It can be extensively used in medicine, food, aerospace, biotechnology, environmental protection and other fields. Photobioreactor which is important equipment is mainly used to cultivate massive and high-density microalgae. In this paper, based on the mathematical model of microalgae which grew under different light intensity, three-dimensional visualization model was built and implemented in 3ds max, Virtools and some other three dimensional software. Microalgae is photosynthetic organism, it can efficiently produce oxygen and absorb carbon dioxide. The goal of the visual simulation is to display its change and impacting on oxygen and carbon dioxide intuitively. In this paper, different temperatures and light intensities were selected to control the photobioreactor, and dynamic change of microalgal biomass, Oxygen and carbon dioxide was observed with the aim of providing visualization support for microalgal and photobioreactor research.

  20. A rainfall simulation model for agricultural development in Bangladesh

    Directory of Open Access Journals (Sweden)

    M. Sayedur Rahman

    2000-01-01

    Full Text Available A rainfall simulation model based on a first-order Markov chain has been developed to simulate the annual variation in rainfall amount that is observed in Bangladesh. The model has been tested in the Barind Tract of Bangladesh. Few significant differences were found between the actual and simulated seasonal, annual and average monthly. The distribution of number of success is asymptotic normal distribution. When actual and simulated daily rainfall data were used to drive a crop simulation model, there was no significant difference of rice yield response. The results suggest that the rainfall simulation model perform adequately for many applications.

  1. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  2. Toy Models for Galaxy Formation versus Simulations

    CERN Document Server

    Dekel, A; Tweed, D; Cacciato, M; Ceverino, D; Primack, J R

    2013-01-01

    We describe simple useful toy models for key processes of galaxy formation in its most active phase, at z > 1, and test the approximate expressions against the typical behaviour in a suite of high-resolution hydro-cosmological simulations of massive galaxies at z = 4-1. We address in particular the evolution of (a) the total mass inflow rate from the cosmic web into galactic haloes based on the EPS approximation, (b) the penetration of baryonic streams into the inner galaxy, (c) the disc size, (d) the implied steady-state gas content and star-formation rate (SFR) in the galaxy subject to mass conservation and a universal star-formation law, (e) the inflow rate within the disc to a central bulge and black hole as derived using energy conservation and self-regulated Q ~ 1 violent disc instability (VDI), and (f) the implied steady state in the disc and bulge. The toy models provide useful approximations for the behaviour of the simulated galaxies. We find that (a) the inflow rate is proportional to mass and to (...

  3. Modelling and simulation of multitechnological machine systems

    Energy Technology Data Exchange (ETDEWEB)

    Holopainen, T. (ed.) [VTT Manufacturing Technology, Espoo (Finland)

    2001-07-01

    The Smart Machines and Systems 2010 (SMART) technology programme 1997-2000 aimed at supporting the machine and electromechanical industries in incorporating the modern technology into their products and processes. The public research projects in this programme were planned to accumulate the latest research results and transfer them for the benefit of industrial product development. The major research topic in the SMART programme was called Modelling and Simulation of Multitechnological Mechatronic Systems. The behaviour of modern machine systems and subsystems addresses many different types of physical phenomena and their mutual interactions: mechanical behaviour of structures, electromagnetic effects, hydraulics, vibrations and acoustics etc. together with associated control systems and software. The actual research was carried out in three separate projects called Modelling and Simulation of Mechtronic Machine Systems for Product Development and Condition Monitoring Purposes (MASI), Virtual Testing of Hydraulically Driven Machines (HYSI), and Control of Low Frequency Vibration of a Mobile Machine (AKSUS). This publication contains the papers presented at the final seminar of these three research projects, held on November 30th at Otaniemi Espoo. (orig.)

  4. Simulated evaluation of an intraoperative surface modeling method for catheter ablation by a real phantom simulation experiment

    Science.gov (United States)

    Sun, Deyu; Rettmann, Maryam E.; Packer, Douglas; Robb, Richard A.; Holmes, David R.

    2015-03-01

    In this work, we propose a phantom experiment method to quantitatively evaluate an intraoperative left-atrial modeling update method. In prior work, we proposed an update procedure which updates the preoperative surface model with information from real-time tracked 2D ultrasound. Prior studies did not evaluate the reconstruction using an anthropomorphic phantom. In this approach, a silicone heart phantom (based on a high resolution human atrial surface model reconstructed from CT images) was made as simulated atriums. A surface model of the left atrium of the phantom was deformed by a morphological operation - simulating the shape difference caused by organ deformation between pre-operative scanning and intra-operative guidance. During the simulated procedure, a tracked ultrasound catheter was inserted into right atrial phantom - scanning the left atrial phantom in a manner mimicking the cardiac ablation procedure. By merging the preoperative model and the intraoperative ultrasound images, an intraoperative left atrial model was reconstructed. According to results, the reconstruction error of the modeling method is smaller than the initial geometric difference caused by organ deformation. As the area of the left atrial phantom scanned by ultrasound increases, the reconstruction error of the intraoperative surface model decreases. The study validated the efficacy of the modeling method.

  5. Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model.

    Science.gov (United States)

    Liu, Fang; Velikina, Julia V; Block, Walter F; Kijowski, Richard; Samsonov, Alexey A

    2017-02-01

    We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexible representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplified treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed  ∼ 200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure.

  6. A Quantitative Causal Model Theory of Conditional Reasoning

    Science.gov (United States)

    Fernbach, Philip M.; Erb, Christopher D.

    2013-01-01

    The authors propose and test a causal model theory of reasoning about conditional arguments with causal content. According to the theory, the acceptability of modus ponens (MP) and affirming the consequent (AC) reflect the conditional likelihood of causes and effects based on a probabilistic causal model of the scenario being judged. Acceptability…

  7. Deficiencies in quantitative precipitation forecasts. Sensitivity studies using the COSMO model

    Energy Technology Data Exchange (ETDEWEB)

    Dierer, Silke [Federal Office of Meteorology and Climatology, MeteoSwiss, Zurich (Switzerland); Meteotest, Bern (Switzerland); Arpagaus, Marco [Federal Office of Meteorology and Climatology, MeteoSwiss, Zurich (Switzerland); Seifert, Axel [Deutscher Wetterdienst, Offenbach (Germany); Avgoustoglou, Euripides [Hellenic National Meteorological Service, Hellinikon (Greece); Dumitrache, Rodica [National Meteorological Administration, Bucharest (Romania); Grazzini, Federico [Agenzia Regionale per la Protezione Ambientale Emilia Romagna, Bologna (Italy); Mercogliano, Paola [Italian Aerospace Research Center, Capua (Italy); Milelli, Massimo [Agenzia Regionale per la Protezione Ambientale Piemonte, Torino (Italy); Starosta, Katarzyna [Inst. of Meteorology and Water Management, Warsaw (Poland)

    2009-12-15

    The quantitative precipitation forecast (QPF) of the COSMO model, like of other models, reveals some deficiencies. The aim of this study is to investigate which physical and numerical schemes have the strongest impact on QPF and, thus, have the highest potential for improving QPF. Test cases are selected that are meant to reflect typical forecast errors in different countries. The 13 test cases fall into two main groups: overestimation of stratiform precipitation (6 cases) and underestimation of convective precipitation (5 cases). 22 sensitivity experiments predominantly regarding numerical and physical schemes are performed. The area averaged 24 h precipitation sums arc evaluated. The results show that the strongest impact on QPF is caused by changes of the initial atmospheric humidity and by using the Kain-Fritsch/Bechtold convection scheme instead of the Tiedtke scheme. Both sensitivity experiments change the area averaged precipitation in the range of 30-35%. This clearly shows that improved simulation of atmospheric water vapour is of utmost importance to achieve better precipitation forecasts. Significant changes are also caused by using the Runge-Kutta time integration scheme instead of the Leapfrog scheme, by applying a modified warm rain and snow physics scheme or a modified Tiedtke convection scheme. The fore-mentioned changes result in differences of area averaged precipitation of roughly 20%. Only for Greek lest cases, which all have a strong influence from the sea, the heat and moisture exchange between surface and atmosphere is of great importance and can cause changes of up to 20%. (orig.)

  8. Physiologically Based Pharmacokinetic Modeling Framework for Quantitative Prediction of an Herb–Drug Interaction

    Science.gov (United States)

    Brantley, S J; Gufford, B T; Dua, R; Fediuk, D J; Graf, T N; Scarlett, Y V; Frederick, K S; Fisher, M B; Oberlies, N H; Paine, M F

    2014-01-01

    Herb–drug interaction predictions remain challenging. Physiologically based pharmacokinetic (PBPK) modeling was used to improve prediction accuracy of potential herb–drug interactions using the semipurified milk thistle preparation, silibinin, as an exemplar herbal product. Interactions between silibinin constituents and the probe substrates warfarin (CYP2C9) and midazolam (CYP3A) were simulated. A low silibinin dose (160 mg/day × 14 days) was predicted to increase midazolam area under the curve (AUC) by 1%, which was corroborated with external data; a higher dose (1,650 mg/day × 7 days) was predicted to increase midazolam and (S)-warfarin AUC by 5% and 4%, respectively. A proof-of-concept clinical study confirmed minimal interaction between high-dose silibinin and both midazolam and (S)-warfarin (9 and 13% increase in AUC, respectively). Unexpectedly, (R)-warfarin AUC decreased (by 15%), but this is unlikely to be clinically important. Application of this PBPK modeling framework to other herb–drug interactions could facilitate development of guidelines for quantitative prediction of clinically relevant interactions. PMID:24670388

  9. Towards the quantitative evaluation of visual attention models.

    Science.gov (United States)

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations.

  10. Assessment of Quantitative Precipitation Forecasts from Operational NWP Models (Invited)

    Science.gov (United States)

    Sapiano, M. R.

    2010-12-01

    Previous work has shown that satellite and numerical model estimates of precipitation have complimentary strengths, with satellites having greater skill at detecting convective precipitation events and model estimates having greater skill at detecting stratiform precipitation. This is due in part to the challenges associated with retrieving stratiform precipitation from satellites and the difficulty in resolving sub-grid scale processes in models. These complimentary strengths can be exploited to obtain new merged satellite/model datasets, and several such datasets have been constructed using reanalysis data. Whilst reanalysis data are stable in a climate sense, they also have relatively coarse resolution compared to the satellite estimates (many of which are now commonly available at quarter degree resolution) and they necessarily use fixed forecast systems that are not state-of-the-art. An alternative to reanalysis data is to use Operational Numerical Weather Prediction (NWP) model estimates, which routinely produce precipitation with higher resolution and using the most modern techniques. Such estimates have not been combined with satellite precipitation and their relative skill has not been sufficiently assessed beyond model validation. The aim of this work is to assess the information content of the models relative to satellite estimates with the goal of improving techniques for merging these data types. To that end, several operational NWP precipitation forecasts have been compared to satellite and in situ data and their relative skill in forecasting precipitation has been assessed. In particular, the relationship between precipitation forecast skill and other model variables will be explored to see if these other model variables can be used to estimate the skill of the model at a particular time. Such relationships would be provide a basis for determining weights and errors of any merged products.

  11. Quantitative Methods for Comparing Different Polyline Stream Network Models

    Energy Technology Data Exchange (ETDEWEB)

    Danny L. Anderson; Daniel P. Ames; Ping Yang

    2014-04-01

    Two techniques for exploring relative horizontal accuracy of complex linear spatial features are described and sample source code (pseudo code) is presented for this purpose. The first technique, relative sinuosity, is presented as a measure of the complexity or detail of a polyline network in comparison to a reference network. We term the second technique longitudinal root mean squared error (LRMSE) and present it as a means for quantitatively assessing the horizontal variance between two polyline data sets representing digitized (reference) and derived stream and river networks. Both relative sinuosity and LRMSE are shown to be suitable measures of horizontal stream network accuracy for assessing quality and variation in linear features. Both techniques have been used in two recent investigations involving extracting of hydrographic features from LiDAR elevation data. One confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes yielded better stream network delineations, based on sinuosity and LRMSE, when using LiDAR-derived DEMs. The other demonstrated a new method of delineating stream channels directly from LiDAR point clouds, without the intermediate step of deriving a DEM, showing that the direct delineation from LiDAR point clouds yielded an excellent and much better match, as indicated by the LRMSE.

  12. Modeling and Simulation of Relocation of a Production in SIMPRO-Q Web Based Educational Environment

    Directory of Open Access Journals (Sweden)

    Lubomir Lengyel

    2012-02-01

    Full Text Available The aim of this paper is to show the how to get new knowledge and skills through solving production relocation situations in a continuously changing global environment. The used methods model and simulate the related risks, using web based learning environment of Quality Management Role Play Simulation (SIMPRO-Q. The presented methods are applicable also in engineering education. During simulations, the role-players make both quantitative and qualitative decisions regarding management of critical situations during production relocation. Experiences from relocation production project in industry are discussed at the end of the paper.

  13. Towards Modelling and Simulation of Crowded Environments in Cell Biology

    Science.gov (United States)

    Bittig, Arne T.; Jeschke, Matthias; Uhrmacher, Adelinde M.

    2010-09-01

    In modelling and simulation of cell biological processes, spatial homogeneity in the distribution of components is a common but not always valid assumption. Spatial simulation methods differ in computational effort and accuracy, and usually rely on tool-specific input formats for model specification. A clear separation between modelling and simulation allows a declarative model specification thereby facilitating reuse of models and exploiting different simulators. We outline a modelling formalism covering both stochastic spatial simulation at the population level and simulation of individual entities moving in continuous space as well as the combination thereof. A multi-level spatial simulator is presented that combines populations of small particles simulated according to the Next Subvolume Method with individually represented large particles following Brownian motion. This approach entails several challenges that need to be overcome, but nicely balances between calculation effort and required levels of detail.

  14. Probabilistic Quantitative Precipitation Forecasting Using Ensemble Model Output Statistics

    CERN Document Server

    Scheuerer, Michael

    2013-01-01

    Statistical post-processing of dynamical forecast ensembles is an essential component of weather forecasting. In this article, we present a post-processing method that generates full predictive probability distributions for precipitation accumulations based on ensemble model output statistics (EMOS). We model precipitation amounts by a generalized extreme value distribution that is left-censored at zero. This distribution permits modelling precipitation on the original scale without prior transformation of the data. A closed form expression for its continuous rank probability score can be derived and permits computationally efficient model fitting. We discuss an extension of our approach that incorporates further statistics characterizing the spatial variability of precipitation amounts in the vicinity of the location of interest. The proposed EMOS method is applied to daily 18-h forecasts of 6-h accumulated precipitation over Germany in 2011 using the COSMO-DE ensemble prediction system operated by the Germa...

  15. Quantitative modeling of degree-degree correlation in complex networks

    CERN Document Server

    Niño, Alfonso

    2013-01-01

    This paper presents an approach to the modeling of degree-degree correlation in complex networks. Thus, a simple function, \\Delta(k', k), describing specific degree-to- degree correlations is considered. The function is well suited to graphically depict assortative and disassortative variations within networks. To quantify degree correlation variations, the joint probability distribution between nodes with arbitrary degrees, P(k', k), is used. Introduction of the end-degree probability function as a basic variable allows using group theory to derive mathematical models for P(k', k). In this form, an expression, representing a family of seven models, is constructed with the needed normalization conditions. Applied to \\Delta(k', k), this expression predicts a nonuniform distribution of degree correlation in networks, organized in two assortative and two disassortative zones. This structure is actually observed in a set of four modeled, technological, social, and biological networks. A regression study performed...

  16. Quantitative modeling of selective lysosomal targeting for drug design

    DEFF Research Database (Denmark)

    Trapp, Stefan; Rosania, G.; Horobin, R.W.;

    2008-01-01

    Lysosomes are acidic organelles and are involved in various diseases, the most prominent is malaria. Accumulation of molecules in the cell by diffusion from the external solution into cytosol, lysosome and mitochondrium was calculated with the Fick–Nernst–Planck equation. The cell model considers....... This demonstrates that the cell model can be a useful tool for the design of effective lysosome-targeting drugs with minimal off-target interactions....

  17. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    Science.gov (United States)

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Evaluating quantitative and qualitative models: an application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L.

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the mode

  19. Evaluating quantitative and qualitative models: an application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L.

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the mode

  20. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...