WorldWideScience

Sample records for mechanistic simulation model

  1. Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation: Report of an FDA Public Workshop.

    Science.gov (United States)

    Zhang, X; Duan, J; Kesisoglou, F; Novakovic, J; Amidon, G L; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R

    2017-08-01

    On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled "Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation." The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole-body framework. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  2. Simulating the Risk of Liver Fluke Infection using a Mechanistic Hydro-epidemiological Model

    Science.gov (United States)

    Beltrame, Ludovica; Dunne, Toby; Rose, Hannah; Walker, Josephine; Morgan, Eric; Vickerman, Peter; Wagener, Thorsten

    2016-04-01

    Liver Fluke (Fasciola hepatica) is a common parasite found in livestock and responsible for considerable economic losses throughout the world. Risk of infection is strongly influenced by climatic and hydrological conditions, which characterise the host environment for parasite development and transmission. Despite on-going control efforts, increases in fluke outbreaks have been reported in recent years in the UK, and have been often attributed to climate change. Currently used fluke risk models are based on empirical relationships derived between historical climate and incidence data. However, hydro-climate conditions are becoming increasingly non-stationary due to climate change and direct anthropogenic impacts such as land use change, making empirical models unsuitable for simulating future risk. In this study we introduce a mechanistic hydro-epidemiological model for Liver Fluke, which explicitly simulates habitat suitability for disease development in space and time, representing the parasite life cycle in connection with key environmental conditions. The model is used to assess patterns of Liver Fluke risk for two catchments in the UK under current and potential future climate conditions. Comparisons are made with a widely used empirical model employing different datasets, including data from regional veterinary laboratories. Results suggest that mechanistic models can achieve adequate predictive ability and support adaptive fluke control strategies under climate change scenarios.

  3. Numerical simulation in steam injection process by a mechanistic approach

    Energy Technology Data Exchange (ETDEWEB)

    De Souza, J.C.Jr.; Campos, W.; Lopes, D.; Moura, L.S.S. [Petrobras, Rio de Janeiro (Brazil)

    2008-10-15

    Steam injection is a common thermal recovery method used in very viscous oil reservoirs. The method involves the injection of heat to reduce viscosity and mobilize oil. A steam generation and injection system consists primarily of a steam source, distribution lines, injection wells and a discarding tank. In order to optimize injection and improve the oil recovery factor, one must determine the parameters of steam flow such as pressure, temperature and steam quality. This study focused on developing a unified mathematical model by means of a mechanistic approach for two-phase steam flow in pipelines and wells. The hydrodynamic and heat transfer mechanistic model was implemented in a computer simulator to model the parameters of steam injection while trying to avoid the use of empirical correlations. A marching algorithm was used to determine the distribution of pressure and temperature along the pipelines and wellbores. The mathematical model for steam flow in injection systems, developed by a mechanistic approach (VapMec) performed well when the simulated values of pressures and temperatures were compared with the values measured during field tests. The newly developed VapMec model was incorporated in the LinVap-3 simulator that constitutes an engineering supporting tool for steam injection wells operated by Petrobras. 23 refs., 7 tabs., 6 figs.

  4. Requirements on mechanistic NPP models used in CSS for diagnostics and predictions

    International Nuclear Information System (INIS)

    Juslin, K.

    1996-01-01

    Mechanistic models have for several years with good experience been used for operators' support in electric power dispatching centres. Some models of limited scope have already been in use at nuclear power plants. It is considered that also advanced mechanistic models in combination with present computer technology with preference could be used in Computerized Support Systems (CSS) for the assistance of Nuclear Power Plant (NPP) operators. Requirements with respect to accuracy, validity range, speed flexibility and level of detail on the models used for such purposes are discussed. Quality Assurance, Verification and Validation efforts are considered. A long term commitment in the field of mechanistic modelling and real time simulation is considered as the key to successful implementations. The Advanced PROcess Simulation (APROS) code system and simulation environment developed at the Technical Research Centre of Finland (VTT) is intended also for CSS applications in NPP control rooms. (author). 4 refs

  5. Simulating soil C stability with mechanistic systems models: a multisite comparison of measured fractions and modelled pools

    Science.gov (United States)

    Robertson, Andy; Schipanski, Meagan; Sherrod, Lucretia; Ma, Liwang; Ahuja, Lajpat; McNamara, Niall; Smith, Pete; Davies, Christian

    2016-04-01

    Agriculture, covering more than 30% of global land area, has an exciting opportunity to help combat climate change by effectively managing its soil to promote increased C sequestration. Further, newly sequestered soil carbon (C) through agriculture needs to be stored in more stable forms in order to have a lasting impact on reducing atmospheric CO2 concentrations. While land uses in different climates and soils require different management strategies, the fundamental mechanisms that regulate C sequestration and stabilisation remain the same. These mechanisms are used by a number of different systems models to simulate C dynamics, and thus assess the impacts of change in management or climate. To evaluate the accuracy of these model simulations, our research uses a multidirectional approach to compare C stocks of physicochemical soil fractions collected at two long-term agricultural sites. Carbon stocks for a number of soil fractions were measured at two sites (Lincoln, UK; Colorado, USA) over 8 and 12 years, respectively. Both sites represent managed agricultural land but have notably different climates and levels of disturbance. The measured soil fractions act as proxies for varying degrees of stability, with C contained within these fractions relatable to the C simulated within the soil pools of mechanistic systems models1. Using stable isotope techniques at the UK site, specific turnover times of C within the different fractions were determined and compared with those simulated in the pools of 3 different models of varying complexity (RothC, DayCent and RZWQM2). Further, C dynamics and N-mineralisation rates of the measured fractions at the US site were assessed and compared to results of the same three models. The UK site saw a significant increase in C stocks within the most stable fractions, with topsoil (0-30cm) sequestration rates of just over 0.3 tC ha-1 yr-1 after only 8 years. Further, the sum of all fractions reported C sequestration rates of nearly 1

  6. Development and Implementation of Mechanistic Terry Turbine Models in RELAP-7 to Simulate RCIC Normal Operation Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Haihua [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zhang, Hongbin [Idaho National Lab. (INL), Idaho Falls, ID (United States); O' Brien, James Edward [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    As part of the efforts to understand the unexpected “self-regulating” mode of the RCIC (Reactor Core Isolation Cooling) systems in Fukushima accidents and extend BWR RCIC and PWR AFW (Auxiliary Feed Water) operational range and flexibility, mechanistic models for the Terry turbine, based on Sandia’s original work [1], have been developed and implemented in the RELAP-7 code to simulate the RCIC system. In 2016, our effort has been focused on normal working conditions of the RCIC system. More complex off-design conditions will be pursued in later years when more data are available. In the Sandia model, the turbine stator inlet velocity is provided according to a reduced-order model which was obtained from a large number of CFD (computational fluid dynamics) simulations. In this work, we propose an alternative method, using an under-expanded jet model to obtain the velocity and thermodynamic conditions for the turbine stator inlet. The models include both an adiabatic expansion process inside the nozzle and a free expansion process outside of the nozzle to ambient pressure. The combined models are able to predict the steam mass flow rate and supersonic velocity to the Terry turbine bucket entrance, which are the necessary input information for the Terry turbine rotor model. The analytical models for the nozzle were validated with experimental data and benchmarked with CFD simulations. The analytical models generally agree well with the experimental data and CFD simulations. The analytical models are suitable for implementation into a reactor system analysis code or severe accident code as part of mechanistic and dynamical models to understand the RCIC behaviors. The newly developed nozzle models and modified turbine rotor model according to the Sandia’s original work have been implemented into RELAP-7, along with the original Sandia Terry turbine model. A new pump model has also been developed and implemented to couple with the Terry turbine model. An input

  7. Assessing uncertainty in mechanistic models

    Science.gov (United States)

    Edwin J. Green; David W. MacFarlane; Harry T. Valentine

    2000-01-01

    Concern over potential global change has led to increased interest in the use of mechanistic models for predicting forest growth. The rationale for this interest is that empirical models may be of limited usefulness if environmental conditions change. Intuitively, we expect that mechanistic models, grounded as far as possible in an understanding of the biology of tree...

  8. A mechanistic diagnosis of the simulation of soil CO2 efflux of the ACME Land Model

    Science.gov (United States)

    Liang, J.; Ricciuto, D. M.; Wang, G.; Gu, L.; Hanson, P. J.; Mayes, M. A.

    2017-12-01

    Accurate simulation of the CO2 efflux from soils (i.e., soil respiration) to the atmosphere is critical to project global biogeochemical cycles and the magnitude of climate change in Earth system models (ESMs). Currently, the simulated soil respiration by ESMs still have a large uncertainty. In this study, a mechanistic diagnosis of soil respiration in the Accelerated Climate Model for Energy (ACME) Land Model (ALM) was conducted using long-term observations at the Missouri Ozark AmeriFlux (MOFLUX) forest site in the central U.S. The results showed that the ALM default run significantly underestimated annual soil respiration and gross primary production (GPP), while incorrectly estimating soil water potential. Improved simulations of soil water potential with site-specific data significantly improved the modeled annual soil respiration, primarily because annual GPP was simultaneously improved. Therefore, accurate simulations of soil water potential must be carefully calibrated in ESMs. Despite improved annual soil respiration, the ALM continued to underestimate soil respiration during peak growing seasons, and to overestimate soil respiration during non-peak growing seasons. Simulations involving increased GPP during peak growing seasons increased soil respiration, while neither improved plant phenology nor increased temperature sensitivity affected the simulation of soil respiration during non-peak growing seasons. One potential reason for the overestimation of the soil respiration during non-peak growing seasons may be that the current model structure is substrate-limited, while microbial dormancy under stress may cause the system to become decomposer-limited. Further studies with more microbial data are required to provide adequate representation of soil respiration and to understand the underlying reasons for inaccurate model simulations.

  9. A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.

    Science.gov (United States)

    Revell, Christopher; Somveille, Marius

    2017-08-29

    In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.

  10. The use of mechanistic descriptions of algal growth and zooplankton grazing in an estuarine eutrophication model

    Science.gov (United States)

    Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.

    2003-03-01

    A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.

  11. Development of a mechanistically based computer simulation of nitrogen oxide absorption in packed towers

    International Nuclear Information System (INIS)

    Counce, R.M.

    1981-01-01

    A computer simulation for nitrogen oxide (NO/sub x/) scrubbing in packed towers was developed for use in process design and process control. This simulation implements a mechanistically based mathematical model, which was formulated from (1) an exhaustive literature review; (2) previous NO/sub x/ scrubbing experience with sieve-plate towers; and (3) comparisons of sequential sets of experiments. Nitrogen oxide scrubbing is characterized by simultaneous absorption and desorption phenomena: the model development is based on experiments designed to feature these two phenomena. The model was then successfully tested in experiments designed to put it in jeopardy

  12. In Vitro–In Vivo Correlation for Gliclazide Immediate-Release Tablets Based on Mechanistic Absorption Simulation

    OpenAIRE

    Grbic, Sandra; Parojcic, Jelena; Ibric, Svetlana; Djuric, Zorica

    2010-01-01

    The aim of this study was to develop a drug-specific absorption model for gliclazide (GLK) using mechanistic gastrointestinal simulation technology (GIST) implemented in GastroPlusTM software package. A range of experimentally determined, in silico predicted or literature data were used as input parameters. Experimentally determined pH-solubility profile was used for all simulations. The human jejunum effective permeability (Peff) value was estimated on the basis of in vitro measured Caco-2 p...

  13. Fuel swelling importance in PCI mechanistic modelling

    International Nuclear Information System (INIS)

    Arimescu, V.I.

    2005-01-01

    Under certain conditions, fuel pellet swelling is the most important factor in determining the intensity of the pellet-to-cladding mechanical interaction (PCMI). This is especially true during power ramps, which lead to a temperature increase to a higher terminal plateau that is maintained for hours. The time-dependent gaseous swelling is proportional to temperature and is also enhanced by the increased gas atom migration to the grain boundary during the power ramp. On the other hand, gaseous swelling is inhibited by a compressive hydrostatic stress in the pellet. Therefore, PCMI is the net result of combining gaseous swelling and pellet thermal expansion with the opposing feedback from the cladding mechanical reaction. The coupling of the thermal and mechanical processes, mentioned above, with various feedback loops is best simulated by a mechanistic fuel code. This paper discusses a mechanistic swelling model that is coupled with a fission gas release model as well as a mechanical model of the fuel pellet. The role of fuel swelling is demonstrated for typical power ramps at different burn-ups. Also, fuel swelling plays a significant role in avoiding the thermal instability for larger gap fuel rods, by limiting the potentially exponentially increasing gap due to the positive feedback loop effect of increasing fission gas release and the associated over-pressure inside the cladding. (author)

  14. Testing mechanistic models of growth in insects.

    Science.gov (United States)

    Maino, James L; Kearney, Michael R

    2015-11-22

    Insects are typified by their small size, large numbers, impressive reproductive output and rapid growth. However, insect growth is not simply rapid; rather, insects follow a qualitatively distinct trajectory to many other animals. Here we present a mechanistic growth model for insects and show that increasing specific assimilation during the growth phase can explain the near-exponential growth trajectory of insects. The presented model is tested against growth data on 50 insects, and compared against other mechanistic growth models. Unlike the other mechanistic models, our growth model predicts energy reserves per biomass to increase with age, which implies a higher production efficiency and energy density of biomass in later instars. These predictions are tested against data compiled from the literature whereby it is confirmed that insects increase their production efficiency (by 24 percentage points) and energy density (by 4 J mg(-1)) between hatching and the attainment of full size. The model suggests that insects achieve greater production efficiencies and enhanced growth rates by increasing specific assimilation and increasing energy reserves per biomass, which are less costly to maintain than structural biomass. Our findings illustrate how the explanatory and predictive power of mechanistic growth models comes from their grounding in underlying biological processes. © 2015 The Author(s).

  15. FOAM3D: A numerical simulator for mechanistic prediciton of foam displacement in multidimensions

    Energy Technology Data Exchange (ETDEWEB)

    Kovscek, A.R.; Patzek, T.W. [Lawrence Berkeley Laboratory, Berkeley, CA (United States); Radke, C.J. [Univ. of California, Berkeley, CA (United States)

    1995-03-01

    Field application of foam is a technically viable enhanced oil recovery process (EOR) as demonstrated by recent steam-foam field studies. Traditional gas-displacement processes, such as steam drive, are improved substantially by controlling gas mobility and thereby improving volumetric displacement efficiency. For instance, Patzek and Koinis showed major oil-recovery response after about two years of foam injection in two different pilot studies at the Kern River field. They report increased production of 5.5 to 14% of the original oil in place over a five year period. Because reservoir-scale simulation is a vital component of the engineering and economic evaluation of any EOR project, efficient application of foam as a displacement fluid requires a predictive numerical model of foam displacement. A mechanistic model would also expedite scale-up of the process from the laboratory to the field scale. No general, mechanistic, field-scale model for foam displacement is currently in use.

  16. Comparative approaches from empirical to mechanistic simulation modelling in Land Evaluation studies

    Science.gov (United States)

    Manna, P.; Basile, A.; Bonfante, A.; Terribile, F.

    2009-04-01

    The Land Evaluation (LE) comprise the evaluation procedures to asses the attitudes of the land to a generic or specific use (e.g. biomass production). From local to regional and national scale the approach to the land use planning should requires a deep knowledge of the processes that drive the functioning of the soil-plant-atmosphere system. According to the classical approaches the assessment of attitudes is the result of a qualitative comparison between the land/soil physical properties and the land use requirements. These approaches have a quick and inexpensive applicability; however, they are based on empirical and qualitative models with a basic knowledge structure specifically built for a specific landscape and for the specific object of the evaluation (e.g. crop). The outcome from this situation is the huge difficulties in the spatial extrapolation of the LE results and the rigidity of the system. Modern techniques instead, rely on the application of mechanistic and quantitative simulation modelling that allow a dynamic characterisation of the interrelated physical and chemical processes taking place in the soil landscape. Moreover, the insertion of physical based rules in the LE procedure may make it less difficult in terms of both extending spatially the results and changing the object (e.g. crop species, nitrate dynamics, etc.) of the evaluation. On the other side these modern approaches require high quality and quantity of input data that cause a significant increase in costs. In this scenario nowadays the LE expert is asked to choose the best LE methodology considering costs, complexity of the procedure and benefits in handling a specific land evaluation. In this work we performed a forage maize land suitability study by comparing 9 different methods having increasing complexity and costs. The study area, of about 2000 ha, is located in North Italy in the Lodi plain (Po valley). The range of the 9 employed methods ranged from standard LE approaches to

  17. Inferring the Impact of Regulatory Mechanisms that Underpin CD8+ T Cell Control of B16 Tumor Growth In vivo Using Mechanistic Models and Simulation.

    Science.gov (United States)

    Klinke, David J; Wang, Qing

    2016-01-01

    A major barrier for broadening the efficacy of immunotherapies for cancer is identifying key mechanisms that limit the efficacy of tumor infiltrating lymphocytes. Yet, identifying these mechanisms using human samples and mouse models for cancer remains a challenge. While interactions between cancer and the immune system are dynamic and non-linear, identifying the relative roles that biological components play in regulating anti-tumor immunity commonly relies on human intuition alone, which can be limited by cognitive biases. To assist natural intuition, modeling and simulation play an emerging role in identifying therapeutic mechanisms. To illustrate the approach, we developed a multi-scale mechanistic model to describe the control of tumor growth by a primary response of CD8+ T cells against defined tumor antigens using the B16 C57Bl/6 mouse model for malignant melanoma. The mechanistic model was calibrated to data obtained following adenovirus-based immunization and validated to data obtained following adoptive transfer of transgenic CD8+ T cells. More importantly, we use simulation to test whether the postulated network topology, that is the modeled biological components and their associated interactions, is sufficient to capture the observed anti-tumor immune response. Given the available data, the simulation results also provided a statistical basis for quantifying the relative importance of different mechanisms that underpin CD8+ T cell control of B16F10 growth. By identifying conditions where the postulated network topology is incomplete, we illustrate how this approach can be used as part of an iterative design-build-test cycle to expand the predictive power of the model.

  18. Refined pipe theory for mechanistic modeling of wood development.

    Science.gov (United States)

    Deckmyn, Gaby; Evans, Sam P; Randle, Tim J

    2006-06-01

    We present a mechanistic model of wood tissue development in response to changes in competition, management and climate. The model is based on a refinement of the pipe theory, where the constant ratio between sapwood and leaf area (pipe theory) is replaced by a ratio between pipe conductivity and leaf area. Simulated pipe conductivity changes with age, stand density and climate in response to changes in allocation or pipe radius, or both. The central equation of the model, which calculates the ratio of carbon (C) allocated to leaves and pipes, can be parameterized to describe the contrasting stem conductivity behavior of different tree species: from constant stem conductivity (functional homeostasis hypothesis) to height-related reduction in stem conductivity with age (hydraulic limitation hypothesis). The model simulates the daily growth of pipes (vessels or tracheids), fibers and parenchyma as well as vessel size and simulates the wood density profile and the earlywood to latewood ratio from these data. Initial runs indicate the model yields realistic seasonal changes in pipe radius (decreasing pipe radius from spring to autumn) and wood density, as well as realistic differences associated with the competitive status of trees (denser wood in suppressed trees).

  19. Modeling Bird Migration under Climate Change: A Mechanistic Approach

    Science.gov (United States)

    Smith, James A.

    2009-01-01

    How will migrating birds respond to changes in the environment under climate change? What are the implications for migratory success under the various accelerated climate change scenarios as forecast by the Intergovernmental Panel on Climate Change? How will reductions or increased variability in the number or quality of wetland stop-over sites affect migratory bird species? The answers to these questions have important ramifications for conservation biology and wildlife management. Here, we describe the use of continental scale simulation modeling to explore how spatio-temporal changes along migratory flyways affect en-route migration success. We use an individually based, biophysical, mechanistic, bird migration model to simulate the movement of shorebirds in North America as a tool to study how such factors as drought and wetland loss may impact migratory success and modify migration patterns. Our model is driven by remote sensing and climate data and incorporates important landscape variables. The energy budget components of the model include resting, foraging, and flight, but presently predation is ignored. Results/Conclusions We illustrate our model by studying the spring migration of sandpipers through the Great Plains to their Arctic breeding grounds. Why many species of shorebirds have shown significant declines remains a puzzle. Shorebirds are sensitive to stop-over quality and spacing because of their need for frequent refueling stops and their opportunistic feeding patterns. We predict bird "hydrographs that is, stop-over frequency with latitude, that are in agreement with the literature. Mean stop-over durations predicted from our model for nominal cases also are consistent with the limited, but available data. For the shorebird species simulated, our model predicts that shorebirds exhibit significant plasticity and are able to shift their migration patterns in response to changing drought conditions. However, the question remains as to whether this

  20. Appropriateness of mechanistic and non-mechanistic models for the application of ultrafiltration to mixed waste

    International Nuclear Information System (INIS)

    Foust, Henry; Ghosehajra, Malay

    2007-01-01

    This study asks two questions: (1) How appropriate is the use of a basic filtration equation to the application of ultrafiltration of mixed waste, and (2) How appropriate are non-parametric models for permeate rates (volumes)? To answer these questions, mechanistic and non-mechanistic approaches are developed for permeate rates and volumes associated with an ultrafiltration/mixed waste system in dia-filtration mode. The mechanistic approach is based on a filtration equation which states that t/V vs. V is a linear relationship. The coefficients associated with this linear regression are composed of physical/chemical parameters of the system and based the mass balance equation associated with the membrane and associated developing cake layer. For several sets of data, a high correlation is shown that supports the assertion that t/V vs. V is a linear relationship. It is also shown that non-mechanistic approaches, i.e., the use of regression models to are not appropriate. One models considered is Q(p) = a*ln(Cb)+b. Regression models are inappropriate because the scale-up from a bench scale (pilot scale) study to full-scale for permeate rates (volumes) is not simply the ratio of the two membrane surface areas. (authors)

  1. Application of mechanistic models to fermentation and biocatalysis for next-generation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist; Eliasson Lantz, Anna; Tufvesson, Pär

    2010-01-01

    of variables required for measurement, control and process design. In the near future, mechanistic models with a higher degree of detail will play key roles in the development of efficient next-generation fermentation and biocatalytic processes. Moreover, mechanistic models will be used increasingly......Mechanistic models are based on deterministic principles, and recently, interest in them has grown substantially. Herein we present an overview of mechanistic models and their applications in biotechnology, including future perspectives. Model utility is highlighted with respect to selection...

  2. Mechanistic species distribution modelling as a link between physiology and conservation.

    Science.gov (United States)

    Evans, Tyler G; Diamond, Sarah E; Kelly, Morgan W

    2015-01-01

    Climate change conservation planning relies heavily on correlative species distribution models that estimate future areas of occupancy based on environmental conditions encountered in present-day ranges. The approach benefits from rapid assessment of vulnerability over a large number of organisms, but can have poor predictive power when transposed to novel environments and reveals little in the way of causal mechanisms that define changes in species distribution or abundance. Having conservation planning rely largely on this single approach also increases the risk of policy failure. Mechanistic models that are parameterized with physiological information are expected to be more robust when extrapolating distributions to future environmental conditions and can identify physiological processes that set range boundaries. Implementation of mechanistic species distribution models requires knowledge of how environmental change influences physiological performance, and because this information is currently restricted to a comparatively small number of well-studied organisms, use of mechanistic modelling in the context of climate change conservation is limited. In this review, we propose that the need to develop mechanistic models that incorporate physiological data presents an opportunity for physiologists to contribute more directly to climate change conservation and advance the field of conservation physiology. We begin by describing the prevalence of species distribution modelling in climate change conservation, highlighting the benefits and drawbacks of both mechanistic and correlative approaches. Next, we emphasize the need to expand mechanistic models and discuss potential metrics of physiological performance suitable for integration into mechanistic models. We conclude by summarizing other factors, such as the need to consider demography, limiting broader application of mechanistic models in climate change conservation. Ideally, modellers, physiologists and

  3. A dynamic and mechanistic model of PCB bioaccumulation in the European hake ( Merluccius merluccius)

    Science.gov (United States)

    Bodiguel, Xavier; Maury, Olivier; Mellon-Duval, Capucine; Roupsard, François; Le Guellec, Anne-Marie; Loizeau, Véronique

    2009-08-01

    Bioaccumulation is difficult to document because responses differ among chemical compounds, with environmental conditions, and physiological processes characteristic of each species. We use a mechanistic model, based on the Dynamic Energy Budget (DEB) theory, to take into account this complexity and study factors impacting accumulation of organic pollutants in fish through ontogeny. The bioaccumulation model proposed is a comprehensive approach that relates evolution of hake PCB contamination to physiological information about the fish, such as diet, metabolism, reserve and reproduction status. The species studied is the European hake ( Merluccius merluccius, L. 1758). The model is applied to study the total concentration and the lipid normalised concentration of 4 PCB congeners in male and female hakes from the Gulf of Lions (NW Mediterranean sea) and the Bay of Biscay (NE Atlantic ocean). Outputs of the model compare consistently to measurements over the life span of fish. Simulation results clearly demonstrate the relative effects of food contamination, growth and reproduction on the PCB bioaccumulation in hake. The same species living in different habitats and exposed to different PCB prey concentrations exhibit marked difference in the body accumulation of PCBs. At the adult stage, female hakes have a lower PCB concentration compared to males for a given length. We successfully simulated these sex-specific PCB concentrations by considering two mechanisms: a higher energy allocation to growth for females and a transfer of PCBs from the female to its eggs when allocating lipids from reserve to eggs. Finally, by its mechanistic description of physiological processes, the model is relevant for other species and sets the stage for a mechanistic understanding of toxicity and ecological effects of organic contaminants in marine organisms.

  4. A mechanistic model for electricity consumption on dairy farms: Definition, validation, and demonstration

    OpenAIRE

    Upton, J.R.; Murphy, M.; Shallo, L.; Groot Koerkamp, P.W.G.; Boer, de, I.J.M.

    2014-01-01

    Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical d...

  5. Phenomenological and mechanistic modeling of melt-structure-water interactions in a light water reactor severe accident

    International Nuclear Information System (INIS)

    Bui, V.A.

    1998-01-01

    The objective of this work is to address the modeling of the thermal hydrodynamic phenomena and interactions occurring during the progression of reactor severe accidents. Integrated phenomenological models are developed to describe the accident scenarios, which consist of many processes, while mechanistic modeling, including direct numerical simulation, is carried out to describe separate effects and selected physical phenomena of particular importance

  6. Assessing first-order emulator inference for physical parameters in nonlinear mechanistic models

    Science.gov (United States)

    Hooten, Mevin B.; Leeds, William B.; Fiechter, Jerome; Wikle, Christopher K.

    2011-01-01

    We present an approach for estimating physical parameters in nonlinear models that relies on an approximation to the mechanistic model itself for computational efficiency. The proposed methodology is validated and applied in two different modeling scenarios: (a) Simulation and (b) lower trophic level ocean ecosystem model. The approach we develop relies on the ability to predict right singular vectors (resulting from a decomposition of computer model experimental output) based on the computer model input and an experimental set of parameters. Critically, we model the right singular vectors in terms of the model parameters via a nonlinear statistical model. Specifically, we focus our attention on first-order models of these right singular vectors rather than the second-order (covariance) structure.

  7. Bridging Mechanistic and Phenomenological Models of Complex Biological Systems.

    Science.gov (United States)

    Transtrum, Mark K; Qiu, Peng

    2016-05-01

    The inherent complexity of biological systems gives rise to complicated mechanistic models with a large number of parameters. On the other hand, the collective behavior of these systems can often be characterized by a relatively small number of phenomenological parameters. We use the Manifold Boundary Approximation Method (MBAM) as a tool for deriving simple phenomenological models from complicated mechanistic models. The resulting models are not black boxes, but remain expressed in terms of the microscopic parameters. In this way, we explicitly connect the macroscopic and microscopic descriptions, characterize the equivalence class of distinct systems exhibiting the same range of collective behavior, and identify the combinations of components that function as tunable control knobs for the behavior. We demonstrate the procedure for adaptation behavior exhibited by the EGFR pathway. From a 48 parameter mechanistic model, the system can be effectively described by a single adaptation parameter τ characterizing the ratio of time scales for the initial response and recovery time of the system which can in turn be expressed as a combination of microscopic reaction rates, Michaelis-Menten constants, and biochemical concentrations. The situation is not unlike modeling in physics in which microscopically complex processes can often be renormalized into simple phenomenological models with only a few effective parameters. The proposed method additionally provides a mechanistic explanation for non-universal features of the behavior.

  8. Phenomenological and mechanistic modeling of melt-structure-water interactions in a light water reactor severe accident

    Energy Technology Data Exchange (ETDEWEB)

    Bui, V.A

    1998-10-01

    The objective of this work is to address the modeling of the thermal hydrodynamic phenomena and interactions occurring during the progression of reactor severe accidents. Integrated phenomenological models are developed to describe the accident scenarios, which consist of many processes, while mechanistic modeling, including direct numerical simulation, is carried out to describe separate effects and selected physical phenomena of particular importance 88 refs, 54 figs, 7 tabs

  9. LASSIM-A network inference toolbox for genome-wide mechanistic modeling.

    Directory of Open Access Journals (Sweden)

    Rasmus Magnusson

    2017-06-01

    Full Text Available Recent technological advancements have made time-resolved, quantitative, multi-omics data available for many model systems, which could be integrated for systems pharmacokinetic use. Here, we present large-scale simulation modeling (LASSIM, which is a novel mathematical tool for performing large-scale inference using mechanistically defined ordinary differential equations (ODE for gene regulatory networks (GRNs. LASSIM integrates structural knowledge about regulatory interactions and non-linear equations with multiple steady state and dynamic response expression datasets. The rationale behind LASSIM is that biological GRNs can be simplified using a limited subset of core genes that are assumed to regulate all other gene transcription events in the network. The LASSIM method is implemented as a general-purpose toolbox using the PyGMO Python package to make the most of multicore computers and high performance clusters, and is available at https://gitlab.com/Gustafsson-lab/lassim. As a method, LASSIM works in two steps, where it first infers a non-linear ODE system of the pre-specified core gene expression. Second, LASSIM in parallel optimizes the parameters that model the regulation of peripheral genes by core system genes. We showed the usefulness of this method by applying LASSIM to infer a large-scale non-linear model of naïve Th2 cell differentiation, made possible by integrating Th2 specific bindings, time-series together with six public and six novel siRNA-mediated knock-down experiments. ChIP-seq showed significant overlap for all tested transcription factors. Next, we performed novel time-series measurements of total T-cells during differentiation towards Th2 and verified that our LASSIM model could monitor those data significantly better than comparable models that used the same Th2 bindings. In summary, the LASSIM toolbox opens the door to a new type of model-based data analysis that combines the strengths of reliable mechanistic models

  10. NEAMS FPL M2 Milestone Report: Development of a UO₂ Grain Size Model using Multicale Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Tonks, Michael R [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zhang, Yongfeng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bai, Xianming [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-06-01

    This report summarizes development work funded by the Nuclear Energy Advanced Modeling Simulation program's Fuels Product Line (FPL) to develop a mechanistic model for the average grain size in UO₂ fuel. The model is developed using a multiscale modeling and simulation approach involving atomistic simulations, as well as mesoscale simulations using INL's MARMOT code.

  11. SITE-94. Adaptation of mechanistic sorption models for performance assessment calculations

    International Nuclear Information System (INIS)

    Arthur, R.C.

    1996-10-01

    Sorption is considered in most predictive models of radionuclide transport in geologic systems. Most models simulate the effects of sorption in terms of empirical parameters, which however can be criticized because the data are only strictly valid under the experimental conditions at which they were measured. An alternative is to adopt a more mechanistic modeling framework based on recent advances in understanding the electrical properties of oxide mineral-water interfaces. It has recently been proposed that these 'surface-complexation' models may be directly applicable to natural systems. A possible approach for adapting mechanistic sorption models for use in performance assessments, using this 'surface-film' concept, is described in this report. Surface-acidity parameters in the Generalized Two-Layer surface complexation model are combined with surface-complexation constants for Np(V) sorption ob hydrous ferric oxide to derive an analytical model enabling direct calculation of corresponding intrinsic distribution coefficients as a function of pH, and Ca 2+ , Cl - , and HCO 3 - concentrations. The surface film concept is then used to calculate whole-rock distribution coefficients for Np(V) sorption by altered granitic rocks coexisting with a hypothetical, oxidized Aespoe groundwater. The calculated results suggest that the distribution coefficients for Np adsorption on these rocks could range from 10 to 100 ml/g. Independent estimates of K d for Np sorption in similar systems, based on an extensive review of experimental data, are consistent, though slightly conservative, with respect to the calculated values. 31 refs

  12. Comparison of Two-Phase Pipe Flow in OpenFOAM with a Mechanistic Model

    Science.gov (United States)

    Shuard, Adrian M.; Mahmud, Hisham B.; King, Andrew J.

    2016-03-01

    Two-phase pipe flow is a common occurrence in many industrial applications such as power generation and oil and gas transportation. Accurate prediction of liquid holdup and pressure drop is of vast importance to ensure effective design and operation of fluid transport systems. In this paper, a Computational Fluid Dynamics (CFD) study of a two-phase flow of air and water is performed using OpenFOAM. The two-phase solver, interFoam is used to identify flow patterns and generate values of liquid holdup and pressure drop, which are compared to results obtained from a two-phase mechanistic model developed by Petalas and Aziz (2002). A total of 60 simulations have been performed at three separate pipe inclinations of 0°, +10° and -10° respectively. A three dimensional, 0.052m diameter pipe of 4m length is used with the Shear Stress Transport (SST) k - ɷ turbulence model to solve the turbulent mixtures of air and water. Results show that the flow pattern behaviour and numerical values of liquid holdup and pressure drop compare reasonably well to the mechanistic model.

  13. Evaluation of mechanistic DNB models using HCLWR CHF data

    International Nuclear Information System (INIS)

    Iwamura, Takamichi; Watanabe, Hironori; Okubo, Tsutomu; Araya, Fumimasa; Murao, Yoshio.

    1992-03-01

    An onset of departure from nucleate boiling (DNB) in light water reactor (LWR) has been generally predicted with empirical correlations. Since these correlations have less physical bases and contain adjustable empirical constants determined by best fitting of test data, applicable geometries and flow conditions are limited within the original experiment ranges. In order to obtain more universal prediction method, several mechanistic DNB models based on physical approaches have been proposed in recent years. However, the predictive capabilities of mechanistic DNB models have not been verified successfully especially for advanced LWR design purposes. In this report, typical DNB mechanistic models are reviewed and compared with critical heat flux (CHF) data for high conversion light water reactor (HCLWR). The experiments were performed using triangular 7-rods array with non-uniform axial heat flux distribution. Test pressure was 16 MPa, mass velocities ranged from 800 t0 3100 kg/s·m 2 and exit qualities from -0.07 to 0.19. The evaluated models are: 1) Wisman-Pei, 2) Chang-Lee, 3) Lee-Mudawwar, 4) Lin-Lee-Pei, and 5) Katto. The first two models are based on near-wall bubble crowding model and the other three models on sublayer dryout model. The comparison with experimental data indicated that the Weisman-Pei model agreed relatively well with the CHF data. Effects of empirical constants in each model on CHF calculation were clarified by sensitivity studies. It was also found that the magnitudes of physical quantities obtained in the course of calculation were significantly different for each model. Therefore, microscopic observation of the onset of DNB on heated surface is essential to clarify the DNB mechanism and establish a general DNB mechanistic model based on physical phenomenon. (author)

  14. Specialists without spirit: limitations of the mechanistic biomedical model.

    Science.gov (United States)

    Hewa, S; Hetherington, R W

    1995-06-01

    This paper examines the origin and the development of the mechanistic model of the human body and health in terms of Max Weber's theory of rationalization. It is argued that the development of Western scientific medicine is a part of the broad process of rationalization that began in sixteenth century Europe as a result of the Reformation. The development of the mechanistic view of the human body in Western medicine is consistent with the ideas of calculability, predictability, and control-the major tenets of the process of rationalization as described by Weber. In recent years, however, the limitations of the mechanistic model have been the topic of many discussions. George Engel, a leading advocate of general systems theory, is one of the leading proponents of a new medical model which includes the general quality of life, clean environment, and psychological, or spiritual stability of life. The paper concludes with consideration of the potential of Engel's proposed new model in the context of the current state of rationalization in modern industrialized society.

  15. Mechanistic Fermentation Models for Process Design, Monitoring, and Control

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads Orla

    2017-01-01

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted...... for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool...... for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding....

  16. Mechanistic model for microbial growth on hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Mallee, F M; Blanch, H W

    1977-12-01

    Based on available information describing the transport and consumption of insoluble alkanes, a mechanistic model is proposed for microbial growth on hydrocarbons. The model describes the atypical growth kinetics observed, and has implications in the design of large scale equipment for single cell protein (SCP) manufacture from hydrocarbons. The model presents a framework for comparison of the previously published experimental kinetic data.

  17. Mechanistic-empirical subgrade design model based on heavy vehicle simulator test results

    CSIR Research Space (South Africa)

    Theyse, HL

    2006-06-01

    Full Text Available Although Accelerated Pavement Testing (APT) is often done with specific objectives, valuable pavement performance data is generated over the long-term that may be used to investigate pavement behaviour in general and calibrate mechanistic...

  18. Patterns and causes of species richness: a general simulation model for macroecology

    DEFF Research Database (Denmark)

    Gotelli, Nicholas J; Anderson, Marti J; Arita, Hector T

    2009-01-01

    to a mechanistic understanding of the patterns. During the past two decades, macroecologists have successfully addressed technical problems posed by spatial autocorrelation, intercorrelation of predictor variables and non-linearity. However, curve-fitting approaches are problematic because most theoretical models...... in macroecology do not make quantitative predictions, and they do not incorporate interactions among multiple forces. As an alternative, we propose a mechanistic modelling approach. We describe computer simulation models of the stochastic origin, spread, and extinction of species' geographical ranges...... in an environmentally heterogeneous, gridded domain and describe progress to date regarding their implementation. The output from such a general simulation model (GSM) would, at a minimum, consist of the simulated distribution of species ranges on a map, yielding the predicted number of species in each grid cell...

  19. Conceptual models for waste tank mechanistic analysis

    International Nuclear Information System (INIS)

    Allemann, R.T.; Antoniak, Z.I.; Eyler, L.L.; Liljegren, L.M.; Roberts, J.S.

    1992-02-01

    Pacific Northwest Laboratory (PNL) is conducting a study for Westinghouse Hanford Company (Westinghouse Hanford), a contractor for the US Department of Energy (DOE). The purpose of the work is to study possible mechanisms and fluid dynamics contributing to the periodic release of gases from double-shell waste storage tanks at the Hanford Site in Richland, Washington. This interim report emphasizing the modeling work follows two other interim reports, Mechanistic Analysis of Double-Shell Tank Gas Release Progress Report -- November 1990 and Collection and Analysis of Existing Data for Waste Tank Mechanistic Analysis Progress Report -- December 1990, that emphasized data correlation and mechanisms. The approach in this study has been to assemble and compile data that are pertinent to the mechanisms, analyze the data, evaluate physical properties and parameters, evaluate hypothetical mechanisms, and develop mathematical models of mechanisms

  20. Mechanistic modeling of aberrant energy metabolism in human disease

    Directory of Open Access Journals (Sweden)

    Vineet eSangar

    2012-10-01

    Full Text Available Dysfunction in energy metabolism—including in pathways localized to the mitochondria—has been implicated in the pathogenesis of a wide array of disorders, ranging from cancer to neurodegenerative diseases to type II diabetes. The inherent complexities of energy and mitochondrial metabolism present a significant obstacle in the effort to understand the role that these molecular processes play in the development of disease. To help unravel these complexities, systems biology methods have been applied to develop an array of computational metabolic models, ranging from mitochondria-specific processes to genome-scale cellular networks. These constraint-based models can efficiently simulate aspects of normal and aberrant metabolism in various genetic and environmental conditions. Development of these models leverages—and also provides a powerful means to integrate and interpret—information from a wide range of sources including genomics, proteomics, metabolomics, and enzyme kinetics. Here, we review a variety of mechanistic modeling studies that explore metabolic functions, deficiency disorders, and aberrant biochemical pathways in mitochondria and related regions in the cell.

  1. Comparison of Two-Phase Pipe Flow in OpenFOAM with a Mechanistic Model

    International Nuclear Information System (INIS)

    Shuard, Adrian M; Mahmud, Hisham B; King, Andrew J

    2016-01-01

    Two-phase pipe flow is a common occurrence in many industrial applications such as power generation and oil and gas transportation. Accurate prediction of liquid holdup and pressure drop is of vast importance to ensure effective design and operation of fluid transport systems. In this paper, a Computational Fluid Dynamics (CFD) study of a two-phase flow of air and water is performed using OpenFOAM. The two-phase solver, interFoam is used to identify flow patterns and generate values of liquid holdup and pressure drop, which are compared to results obtained from a two-phase mechanistic model developed by Petalas and Aziz (2002). A total of 60 simulations have been performed at three separate pipe inclinations of 0°, +10° and -10° respectively. A three dimensional, 0.052m diameter pipe of 4m length is used with the Shear Stress Transport (SST) k - ω turbulence model to solve the turbulent mixtures of air and water. Results show that the flow pattern behaviour and numerical values of liquid holdup and pressure drop compare reasonably well to the mechanistic model. (paper)

  2. Exploring BSEP Inhibition-Mediated Toxicity with a Mechanistic Model of Drug-Induced Liver Injury

    Directory of Open Access Journals (Sweden)

    Jeffrey L Woodhead

    2014-11-01

    Full Text Available Inhibition of the bile salt export pump (BSEP has been linked to incidence of drug-induced liver injury (DILI, presumably by the accumulation of toxic bile acids in the liver. We have previously constructed and validated a model of bile acid disposition within DILIsym®, a mechanistic model of DILI. In this paper, we use DILIsym® to simulate the DILI response of the hepatotoxic BSEP inhibitors bosentan and CP-724,714 and the non-hepatotoxic BSEP inhibitor telmisartan in humans in order to explore whether we can predict that hepatotoxic BSEP inhibitors can cause bile acid accumulation to reach toxic levels. We also simulate bosentan in rats in order to illuminate potential reasons behind the lack of toxicity in rats compared to the toxicity observed in humans. DILIsym® predicts that bosentan, but not telmisartan, will cause mild hepatocellular ATP decline and serum ALT elevation in a simulated population of humans. The difference in hepatotoxic potential between bosentan and telmisartan is consistent with clinical observations. However, DILIsym® underpredicts the incidence of bosentan toxicity. DILIsym® also predicts that bosentan will not cause toxicity in a simulated population of rats, and that the difference between the response to bosentan in rats and in humans is primarily due to the less toxic bile acid pool in rats. Our simulations also suggest a potential synergistic role for bile acid accumulation and mitochondrial electron transport chain inhibition in producing the observed toxicity in CP-724,714, and suggest that CP-724,714 metabolites may also play a role in the observed toxicity. Our work also compares the impact of competitive and noncompetitive BSEP inhibition for CP-724,714 and demonstrates that noncompetitive inhibition leads to much greater bile acid accumulation and potential toxicity. Our research demonstrates the potential for mechanistic modeling to contribute to the understanding of how bile acid transport inhibitors

  3. Simulating polar bear energetics during a seasonal fast using a mechanistic model.

    Directory of Open Access Journals (Sweden)

    Paul D Mathewson

    Full Text Available In this study we tested the ability of a mechanistic model (Niche Mapper™ to accurately model adult, non-denning polar bear (Ursus maritimus energetics while fasting during the ice-free season in the western Hudson Bay. The model uses a steady state heat balance approach, which calculates the metabolic rate that will allow an animal to maintain its core temperature in its particular microclimate conditions. Predicted weight loss for a 120 day fast typical of the 1990s was comparable to empirical studies of the population, and the model was able to reach a heat balance at the target metabolic rate for the entire fast, supporting use of the model to explore the impacts of climate change on polar bears. Niche Mapper predicted that all but the poorest condition bears would survive a 120 day fast under current climate conditions. When the fast extended to 180 days, Niche Mapper predicted mortality of up to 18% for males. Our results illustrate how environmental conditions, variation in animal properties, and thermoregulation processes may impact survival during extended fasts because polar bears were predicted to require additional energetic expenditure for thermoregulation during a 180 day fast. A uniform 3°C temperature increase reduced male mortality during a 180 day fast from 18% to 15%. Niche Mapper explicitly links an animal's energetics to environmental conditions and thus can be a valuable tool to help inform predictions of climate-related population changes. Since Niche Mapper is a generic model, it can make energetic predictions for other species threatened by climate change.

  4. Simulating polar bear energetics during a seasonal fast using a mechanistic model.

    Science.gov (United States)

    Mathewson, Paul D; Porter, Warren P

    2013-01-01

    In this study we tested the ability of a mechanistic model (Niche Mapper™) to accurately model adult, non-denning polar bear (Ursus maritimus) energetics while fasting during the ice-free season in the western Hudson Bay. The model uses a steady state heat balance approach, which calculates the metabolic rate that will allow an animal to maintain its core temperature in its particular microclimate conditions. Predicted weight loss for a 120 day fast typical of the 1990s was comparable to empirical studies of the population, and the model was able to reach a heat balance at the target metabolic rate for the entire fast, supporting use of the model to explore the impacts of climate change on polar bears. Niche Mapper predicted that all but the poorest condition bears would survive a 120 day fast under current climate conditions. When the fast extended to 180 days, Niche Mapper predicted mortality of up to 18% for males. Our results illustrate how environmental conditions, variation in animal properties, and thermoregulation processes may impact survival during extended fasts because polar bears were predicted to require additional energetic expenditure for thermoregulation during a 180 day fast. A uniform 3°C temperature increase reduced male mortality during a 180 day fast from 18% to 15%. Niche Mapper explicitly links an animal's energetics to environmental conditions and thus can be a valuable tool to help inform predictions of climate-related population changes. Since Niche Mapper is a generic model, it can make energetic predictions for other species threatened by climate change.

  5. Volterra representation enables modeling of complex synaptic nonlinear dynamics in large-scale simulations.

    Science.gov (United States)

    Hu, Eric Y; Bouteiller, Jean-Marie C; Song, Dong; Baudry, Michel; Berger, Theodore W

    2015-01-01

    Chemical synapses are comprised of a wide collection of intricate signaling pathways involving complex dynamics. These mechanisms are often reduced to simple spikes or exponential representations in order to enable computer simulations at higher spatial levels of complexity. However, these representations cannot capture important nonlinear dynamics found in synaptic transmission. Here, we propose an input-output (IO) synapse model capable of generating complex nonlinear dynamics while maintaining low computational complexity. This IO synapse model is an extension of a detailed mechanistic glutamatergic synapse model capable of capturing the input-output relationships of the mechanistic model using the Volterra functional power series. We demonstrate that the IO synapse model is able to successfully track the nonlinear dynamics of the synapse up to the third order with high accuracy. We also evaluate the accuracy of the IO synapse model at different input frequencies and compared its performance with that of kinetic models in compartmental neuron models. Our results demonstrate that the IO synapse model is capable of efficiently replicating complex nonlinear dynamics that were represented in the original mechanistic model and provide a method to replicate complex and diverse synaptic transmission within neuron network simulations.

  6. Virtual Systems Pharmacology (ViSP software for mechanistic system-level model simulations

    Directory of Open Access Journals (Sweden)

    Sergey eErmakov

    2014-10-01

    Full Text Available Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user’s particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  7. Multiscale mechanistic modeling in pharmaceutical research and development.

    Science.gov (United States)

    Kuepfer, Lars; Lippert, Jörg; Eissing, Thomas

    2012-01-01

    Discontinuation of drug development projects due to lack of efficacy or adverse events is one of the main cost drivers in pharmaceutical research and development (R&D). Investments have to be written-off and contribute to the total costs of a successful drug candidate receiving marketing authorization and allowing return on invest. A vital risk for pharmaceutical innovator companies is late stage clinical failure since costs for individual clinical trials may exceed the one billion Euro threshold. To guide investment decisions and to safeguard maximum medical benefit and safety for patients recruited in clinical trials, it is therefore essential to understand the clinical consequences of all information and data generated. The complexity of the physiological and pathophysiological processes and the sheer amount of information available overcharge the mental capacity of any human being and prevent a prediction of the success in clinical development. A rigorous integration of knowledge, assumption, and experimental data into computational models promises a significant improvement of the rationalization of decision making in pharmaceutical industry. We here give an overview of the current status of modeling and simulation in pharmaceutical R&D and outline the perspectives of more recent developments in mechanistic modeling. Specific modeling approaches for different biological scales ranging from intracellular processes to whole organism physiology are introduced and an example for integrative multiscale modeling of therapeutic efficiency in clinical oncology trials is showcased.

  8. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  9. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    Science.gov (United States)

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  10. Descriptive and mechanistic models of crop–weed competition

    NARCIS (Netherlands)

    Bastiaans, L.; Storkey, J.

    2017-01-01

    Crop-weed competitive relations are an important element of agroecosystems. Quantifying and understanding them helps to design appropriate weed management at operational, tactical and strategic level. This chapter presents and discusses simple descriptive and more mechanistic models for crop-weed

  11. Mechanistic Models for Process Development and Optimization of Fed-batch Fermentation Systems

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads O.

    2016-01-01

    This work discusses the application of mechanistic models to pilot scale filamentous fungal fermentation systems operated at Novozymes A/S. For on-line applications, a state estimator model is developed based on a stoichiometric balance in order to predict the biomass and product concentration....... This is based on on-line gas measurements and ammonia addition flow rate measurements. Additionally, a mechanistic model is applied offline as a tool for batch planning, based on definition of the process back pressure, aeration rate and stirrer speed. This allows the batch starting fill to be planned, taking...... into account the oxygen transfer conditions, as well as the evaporation rates of the system. Mechanistic models are valuable tools which are applicable for both process development and optimization. The state estimator described will be a valuable tool for future work as part of control strategy development...

  12. Atomic scale simulations for improved CRUD and fuel performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Anders David Ragnar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cooper, Michael William Donald [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-06

    A more mechanistic description of fuel performance codes can be achieved by deriving models and parameters from atomistic scale simulations rather than fitting models empirically to experimental data. The same argument applies to modeling deposition of corrosion products on fuel rods (CRUD). Here are some results from publications in 2016 carried out using the CASL allocation at LANL.

  13. Coupling machine learning with mechanistic models to study runoff production and river flow at the hillslope scale

    Science.gov (United States)

    Marçais, J.; Gupta, H. V.; De Dreuzy, J. R.; Troch, P. A. A.

    2016-12-01

    Geomorphological structure and geological heterogeneity of hillslopes are major controls on runoff responses. The diversity of hillslopes (morphological shapes and geological structures) on one hand, and the highly non linear runoff mechanism response on the other hand, make it difficult to transpose what has been learnt at one specific hillslope to another. Therefore, making reliable predictions on runoff appearance or river flow for a given hillslope is a challenge. Applying a classic model calibration (based on inverse problems technique) requires doing it for each specific hillslope and having some data available for calibration. When applied to thousands of cases it cannot always be promoted. Here we propose a novel modeling framework based on coupling process based models with data based approach. First we develop a mechanistic model, based on hillslope storage Boussinesq equations (Troch et al. 2003), able to model non linear runoff responses to rainfall at the hillslope scale. Second we set up a model database, representing thousands of non calibrated simulations. These simulations investigate different hillslope shapes (real ones obtained by analyzing 5m digital elevation model of Brittany and synthetic ones), different hillslope geological structures (i.e. different parametrizations) and different hydrologic forcing terms (i.e. different infiltration chronicles). Then, we use this model library to train a machine learning model on this physically based database. Machine learning model performance is then assessed by a classic validating phase (testing it on new hillslopes and comparing machine learning with mechanistic outputs). Finally we use this machine learning model to learn what are the hillslope properties controlling runoffs. This methodology will be further tested combining synthetic datasets with real ones.

  14. Development of a mechanistic model for prediction of CO2 capture from gas mixtures by amine solutions in porous membranes.

    Science.gov (United States)

    Ghadiri, Mehdi; Marjani, Azam; Shirazian, Saeed

    2017-06-01

    A mechanistic model was developed in order to predict capture and removal of CO 2 from air using membrane technology. The considered membrane was a hollow-fiber contactor module in which gas mixture containing CO 2 was assumed as feed while 2-amino-2-metyl-1-propanol (AMP) was used as an absorbent. The mechanistic model was developed according to transport phenomena taking into account mass transfer and chemical reaction between CO 2 and amine in the contactor module. The main aim of modeling was to track the composition and flux of CO 2 and AMP in the membrane module for process optimization. For modeling of the process, the governing equations were computed using finite element approach in which the whole model domain was discretized into small cells. To confirm the simulation findings, model outcomes were compared with experimental data and good consistency was revealed. The results showed that increasing temperature of AMP solution increases CO 2 removal in the hollow-fiber membrane contactor.

  15. In silico, experimental, mechanistic model for extended-release felodipine disposition exhibiting complex absorption and a highly variable food interaction.

    Directory of Open Access Journals (Sweden)

    Sean H J Kim

    Full Text Available The objective of this study was to develop and explore new, in silico experimental methods for deciphering complex, highly variable absorption and food interaction pharmacokinetics observed for a modified-release drug product. Toward that aim, we constructed an executable software analog of study participants to whom product was administered orally. The analog is an object- and agent-oriented, discrete event system, which consists of grid spaces and event mechanisms that map abstractly to different physiological features and processes. Analog mechanisms were made sufficiently complicated to achieve prespecified similarity criteria. An equation-based gastrointestinal transit model with nonlinear mixed effects analysis provided a standard for comparison. Subject-specific parameterizations enabled each executed analog's plasma profile to mimic features of the corresponding six individual pairs of subject plasma profiles. All achieved prespecified, quantitative similarity criteria, and outperformed the gastrointestinal transit model estimations. We observed important subject-specific interactions within the simulation and mechanistic differences between the two models. We hypothesize that mechanisms, events, and their causes occurring during simulations had counterparts within the food interaction study: they are working, evolvable, concrete theories of dynamic interactions occurring within individual subjects. The approach presented provides new, experimental strategies for unraveling the mechanistic basis of complex pharmacological interactions and observed variability.

  16. Profiling the biological activity of oxide nanomaterials with mechanistic models

    NARCIS (Netherlands)

    Burello, E.

    2013-01-01

    In this study we present three mechanistic models for profiling the potential biological and toxicological effects of oxide nanomaterials. The models attempt to describe the reactivity, protein adsorption and membrane adhesion processes of a large range of oxide materials and are based on properties

  17. A mechanistic model on methane oxidation in the rice rhizosphere

    NARCIS (Netherlands)

    Bodegom, van P.M.; Leffelaar, P.A.; Goudriaan, J.

    2001-01-01

    A mechanistic model is presented on the processes leading to methane oxidation in rice rhizosphere. The model is driven by oxygen release from a rice root into anaerobic rice soil. Oxygen is consumed by heterotrophic and methanotrophic respiration, described by double Monod kinetics, and by iron

  18. An improved mechanistic critical heat flux model for subcooled flow boiling

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Young Min [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Chang, Soon Heung [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    Based on the bubble coalescence adjacent to the heated wall as a flow structure for CHF condition, Chang and Lee developed a mechanistic critical heat flux (CHF) model for subcooled flow boiling. In this paper, improvements of Chang-Lee model are implemented with more solid theoretical bases for subcooled and low-quality flow boiling in tubes. Nedderman-Shearer`s equations for the skin friction factor and universal velocity profile models are employed. Slip effect of movable bubbly layer is implemented to improve the predictability of low mass flow. Also, mechanistic subcooled flow boiling model is used to predict the flow quality and void fraction. The performance of the present model is verified using the KAIST CHF database of water in uniformly heated tubes. It is found that the present model can give a satisfactory agreement with experimental data within less than 9% RMS error. 9 refs., 5 figs. (Author)

  19. An improved mechanistic critical heat flux model for subcooled flow boiling

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Young Min [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Chang, Soon Heung [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Based on the bubble coalescence adjacent to the heated wall as a flow structure for CHF condition, Chang and Lee developed a mechanistic critical heat flux (CHF) model for subcooled flow boiling. In this paper, improvements of Chang-Lee model are implemented with more solid theoretical bases for subcooled and low-quality flow boiling in tubes. Nedderman-Shearer`s equations for the skin friction factor and universal velocity profile models are employed. Slip effect of movable bubbly layer is implemented to improve the predictability of low mass flow. Also, mechanistic subcooled flow boiling model is used to predict the flow quality and void fraction. The performance of the present model is verified using the KAIST CHF database of water in uniformly heated tubes. It is found that the present model can give a satisfactory agreement with experimental data within less than 9% RMS error. 9 refs., 5 figs. (Author)

  20. A mechanistic nitrogen limitation model for CLM(ED)

    Science.gov (United States)

    Ali, A. A.; Xu, C.; McDowell, N. G.; Rogers, A.; Wullschleger, S. D.; Fisher, R.; Vrugt, J. A.

    2014-12-01

    Photosynthetic capacity is a key plant trait that determines the rate of photosynthesis; however, in Earth System Models it is either a fixed value or derived from a linear function of leaf nitrogen content. A mechanistic leaf nitrogen allocation model have been developed for a DOE-sponsored Community Land Model coupled to the Ecosystem Demography model (CLM-ED) to predict the photosynthetic capacity [Vc,max25 (μmol CO2 m-2 s-1)] under different environmental conditions at the global scale. We collected more than 800 data points of photosynthetic capacity (Vc,max25) for 124 species from 57 studies with the corresponding leaf nitrogen content and environmental conditions (temperature, radiation, humidity and day length) from literature and the NGEE arctic site (Barrow). Based on the data, we found that environmental control of Vc,max25 is about 4 times stronger than the leaf nitrogen content. Using the Markov-Chain Monte Carlo simulation approach, we fitted the collected data to our newly developed nitrogen allocation model, which predict the leaf nitrogen investment in different components including structure, storage, respiration, light capture, carboxylation and electron transport at different environmental conditions. Our results showed that our nitrogen allocation model explained 52% of variance in observed Vc,max25 and 65% variance in observed Jmax25 using a single set of fitted model parameters for all species. Across the growing season, we found that the modeled Vc,max25 explained 49% of the variability in measured Vc,max25. In the context of future global warming, our model predicts that a temperature increase by 5oC and the doubling of atmospheric carbon dioxide reduced the Vc,max25 by 5%, 11%, respectively.

  1. Global scale analysis and evaluation of an improved mechanistic representation of plant nitrogen and carbon dynamics in the Community Land Model (CLM)

    Science.gov (United States)

    Ghimire, B.; Riley, W. J.; Koven, C. D.; Randerson, J. T.; Mu, M.; Kattge, J.; Rogers, A.; Reich, P. B.

    2014-12-01

    In many ecosystems, nitrogen is the most limiting nutrient for plant growth and productivity. However mechanistic representation of nitrogen uptake linked to root traits, and functional nitrogen allocation among different leaf enzymes involved in respiration and photosynthesis is currently lacking in Earth System models. The linkage between nitrogen availability and plant productivity is simplistically represented by potential photosynthesis rates, and is subsequently downregulated depending on nitrogen supply and other nitrogen consumers in the model (e.g., nitrification). This type of potential photosynthesis rate calculation is problematic for several reasons. Firstly, plants do not photosynthesize at potential rates and then downregulate. Secondly, there is considerable subjectivity on the meaning of potential photosynthesis rates. Thirdly, there exists lack of understanding on modeling these potential photosynthesis rates in a changing climate. In addition to model structural issues in representing photosynthesis rates, the role of plant roots in nutrient acquisition have been largely ignored in Earth System models. For example, in CLM4.5, nitrogen uptake is linked to leaf level processes (e.g., primarily productivity) rather than root scale process involved in nitrogen uptake. We present a new plant model for CLM with an improved mechanistic presentation of plant nitrogen uptake based on root scale Michaelis Menten kinetics, and stronger linkages between leaf nitrogen and plant productivity by inferring relationships observed in global databases of plant traits (including the TRY database and several individual studies). We also incorporate improved representation of plant nitrogen leaf allocation, especially in tropical regions where significant over-prediction of plant growth and productivity in CLM4.5 simulations exist. We evaluate our improved global model simulations using the International Land Model Benchmarking (ILAMB) framework. We conclude that

  2. Modeling systems-level dynamics: Understanding without mechanistic explanation in integrative systems biology.

    Science.gov (United States)

    MacLeod, Miles; Nersessian, Nancy J

    2015-02-01

    In this paper we draw upon rich ethnographic data of two systems biology labs to explore the roles of explanation and understanding in large-scale systems modeling. We illustrate practices that depart from the goal of dynamic mechanistic explanation for the sake of more limited modeling goals. These processes use abstract mathematical formulations of bio-molecular interactions and data fitting techniques which we call top-down abstraction to trade away accurate mechanistic accounts of large-scale systems for specific information about aspects of those systems. We characterize these practices as pragmatic responses to the constraints many modelers of large-scale systems face, which in turn generate more limited pragmatic non-mechanistic forms of understanding of systems. These forms aim at knowledge of how to predict system responses in order to manipulate and control some aspects of them. We propose that this analysis of understanding provides a way to interpret what many systems biologists are aiming for in practice when they talk about the objective of a "systems-level understanding." Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Mechanistic effect modeling for ecological risk assessment: where to go from here?

    Science.gov (United States)

    Grimm, Volker; Martin, Benjamin T

    2013-07-01

    Mechanistic effect models (MEMs) consider the mechanisms of how chemicals affect individuals and ecological systems such as populations and communities. There is an increasing awareness that MEMs have high potential to make risk assessment of chemicals more ecologically relevant than current standard practice. Here we discuss what kinds of MEMs are needed to improve scientific and regulatory aspects of risk assessment. To make valid predictions for a wide range of environmental conditions, MEMs need to include a sufficient amount of emergence, for example, population dynamics emerging from what individual organisms do. We present 1 example where the life cycle of individuals is described using Dynamic Energy Budget theory. The resulting individual-based population model is thus parameterized at the individual level but correctly predicts multiple patterns at the population level. This is the case for both control and treated populations. We conclude that the state-of-the-art in mechanistic effect modeling has reached a level where MEMs are robust and predictive enough to be used in regulatory risk assessment. Mechanistic effect models will thus be used to advance the scientific basis of current standard practice and will, if their development follows Good Modeling Practice, be included in a standardized way in future regulatory risk assessments. Copyright © 2013 SETAC.

  4. Hierarchical modeling of activation mechanisms in the ABL and EGFR kinase domains: thermodynamic and mechanistic catalysts of kinase activation by cancer mutations.

    Directory of Open Access Journals (Sweden)

    Anshuman Dixit

    2009-08-01

    Full Text Available Structural and functional studies of the ABL and EGFR kinase domains have recently suggested a common mechanism of activation by cancer-causing mutations. However, dynamics and mechanistic aspects of kinase activation by cancer mutations that stimulate conformational transitions and thermodynamic stabilization of the constitutively active kinase form remain elusive. We present a large-scale computational investigation of activation mechanisms in the ABL and EGFR kinase domains by a panel of clinically important cancer mutants ABL-T315I, ABL-L387M, EGFR-T790M, and EGFR-L858R. We have also simulated the activating effect of the gatekeeper mutation on conformational dynamics and allosteric interactions in functional states of the ABL-SH2-SH3 regulatory complexes. A comprehensive analysis was conducted using a hierarchy of computational approaches that included homology modeling, molecular dynamics simulations, protein stability analysis, targeted molecular dynamics, and molecular docking. Collectively, the results of this study have revealed thermodynamic and mechanistic catalysts of kinase activation by major cancer-causing mutations in the ABL and EGFR kinase domains. By using multiple crystallographic states of ABL and EGFR, computer simulations have allowed one to map dynamics of conformational fluctuations and transitions in the normal (wild-type and oncogenic kinase forms. A proposed multi-stage mechanistic model of activation involves a series of cooperative transitions between different conformational states, including assembly of the hydrophobic spine, the formation of the Src-like intermediate structure, and a cooperative breakage and formation of characteristic salt bridges, which signify transition to the active kinase form. We suggest that molecular mechanisms of activation by cancer mutations could mimic the activation process of the normal kinase, yet exploiting conserved structural catalysts to accelerate a conformational transition

  5. Mechanistic modeling of reactive soil nitrogen emissions across agricultural management practices

    Science.gov (United States)

    Rasool, Q. Z.; Miller, D. J.; Bash, J. O.; Venterea, R. T.; Cooter, E. J.; Hastings, M. G.; Cohan, D. S.

    2017-12-01

    The global reactive nitrogen (N) budget has increased by a factor of 2-3 from pre-industrial levels. This increase is especially pronounced in highly N fertilized agricultural regions in summer. The reactive N emissions from soil to atmosphere can be in reduced (NH3) or oxidized (NO, HONO, N2O) forms, depending on complex biogeochemical transformations of soil N reservoirs. Air quality models like CMAQ typically neglect soil emissions of HONO and N2O. Previously, soil NO emissions estimated by models like CMAQ remained parametric and inconsistent with soil NH3 emissions. Thus, there is a need to more mechanistically and consistently represent the soil N processes that lead to reactive N emissions to the atmosphere. Our updated approach estimates soil NO, HONO and N2O emissions by incorporating detailed agricultural fertilizer inputs from EPIC, and CMAQ-modeled N deposition, into the soil N pool. EPIC addresses the nitrification, denitrification and volatilization rates along with soil N pools for agricultural soils. Suitable updates to account for factors like nitrite (NO2-) accumulation not addressed in EPIC, will also be made. The NO and N2O emissions from nitrification and denitrification are computed mechanistically using the N sub-model of DAYCENT. These mechanistic definitions use soil water content, temperature, NH4+ and NO3- concentrations, gas diffusivity and labile C availability as dependent parameters at various soil layers. Soil HONO emissions found to be most probable under high NO2- availability will be based on observed ratios of HONO to NO emissions under different soil moistures, pH and soil types. The updated scheme will utilize field-specific soil properties and N inputs across differing manure management practices such as tillage. Comparison of the modeled soil NO emission rates from the new mechanistic and existing schemes against field measurements will be discussed. Our updated framework will help to predict the diurnal and daily variability

  6. Mechanistic species distribution modeling reveals a niche shift during invasion.

    Science.gov (United States)

    Chapman, Daniel S; Scalone, Romain; Štefanić, Edita; Bullock, James M

    2017-06-01

    Niche shifts of nonnative plants can occur when they colonize novel climatic conditions. However, the mechanistic basis for niche shifts during invasion is poorly understood and has rarely been captured within species distribution models. We quantified the consequence of between-population variation in phenology for invasion of common ragweed (Ambrosia artemisiifolia L.) across Europe. Ragweed is of serious concern because of its harmful effects as a crop weed and because of its impact on public health as a major aeroallergen. We developed a forward mechanistic species distribution model based on responses of ragweed development rates to temperature and photoperiod. The model was parameterized and validated from the literature and by reanalyzing data from a reciprocal common garden experiment in which native and invasive populations were grown within and beyond the current invaded range. It could therefore accommodate between-population variation in the physiological requirements for flowering, and predict the potentially invaded ranges of individual populations. Northern-origin populations that were established outside the generally accepted climate envelope of the species had lower thermal requirements for bud development, suggesting local adaptation of phenology had occurred during the invasion. The model predicts that this will extend the potentially invaded range northward and increase the average suitability across Europe by 90% in the current climate and 20% in the future climate. Therefore, trait variation observed at the population scale can trigger a climatic niche shift at the biogeographic scale. For ragweed, earlier flowering phenology in established northern populations could allow the species to spread beyond its current invasive range, substantially increasing its risk to agriculture and public health. Mechanistic species distribution models offer the possibility to represent niche shifts by varying the traits and niche responses of individual

  7. Uncertainty, sensitivity analysis and the role of data based mechanistic modeling in hydrology

    Science.gov (United States)

    Ratto, M.; Young, P. C.; Romanowicz, R.; Pappenberger, F.; Saltelli, A.; Pagano, A.

    2007-05-01

    calibration of mechanistic hydrological models, making their properties more transparent. It also helps to highlight possible mis-specification problems, if these are identified. The results of the exercise show that the two modelling methodologies have good synergy; combining well to produce a complete joint modelling approach that has the kinds of checks-and-balances required in practical data-based modelling of rainfall-flow systems. Such a combined approach also produces models that are suitable for different kinds of application. As such, the DBM model considered in the paper is developed specifically as a vehicle for flow and flood forecasting (although the generality of DBM modelling means that a simulation version of the model could be developed if required); while TOPMODEL, suitably calibrated (and perhaps modified) in the light of the DBM and GSA results, immediately provides a simulation model with a variety of potential applications, in areas such as catchment management and planning.

  8. The coefficient of restitution of pressurized balls: a mechanistic model

    Science.gov (United States)

    Georgallas, Alex; Landry, Gaëtan

    2016-01-01

    Pressurized, inflated balls used in professional sports are regulated so that their behaviour upon impact can be anticipated and allow the game to have its distinctive character. However, the dynamics governing the impacts of such balls, even on stationary hard surfaces, can be extremely complex. The energy transformations, which arise from the compression of the gas within the ball and from the shear forces associated with the deformation of the wall, are examined in this paper. We develop a simple mechanistic model of the dependence of the coefficient of restitution, e, upon both the gauge pressure, P_G, of the gas and the shear modulus, G, of the wall. The model is validated using the results from a simple series of experiments using three different sports balls. The fits to the data are extremely good for P_G > 25 kPa and consistent values are obtained for the value of G for the wall material. As far as the authors can tell, this simple, mechanistic model of the pressure dependence of the coefficient of restitution is the first in the literature. *%K Coefficient of Restitution, Dynamics, Inflated Balls, Pressure, Impact Model

  9. Mechanistic modeling of CHF in forced-convection subcooled boiling

    International Nuclear Information System (INIS)

    Podowski, M.Z.; Alajbegovic, A.; Kurul, N.; Drew, D.A.; Lahey, R.T. Jr.

    1997-05-01

    Because of the complexity of phenomena governing boiling heat transfer, the approach to solve practical problems has traditionally been based on experimental correlations rather than mechanistic models. The recent progress in computational fluid dynamics (CFD), combined with improved experimental techniques in two-phase flow and heat transfer, makes the use of rigorous physically-based models a realistic alternative to the current simplistic phenomenological approach. The objective of this paper is to present a new CFD model for critical heat flux (CHF) in low quality (in particular, in subcooled boiling) forced-convection flows in heated channels

  10. Development of Improved Mechanistic Deterioration Models for Flexible Pavements

    DEFF Research Database (Denmark)

    Ullidtz, Per; Ertman, Hans Larsen

    1998-01-01

    The paper describes a pilot study in Denmark with the main objective of developing improved mechanistic deterioration models for flexible pavements based on an accelerated full scale test on an instrumented pavement in the Danish Road Tessting Machine. The study was the first in "International...... Pavement Subgrade Performance Study" sponsored by the Federal Highway Administration (FHWA), USA. The paper describes in detail the data analysis and the resulting models for rutting, roughness, and a model for the plastic strain in the subgrade.The reader will get an understanding of the work needed...

  11. Can simulation models help design rice cultivars that are more competitive against weeds?

    NARCIS (Netherlands)

    Bastiaans, L.; Kropff, M.J.; Kempuchetty, N.; Rajan, A.; Migo, T.R.

    1997-01-01

    Differences in competitive ability between rice cultivars IR8 and Mahsuri, grown in well-fertilised irrigated conditions, were analysed by means of a mechanistic simulation model (INTERCOM) for crop-weed interaction. The analysis revealed that the greater competitive ability of Mahsuri was due

  12. Calibrating mechanistic-empirical pavement performance models with an expert matrix

    Energy Technology Data Exchange (ETDEWEB)

    Tighe, S.; AlAssar, R.; Haas, R. [Waterloo Univ., ON (Canada). Dept. of Civil Engineering; Zhiwei, H. [Stantec Consulting Ltd., Cambridge, ON (Canada)

    2001-07-01

    Proper management of pavement infrastructure requires pavement performance modelling. For the past 20 years, the Ontario Ministry of Transportation has used the Ontario Pavement Analysis of Costs (OPAC) system for pavement design. Pavement needs, however, have changed substantially during that time. To address this need, a new research contract is underway to enhance the model and verify the predictions, particularly at extreme points such as low and high traffic volume pavement design. This initiative included a complete evaluation of the existing OPAC pavement design method, the construction of a new set of pavement performance prediction models, and the development of the flexible pavement design procedure that incorporates reliability analysis. The design was also expanded to include rigid pavement designs and modification of the existing life cycle cost analysis procedure which includes both the agency cost and road user cost. Performance prediction and life-cycle costs were developed based on several factors, including material properties, traffic loads and climate. Construction and maintenance schedules were also considered. The methodology for the calibration and validation of a mechanistic-empirical flexible pavement performance model was described. Mechanistic-empirical design methods combine theory based design such as calculated stresses, strains or deflections with empirical methods, where a measured response is associated with thickness and pavement performance. Elastic layer analysis was used to determine pavement response to determine the most effective design using cumulative Equivalent Single Axle Loads (ESALs), below grade type and layer thickness.The new mechanistic-empirical model separates the environment and traffic effects on performance. This makes it possible to quantify regional differences between Southern and Northern Ontario. In addition, roughness can be calculated in terms of the International Roughness Index or Riding comfort Index

  13. Prediction of warfarin maintenance dose in Han Chinese patients using a mechanistic model based on genetic and non-genetic factors.

    Science.gov (United States)

    Lu, Yuan; Yang, Jinbo; Zhang, Haiyan; Yang, Jin

    2013-07-01

    Many attempts have been made to predict the warfarin maintenance dose in patients beginning warfarin therapy using a descriptive model based on multiple linear regression. Here we report the first attempt to develop a comprehensive mechanistic model integrating in vitro-in vivo extrapolation (IVIVE) with a pharmacokinetic-pharmacodynamic model to predict the warfarin maintenance dose in Han Chinese patients. The model incorporates demographic factors [sex, age, body weight (BW)] and the genetic polymorphisms of cytochrome P450 (CYP) 2C9 (CYP2C9) and vitamin K epoxide reductase complex subunit 1 (VKORC1). Information on the various factors, mean warfarin daily dose and International Normalized Ratio (INR) was available for a cohort of 197 Han Chinese patients. Based on in vitro enzyme kinetic parameters for S-warfarin metabolism, demographic data for Han Chinese and some scaling factors, the S-warfarin clearance (CL) was predicted for patients in the cohort with different CYP2C9 genotypes using IVIVE. The plasma concentration of S-warfarin after a single oral dose was simulated using a one-compartment pharmacokinetic model with first-order absorption and a lag time and was combined with a mechanistic coagulation model to simulate the INR response. The warfarin maintenance dose was then predicted based on the demographic data and genotypes of CYP2C9 and VKORC1 for each patient and using the observed steady-state INR (INRss) as a target value. Finally, sensitivity analysis was carried out to determine which factor(s) affect the warfarin maintenance dose most strongly. The predictive performance of this mechanistic model is not inferior to that of our previous descriptive model. There were significant differences in the mean warfarin daily dose in patients with different CYP2C9 and VKORC1 genotypes. Using IVIVE, the predicted mean CL of S-warfarin for patients with CYP2C9*1/*3 (0.092 l/h, n = 11) was 57 % less than for those with wild-type *1/*1 (0.215 l/h, n

  14. Food supply and demand, a simulation model of the functional response of grazing ruminants

    NARCIS (Netherlands)

    Smallegange, I.M.; Brunsting, A.M.H.

    2002-01-01

    A dynamic model of the functional response is a first prerequisite to be able to bridge the gap between local feeding ecology and grazing rules that pertain to larger scales. A mechanistic model is presented that simulates the functional response, growth and grazing time of ruminants. It is based on

  15. INCORPORATION OF MECHANISTIC INFORMATION IN THE ARSENIC PBPK MODEL DEVELOPMENT PROCESS

    Science.gov (United States)

    INCORPORATING MECHANISTIC INSIGHTS IN A PBPK MODEL FOR ARSENICElaina M. Kenyon, Michael F. Hughes, Marina V. Evans, David J. Thomas, U.S. EPA; Miroslav Styblo, University of North Carolina; Michael Easterling, Analytical Sciences, Inc.A physiologically based phar...

  16. Conceptual models for waste tank mechanistic analysis. Status report, January 1991

    Energy Technology Data Exchange (ETDEWEB)

    Allemann, R. T.; Antoniak, Z. I.; Eyler, L. L.; Liljegren, L. M.; Roberts, J. S.

    1992-02-01

    Pacific Northwest Laboratory (PNL) is conducting a study for Westinghouse Hanford Company (Westinghouse Hanford), a contractor for the US Department of Energy (DOE). The purpose of the work is to study possible mechanisms and fluid dynamics contributing to the periodic release of gases from double-shell waste storage tanks at the Hanford Site in Richland, Washington. This interim report emphasizing the modeling work follows two other interim reports, Mechanistic Analysis of Double-Shell Tank Gas Release Progress Report -- November 1990 and Collection and Analysis of Existing Data for Waste Tank Mechanistic Analysis Progress Report -- December 1990, that emphasized data correlation and mechanisms. The approach in this study has been to assemble and compile data that are pertinent to the mechanisms, analyze the data, evaluate physical properties and parameters, evaluate hypothetical mechanisms, and develop mathematical models of mechanisms.

  17. A mechanistic model for electricity consumption on dairy farms: definition, validation, and demonstration.

    Science.gov (United States)

    Upton, J; Murphy, M; Shalloo, L; Groot Koerkamp, P W G; De Boer, I J M

    2014-01-01

    Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical data of 1yr on commercial spring calving, grass-based dairy farms with 45, 88, and 195 milking cows; and (3) demonstrated the functionality of the model by applying 2 electricity tariffs to the electricity consumption data and examining the effect on total dairy farm electricity costs. The MECD was developed using a mechanistic modeling approach and required the key inputs of milk production, cow number, and details relating to the milk-cooling system, milking machine system, water-heating system, lighting systems, water pump systems, and the winter housing facilities as well as details relating to the management of the farm (e.g., season of calving). Model validation showed an overall relative prediction error (RPE) of less than 10% for total electricity consumption. More than 87% of the mean square prediction error of total electricity consumption was accounted for by random variation. The RPE values of the milk-cooling systems, water-heating systems, and milking machine systems were less than 20%. The RPE values for automatic scraper systems, lighting systems, and water pump systems varied from 18 to 113%, indicating a poor prediction for these metrics. However, automatic scrapers, lighting, and water pumps made up only 14% of total electricity consumption across all farms, reducing the overall impact of these poor predictions. Demonstration of the model showed that total farm electricity costs increased by between 29 and 38% by moving from a day and night tariff to a flat

  18. Mechanistic models for the evaluation of biocatalytic reaction conditions and biosensor design optimization

    DEFF Research Database (Denmark)

    Semenova, Daria

    . In the first case study a mechanistic model was developed to describe the enzymatic reaction of glucose oxidase and glucose in the presence of catalase inside a commercial microfluidic platform with integrated oxygen sensor spots. The simplicity of the proposed model allowed an easy calibration of the reaction...... the microfluidic device. In the second case study the flexible microfluidic platform with integrated amperometric glucose biosensors was developed for continuous monitoring of glucose consumption rates. The integration of the mixing chamber inside the platform allowed performing sample dilutions which subsequently......BRs. In the third case study the mechanistic model of the cyclic voltammetry response of the first generation glucose biosensors was developed and applied for the biosensor design optimization. Furthermore the obtained qualitative and quantitative dependencies between the model output and experimental results were...

  19. Numerical simulation in steam injection wellbores by mechanistic approach; Simulacao numerica do escoamento de vapor em pocos por uma abordagem mecanicista

    Energy Technology Data Exchange (ETDEWEB)

    Souza Junior, J.C. de; Campos, W.; Lopes, D.; Moura, L.S.S. [PETROBRAS, Rio de Janeiro, RJ (Brazil); Thomas, A. Clecio F. [Universidade Estadual do Ceara (UECE), CE (Brazil)

    2008-07-01

    This work addresses to the development of a hydrodynamic and heat transfer mechanistic model for steam flow in injection wellbores. The problem of two-phase steam flow in wellbores has been solved recently by using available empirical correlations from petroleum industry (Lopes, 1986) and nuclear industry (Moura, 1991).The good performance achieved by mechanistic models developed by Ansari (1994), Hasan (1995), Gomez (2000) and Kaya (2001) supports the importance of the mechanistic approach for the steam flow problem in injection wellbores. In this study, the methodology to solve the problem consists in the application of a numerical method to the governing equations of steam flow and a marching algorithm to determine the distribution of the pressure and temperature along the wellbore. So, a computer code has been formulated to get numerical results, which provides a comparative study to the main models found in the literature. Finally, when compared to available field data, the mechanistic model for downward vertical steam flow in wellbores gave better results than the empirical correlations. (author)

  20. Optimization of Hybrid Power Trains by Mechanistic System Simulations Optimisation de groupes motopropulseurs électriques hybrides par simulation du système mécanique

    Directory of Open Access Journals (Sweden)

    Katrašnik T.

    2013-05-01

    Full Text Available The paper presents a mechanistic system level simulation model for mode/big hybrid and conventional vehicle topologies. The paper addresses the Dynamic interaction between different domains: internal combustion engine. exhaust after treatment devices, electric components. mechanical drive train. cooling circuit system and corresponding control units. To achieve a good ratio between accuracy. predictability and computational speed of the model an innovative time domain decoupling is presented, which is based on applying domain specific integration steps to ditferent domains and subsequent consistent cross-domain coupling ol’thefluxes. In addition, a computationally efficient frunieveork for transporting active and passive gaseous species is introduced to combine computational efficiency with the need for modeling pollutant transport in the gas path. The applicability and versatility of the mechanistic system level simulations model is presented through analyses of transient phenomena caused by the high interdependency of the sub-systems, i.e. domains. Results of a hyt’hrid vehicle are compared to results of a conventional vehicle to highlight differences in operating regimes of partiular components that are inherent to particular poster train topology. L’article présente un modèle de simulation au niveau mécanique destiné à la modélisation de topologies de véhicules hydrides et conventionnels. L’article décrit l’interaction dynamique entre différents domaines : moteur à combustion interne, dispositifs de post-traitement d’échappement, composants électriques, chaîne cinématique mécanique, circuit de refroidissement et les unités de contrôle correspondantes. Afin d’obtenir un rapport correct entre précision, prévisibilité et vitesse de calculs du modèle, un découplage innovant du domaine temporel est présenté, lequel est basé sur l’application à différents domaines, d’étapes d’intégration sp

  1. INTEGRATION OF QSAR AND SAR METHODS FOR THE MECHANISTIC INTERPRETATION OF PREDICTIVE MODELS FOR CARCINOGENICITY

    Directory of Open Access Journals (Sweden)

    Natalja Fjodorova

    2012-07-01

    Full Text Available The knowledge-based Toxtree expert system (SAR approach was integrated with the statistically based counter propagation artificial neural network (CP ANN model (QSAR approach to contribute to a better mechanistic understanding of a carcinogenicity model for non-congeneric chemicals using Dragon descriptors and carcinogenic potency for rats as a response. The transparency of the CP ANN algorithm was demonstrated using intrinsic mapping technique specifically Kohonen maps. Chemical structures were represented by Dragon descriptors that express the structural and electronic features of molecules such as their shape and electronic surrounding related to reactivity of molecules. It was illustrated how the descriptors are correlated with particular structural alerts (SAs for carcinogenicity with recognized mechanistic link to carcinogenic activity. Moreover, the Kohonen mapping technique enables one to examine the separation of carcinogens and non-carcinogens (for rats within a family of chemicals with a particular SA for carcinogenicity. The mechanistic interpretation of models is important for the evaluation of safety of chemicals.

  2. The Combined Use of Correlative and Mechanistic Species Distribution Models Benefits Low Conservation Status Species.

    Directory of Open Access Journals (Sweden)

    Thibaud Rougier

    Full Text Available Species can respond to climate change by tracking appropriate environmental conditions in space, resulting in a range shift. Species Distribution Models (SDMs can help forecast such range shift responses. For few species, both correlative and mechanistic SDMs were built, but allis shad (Alosa alosa, an endangered anadromous fish species, is one of them. The main purpose of this study was to provide a framework for joint analyses of correlative and mechanistic SDMs projections in order to strengthen conservation measures for species of conservation concern. Guidelines for joint representation and subsequent interpretation of models outputs were defined and applied. The present joint analysis was based on the novel mechanistic model GR3D (Global Repositioning Dynamics of Diadromous fish Distribution which was parameterized on allis shad and then used to predict its future distribution along the European Atlantic coast under different climate change scenarios (RCP 4.5 and RCP 8.5. We then used a correlative SDM for this species to forecast its distribution across the same geographic area and under the same climate change scenarios. First, projections from correlative and mechanistic models provided congruent trends in probability of habitat suitability and population dynamics. This agreement was preferentially interpreted as referring to the species vulnerability to climate change. Climate change could not be accordingly listed as a major threat for allis shad. The congruence in predicted range limits between SDMs projections was the next point of interest. The difference, when noticed, required to deepen our understanding of the niche modelled by each approach. In this respect, the relative position of the northern range limit between the two methods strongly suggested here that a key biological process related to intraspecific variability was potentially lacking in the mechanistic SDM. Based on our knowledge, we hypothesized that local

  3. Mechanistic model for void distribution in flashing flow

    International Nuclear Information System (INIS)

    Riznic, J.; Ishii, M.; Afgan, N.

    1987-01-01

    A problem of discharging of an initially subcooled liquid from a high pressure condition into a low pressure environment is quite important in several industrial systems such as nuclear reactors and chemical reactors. A new model for the flashing process is proposed here based on the wall nucleation theory, bubble growth model and drift-flux bubble transport model. In order to calculate the bubble number density, the bubble number transport equation with a distributed source from the wall nucleation sites is used. The model predictions in terms of the void fraction are compared to Moby Dick and BNL experimental data. It shows that satisfactory agreements could be obtained from the present model without any floating parameter to be adjusted with data. This result indicates that, at least for the experimental conditions considered here, the mechanistic prediction of the flashing phenomenon is possible based on the present wall nucleation based model. 43 refs., 4 figs

  4. Can ligand addition to soil enhance Cd phytoextraction? A mechanistic model study.

    Science.gov (United States)

    Lin, Zhongbing; Schneider, André; Nguyen, Christophe; Sterckeman, Thibault

    2014-11-01

    Phytoextraction is a potential method for cleaning Cd-polluted soils. Ligand addition to soil is expected to enhance Cd phytoextraction. However, experimental results show that this addition has contradictory effects on plant Cd uptake. A mechanistic model simulating the reaction kinetics (adsorption on solid phase, complexation in solution), transport (convection, diffusion) and root absorption (symplastic, apoplastic) of Cd and its complexes in soil was developed. This was used to calculate plant Cd uptake with and without ligand addition in a great number of combinations of soil, ligand and plant characteristics, varying the parameters within defined domains. Ligand addition generally strongly reduced hydrated Cd (Cd(2+)) concentration in soil solution through Cd complexation. Dissociation of Cd complex ([Formula: see text]) could not compensate for this reduction, which greatly lowered Cd(2+) symplastic uptake by roots. The apoplastic uptake of [Formula: see text] was not sufficient to compensate for the decrease in symplastic uptake. This explained why in the majority of the cases, ligand addition resulted in the reduction of the simulated Cd phytoextraction. A few results showed an enhanced phytoextraction in very particular conditions (strong plant transpiration with high apoplastic Cd uptake capacity), but this enhancement was very limited, making chelant-enhanced phytoextraction poorly efficient for Cd.

  5. Mechanistic modeling of heat transfer process governing pressure tube-to-calandria tube contact and fuel channel failure

    International Nuclear Information System (INIS)

    Luxat, J.C.

    2002-01-01

    Heat transfer behaviour and phenomena associated with ballooning deformation of a pressure tube into contact with a calandria tube have been analyzed and mechanistic models have been developed to describe the heat transfer and thermal-mechanical processes. These mechanistic models are applied to analyze experiments performed in various COG funded Contact Boiling Test series. Particular attention is given in the modeling to characterization of the conditions for which fuel channel failure may occur. Mechanistic models describing the governing heat transfer and thermal-mechanical processes are presented. The technical basis for characterizing parameters of the models from the general heat transfer literature is described. The validity of the models is demonstrated by comparison with experimental data. Fuel channel integrity criteria are proposed which are based upon three necessary and sequential mechanisms: Onset of CHF and local drypatch formation at contact; sustained film boiling in the post-contact period; and creep strain to failure of the calandria tube while in sustained film boiling. (author)

  6. A mechanistic model for electricity consumption on dairy farms: Definition, validation, and demonstration

    NARCIS (Netherlands)

    Upton, J.R.; Murphy, M.; Shallo, L.; Groot Koerkamp, P.W.G.; Boer, de I.J.M.

    2014-01-01

    Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on

  7. Mechanistic modelling of cancer: some reflections from software engineering and philosophy of science.

    Science.gov (United States)

    Cañete-Valdeón, José M; Wieringa, Roel; Smallbone, Kieran

    2012-12-01

    There is a growing interest in mathematical mechanistic modelling as a promising strategy for understanding tumour progression. This approach is accompanied by a methodological change of making research, in which models help to actively generate hypotheses instead of waiting for general principles to become apparent once sufficient data are accumulated. This paper applies recent research from philosophy of science to uncover three important problems of mechanistic modelling which may compromise its mainstream application, namely: the dilemma of formal and informal descriptions, the need to express degrees of confidence and the need of an argumentation framework. We report experience and research on similar problems from software engineering and provide evidence that the solutions adopted there can be transferred to the biological domain. We hope this paper can provoke new opportunities for further and profitable interdisciplinary research in the field.

  8. Toward a mechanistic modeling of nitrogen limitation on vegetation dynamics.

    Science.gov (United States)

    Xu, Chonggang; Fisher, Rosie; Wullschleger, Stan D; Wilson, Cathy J; Cai, Michael; McDowell, Nate G

    2012-01-01

    Nitrogen is a dominant regulator of vegetation dynamics, net primary production, and terrestrial carbon cycles; however, most ecosystem models use a rather simplistic relationship between leaf nitrogen content and photosynthetic capacity. Such an approach does not consider how patterns of nitrogen allocation may change with differences in light intensity, growing-season temperature and CO(2) concentration. To account for this known variability in nitrogen-photosynthesis relationships, we develop a mechanistic nitrogen allocation model based on a trade-off of nitrogen allocated between growth and storage, and an optimization of nitrogen allocated among light capture, electron transport, carboxylation, and respiration. The developed model is able to predict the acclimation of photosynthetic capacity to changes in CO(2) concentration, temperature, and radiation when evaluated against published data of V(c,max) (maximum carboxylation rate) and J(max) (maximum electron transport rate). A sensitivity analysis of the model for herbaceous plants, deciduous and evergreen trees implies that elevated CO(2) concentrations lead to lower allocation of nitrogen to carboxylation but higher allocation to storage. Higher growing-season temperatures cause lower allocation of nitrogen to carboxylation, due to higher nitrogen requirements for light capture pigments and for storage. Lower levels of radiation have a much stronger effect on allocation of nitrogen to carboxylation for herbaceous plants than for trees, resulting from higher nitrogen requirements for light capture for herbaceous plants. As far as we know, this is the first model of complete nitrogen allocation that simultaneously considers nitrogen allocation to light capture, electron transport, carboxylation, respiration and storage, and the responses of each to altered environmental conditions. We expect this model could potentially improve our confidence in simulations of carbon-nitrogen interactions and the vegetation

  9. Toward a Mechanistic Modeling of Nitrogen Limitation on Vegetation Dynamics

    Science.gov (United States)

    Xu, Chonggang; Fisher, Rosie; Wullschleger, Stan D.; Wilson, Cathy J.; Cai, Michael; McDowell, Nate G.

    2012-01-01

    Nitrogen is a dominant regulator of vegetation dynamics, net primary production, and terrestrial carbon cycles; however, most ecosystem models use a rather simplistic relationship between leaf nitrogen content and photosynthetic capacity. Such an approach does not consider how patterns of nitrogen allocation may change with differences in light intensity, growing-season temperature and CO2 concentration. To account for this known variability in nitrogen-photosynthesis relationships, we develop a mechanistic nitrogen allocation model based on a trade-off of nitrogen allocated between growth and storage, and an optimization of nitrogen allocated among light capture, electron transport, carboxylation, and respiration. The developed model is able to predict the acclimation of photosynthetic capacity to changes in CO2 concentration, temperature, and radiation when evaluated against published data of Vc,max (maximum carboxylation rate) and Jmax (maximum electron transport rate). A sensitivity analysis of the model for herbaceous plants, deciduous and evergreen trees implies that elevated CO2 concentrations lead to lower allocation of nitrogen to carboxylation but higher allocation to storage. Higher growing-season temperatures cause lower allocation of nitrogen to carboxylation, due to higher nitrogen requirements for light capture pigments and for storage. Lower levels of radiation have a much stronger effect on allocation of nitrogen to carboxylation for herbaceous plants than for trees, resulting from higher nitrogen requirements for light capture for herbaceous plants. As far as we know, this is the first model of complete nitrogen allocation that simultaneously considers nitrogen allocation to light capture, electron transport, carboxylation, respiration and storage, and the responses of each to altered environmental conditions. We expect this model could potentially improve our confidence in simulations of carbon-nitrogen interactions and the vegetation feedbacks

  10. Toward a mechanistic modeling of nitrogen limitation on vegetation dynamics.

    Directory of Open Access Journals (Sweden)

    Chonggang Xu

    Full Text Available Nitrogen is a dominant regulator of vegetation dynamics, net primary production, and terrestrial carbon cycles; however, most ecosystem models use a rather simplistic relationship between leaf nitrogen content and photosynthetic capacity. Such an approach does not consider how patterns of nitrogen allocation may change with differences in light intensity, growing-season temperature and CO(2 concentration. To account for this known variability in nitrogen-photosynthesis relationships, we develop a mechanistic nitrogen allocation model based on a trade-off of nitrogen allocated between growth and storage, and an optimization of nitrogen allocated among light capture, electron transport, carboxylation, and respiration. The developed model is able to predict the acclimation of photosynthetic capacity to changes in CO(2 concentration, temperature, and radiation when evaluated against published data of V(c,max (maximum carboxylation rate and J(max (maximum electron transport rate. A sensitivity analysis of the model for herbaceous plants, deciduous and evergreen trees implies that elevated CO(2 concentrations lead to lower allocation of nitrogen to carboxylation but higher allocation to storage. Higher growing-season temperatures cause lower allocation of nitrogen to carboxylation, due to higher nitrogen requirements for light capture pigments and for storage. Lower levels of radiation have a much stronger effect on allocation of nitrogen to carboxylation for herbaceous plants than for trees, resulting from higher nitrogen requirements for light capture for herbaceous plants. As far as we know, this is the first model of complete nitrogen allocation that simultaneously considers nitrogen allocation to light capture, electron transport, carboxylation, respiration and storage, and the responses of each to altered environmental conditions. We expect this model could potentially improve our confidence in simulations of carbon-nitrogen interactions and the

  11. Mechanistic model of mass-specific basal metabolic rate: evaluation in healthy young adults.

    Science.gov (United States)

    Wang, Z; Bosy-Westphal, A; Schautz, B; Müller, M

    2011-12-01

    Mass-specific basal metabolic rate (mass-specific BMR), defined as the resting energy expenditure per unit body mass per day, is an important parameter in energy metabolism research. However, a mechanistic explanation for magnitude of mass-specific BMR remains lacking. The objective of the present study was to validate the applicability of a proposed mass-specific BMR model in healthy adults. A mechanistic model was developed at the organ-tissue level, mass-specific BMR = Σ( K i × F i ), where Fi is the fraction of body mass as individual organs and tissues, and K i is the specific resting metabolic rate of major organs and tissues. The Fi values were measured by multiple MRI scans and the K i values were suggested by Elia in 1992. A database of healthy non-elderly non-obese adults (age 20 - 49 yrs, BMI BMR of all subjects was 21.6 ± 1.9 (mean ± SD) and 21.7 ± 1.6 kcal/kg per day, respectively. The measured mass-specific BMR was correlated with the predicted mass-specific BMR (r = 0.82, P BMR, versus the average of measured and predicted mass-specific BMR. In conclusion, the proposed mechanistic model was validated in non-elderly non-obese adults and can help to understand the inherent relationship between mass-specific BMR and body composition.

  12. Quantitative assessment of biological impact using transcriptomic data and mechanistic network models

    International Nuclear Information System (INIS)

    Thomson, Ty M.; Sewer, Alain; Martin, Florian; Belcastro, Vincenzo; Frushour, Brian P.; Gebel, Stephan; Park, Jennifer; Schlage, Walter K.; Talikka, Marja; Vasilyev, Dmitry M.; Westra, Jurjen W.; Hoeng, Julia; Peitsch, Manuel C.

    2013-01-01

    Exposure to biologically active substances such as therapeutic drugs or environmental toxicants can impact biological systems at various levels, affecting individual molecules, signaling pathways, and overall cellular processes. The ability to derive mechanistic insights from the resulting system responses requires the integration of experimental measures with a priori knowledge about the system and the interacting molecules therein. We developed a novel systems biology-based methodology that leverages mechanistic network models and transcriptomic data to quantitatively assess the biological impact of exposures to active substances. Hierarchically organized network models were first constructed to provide a coherent framework for investigating the impact of exposures at the molecular, pathway and process levels. We then validated our methodology using novel and previously published experiments. For both in vitro systems with simple exposure and in vivo systems with complex exposures, our methodology was able to recapitulate known biological responses matching expected or measured phenotypes. In addition, the quantitative results were in agreement with experimental endpoint data for many of the mechanistic effects that were assessed, providing further objective confirmation of the approach. We conclude that our methodology evaluates the biological impact of exposures in an objective, systematic, and quantifiable manner, enabling the computation of a systems-wide and pan-mechanistic biological impact measure for a given active substance or mixture. Our results suggest that various fields of human disease research, from drug development to consumer product testing and environmental impact analysis, could benefit from using this methodology. - Highlights: • The impact of biologically active substances is quantified at multiple levels. • The systems-level impact integrates the perturbations of individual networks. • The networks capture the relationships between

  13. Bird Migration Under Climate Change - A Mechanistic Approach Using Remote Sensing

    Science.gov (United States)

    Smith, James A.; Blattner, Tim; Messmer, Peter

    2010-01-01

    The broad-scale reductions and shifts that may be expected under climate change in the availability and quality of stopover habitat for long-distance migrants is an area of increasing concern for conservation biologists. Researchers generally have taken two broad approaches to the modeling of migration behaviour to understand the impact of these changes on migratory bird populations. These include models based on causal processes and their response to environmental stimulation, "mechanistic models", or models that primarily are based on observed animal distribution patterns and the correlation of these patterns with environmental variables, i.e. "data driven" models. Investigators have applied the latter technique to forecast changes in migration patterns with changes in the environment, for example, as might be expected under climate change, by forecasting how the underlying environmental data layers upon which the relationships are built will change over time. The learned geostatstical correlations are then applied to the modified data layers.. However, this is problematic. Even if the projections of how the underlying data layers will change are correct, it is not evident that the statistical relationships will remain the same, i.e. that the animal organism may not adapt its' behaviour to the changing conditions. Mechanistic models that explicitly take into account the physical, biological, and behaviour responses of an organism as well as the underlying changes in the landscape offer an alternative to address these shortcomings. The availability of satellite remote sensing observations at multiple spatial and temporal scales, coupled with advances in climate modeling and information technologies enable the application of the mechanistic models to predict how continental bird migration patterns may change in response to environmental change. In earlier work, we simulated the impact of effects of wetland loss and inter-annual variability on the fitness of

  14. Mechanistic CHF modeling for natural circulation applications in SMR

    Energy Technology Data Exchange (ETDEWEB)

    Luitjens, Jeffrey [Department of Nuclear Engineering and Radiation Health Physics, Oregon State University, 3451 SW Jefferson Way, Corvallis, OR 97331 (United States); Wu, Qiao, E-mail: qiao.wu@oregonstate.edu [Department of Nuclear Engineering and Radiation Health Physics, Oregon State University, 3451 SW Jefferson Way, Corvallis, OR 97331 (United States); Greenwood, Scott; Corradini, Michael [Department of Engineering Physics, University of Wisconsin, 1415 Engineering Drive, Madison, WI 53706 (United States)

    2016-12-15

    A mechanistic critical heat flux correlation has been developed for a wide range of operating conditions which include low mass fluxes of 540–890 kg/m{sup 2}-s, high pressures of 12–13 MPa, and critical heat fluxes of 835–1100 kW/m{sup 2}. Eleven experimental data points have been collected over these conditions to inform the development of the model using bundle geometry. Errors of within 15% have been obtained with the proposed model for predicting the critical heat flux value, location, and critical pin power for a non-uniform heat flux applied to a 2 × 2 bundle configuration.

  15. Mechanistic curiosity will not kill the Bayesian cat

    NARCIS (Netherlands)

    Borsboom, D.; Wagenmakers, E.-J.; Romeijn, J.-W.

    2011-01-01

    Jones & Love (J&L) suggest that Bayesian approaches to the explanation of human behavior should be constrained by mechanistic theories. We argue that their proposal misconstrues the relation between process models, such as the Bayesian model, and mechanisms. While mechanistic theories can answer

  16. Mechanistic curiosity will not kill the Bayesian cat

    NARCIS (Netherlands)

    Borsboom, Denny; Wagenmakers, Eric-Jan; Romeijn, Jan-Willem

    Jones & Love (J&L) suggest that Bayesian approaches to the explanation of human behavior should be constrained by mechanistic theories. We argue that their proposal misconstrues the relation between process models, such as the Bayesian model, and mechanisms. While mechanistic theories can answer

  17. Experimental investigation and mechanistic modelling of dilute bubbly bulk boiling

    International Nuclear Information System (INIS)

    Kutnjak, Josip

    2013-01-01

    During evaporation the geometric shape of the vapour is not described using thermodynamics. In bubbly flows the bubble shape is considered spheric with small diameters and changing into various shapes upon growth. The heat and mass transfer happens at the interfacial area. The forces acting on the bubbles depend on the bubble diameter and shape. In this work the prediction of the bubble diameter and/or bubble number density in bulk boiling was considered outside the vicinity of the heat input area. Thus the boiling effects that happened inside the nearly saturated bulk were under investigation. This situation is relevant for nuclear safety analysis concerning a stagnant coolant in the spent fuel pool. In this research project a new experimental set-up to investigate was built. The experimental set-up consists of an instrumented, partly transparent, high and slender boiling container for visual observation. The direct visual observation of the boiling phenomena is necessary for the identification of basic mechanisms, which should be incorporated in the simulation model. The boiling process has been recorded by means of video images and subsequently was evaluated by digital image processing methods, and by that data concerning the characteristics of the boiling process were generated for the model development and validation. Mechanistic modelling is based on the derivation of relevant mechanisms concluded from observation, which is in line with physical knowledge. In this context two mechanisms were identified; the growth/-shrink mechanism (GSM) of the vapour bubbles and sudden increases of the bubble number density. The GSM was implemented into the CFD-Code ANSYS-CFX using the CFX Expression Language (CEL) by calculation of the internal bubble pressure using the Young-Laplace-Equation. This way a hysteresis is realised as smaller bubbles have an increased internal pressure. The sudden increases of the bubble number density are explainable by liquid super

  18. Experimental investigation and mechanistic modelling of dilute bubbly bulk boiling

    Energy Technology Data Exchange (ETDEWEB)

    Kutnjak, Josip

    2013-06-27

    During evaporation the geometric shape of the vapour is not described using thermodynamics. In bubbly flows the bubble shape is considered spheric with small diameters and changing into various shapes upon growth. The heat and mass transfer happens at the interfacial area. The forces acting on the bubbles depend on the bubble diameter and shape. In this work the prediction of the bubble diameter and/or bubble number density in bulk boiling was considered outside the vicinity of the heat input area. Thus the boiling effects that happened inside the nearly saturated bulk were under investigation. This situation is relevant for nuclear safety analysis concerning a stagnant coolant in the spent fuel pool. In this research project a new experimental set-up to investigate was built. The experimental set-up consists of an instrumented, partly transparent, high and slender boiling container for visual observation. The direct visual observation of the boiling phenomena is necessary for the identification of basic mechanisms, which should be incorporated in the simulation model. The boiling process has been recorded by means of video images and subsequently was evaluated by digital image processing methods, and by that data concerning the characteristics of the boiling process were generated for the model development and validation. Mechanistic modelling is based on the derivation of relevant mechanisms concluded from observation, which is in line with physical knowledge. In this context two mechanisms were identified; the growth/-shrink mechanism (GSM) of the vapour bubbles and sudden increases of the bubble number density. The GSM was implemented into the CFD-Code ANSYS-CFX using the CFX Expression Language (CEL) by calculation of the internal bubble pressure using the Young-Laplace-Equation. This way a hysteresis is realised as smaller bubbles have an increased internal pressure. The sudden increases of the bubble number density are explainable by liquid super

  19. An Emphasis on Perception: Teaching Image Formation Using a Mechanistic Model of Vision.

    Science.gov (United States)

    Allen, Sue; And Others

    An effective way to teach the concept of image is to give students a model of human vision which incorporates a simple mechanism of depth perception. In this study two almost identical versions of a curriculum in geometrical optics were created. One used a mechanistic, interpretive eye model, and in the other the eye was modeled as a passive,…

  20. Predicting soil-to-plant transfer of radionuclides with a mechanistic model (BioRUR)

    Energy Technology Data Exchange (ETDEWEB)

    Casadesus, J. [Servei de Camps Experimentals, Universitat de Barcelona, Avda Diagonal 645, 08028 Barcelona (Spain); Sauras-Yera, T. [Departament de Biologia Vegetal, Facultat de Biologia, Universitat de Barcelona, Avda Diagonal 645, 08028 Barcelona (Spain)], E-mail: msauras@ub.edu; Vallejo, V.R. [Departament de Biologia Vegetal, Facultat de Biologia, Universitat de Barcelona, Avda Diagonal 645, 08028 Barcelona (Spain); Centro de Estudios Ambientales del Mediterraneo, Charles Darwin 14, Parc Tecnologic, 46980 Paterna, Valencia (Spain)

    2008-05-15

    BioRUR model has been developed for the simulation of radionuclide (RN) transfer through physical and biological compartments, based on the available information on the transfer of their nutrient analogues. The model assumes that radionuclides are transferred from soil to plant through the same pathways as their nutrient analogues, where K and Ca are the analogues of Cs and Sr, respectively. Basically, the transfer of radionuclide between two compartments is calculated as the transfer of nutrient multiplied by the ratio of concentrations of RN to nutrient, corrected by a selectivity coefficient. Hydroponic experiments showed the validity of this assumption for root uptake of Cs and Sr and reported a selectivity coefficient around 1.0 for both. However, the application of this approach to soil-to-plant transfer raises some questions on which are the effective concentrations of RN and nutrient detected by the plant uptake mechanism. This paper describes the evaluation of two configurations of BioRUR, one which simplifies the soil as an homogeneous pool, and the other which considers that some concentration gradients develop around roots and therefore ion concentrations at the root surface are different from those of the bulk soil. The results show a good fit between the observed Sr transfer and the mechanistic simulations, even when a homogeneous soil is considered. On the other hand, Cs transfer is overestimated by two orders of magnitude if the development of a decreasing K profile around roots is not taken into account.

  1. Predicting soil-to-plant transfer of radionuclides with a mechanistic model (BioRUR)

    International Nuclear Information System (INIS)

    Casadesus, J.; Sauras-Yera, T.; Vallejo, V.R.

    2008-01-01

    BioRUR model has been developed for the simulation of radionuclide (RN) transfer through physical and biological compartments, based on the available information on the transfer of their nutrient analogues. The model assumes that radionuclides are transferred from soil to plant through the same pathways as their nutrient analogues, where K and Ca are the analogues of Cs and Sr, respectively. Basically, the transfer of radionuclide between two compartments is calculated as the transfer of nutrient multiplied by the ratio of concentrations of RN to nutrient, corrected by a selectivity coefficient. Hydroponic experiments showed the validity of this assumption for root uptake of Cs and Sr and reported a selectivity coefficient around 1.0 for both. However, the application of this approach to soil-to-plant transfer raises some questions on which are the effective concentrations of RN and nutrient detected by the plant uptake mechanism. This paper describes the evaluation of two configurations of BioRUR, one which simplifies the soil as an homogeneous pool, and the other which considers that some concentration gradients develop around roots and therefore ion concentrations at the root surface are different from those of the bulk soil. The results show a good fit between the observed Sr transfer and the mechanistic simulations, even when a homogeneous soil is considered. On the other hand, Cs transfer is overestimated by two orders of magnitude if the development of a decreasing K profile around roots is not taken into account

  2. Mechanistic Physiologically Based Pharmacokinetic (PBPK) Model of the Heart Accounting for Inter-Individual Variability: Development and Performance Verification.

    Science.gov (United States)

    Tylutki, Zofia; Mendyk, Aleksander; Polak, Sebastian

    2018-04-01

    Modern model-based approaches to cardiac safety and efficacy assessment require accurate drug concentration-effect relationship establishment. Thus, knowledge of the active concentration of drugs in heart tissue is desirable along with inter-subject variability influence estimation. To that end, we developed a mechanistic physiologically based pharmacokinetic model of the heart. The models were described with literature-derived parameters and written in R, v.3.4.0. Five parameters were estimated. The model was fitted to amitriptyline and nortriptyline concentrations after an intravenous infusion of amitriptyline. The cardiac model consisted of 5 compartments representing the pericardial fluid, heart extracellular water, and epicardial intracellular, midmyocardial intracellular, and endocardial intracellular fluids. Drug cardiac metabolism, passive diffusion, active efflux, and uptake were included in the model as mechanisms involved in the drug disposition within the heart. The model accounted for inter-individual variability. The estimates of optimized parameters were within physiological ranges. The model performance was verified by simulating 5 clinical studies of amitriptyline intravenous infusion, and the simulated pharmacokinetic profiles agreed with clinical data. The results support the model feasibility. The proposed structure can be tested with the goal of improving the patient-specific model-based cardiac safety assessment and offers a framework for predicting cardiac concentrations of various xenobiotics. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  3. Modeling of the pyruvate production with Escherichia coli: comparison of mechanistic and neural networks-based models.

    Science.gov (United States)

    Zelić, B; Bolf, N; Vasić-Racki, D

    2006-06-01

    Three different models: the unstructured mechanistic black-box model, the input-output neural network-based model and the externally recurrent neural network model were used to describe the pyruvate production process from glucose and acetate using the genetically modified Escherichia coli YYC202 ldhA::Kan strain. The experimental data were used from the recently described batch and fed-batch experiments [ Zelić B, Study of the process development for Escherichia coli-based pyruvate production. PhD Thesis, University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb, Croatia, July 2003. (In English); Zelić et al. Bioproc Biosyst Eng 26:249-258 (2004); Zelić et al. Eng Life Sci 3:299-305 (2003); Zelić et al Biotechnol Bioeng 85:638-646 (2004)]. The neural networks were built out of the experimental data obtained in the fed-batch pyruvate production experiments with the constant glucose feed rate. The model validation was performed using the experimental results obtained from the batch and fed-batch pyruvate production experiments with the constant acetate feed rate. Dynamics of the substrate and product concentration changes was estimated using two neural network-based models for biomass and pyruvate. It was shown that neural networks could be used for the modeling of complex microbial fermentation processes, even in conditions in which mechanistic unstructured models cannot be applied.

  4. Improving Predictive Modeling in Pediatric Drug Development: Pharmacokinetics, Pharmacodynamics, and Mechanistic Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Slikker, William; Young, John F.; Corley, Rick A.; Dorman, David C.; Conolly, Rory B.; Knudsen, Thomas; Erstad, Brian L.; Luecke, Richard H.; Faustman, Elaine M.; Timchalk, Chuck; Mattison, Donald R.

    2005-07-26

    A workshop was conducted on November 18?19, 2004, to address the issue of improving predictive models for drug delivery to developing humans. Although considerable progress has been made for adult humans, large gaps remain for predicting pharmacokinetic/pharmacodynamic (PK/PD) outcome in children because most adult models have not been tested during development. The goals of the meeting included a description of when, during development, infants/children become adultlike in handling drugs. The issue of incorporating the most recent advances into the predictive models was also addressed: both the use of imaging approaches and genomic information were considered. Disease state, as exemplified by obesity, was addressed as a modifier of drug pharmacokinetics and pharmacodynamics during development. Issues addressed in this workshop should be considered in the development of new predictive and mechanistic models of drug kinetics and dynamics in the developing human.

  5. Assessing the ability of mechanistic volatilization models to simulate soil surface conditions: a study with the Volt'Air model.

    Science.gov (United States)

    Garcia, L; Bedos, C; Génermont, S; Braud, I; Cellier, P

    2011-09-01

    Ammonia and pesticide volatilization in the field is a surface phenomenon involving physical and chemical processes that depend on the soil surface temperature and water content. The water transfer, heat transfer and energy budget sub models of volatilization models are adapted from the most commonly accepted formalisms and parameterizations. They are less detailed than the dedicated models describing water and heat transfers and surface status. The aim of this work was to assess the ability of one of the available mechanistic volatilization models, Volt'Air, to accurately describe the pedo-climatic conditions of a soil surface at the required time and space resolution. The assessment involves: (i) a sensitivity analysis, (ii) an evaluation of Volt'Air outputs in the light of outputs from a reference Soil-Vegetation-Atmosphere Transfer model (SiSPAT) and three experimental datasets, and (iii) the study of three tests based on modifications of SiSPAT to establish the potential impact of the simplifying assumptions used in Volt'Air. The analysis confirmed that a 5 mm surface layer was well suited, and that Volt'Air surface temperature correlated well with the experimental measurements as well as with SiSPAT outputs. In terms of liquid water transfers, Volt'Air was overall consistent with SiSPAT, with discrepancies only during major rainfall events and dry weather conditions. The tests enabled us to identify the main source of the discrepancies between Volt'Air and SiSPAT: the lack of gaseous water transfer description in Volt'Air. They also helped to explain why neither Volt'Air nor SiSPAT was able to represent lower values of surface water content: current classical water retention and hydraulic conductivity models are not yet adapted to cases of very dry conditions. Given the outcomes of this study, we discuss to what extent the volatilization models can be improved and the questions they pose for current research in water transfer modeling and parameterization

  6. Application of response surface methodology and semi-mechanistic model to optimize fluoride removal using crushed concrete in a fixed-bed column.

    Science.gov (United States)

    Gu, Bon-Wun; Lee, Chang-Gu; Park, Seong-Jik

    2018-03-01

    The aim of this study was to investigate the removal of fluoride from aqueous solutions by using crushed concrete fines as a filter medium under varying conditions of pH 3-7, flow rate of 0.3-0.7 mL/min, and filter depth of 10-20 cm. The performance of fixed-bed columns was evaluated on the basis of the removal ratio (Re), uptake capacity (qe), degree of sorbent used (DoSU), and sorbent usage rate (SUR) obtained from breakthrough curves (BTCs). Three widely used semi-mechanistic models, that is, Bohart-Adams, Thomas, and Yoon-Nelson models, were applied to simulate the BTCs and to derive the design parameters. The Box-Behnken design of response surface methodology (RSM) was used to elucidate the individual and interactive effects of the three operational parameters on the column performance and to optimize these parameters. The results demonstrated that pH is the most important factor in the performance of fluoride removal by a fixed-bed column. The flow rate had a significant negative influence on Re and DoSU, and the effect of filter depth was observed only in the regression model for DoSU. Statistical analysis indicated that the model attained from the RSM study is suitable for describing the semi-mechanistic model parameters.

  7. Flow regimes and mechanistic modeling of critical heat flux under subcooled flow boiling conditions

    Science.gov (United States)

    Le Corre, Jean-Marie

    Thermal performance of heat flux controlled boiling heat exchangers are usually limited by the Critical Heat Flux (CHF) above which the heat transfer degrades quickly, possibly leading to heater overheating and destruction. In an effort to better understand the phenomena, a literature review of CHF experimental visualizations under subcooled flow boiling conditions was performed and systematically analyzed. Three major types of CHF flow regimes were identified (bubbly, vapor clot and slug flow regime) and a CHF flow regime map was developed, based on a dimensional analysis of the phenomena and available data. It was found that for similar geometric characteristics and pressure, a Weber number (We)/thermodynamic quality (x) map can be used to predict the CHF flow regime. Based on the experimental observations and the review of the available CHF mechanistic models under subcooled flow boiling conditions, hypothetical CHF mechanisms were selected for each CHF flow regime, all based on a concept of wall dry spot overheating, rewetting prevention and subsequent dry spot spreading. It is postulated that a high local wall superheat occurs locally in a dry area of the heated wall, due to a cyclical event inherent to the considered CHF two-phase flow regime, preventing rewetting (Leidenfrost effect). The selected modeling concept has the potential to span the CHF conditions from highly subcooled bubbly flow to early stage of annular flow. A numerical model using a two-dimensional transient thermal analysis of the heater undergoing nucleation was developed to mechanistically predict CHF in the case of a bubbly flow regime. In this type of CHF two-phase flow regime, the high local wall superheat occurs underneath a nucleating bubble at the time of bubble departure. The model simulates the spatial and temporal heater temperature variations during nucleation at the wall, accounting for the stochastic nature of the boiling phenomena. The model has also the potential to evaluate

  8. Behavioural Procedural Models – a multipurpose mechanistic account

    Directory of Open Access Journals (Sweden)

    Leonardo Ivarola

    2012-05-01

    Full Text Available In this paper we outline an epistemological defence of what wecall Behavioural Procedural Models (BPMs, which represent the processes of individual decisions that lead to relevant economic patterns as psychologically (rather than rationally driven. Their general structure, and the way in which they may be incorporated to a multipurpose view of models, where the representational and interventionist goals are combined, is shown. It is argued that BPMs may provide “mechanistic-based explanations” in the sense defended by Hedström and Ylikoski (2010, which involve invariant regularities in Woodward’s sense. Such mechanisms provide a causal sort of explanation of anomalous economic patterns, which allow for extra marketintervention and manipulability in order to correct and improve some key individual decisions. This capability sets the basis for the so called libertarian paternalism (Sunstein and Thaler 2003.

  9. Mechanistic Drifting Forecast Model for A Small Semi-Submersible Drifter Under Tide-Wind-Wave Conditions

    Science.gov (United States)

    Zhang, Wei-Na; Huang, Hui-ming; Wang, Yi-gang; Chen, Da-ke; Zhang, lin

    2018-03-01

    Understanding the drifting motion of a small semi-submersible drifter is of vital importance regarding monitoring surface currents and the floating pollutants in coastal regions. This work addresses this issue by establishing a mechanistic drifting forecast model based on kinetic analysis. Taking tide-wind-wave into consideration, the forecast model is validated against in situ drifting experiment in the Radial Sand Ridges. Model results show good performance with respect to the measured drifting features, characterized by migrating back and forth twice a day with daily downwind displacements. Trajectory models are used to evaluate the influence of the individual hydrodynamic forcing. The tidal current is the fundamental dynamic condition in the Radial Sand Ridges and has the greatest impact on the drifting distance. However, it loses its leading position in the field of the daily displacement of the used drifter. The simulations reveal that different hydrodynamic forces dominate the daily displacement of the used drifter at different wind scales. The wave-induced mass transport has the greatest influence on the daily displacement at Beaufort wind scale 5-6; while wind drag contributes mostly at wind scale 2-4.

  10. A mechanistic modelling and data assimilation approach to estimate the carbon/chlorophyll and carbon/nitrogen ratios in a coupled hydrodynamical-biological model

    Directory of Open Access Journals (Sweden)

    B. Faugeras

    2004-01-01

    Full Text Available The principal objective of hydrodynamical-biological models is to provide estimates of the main carbon fluxes such as total and export oceanic production. These models are nitrogen based, that is to say that the variables are expressed in terms of their nitrogen content. Moreover models are calibrated using chlorophyll data sets. Therefore carbon to chlorophyll (C:Chl and carbon to nitrogen (C:N ratios have to be assumed. This paper addresses the problem of the representation of these ratios. In a 1D framework at the DYFAMED station (NW Mediterranean Sea we propose a model which enables the estimation of the basic biogeochemical fluxes and in which the spatio-temporal variability of the C:Chl and C:N ratios is fully represented in a mechanical way. This is achieved through the introduction of new state variables coming from the embedding of a phytoplankton growth model in a more classical Redfieldian NNPZD-DOM model (in which the C:N ratio is assumed to be a constant. Following this modelling step, the parameters of the model are estimated using the adjoint data assimilation method which enables the assimilation of chlorophyll and nitrate data sets collected at DYFAMED in 1997.Comparing the predictions of the new Mechanistic model with those of the classical Redfieldian NNPZD-DOM model which was calibrated with the same data sets, we find that both models reproduce the reference data in a comparable manner. Both fluxes and stocks can be equally well predicted by either model. However if the models are coinciding on an average basis, they are diverging from a variability prediction point of view. In the Mechanistic model biology adapts much faster to its environment giving rise to higher short term variations. Moreover the seasonal variability in total production differs from the Redfieldian NNPZD-DOM model to the Mechanistic model. In summer the Mechanistic model predicts higher production values in carbon unit than the Redfieldian NNPZD

  11. A mechanistic Eulerian-Lagrangian model for dispersed flow film boiling

    International Nuclear Information System (INIS)

    Andreani, M.; Yadigaroglu, G.

    1991-01-01

    In this paper a new mechanistic model of heat transfer in the dispersed flow regime is presented. The usual assumptions that render most of the available models unsuitable for the analysis of the reflooding phase of the LOCA are discussed, and a two-dimensional time-independent numerical model is developed. The gas temperature field is solved in a fixed-grid (Eulerian) mesh, with the droplets behaving as mass and energy sources. The histories of a large number of computational droplets are followed in a Lagrangian frame, considering evaporation, break-up and interactions with the vapor and with the wall. comparisons of calculated wall and vapor temperatures with experimental data are shown for two reflooding tests

  12. Mechanistic variables can enhance predictive models of endotherm distributions: The American pika under current, past, and future climates

    Science.gov (United States)

    Mathewson, Paul; Moyer-Horner, Lucas; Beever, Erik; Briscoe, Natalie; Kearney, Michael T.; Yahn, Jeremiah; Porter, Warren P.

    2017-01-01

    How climate constrains species’ distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8–19% less habitat loss in response to annual temperature increases of ~3–5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect

  13. Mechanistic variables can enhance predictive models of endotherm distributions: the American pika under current, past, and future climates.

    Science.gov (United States)

    Mathewson, Paul D; Moyer-Horner, Lucas; Beever, Erik A; Briscoe, Natalie J; Kearney, Michael; Yahn, Jeremiah M; Porter, Warren P

    2017-03-01

    How climate constrains species' distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8-19% less habitat loss in response to annual temperature increases of ~3-5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect

  14. Thermal tides and studies to tune the mechanistic tidal model using UARS observations

    Directory of Open Access Journals (Sweden)

    V. A. Yudin

    1997-09-01

    Full Text Available Monthly simulations of the thermal diurnal and semidiurnal tides are compared to High-Resolution Doppler Imager (HRDI and Wind Imaging Interferometer (WINDII wind and temperature measurements on the Upper-Atmosphere Research Satellite (UARS. There is encouraging agreement between the observations and the linear global mechanistic tidal model results both for the diurnal and semidiurnal components in the equatorial and mid-latitude regions. This gives us the confidence to outline the first steps of an assimilative analysis/interpretation for tides, dissipation, and mean flow using a combination of model results and the global measurements from HRDI and WINDII. The sensitivity of the proposed technique to the initial guess employed to obtain a best fit to the data by tuning model parameters is discussed for the January and March 1993 cases, when the WINDII day and night measurements of the meridional winds between 90 and 110 km are used along with the daytime HRDI measurements. Several examples for the derivation of the tidal variables and decomposition of the measured winds into tidal and mean flow components using this approach are compared with previous tidal estimates and modeling results for the migrating tides. The seasonal cycle of the derived diurnal tidal amplitudes are discussed and compared with radar observation between 80 and 100 km and 40°S and 40°N.

  15. Thermal tides and studies to tune the mechanistic tidal model using UARS observations

    Directory of Open Access Journals (Sweden)

    V. A. Yudin

    Full Text Available Monthly simulations of the thermal diurnal and semidiurnal tides are compared to High-Resolution Doppler Imager (HRDI and Wind Imaging Interferometer (WINDII wind and temperature measurements on the Upper-Atmosphere Research Satellite (UARS. There is encouraging agreement between the observations and the linear global mechanistic tidal model results both for the diurnal and semidiurnal components in the equatorial and mid-latitude regions. This gives us the confidence to outline the first steps of an assimilative analysis/interpretation for tides, dissipation, and mean flow using a combination of model results and the global measurements from HRDI and WINDII. The sensitivity of the proposed technique to the initial guess employed to obtain a best fit to the data by tuning model parameters is discussed for the January and March 1993 cases, when the WINDII day and night measurements of the meridional winds between 90 and 110 km are used along with the daytime HRDI measurements. Several examples for the derivation of the tidal variables and decomposition of the measured winds into tidal and mean flow components using this approach are compared with previous tidal estimates and modeling results for the migrating tides. The seasonal cycle of the derived diurnal tidal amplitudes are discussed and compared with radar observation between 80 and 100 km and 40°S and 40°N.

  16. Toward a mechanistic modeling of nitrogen limitation for photosynthesis

    Science.gov (United States)

    Xu, C.; Fisher, R. A.; Travis, B. J.; Wilson, C. J.; McDowell, N. G.

    2011-12-01

    The nitrogen limitation is an important regulator for vegetation growth and global carbon cycle. Most current ecosystem process models simulate nitrogen effects on photosynthesis based on a prescribed relationship between leaf nitrogen and photosynthesis; however, there is a large amount of variability in this relationship with different light, temperature, nitrogen availability and CO2 conditions, which can affect the reliability of photosynthesis prediction under future climate conditions. To account for the variability in nitrogen-photosynthesis relationship under different environmental conditions, in this study, we developed a mechanistic model of nitrogen limitation for photosynthesis based on nitrogen trade-offs among light absorption, electron transport, carboxylization and carbon sink. Our model shows that strategies of nitrogen storage allocation as determined by tradeoff among growth and persistence is a key factor contributing to the variability in relationship between leaf nitrogen and photosynthesis. Nitrogen fertilization substantially increases the proportion of nitrogen in storage for coniferous trees but much less for deciduous trees, suggesting that coniferous trees allocate more nitrogen toward persistence compared to deciduous trees. The CO2 fertilization will cause lower nitrogen allocation for carboxylization but higher nitrogen allocation for storage, which leads to a weaker relationship between leaf nitrogen and maximum photosynthesis rate. Lower radiation will cause higher nitrogen allocation for light absorption and electron transport but less nitrogen allocation for carboxylyzation and storage, which also leads to weaker relationship between leaf nitrogen and maximum photosynthesis rate. At the same time, lower growing temperature will cause higher nitrogen allocation for carboxylyzation but lower allocation for light absorption, electron transport and storage, which leads to a stronger relationship between leaf nitrogen and maximum

  17. A mechanistic model for spread of livestock-associated methicillin-resistant Staphylococcus aureus (LA-MRSA) within a pig herd

    DEFF Research Database (Denmark)

    Sørensen, Anna Irene Vedel; Toft, Nils; Boklund, Anette

    2017-01-01

    Before an efficient control strategy for livestock-associated methicillin resistant Staphylococcus aureus (LA-MRSA) in pigs can be decided upon, it is necessary to obtain a betterunderstanding of how LA-MRSA spreads and persists within a pig herd, once it is introduced.We here present a mechanistic...... stochastic discrete-event simulation model forspread of LA-MRSA within a farrow-to-finish sow herd to aid in this. The model was individual-based and included three different disease compartments: susceptible, intermittent or persistent shedder of MRSA. The model was used for studying transmission dynamics...... and within-farm prevalence after different introductions of LA-MRSA into a farm. The spread of LA-MRSA throughout the farm mainly followed the movement of pigs. After spread of LA-MRSA had reached equilibrium, the prevalence of LA-MRSA shedders was predicted to be highest in the farrowing unit, independent...

  18. Mechanistic model coupling gas exchange dynamics and Listeria monocytogenes growth in modified atmosphere packaging of non respiring food.

    Science.gov (United States)

    Chaix, E; Broyart, B; Couvert, O; Guillaume, C; Gontard, N; Guillard, V

    2015-10-01

    A mechanistic model coupling O2 and CO2 mass transfer (namely diffusion and solubilisation in the food itself and permeation through the packaging material) to microbial growth models was developed aiming at predicting the shelf life of modified atmosphere packaging (MAP) systems. It was experimentally validated on a non-respiring food by investigating concomitantly the O2/CO2 partial pressure in packaging headspace and the growth of Listeria monocytogenes (average microbial count) within the food sample. A sensitivity analysis has revealed that the reliability of the prediction by this "super-parametrized" model (no less than 47 parameters were required for running one simulation) was strongly dependent on the accuracy of the microbial input parameters. Once validated, this model was used to decipher the role of O2/CO2 mass transfer on microbial growth and as a MAP design tool: an example of MAP dimensioning was provided in this paper as a proof of concept. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Simulation and mechanistic investigation of the arrhythmogenic role of the late sodium current in human heart failure.

    Directory of Open Access Journals (Sweden)

    Beatriz Trenor

    Full Text Available Heart failure constitutes a major public health problem worldwide. The electrophysiological remodeling of failing hearts sets the stage for malignant arrhythmias, in which the role of the late Na(+ current (I(NaL is relevant and is currently under investigation. In this study we examined the role of I(NaL in the electrophysiological phenotype of ventricular myocytes, and its proarrhythmic effects in the failing heart. A model for cellular heart failure was proposed using a modified version of Grandi et al. model for human ventricular action potential that incorporates the formulation of I(NaL. A sensitivity analysis of the model was performed and simulations of the pathological electrical activity of the cell were conducted. The proposed model for the human I(NaL and the electrophysiological remodeling of myocytes from failing hearts accurately reproduce experimental observations. The sensitivity analysis of the modulation of electrophysiological parameters of myocytes from failing hearts due to ion channels remodeling, revealed a role for I(NaL in the prolongation of action potential duration (APD, triangulation of the shape of the AP, and changes in Ca(2+ transient. A mechanistic investigation of intracellular Na(+ accumulation and APD shortening with increasing frequency of stimulation of failing myocytes revealed a role for the Na(+/K(+ pump, the Na(+/Ca(2+ exchanger and I(NaL. The results of the simulations also showed that in failing myocytes, the enhancement of I(NaL increased the reverse rate-dependent APD prolongation and the probability of initiating early afterdepolarizations. The electrophysiological remodeling of failing hearts and especially the enhancement of the I(NaL prolong APD and alter Ca(2+ transient facilitating the development of early afterdepolarizations. An enhanced I(NaL appears to be an important contributor to the electrophysiological phenotype and to the dysregulation of [Ca(2+](i homeostasis of failing myocytes.

  20. Rational and Mechanistic Perspectives on Reinforcement Learning

    Science.gov (United States)

    Chater, Nick

    2009-01-01

    This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: "mechanistic" and "rational." Reinforcement learning is often viewed in mechanistic terms--as…

  1. Mechanistic Systems Modeling to Improve Understanding and Prediction of Cardiotoxicity Caused by Targeted Cancer Therapeutics

    Directory of Open Access Journals (Sweden)

    Jaehee V. Shim

    2017-09-01

    Full Text Available Tyrosine kinase inhibitors (TKIs are highly potent cancer therapeutics that have been linked with serious cardiotoxicity, including left ventricular dysfunction, heart failure, and QT prolongation. TKI-induced cardiotoxicity is thought to result from interference with tyrosine kinase activity in cardiomyocytes, where these signaling pathways help to control critical processes such as survival signaling, energy homeostasis, and excitation–contraction coupling. However, mechanistic understanding is limited at present due to the complexities of tyrosine kinase signaling, and the wide range of targets inhibited by TKIs. Here, we review the use of TKIs in cancer and the cardiotoxicities that have been reported, discuss potential mechanisms underlying cardiotoxicity, and describe recent progress in achieving a more systematic understanding of cardiotoxicity via the use of mechanistic models. In particular, we argue that future advances are likely to be enabled by studies that combine large-scale experimental measurements with Quantitative Systems Pharmacology (QSP models describing biological mechanisms and dynamics. As such approaches have proven extremely valuable for understanding and predicting other drug toxicities, it is likely that QSP modeling can be successfully applied to cardiotoxicity induced by TKIs. We conclude by discussing a potential strategy for integrating genome-wide expression measurements with models, illustrate initial advances in applying this approach to cardiotoxicity, and describe challenges that must be overcome to truly develop a mechanistic and systematic understanding of cardiotoxicity caused by TKIs.

  2. A tissue-engineered gastric cancer model for mechanistic study of anti-tumor drugs

    International Nuclear Information System (INIS)

    Gao, Ming; Cai, Yiting; Wu, Wei; Shi, Yazhou; Fei, Zhewei

    2013-01-01

    The use of the traditional xenograft subcutaneous tumor model has been contested because of its limitations, such as a slow tumorigenesis, inconsistent chemotherapeutic results, etc. In light of these challenges, we aim to revamp the traditional model by employing an electrospun scaffold composed of polydioxanone, gelatin and elastin to boost the tumorigenesis. The scaffold featured a highly porous microstructure and successfully supported the growth of tumor cells in vitro without provoking apoptosis. In vivo studies showed that in the scaffold model the tumor volume increased by 43.27% and the weight by 75.58%, respectively, within a 12-week period. In addition, the scaffold model saw an increase of CD24 + and CD44 + cells in the tumor mass by 42% and 313%, respectively. The scaffolding materials did not lead to phenotypic changes during the tumorigenesis. Thereafter, in the scaffold model, we found that the chemotherapeutic regimen of docetaxel, cisplatin and fluorouracil unleashed a stronger capability than the regimen comprising cisplatin and fluorouracil to deplete the CD44 + subpopulation. This discovery sheds mechanistic lights on the role of docetaxel for its future chemotherapeutic applications. This revamped model affords cancer scientists a convenient and reliable platform to mechanistically investigate the chemotherapeutic drugs on gastric cancer stem cells. (paper)

  3. A mechanistic model for the evolution of multicellularity

    Science.gov (United States)

    Amado, André; Batista, Carlos; Campos, Paulo R. A.

    2018-02-01

    Through a mechanistic approach we investigate the formation of aggregates of variable sizes, accounting mechanisms of aggregation, dissociation, death and reproduction. In our model, cells can produce two metabolites, but the simultaneous production of both metabolites is costly in terms of fitness. Thus, the formation of larger groups can favor the aggregates to evolve to a configuration where division of labor arises. It is assumed that the states of the cells in a group are those that maximize organismal fitness. In the model it is considered that the groups can grow linearly, forming a chain, or compactly keeping a roughly spherical shape. Starting from a population consisting of single-celled organisms, we observe the formation of groups with variable sizes and usually much larger than two-cell aggregates. Natural selection can favor the formation of large groups, which allows the system to achieve new and larger fitness maxima.

  4. Drug-disease modeling in the pharmaceutical industry - where mechanistic systems pharmacology and statistical pharmacometrics meet.

    Science.gov (United States)

    Helmlinger, Gabriel; Al-Huniti, Nidal; Aksenov, Sergey; Peskov, Kirill; Hallow, Karen M; Chu, Lulu; Boulton, David; Eriksson, Ulf; Hamrén, Bengt; Lambert, Craig; Masson, Eric; Tomkinson, Helen; Stanski, Donald

    2017-11-15

    Modeling & simulation (M&S) methodologies are established quantitative tools, which have proven to be useful in supporting the research, development (R&D), regulatory approval, and marketing of novel therapeutics. Applications of M&S help design efficient studies and interpret their results in context of all available data and knowledge to enable effective decision-making during the R&D process. In this mini-review, we focus on two sets of modeling approaches: population-based models, which are well-established within the pharmaceutical industry today, and fall under the discipline of clinical pharmacometrics (PMX); and systems dynamics models, which encompass a range of models of (patho-)physiology amenable to pharmacological intervention, of signaling pathways in biology, and of substance distribution in the body (today known as physiologically-based pharmacokinetic models) - which today may be collectively referred to as quantitative systems pharmacology models (QSP). We next describe the convergence - or rather selected integration - of PMX and QSP approaches into 'middle-out' drug-disease models, which retain selected mechanistic aspects, while remaining parsimonious, fit-for-purpose, and able to address variability and the testing of covariates. We further propose development opportunities for drug-disease systems models, to increase their utility and applicability throughout the preclinical and clinical spectrum of pharmaceutical R&D. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. A mechanistic ecohydrological model to investigate complex interactions in cold and warm water-controlled environments. 2. Spatiotemporal analyses

    Directory of Open Access Journals (Sweden)

    Simone Fatichi

    2012-05-01

    Full Text Available An ecohydrological model Tethys-Chloris (T&C described in the companion paper is applied to two semiarid systems characterized by different climate and vegetation cover conditions. The Lucky Hills watershed in Arizona represents a typical small, ``unit-source'' catchment of a desert shrub system of the U.S. southwest. Two nested basins of the Reynolds Creek Experimental watershed (Idaho, U.S.A., the Reynolds Creek Mountain East and Tollgate catchments, are representative of a semiarid cold climate with seasonal snow cover. Both exhibit a highly non-uniform vegetation cover. A range of ecohydrological metrics of the long-term model performance is presented to highlight the model capabilities in reproducing hydrological and vegetation dynamics both at the plot and the watershed scales. A diverse set of observations is used to confirm the simulated dynamics. Highly satisfactory results are obtained without significant (or any calibration efforts despite the large phase-space dimensionality of the model, the uncertainty of imposed boundary conditions, and limited data availability. It is argued that a significant investment into the model design based on the description of physical, biophysical, and ecological processes leads to such a consistent simulation skill. The simulated patterns mimic the outcome of hydrological and vegetation dynamics with high realism, as confirmed from spatially distributed remote sensing data. Further community efforts are warranted to address the issue of thorough quantitative assessment. The current lack of appropriate data hampers the development and testing of process-based ecohydrological models. It is further argued that the mechanistic nature of the T&C model can be valuable for designing virtual experiments and developing questions of scientific inquiry at a range of spatiotemporal scales.

  6. Mechanistic electronic model to simulate and predict the effect of heat stress on the functional genomics of HO-1 system: Vasodilation.

    Science.gov (United States)

    Aggarwal, Yogender; Karan, Bhuwan Mohan; Das, Barda Nand; Sinha, Rakesh Kumar

    2010-05-01

    The present work is concerned to model the molecular signalling pathway for vasodilation and to predict the resting young human forearm blood flow under heat stress. The mechanistic electronic modelling technique has been designed and implemented using MULTISIM 8.0 and an assumption of 1V/ degrees C for prediction of forearm blood flow and the digital logic has been used to design the molecular signalling pathway for vasodilation. The minimum forearm blood flow has been observed at 35 degrees C (0 ml 100 ml(-1)min(-1)) and the maximum at 42 degrees C (18.7 ml 100 ml(-1)min(-1)) environmental temperature with respect to the base value of 2 ml 100 ml(-1)min(-1). This model may also enable to identify many therapeutic targets that can be used in the treatment of inflammations and disorders due to heat-related illnesses. 2010 Elsevier Ltd. All rights reserved.

  7. Incorporation of lysosomal sequestration in the mechanistic model for prediction of tissue distribution of basic drugs.

    Science.gov (United States)

    Assmus, Frauke; Houston, J Brian; Galetin, Aleksandra

    2017-11-15

    The prediction of tissue-to-plasma water partition coefficients (Kpu) from in vitro and in silico data using the tissue-composition based model (Rodgers & Rowland, J Pharm Sci. 2005, 94(6):1237-48.) is well established. However, distribution of basic drugs, in particular into lysosome-rich lung tissue, tends to be under-predicted by this approach. The aim of this study was to develop an extended mechanistic model for the prediction of Kpu which accounts for lysosomal sequestration and the contribution of different cell types in the tissue of interest. The extended model is based on compound-specific physicochemical properties and tissue composition data to describe drug ionization, distribution into tissue water and drug binding to neutral lipids, neutral phospholipids and acidic phospholipids in tissues, including lysosomes. Physiological data on the types of cells contributing to lung, kidney and liver, their lysosomal content and lysosomal pH were collated from the literature. The predictive power of the extended mechanistic model was evaluated using a dataset of 28 basic drugs (pK a ≥7.8, 17 β-blockers, 11 structurally diverse drugs) for which experimentally determined Kpu data in rat tissue have been reported. Accounting for the lysosomal sequestration in the extended mechanistic model improved the accuracy of Kpu predictions in lung compared to the original Rodgers model (56% drugs within 2-fold or 88% within 3-fold of observed values). Reduction in the extent of Kpu under-prediction was also evident in liver and kidney. However, consideration of lysosomal sequestration increased the occurrence of over-predictions, yielding overall comparable model performances for kidney and liver, with 68% and 54% of Kpu values within 2-fold error, respectively. High lysosomal concentration ratios relative to cytosol (>1000-fold) were predicted for the drugs investigated; the extent differed depending on the lysosomal pH and concentration of acidic phospholipids among

  8. Mechanistic modelling of the drying behaviour of single pharmaceutical granules

    DEFF Research Database (Denmark)

    Thérèse F.C. Mortier, Séverine; Beer, Thomas De; Gernaey, Krist

    2012-01-01

    The trend to move towards continuous production processes in pharmaceutical applications enhances the necessity to develop mechanistic models to understand and control these processes. This work focuses on the drying behaviour of a single wet granule before tabletting, using a six...... phase (submodel 2), the water inside the granule evaporates. The second submodel contains an empirical power coefficient, b. A sensitivity analysis was performed to study the influence of parameters on the moisture content of single pharmaceutical granules, which clearly points towards the importance...

  9. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission-With an application to the 2014-2015 West Africa Ebola outbreak.

    Directory of Open Access Journals (Sweden)

    Max S Y Lau

    2017-10-01

    Full Text Available In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015. Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.

  10. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission—With an application to the 2014-2015 West Africa Ebola outbreak

    Science.gov (United States)

    McClelland, Amanda; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D.; Grenfell, Bryan T.

    2017-01-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. PMID:29084216

  11. Testing the molecular clock using mechanistic models of fossil preservation and molecular evolution.

    Science.gov (United States)

    Warnock, Rachel C M; Yang, Ziheng; Donoghue, Philip C J

    2017-06-28

    Molecular sequence data provide information about relative times only, and fossil-based age constraints are the ultimate source of information about absolute times in molecular clock dating analyses. Thus, fossil calibrations are critical to molecular clock dating, but competing methods are difficult to evaluate empirically because the true evolutionary time scale is never known. Here, we combine mechanistic models of fossil preservation and sequence evolution in simulations to evaluate different approaches to constructing fossil calibrations and their impact on Bayesian molecular clock dating, and the relative impact of fossil versus molecular sampling. We show that divergence time estimation is impacted by the model of fossil preservation, sampling intensity and tree shape. The addition of sequence data may improve molecular clock estimates, but accuracy and precision is dominated by the quality of the fossil calibrations. Posterior means and medians are poor representatives of true divergence times; posterior intervals provide a much more accurate estimate of divergence times, though they may be wide and often do not have high coverage probability. Our results highlight the importance of increased fossil sampling and improved statistical approaches to generating calibrations, which should incorporate the non-uniform nature of ecological and temporal fossil species distributions. © 2017 The Authors.

  12. Evaluation of Three Models for Simulating Pesticide Runoff from Irrigated Agricultural Fields.

    Science.gov (United States)

    Zhang, Xuyang; Goh, Kean S

    2015-11-01

    Three models were evaluated for their accuracy in simulating pesticide runoff at the edge of agricultural fields: Pesticide Root Zone Model (PRZM), Root Zone Water Quality Model (RZWQM), and OpusCZ. Modeling results on runoff volume, sediment erosion, and pesticide loss were compared with measurements taken from field studies. Models were also compared on their theoretical foundations and ease of use. For runoff events generated by sprinkler irrigation and rainfall, all models performed equally well with small errors in simulating water, sediment, and pesticide runoff. The mean absolute percentage errors (MAPEs) were between 3 and 161%. For flood irrigation, OpusCZ simulated runoff and pesticide mass with the highest accuracy, followed by RZWQM and PRZM, likely owning to its unique hydrological algorithm for runoff simulations during flood irrigation. Simulation results from cold model runs by OpusCZ and RZWQM using measured values for model inputs matched closely to the observed values. The MAPE ranged from 28 to 384 and 42 to 168% for OpusCZ and RZWQM, respectively. These satisfactory model outputs showed the models' abilities in mimicking reality. Theoretical evaluations indicated that OpusCZ and RZWQM use mechanistic approaches for hydrology simulation, output data on a subdaily time-step, and were able to simulate management practices and subsurface flow via tile drainage. In contrast, PRZM operates at daily time-step and simulates surface runoff using the USDA Soil Conservation Service's curve number method. Among the three models, OpusCZ and RZWQM were suitable for simulating pesticide runoff in semiarid areas where agriculture is heavily dependent on irrigation. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  13. A Mechanistic Beta-Binomial Probability Model for mRNA Sequencing Data.

    Science.gov (United States)

    Smith, Gregory R; Birtwistle, Marc R

    2016-01-01

    A main application for mRNA sequencing (mRNAseq) is determining lists of differentially-expressed genes (DEGs) between two or more conditions. Several software packages exist to produce DEGs from mRNAseq data, but they typically yield different DEGs, sometimes markedly so. The underlying probability model used to describe mRNAseq data is central to deriving DEGs, and not surprisingly most softwares use different models and assumptions to analyze mRNAseq data. Here, we propose a mechanistic justification to model mRNAseq as a binomial process, with data from technical replicates given by a binomial distribution, and data from biological replicates well-described by a beta-binomial distribution. We demonstrate good agreement of this model with two large datasets. We show that an emergent feature of the beta-binomial distribution, given parameter regimes typical for mRNAseq experiments, is the well-known quadratic polynomial scaling of variance with the mean. The so-called dispersion parameter controls this scaling, and our analysis suggests that the dispersion parameter is a continually decreasing function of the mean, as opposed to current approaches that impose an asymptotic value to the dispersion parameter at moderate mean read counts. We show how this leads to current approaches overestimating variance for moderately to highly expressed genes, which inflates false negative rates. Describing mRNAseq data with a beta-binomial distribution thus may be preferred since its parameters are relatable to the mechanistic underpinnings of the technique and may improve the consistency of DEG analysis across softwares, particularly for moderately to highly expressed genes.

  14. Problems in mechanistic theoretical models for cell transformation by ionizing radiation

    International Nuclear Information System (INIS)

    Chatterjee, Aloke; Holley, W.R.

    1992-01-01

    A mechanistic model based on yields of double strand breaks has been developed to determine the dose response curves for cell transformation frequencies. At its present stage the model is applicable to immortal cell lines and to various qualities (X-rays, Neon and Iron) of ionizing radiation. Presently, we have considered four types of processes which can lead to activation phenomena: (i) point mutation events on a regulatory segment of selected oncogenes, (ii) inactivation of suppressor genes, through point mutation, (iii) deletion of a suppressor gene by a single track, and (iv) deletion of a suppressor gene by two tracks. (author)

  15. Mechanistic modeling for mammography screening risks

    International Nuclear Information System (INIS)

    Bijwaard, Harmen

    2008-01-01

    Full text: Western populations show a very high incidence of breast cancer and in many countries mammography screening programs have been set up for the early detection of these cancers. Through these programs large numbers of women (in the Netherlands, 700.000 per year) are exposed to low but not insignificant X-ray doses. ICRP based risk estimates indicate that the number of breast cancer casualties due to mammography screening can be as high as 50 in the Netherlands per year. The number of lives saved is estimated to be much higher, but for an accurate calculation of the benefits of screening a better estimate of these risks is indispensable. Here it is attempted to better quantify the radiological risks of mammography screening through the application of a biologically based model for breast tumor induction by X-rays. The model is applied to data obtained from the National Institutes of Health in the U.S. These concern epidemiological data of female TB patients who received high X-ray breast doses in the period 1930-1950 through frequent fluoroscopy of their lungs. The mechanistic model that is used to describe the increased breast cancer incidence is based on an earlier study by Moolgavkar et al. (1980), in which the natural background incidence of breast cancer was modeled. The model allows for a more sophisticated extrapolation of risks to the low dose X-ray exposures that are common in mammography screening and to the higher ages that are usually involved. Furthermore, it allows for risk transfer to other (non-western) populations. The results have implications for decisions on the frequency of screening, the number of mammograms taken at each screening, minimum and maximum ages for screening and the transfer to digital equipment. (author)

  16. MECHANISTIC KINETIC MODELS FOR STEAM REFORMING OF CONCENTRATED CRUDE ETHANOL ON NI/AL2O3 CATALYST

    Directory of Open Access Journals (Sweden)

    O. A. OLAFADEHAN

    2015-05-01

    Full Text Available Mechanistic kinetic models were postulated for the catalytic steam reforming of concentrated crude ethanol on a Ni-based commercial catalyst at atmosphere pressure in the temperature range of 673-863 K, and at different catalyst weight to the crude ethanol molar flow rate ratio (in the range 0.9645-9.6451 kg catalyst h/kg mole crude ethanol in a stainless steel packed bed tubular microreactor. The models were based on Langmuir-Hinshelwood-Hougen-Watson (LHHW and Eley-Rideal (ER mechanisms. The optimization routine of Nelder-Mead simplex algorithm was used to estimate the inherent kinetic parameters in the proposed models. The selection of the best kinetic model amongst the rival kinetic models was based on physicochemical, statistical and thermodynamic scrutinies. The rate determining step for the steam reforming of concentrated crude ethanol on Ni/Al2O3 catalyst was found to be surface reaction between chemisorbed CH3O and O when hydrogen and oxygen were adsorbed as monomolecular species on the catalyst surface. Excellent agreement was obtained between the experimental rate of reaction and conversion of crude ethanol, and the simulated results, with ADD% being ±0.46.

  17. Reducing Uncertainty in the Daycent Model of Heterotrophic Respiration with a More Mechanistic Representation of Microbial Processes.

    Science.gov (United States)

    Berardi, D.; Gomez-Casanovas, N.; Hudiburg, T. W.

    2017-12-01

    Improving the certainty of ecosystem models is essential to ensuring their legitimacy, value, and ability to inform management and policy decisions. With more than a century of research exploring the variables controlling soil respiration, a high level of uncertainty remains in the ability of ecosystem models to accurately estimate respiration with changing climatic conditions. Refining model estimates of soil carbon fluxes is a high priority for climate change scientists to determine whether soils will be carbon sources or sinks in the future. We found that DayCent underestimates heterotrophic respiration by several magnitudes for our temperate mixed conifer forest site. While traditional ecosystem models simulate decomposition through first order kinetics, recent research has found that including microbial mechanisms explains 20 percent more spatial heterogeneity. We manipulated the DayCent heterotrophic respiration model to include a more mechanistic representation of microbial dynamic and compared the new model with continuous and survey observations from our experimental forest site in the Northern Rockies ecoregion. We also calibrated the model's sensitivity to soil moisture and temperature to our experimental data. We expect to improve the accuracy of the model by 20-30 percent. By using a more representative and calibrated model of soil carbon dynamics, we can better predict feedbacks between climate and soil carbon pools.

  18. Capabilities of a mechanistic model for containment condenser simulation

    International Nuclear Information System (INIS)

    Broxtermann, Philipp; Cron, Daniel von der; Allelein, Hans-Josef

    2011-01-01

    In this paper the first application of the new containment COndenser MOdule (COMO) is being presented. COMO, just under development, represents a newly introduced part of the German containment code system COCOSYS and evaluates the contribution of containment condensers (CC) to passive containment cooling. Several next-generation Light Water Reactors (LWR) will feature innovative systems for passive heat removal during accidents from both the primary circuit and the containment. The passivity is based on natural driving forces only, such as gravity and natural circulation. To investigate the complex thermal-hydraulics and their propagation within a containment during accidents, containment code systems have been developed and validated against a wide variety of experiments. Furthermore, these codes are constantly improved to meet the intensified interest and knowledge in single phenomena and the technological evolution of the actual containment systems, e.g. the installation of passive devices. Accordingly, COCOSYS has been developed and is being continuously enhanced by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) and other partners like LRST. As has been shown by GRS and the calculation of the PANDA BC4 experiment, the interaction between a CC and the atmosphere of the surrounding vessel can be reproduced with the help of COCOSYS. However, up to now this has only been achieved on account of good knowledge of the outcome of the experiment, the user's skills and a complex input deck. The main goal of the newly introduced COMO is to improve the simulation of physical processes of CCs. This will be achieved by considering the passive driving forces which the CCs are based on. Up to now, a natural circulation within the CC tubes has been realized. The simulation of boiling conditions and the impact on the flow will be addressed in future studies. Additionally, the application of CCs to reactor simulation is being simplified and thus is supposed to reduce

  19. BIOMAP A Daily Time Step, Mechanistic Model for the Study of Ecosystem Dynamics

    Science.gov (United States)

    Wells, J. R.; Neilson, R. P.; Drapek, R. J.; Pitts, B. S.

    2010-12-01

    BIOMAP simulates competition between two Plant Functional Types (PFT) at any given point in the conterminous U.S. using a time series of daily temperature (mean, minimum, maximum), precipitation, humidity, light and nutrients, with PFT-specific rooting within a multi-layer soil. The model employs a 2-layer canopy biophysics, Farquhar photosynthesis, the Beer-Lambert Law for light attenuation and a mechanistic soil hydrology. In essence, BIOMAP is a re-built version of the biogeochemistry model, BIOME-BGC, into the form of the MAPSS biogeography model. Specific enhancements are: 1) the 2-layer canopy biophysics of Dolman (1993); 2) the unique MAPSS-based hydrology, which incorporates canopy evaporation, snow dynamics, infiltration and saturated and unsaturated percolation with ‘fast’ flow and base flow and a ‘tunable aquifer’ capacity, a metaphor of D’Arcy’s Law; and, 3) a unique MAPSS-based stomatal conductance algorithm, which simultaneously incorporates vapor pressure and soil water potential constraints, based on physiological information and many other improvements. Over small domains the PFTs can be parameterized as individual species to investigate fundamental vs. potential niche theory; while, at more coarse scales the PFTs can be rendered as more general functional groups. Since all of the model processes are intrinsically leaf to plot scale (physiology to PFT competition), it essentially has no ‘intrinsic’ scale and can be implemented on a grid of any size, taking on the characteristics defined by the homogeneous climate of each grid cell. Currently, the model is implemented on the VEMAP 1/2 degree, daily grid over the conterminous U.S. Although both the thermal and water-limited ecotones are dynamic, following climate variability, the PFT distributions remain fixed. Thus, the model is currently being fitted with a ‘reproduction niche’ to allow full dynamic operation as a Dynamic General Vegetation Model (DGVM). While global simulations

  20. Modeling and validation of a mechanistic tool (MEFISTO) for the prediction of critical power in BWR fuel assemblies

    International Nuclear Information System (INIS)

    Adamsson, Carl; Le Corre, Jean-Marie

    2011-01-01

    Highlights: → The MEFISTO code efficiently and accurately predicts the dryout event in a BWR fuel bundle, using a mechanistic model. → A hybrid approach between a fast and robust sub-channel analysis and a three-field two-phase analysis is adopted. → MEFISTO modeling approach, calibration, CPU usage, sensitivity, trend analysis and performance evaluation are presented. → The calibration parameters and process were carefully selected to preserve the mechanistic nature of the code. → The code dryout prediction performance is near the level of fuel-specific empirical dryout correlations. - Abstract: Westinghouse is currently developing the MEFISTO code with the main goal to achieve fast, robust, practical and reliable prediction of steady-state dryout Critical Power in Boiling Water Reactor (BWR) fuel bundle based on a mechanistic approach. A computationally efficient simulation scheme was used to achieve this goal, where the code resolves all relevant field (drop, steam and multi-film) mass balance equations, within the annular flow region, at the sub-channel level while relying on a fast and robust two-phase (liquid/steam) sub-channel solution to provide the cross-flow information. The MEFISTO code can hence provide highly detailed solution of the multi-film flow in BWR fuel bundle while enhancing flexibility and reducing the computer time by an order of magnitude as compared to a standard three-field sub-channel analysis approach. Models for the numerical computation of the one-dimensional field flowrate distributions in an open channel (e.g. a sub-channel), including the numerical treatment of field cross-flows, part-length rods, spacers grids and post-dryout conditions are presented in this paper. The MEFISTO code is then applied to dryout prediction in BWR fuel bundle using VIPRE-W as a fast and robust two-phase sub-channel driver code. The dryout power is numerically predicted by iterating on the bundle power so that the minimum film flowrate in the

  1. Comparison of Two Models for Damage Accumulation in Simulations of System Performance

    Energy Technology Data Exchange (ETDEWEB)

    Youngblood, R. [Idaho National Laboratory, Idaho Falls, ID (United States); Mandelli, D. [Idaho National Laboratory, Idaho Falls, ID (United States)

    2015-11-01

    A comprehensive simulation study of system performance needs to address variations in component behavior, variations in phenomenology, and the coupling between phenomenology and component failure. This paper discusses two models of this: 1. damage accumulation is modeled as a random walk process in each time history, with component failure occurring when damage accumulation reaches a specified threshold; or 2. damage accumulation is modeled mechanistically within each time history, but failure occurs when damage reaches a time-history-specific threshold, sampled at time zero from each component’s distribution of damage tolerance. A limiting case of the latter is classical discrete-event simulation, with component failure times sampled a priori from failure time distributions; but in such models, the failure times are not typically adjusted for operating conditions varying within a time history. Nowadays, as discussed below, it is practical to account for this. The paper compares the interpretations and computational aspects of the two models mentioned above.

  2. Mechanistic movement models to understand epidemic spread.

    Science.gov (United States)

    Fofana, Abdou Moutalab; Hurford, Amy

    2017-05-05

    An overlooked aspect of disease ecology is considering how and why animals come into contact with one and other resulting in disease transmission. Mathematical models of disease spread frequently assume mass-action transmission, justified by stating that susceptible and infectious hosts mix readily, and foregoing any detailed description of host movement. Numerous recent studies have recorded, analysed and modelled animal movement. These movement models describe how animals move with respect to resources, conspecifics and previous movement directions and have been used to understand the conditions for the occurrence and the spread of infectious diseases when hosts perform a type of movement. Here, we summarize the effect of the different types of movement on the threshold conditions for disease spread. We identify gaps in the literature and suggest several promising directions for future research. The mechanistic inclusion of movement in epidemic models may be beneficial for the following two reasons. Firstly, the estimation of the transmission coefficient in an epidemic model is possible because animal movement data can be used to estimate the rate of contacts between conspecifics. Secondly, unsuccessful transmission events, where a susceptible host contacts an infectious host but does not become infected can be quantified. Following an outbreak, this enables disease ecologists to identify 'near misses' and to explore possible alternative epidemic outcomes given shifts in ecological or immunological parameters.This article is part of the themed issue 'Opening the black box: re-examining the ecology and evolution of parasite transmission'. © 2017 The Author(s).

  3. Mechanistic modelling of genetic and epigenetic events in radiation carcinogenesis

    International Nuclear Information System (INIS)

    Andreev, S. G.; Eidelman, Y. A.; Salnikov, I. V.; Khvostunov, I. K.

    2006-01-01

    Methodological problems arise on the way of radiation carcinogenesis modelling with the incorporation of radiobiological and cancer biology mechanistic data. The results of biophysical modelling of different endpoints [DNA DSB induction, repair, chromosome aberrations (CA) and cell proliferation] are presented and applied to the analysis of RBE-LET relationships for radiation-induced neoplastic transformation (RINT) of C3H/10T1/2 cells in culture. Predicted values for some endpoints correlate well with the data. It is concluded that slowly repaired DSB clusters, as well as some kind of CA, may be initiating events for RINT. As an alternative interpretation, it is possible that DNA damage can induce RINT indirectly via epigenetic process. A hypothetical epigenetic pathway for RINT is discussed. (authors)

  4. Prediction of net hepatic release of glucose using a “hybrid” mechanistic model in ruminants applied to positive energy balance

    OpenAIRE

    Bahloul, Lahlou; Ortigues, Isabelle; Vernet, Jean; Lapierre, Helène; Noziere, Pierre; Sauvant, Daniel

    2013-01-01

    Ruminants depend on hepatic gluconeogenesis to meet most of their metabolic demand for glucose which relies on availability of precursors from diet supply and animal requirements (Loncke et al., 2010). Several mechanistic models of the metabolic fate of nutrients across the liver exist that have been parameterized for dairy cows. They cannot be directly used to predict hepatic gluconeogenesis in all types of ruminants in different physiological status. A hybrid mechanistic model of nutrient f...

  5. Mechanistic systems modeling to guide drug discovery and development.

    Science.gov (United States)

    Schmidt, Brian J; Papin, Jason A; Musante, Cynthia J

    2013-02-01

    A crucial question that must be addressed in the drug development process is whether the proposed therapeutic target will yield the desired effect in the clinical population. Pharmaceutical and biotechnology companies place a large investment on research and development, long before confirmatory data are available from human trials. Basic science has greatly expanded the computable knowledge of disease processes, both through the generation of large omics data sets and a compendium of studies assessing cellular and systemic responses to physiologic and pathophysiologic stimuli. Given inherent uncertainties in drug development, mechanistic systems models can better inform target selection and the decision process for advancing compounds through preclinical and clinical research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Mechanistic modeling of biocorrosion caused by biofilms of sulfate reducing bacteria and acid producing bacteria.

    Science.gov (United States)

    Xu, Dake; Li, Yingchao; Gu, Tingyue

    2016-08-01

    Biocorrosion is also known as microbiologically influenced corrosion (MIC). Most anaerobic MIC cases can be classified into two major types. Type I MIC involves non-oxygen oxidants such as sulfate and nitrate that require biocatalysis for their reduction in the cytoplasm of microbes such as sulfate reducing bacteria (SRB) and nitrate reducing bacteria (NRB). This means that the extracellular electrons from the oxidation of metal such as iron must be transported across cell walls into the cytoplasm. Type II MIC involves oxidants such as protons that are secreted by microbes such as acid producing bacteria (APB). The biofilms in this case supply the locally high concentrations of oxidants that are corrosive without biocatalysis. This work describes a mechanistic model that is based on the biocatalytic cathodic sulfate reduction (BCSR) theory. The model utilizes charge transfer and mass transfer concepts to describe the SRB biocorrosion process. The model also includes a mechanism to describe APB attack based on the local acidic pH at a pit bottom. A pitting prediction software package has been created based on the mechanisms. It predicts long-term pitting rates and worst-case scenarios after calibration using SRB short-term pit depth data. Various parameters can be investigated through computer simulation. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Mechanistic modeling analysis of micro-evolutive responses from a Caenorhabditis elegans population exposed to a radioactive metallic stress

    International Nuclear Information System (INIS)

    Goussen, Benoit

    2013-01-01

    The evolution of toxic effects at a relevant scale is an important challenge for the ecosystem protection. Indeed, pollutants may impact populations over long-term and represent a new evolutionary force which can be adding itself to the natural selection forces. Thereby, it is necessary to acquire knowledge on the phenotypics and genetics changes that may appear in populations submitted to stress over several generations. Usually statistical analyses are performed to analyse such multi-generational studies. The use of a mechanistic mathematical model may provide a way to fully understand the impact of pollutants on the populations' dynamics. Such kind of model allows the integration of biological and toxic processes into the analysis of eco-toxicological data and the assessment of interactions between these processes. The aim of this Ph.D. project was to assess the contributions of the mechanistic modelling to the analysis of evolutionary experiment assessing long-term exposure. To do so, a three step strategy has been developed. Foremost, a multi-generational study was performed to assess the evolution of two populations of the ubiquitous nematode Caenorhabditis elegans in control conditions or exposed to 1.1 mM of uranium. Several generations were selected to assess growth, reproduction, and dose-responses relationships, through exposure to a range of concentrations (from 0 to 1.2 mM U) with all endpoints measured daily. A first statistical analysis was then performed. In a second step, a bio-energetic model adapted to the assessment of eco-toxicological data (DEBtox) was developed on C. elegans. Its numerical behaviour was analysed. Finally, this model was applied to all the selected generations in order to infer parameters values for the two populations and to assess their evolutions. Results highlighted an impact of the uranium starting from 0.4 mM U on both C. elegans' growth and reproduction. Results from the mechanistic analysis indicate this effect is due

  8. Development of a mechanistic model for release of radionuclides from spent fuel in brines: Salt Repository Project

    International Nuclear Information System (INIS)

    Reimus, P.W.; Windisch, C.F.

    1988-03-01

    At present there are no comprehensive mechanistic models describing the release of radionuclides from spent fuel in brine environments. This report provides a comprehensive review of the various factors that can affect radionuclide release from spent fuel, suggests a modeling approach, and discusses proposed experiments for obtaining a better mechanistic understanding of the radionuclide release processes. Factors affecting radionuclide release include the amount, location, and disposition of radionuclides in the fuel and environmental factors such as redox potential, pH, the presence of complexing anions, temperature, and radiolysis. It is concluded that a model describing the release of radionuclides from spent fuel should contain separate terms for release from the gap, grain boundaries, and grains of the fuel. Possible functional forms for these terms are discussed in the report. Experiments for assessing their validity and obtaining key model parameters are proposed. 71 refs., 4 figs., 6 tabs

  9. A mechanistic approach to postirradiation spoilage kinetics of fish

    International Nuclear Information System (INIS)

    Tukenmez, I.

    2004-01-01

    Full text: In order to simulate postirradiation spoilage of fish, the mechanistic aspects of the growth of surviving microorganisms during chill storage and their product formation in irradiated fish were analyzed. Anchovy (Engraulis encrasicholus) samples those unirradiated and irradiated at 1, 2 and 3 kGy doses of gamma radiation were stored at +2 o C for 21 days. Total bacterial counts (TBC) and trimethylamine (TMA) analysis of the samples were done periodically during storage. Depending on the proposed spoilage mechanism, kinetic model equations were derived. By using experimental data of TBC and TMA in the developed model, the postirradiation spoilage parameters including growth rate constant, inital and maximum attainable TBC, lag time and TMA yield were evaluated and microbial spoilage of fish was simulated for postirradiation storage. Shelf life of irradiated fish was estimated depending on the spoilage kinetics. Dose effects on the kinetic parameters were analyzed. It is suggested that the kinetic evaluation method developed in this study may be used for quality assessment, shelf life determination and dose optimization for radiation preservation of fish

  10. Mechanistic Modeling of Water Replenishment Rate of Zeer Refrigerator

    Directory of Open Access Journals (Sweden)

    B. N. Nwankwojike

    2017-06-01

    Full Text Available A model for predicting the water replenishment rate of zeer pot refrigerator was developed in this study using mechanistic modeling approach and evaluated at Obowo, Imo State, Nigeria using six fruits, tomatoes, guava, okra, banana, orange and avocado pear. The developed model confirmed zeer pot water replenishment rate as a function of ambient temperature, relative humidity, wind speed, thermal conductivity of the pot materials and sand, density of air and water vapor, permeability coefficient of clay and heat transfer coefficient of water into air, circumferential length, height of pot, geometrical profile of the pot, heat load of the food preserved, heat flow into the device and gradient at which the pot is placed above ground level. Compared to the conventional approach of water replenishment, performance analysis results revealed 44% to 58% water economy when the zeer pot’s water was replenished based on the model’s prediction; while there was no significant difference in the shelf-life of the fruits preserved with both replenishment methods. Application of the developed water replenishment model facilitates optimal water usage in this system, thereby reducing operational cost of zeer pot refrigerator.

  11. Growth and lipid production of Umbelopsis isabellina on a solid substrate - Mechanistic modeling and validation

    NARCIS (Netherlands)

    Meeuwse, P.; Klok, A.J.; Haemers, S.; Tramper, J.; Rinzema, A.

    2012-01-01

    Microbial lipids are an interesting feedstock for biodiesel. Their production from agricultural waste streams by fungi cultivated in solid-state fermentation may be attractive, but the yield of this process is still quite low. In this article, a mechanistic model is presented that describes growth,

  12. Rapid Discrimination Among Putative Mechanistic Models of Biochemical Systems.

    Science.gov (United States)

    Lomnitz, Jason G; Savageau, Michael A

    2016-08-31

    An overarching goal in molecular biology is to gain an understanding of the mechanistic basis underlying biochemical systems. Success is critical if we are to predict effectively the outcome of drug treatments and the development of abnormal phenotypes. However, data from most experimental studies is typically noisy and sparse. This allows multiple potential mechanisms to account for experimental observations, and often devising experiments to test each is not feasible. Here, we introduce a novel strategy that discriminates among putative models based on their repertoire of qualitatively distinct phenotypes, without relying on knowledge of specific values for rate constants and binding constants. As an illustration, we apply this strategy to two synthetic gene circuits exhibiting anomalous behaviors. Our results show that the conventional models, based on their well-characterized components, cannot account for the experimental observations. We examine a total of 40 alternative hypotheses and show that only 5 have the potential to reproduce the experimental data, and one can do so with biologically relevant parameter values.

  13. Mechanistic modelling of the corrosion behaviour of copper nuclear fuel waste containers

    Energy Technology Data Exchange (ETDEWEB)

    King, F; Kolar, M

    1996-10-01

    A mechanistic model has been developed to predict the long-term corrosion behaviour of copper nuclear fuel waste containers in a Canadian disposal vault. The model is based on a detailed description of the electrochemical, chemical, adsorption and mass-transport processes involved in the uniform corrosion of copper, developed from the results of an extensive experimental program. Predictions from the model are compared with the results of some of these experiments and with observations from a bronze cannon submerged in seawater saturated clay sediments. Quantitative comparisons are made between the observed and predicted corrosion potential, corrosion rate and copper concentration profiles adjacent to the corroding surface, as a way of validating the long-term model predictions. (author). 12 refs., 5 figs.

  14. Productivity of "collisions generate heat" for reconciling an energy model with mechanistic reasoning: A case study

    Science.gov (United States)

    Scherr, Rachel E.; Robertson, Amy D.

    2015-06-01

    We observe teachers in professional development courses about energy constructing mechanistic accounts of energy transformations. We analyze a case in which teachers investigating adiabatic compression develop a model of the transformation of kinetic energy to thermal energy. Among their ideas is the idea that thermal energy is generated as a byproduct of individual particle collisions, which is represented in science education research literature as an obstacle to learning. We demonstrate that in this instructional context, the idea that individual particle collisions generate thermal energy is not an obstacle to learning, but instead is productive: it initiates intellectual progress. Specifically, this idea initiates the reconciliation of the teachers' energy model with mechanistic reasoning about adiabatic compression, and leads to a canonically correct model of the transformation of kinetic energy into thermal energy. We claim that the idea's productivity is influenced by features of our particular instructional context, including the instructional goals of the course, the culture of collaborative sense making, and the use of certain representations of energy.

  15. Mechanistic modeling of insecticide risks to breeding birds in ...

    Science.gov (United States)

    Insecticide usage in the United States is ubiquitous in urban, suburban, and rural environments. In evaluating data for an insecticide registration application and for registration review, scientists at the United States Environmental Protection Agency (USEPA) assess the fate of the insecticide and the risk the insecticide poses to the environment and non-target wildlife. At the present time, current USEPA risk assessments do not include population-level endpoints. In this paper, we present a new mechanistic model, which allows risk assessors to estimate the effects of insecticide exposure on the survival and seasonal productivity of birds known to use agricultural fields during their breeding season. The new model was created from two existing USEPA avian risk assessment models, the Terrestrial Investigation Model (TIM v.3.0) and the Markov Chain Nest Productivity model (MCnest). The integrated TIM/MCnest model has been applied to assess the relative risk of 12 insecticides used to control corn pests on a suite of 31 avian species known to use cornfields in midwestern agroecosystems. The 12 insecticides that were assessed in this study are all used to treat major pests of corn (corn root worm borer, cutworm, and armyworm). After running the integrated TIM/MCnest model, we found extensive differences in risk to birds among insecticides, with chlorpyrifos and malathion (organophosphates) generally posing the greatest risk, and bifenthrin and ë-cyhalothrin (

  16. Fast Biological Modeling for Voxel-based Heavy Ion Treatment Planning Using the Mechanistic Repair-Misrepair-Fixation Model and Nuclear Fragment Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Kamp, Florian [Department of Therapeutic Radiology, Yale University School of Medicine, New Haven, Connecticut (United States); Department of Radiation Oncology, Technische Universität München, Klinikum Rechts der Isar, München (Germany); Physik-Department, Technische Universität München, Garching (Germany); Cabal, Gonzalo [Experimental Physics–Medical Physics, Ludwig Maximilians University Munich, Garching (Germany); Mairani, Andrea [Medical Physics Unit, Centro Nazionale Adroterapia Oncologica (CNAO), Pavia (Italy); Heidelberg Ion-Beam Therapy Center, Heidelberg (Germany); Parodi, Katia [Experimental Physics–Medical Physics, Ludwig Maximilians University Munich, Garching (Germany); Wilkens, Jan J. [Department of Radiation Oncology, Technische Universität München, Klinikum Rechts der Isar, München (Germany); Physik-Department, Technische Universität München, Garching (Germany); Carlson, David J., E-mail: david.j.carlson@yale.edu [Department of Therapeutic Radiology, Yale University School of Medicine, New Haven, Connecticut (United States)

    2015-11-01

    Purpose: The physical and biological differences between heavy ions and photons have not been fully exploited and could improve treatment outcomes. In carbon ion therapy, treatment planning must account for physical properties, such as the absorbed dose and nuclear fragmentation, and for differences in the relative biological effectiveness (RBE) of ions compared with photons. We combined the mechanistic repair-misrepair-fixation (RMF) model with Monte Carlo-generated fragmentation spectra for biological optimization of carbon ion treatment plans. Methods and Materials: Relative changes in double-strand break yields and radiosensitivity parameters with particle type and energy were determined using the independently benchmarked Monte Carlo damage simulation and the RMF model to estimate the RBE values for primary carbon ions and secondary fragments. Depth-dependent energy spectra were generated with the Monte Carlo code FLUKA for clinically relevant initial carbon ion energies. The predicted trends in RBE were compared with the published experimental data. Biological optimization for carbon ions was implemented in a 3-dimensional research treatment planning tool. Results: We compared the RBE and RBE-weighted dose (RWD) distributions of different carbon ion treatment scenarios with and without nuclear fragments. The inclusion of fragments in the simulations led to smaller RBE predictions. A validation of RMF against measured cell survival data reported in published studies showed reasonable agreement. We calculated and optimized the RWD distributions on patient data and compared the RMF predictions with those from other biological models. The RBE values in an astrocytoma tumor ranged from 2.2 to 4.9 (mean 2.8) for a RWD of 3 Gy(RBE) assuming (α/β){sub X} = 2 Gy. Conclusions: These studies provide new information to quantify and assess uncertainties in the clinically relevant RBE values for carbon ion therapy based on biophysical mechanisms. We present results from

  17. Mutual Dependence Between Sedimentary Organic Carbon and Infaunal Macrobenthos Resolved by Mechanistic Modeling

    Science.gov (United States)

    Zhang, Wenyan; Wirtz, Kai

    2017-10-01

    The mutual dependence between sedimentary total organic carbon (TOC) and infaunal macrobenthos is here quantified by a mechanistic model. The model describes (i) the vertical distribution of infaunal macrobenthic biomass resulting from a trade-off between nutritional benefit (quantity and quality of TOC) and the costs of burial (respiration) and mortality, and (ii) the variable vertical distribution of TOC being in turn shaped by bioturbation of local macrobenthos. In contrast to conventional approaches, our model emphasizes variations of bioturbation both spatially and temporally depending on local food resources and macrobenthic biomass. Our implementation of the dynamic interaction between TOC and infaunal macrobenthos is able to capture a temporal benthic response to both depositional and erosional environments and provides improved estimates of the material exchange flux at the sediment-water interface. Applications to literature data for the North Sea demonstrate the robustness and accuracy of the model and its potential as an analysis tool for the status of TOC and macrobenthos in marine sediments. Results indicate that the vertical distribution of infaunal biomass is shaped by both the quantity and the quality of OC, while the community structure is determined only by the quality of OC. Bioturbation intensity may differ by 1 order of magnitude over different seasons owing to variations in the OC input, resulting in a significant modulation on the distribution of OC. Our relatively simple implementation may further improve models of early diagenesis and marine food web dynamics by mechanistically connecting the vertical distribution of both TOC and macrobenthic biomass.

  18. Proceedings of the international workshop on mechanistic understanding of radionuclide migration in compacted/intact systems

    International Nuclear Information System (INIS)

    Tachi, Yukio; Yui, Mikazu

    2010-03-01

    The international workshop on mechanistic understanding of radionuclide migration in compacted / intact systems was held at ENTRY, JAEA, Tokai on 21st - 23rd January, 2009. This workshop was hosted by Japan Atomic Energy Agency (JAEA) as part of the project on the mechanistic model/database development for radionuclide sorption and diffusion behavior in compacted / intact systems. The overall goal of the project is to develop the mechanistic model / database for a consistent understanding and prediction of migration parameters and its uncertainties for performance assessment of geological disposal of radioactive waste. The objective of the workshop is to integrate the state-of-the-art of mechanistic sorption and diffusion model in compacted / intact systems, especially in bentonite / clay systems, and discuss the JAEA's mechanistic approaches and future challenges, especially the following discussions points; 1) What's the status and difficulties for mechanistic model/database development? 2) What's the status and difficulties for applicability of mechanistic model to the compacted/intact system? 3) What's the status and difficulties for obtaining evidences for mechanistic model? 4) What's the status and difficulties for standardization of experimental methodology for batch sorption and diffusion? 5) What's the uncertainties of transport parameters in radionuclides migration analysis due to a lack of understanding/experimental methodologies, and how do we derive them? This report includes workshop program, overview and materials of each presentation, summary of discussions. (author)

  19. Linking spring phenology with mechanistic models of host movement to predict disease transmission risk

    Science.gov (United States)

    Merkle, Jerod A.; Cross, Paul C.; Scurlock, Brandon M.; Cole, Eric K.; Courtemanch, Alyson B.; Dewey, Sarah R.; Kauffman, Matthew J.

    2018-01-01

    Disease models typically focus on temporal dynamics of infection, while often neglecting environmental processes that determine host movement. In many systems, however, temporal disease dynamics may be slow compared to the scale at which environmental conditions alter host space-use and accelerate disease transmission.Using a mechanistic movement modelling approach, we made space-use predictions of a mobile host (elk [Cervus Canadensis] carrying the bacterial disease brucellosis) under environmental conditions that change daily and annually (e.g., plant phenology, snow depth), and we used these predictions to infer how spring phenology influences the risk of brucellosis transmission from elk (through aborted foetuses) to livestock in the Greater Yellowstone Ecosystem.Using data from 288 female elk monitored with GPS collars, we fit step selection functions (SSFs) during the spring abortion season and then implemented a master equation approach to translate SSFs into predictions of daily elk distribution for five plausible winter weather scenarios (from a heavy snow, to an extreme winter drought year). We predicted abortion events by combining elk distributions with empirical estimates of daily abortion rates, spatially varying elk seroprevelance and elk population counts.Our results reveal strong spatial variation in disease transmission risk at daily and annual scales that is strongly governed by variation in host movement in response to spring phenology. For example, in comparison with an average snow year, years with early snowmelt are predicted to have 64% of the abortions occurring on feedgrounds shift to occurring on mainly public lands, and to a lesser extent on private lands.Synthesis and applications. Linking mechanistic models of host movement with disease dynamics leads to a novel bridge between movement and disease ecology. Our analysis framework offers new avenues for predicting disease spread, while providing managers tools to proactively mitigate

  20. Unification and mechanistic detail as drivers of model construction: models of networks in economics and sociology.

    Science.gov (United States)

    Kuorikoski, Jaakko; Marchionni, Caterina

    2014-12-01

    We examine the diversity of strategies of modelling networks in (micro) economics and (analytical) sociology. Field-specific conceptions of what explaining (with) networks amounts to or systematic preference for certain kinds of explanatory factors are not sufficient to account for differences in modelling methodologies. We argue that network models in both sociology and economics are abstract models of network mechanisms and that differences in their modelling strategies derive to a large extent from field-specific conceptions of the way in which a good model should be a general one. Whereas the economics models aim at unification, the sociological models aim at a set of mechanism schemas that are extrapolatable to the extent that the underlying psychological mechanisms are general. These conceptions of generality induce specific biases in mechanistic explanation and are related to different views of when knowledge from different fields should be seen as relevant.

  1. Higher plant modelling for life support applications: first results of a simple mechanistic model

    Science.gov (United States)

    Hezard, Pauline; Dussap, Claude-Gilles; Sasidharan L, Swathy

    2012-07-01

    In the case of closed ecological life support systems, the air and water regeneration and food production are performed using microorganisms and higher plants. Wheat, rice, soybean, lettuce, tomato or other types of eatable annual plants produce fresh food while recycling CO2 into breathable oxygen. Additionally, they evaporate a large quantity of water, which can be condensed and used as potable water. This shows that recycling functions of air revitalization and food production are completely linked. Consequently, the control of a growth chamber for higher plant production has to be performed with efficient mechanistic models, in order to ensure a realistic prediction of plant behaviour, water and gas recycling whatever the environmental conditions. Purely mechanistic models of plant production in controlled environments are not available yet. This is the reason why new models must be developed and validated. This work concerns the design and test of a simplified version of a mathematical model coupling plant architecture and mass balance purposes in order to compare its results with available data of lettuce grown in closed and controlled chambers. The carbon exchange rate, water absorption and evaporation rate, biomass fresh weight as well as leaf surface are modelled and compared with available data. The model consists of four modules. The first one evaluates plant architecture, like total leaf surface, leaf area index and stem length data. The second one calculates the rate of matter and energy exchange depending on architectural and environmental data: light absorption in the canopy, CO2 uptake or release, water uptake and evapotranspiration. The third module evaluates which of the previous rates is limiting overall biomass growth; and the last one calculates biomass growth rate depending on matter exchange rates, using a global stoichiometric equation. All these rates are a set of differential equations, which are integrated with time in order to provide

  2. VIC-CropSyst-v2: A regional-scale modeling platform to simulate the nexus of climate, hydrology, cropping systems, and human decisions

    Science.gov (United States)

    Malek, Keyvan; Stöckle, Claudio; Chinnayakanahalli, Kiran; Nelson, Roger; Liu, Mingliang; Rajagopalan, Kirti; Barik, Muhammad; Adam, Jennifer C.

    2017-08-01

    Food supply is affected by a complex nexus of land, atmosphere, and human processes, including short- and long-term stressors (e.g., drought and climate change, respectively). A simulation platform that captures these complex elements can be used to inform policy and best management practices to promote sustainable agriculture. We have developed a tightly coupled framework using the macroscale variable infiltration capacity (VIC) hydrologic model and the CropSyst agricultural model. A mechanistic irrigation module was also developed for inclusion in this framework. Because VIC-CropSyst combines two widely used and mechanistic models (for crop phenology, growth, management, and macroscale hydrology), it can provide realistic and hydrologically consistent simulations of water availability, crop water requirements for irrigation, and agricultural productivity for both irrigated and dryland systems. This allows VIC-CropSyst to provide managers and decision makers with reliable information on regional water stresses and their impacts on food production. Additionally, VIC-CropSyst is being used in conjunction with socioeconomic models, river system models, and atmospheric models to simulate feedback processes between regional water availability, agricultural water management decisions, and land-atmosphere interactions. The performance of VIC-CropSyst was evaluated on both regional (over the US Pacific Northwest) and point scales. Point-scale evaluation involved using two flux tower sites located in agricultural fields in the US (Nebraska and Illinois). The agreement between recorded and simulated evapotranspiration (ET), applied irrigation water, soil moisture, leaf area index (LAI), and yield indicated that, although the model is intended to work on regional scales, it also captures field-scale processes in agricultural areas.

  3. A dynamic, mechanistic model of metabolism in adipose tissue of lactating dairy cattle.

    Science.gov (United States)

    McNamara, J P; Huber, K; Kenéz, A

    2016-07-01

    Research in dairy cattle biology has resulted in a large body of knowledge on nutrition and metabolism in support of milk production and efficiency. This quantitative knowledge has been compiled in several model systems to balance and evaluate rations and predict requirements. There are also systems models for metabolism and reproduction in the cow that can be used to support research programs. Adipose tissue plays a significant role in the success and efficiency of lactation, and recent research has resulted in several data sets on genomic differences and changes in gene transcription of adipose tissue in dairy cattle. To fully use this knowledge, we need to build and expand mechanistic, dynamic models that integrate control of metabolism and production. Therefore, we constructed a second-generation dynamic, mechanistic model of adipose tissue metabolism of dairy cattle. The model describes the biochemical interconversions of glucose, acetate, β-hydroxybutyrate (BHB), glycerol, C16 fatty acids, and triacylglycerols. Data gathered from our own research and published references were used to set equation forms and parameter values. Acetate, glucose, BHB, and fatty acids are taken up from blood. The fatty acids are activated to the acyl coenzyme A moieties. Enzymatically catalyzed reactions are explicitly described with parameters including maximal velocity and substrate sensitivity. The control of enzyme activity is partially carried out by insulin and norepinephrine, portraying control in the cow. Model behavior was adequate, with sensitive responses to changing substrates and hormones. Increased nutrient uptake and increased insulin stimulate triacylglycerol synthesis, whereas a reduction in nutrient availability or increase in norepinephrine increases triacylglycerol hydrolysis and free fatty acid release to blood. This model can form a basis for more sophisticated integration of existing knowledge and future studies on metabolic efficiency of dairy cattle

  4. Mechanistic approach for the kinetics of the decomposition of nitrous oxide over calcined hydrotalcites

    Energy Technology Data Exchange (ETDEWEB)

    Dandl, H.; Emig, G. [Lehrstuhl fuer Technische Chemie I, Erlangen (Germany)

    1998-03-27

    A highly active catalyst for the decomposition of N{sub 2}O was prepared by the thermal treatment of CoLaAl-hydrotalcite. For this catalyst the reaction rate was determined at various partial pressures of N{sub 2}O, O{sub 2} and H{sub 2}O in a temperature range from 573K to 823K. The kinetic simulation resulted in a mechanistic model. The energies of activation and rate coefficients are estimated for the main steps of the reaction

  5. Comparative ecophysiology of two sympatric lizards. Laying the groundwork for mechanistic distribution models

    Directory of Open Access Journals (Sweden)

    Enrique García-Muñoz

    2013-12-01

    Full Text Available Distribution modelling usually makes inferences correlating species presence and environmental variables but does not take biotic relations into account. Alternative approaches based on a mechanistic understanding of biological processes are now being applied. Regarding lacertid lizards, physiological traits such as preferred body temperature (Tp are well known to correlate with several physiological optima. Much less is known about their water ecology although body temperature and evaporative water loss (Wl may trade-off. Two saxicolous lacertids, Algyroides marchi and Podarcis hispanica ss are sympatric in the Subbetic Mountains (SE Spain were they can be found in syntopy. Previous distribution modelling indicates the first species is associated with mountains, low temperatures; high precipitation and forest cover whereas the second one is more generalistic. Here, we perform two ecophysiological tests with both species: a Tp experiment in thermal gradient and a Wl experiment in sealed chambers. Although both species attained similar body temperatures, A. marchi lost more water and more uniformly in time than P. hispanica ss that displayed an apparent response to dehydration. These results suggest that water loss rather temperature is crucial to explain the distribution patterns of A. marchi in relation to P. hispanica ss, the former risking dehydration in dry areas no matter what temperature is. Ecophysiological traits represent a promising tool to build future mechanistic models for (lacertid lizards. Additionally, the implications for their biogeography and conservation are discussed.

  6. Malaria's missing number: calculating the human component of R0 by a within-host mechanistic model of Plasmodium falciparum infection and transmission.

    Directory of Open Access Journals (Sweden)

    Geoffrey L Johnston

    2013-04-01

    Full Text Available Human infection by malarial parasites of the genus Plasmodium begins with the bite of an infected Anopheles mosquito. Current estimates place malaria mortality at over 650,000 individuals each year, mostly in African children. Efforts to reduce disease burden can benefit from the development of mathematical models of disease transmission. To date, however, comprehensive modeling of the parameters defining human infectivity to mosquitoes has remained elusive. Here, we describe a mechanistic within-host model of Plasmodium falciparum infection in humans and pathogen transmission to the mosquito vector. Our model incorporates the entire parasite lifecycle, including the intra-erythrocytic asexual forms responsible for disease, the onset of symptoms, the development and maturation of intra-erythrocytic gametocytes that are transmissible to Anopheles mosquitoes, and human-to-mosquito infectivity. These model components were parameterized from malaria therapy data and other studies to simulate individual infections, and the ensemble of outputs was found to reproduce the full range of patient responses to infection. Using this model, we assessed human infectivity over the course of untreated infections and examined the effects in relation to transmission intensity, expressed by the basic reproduction number R0 (defined as the number of secondary cases produced by a single typical infection in a completely susceptible population. Our studies predict that net human-to-mosquito infectivity from a single non-immune individual is on average equal to 32 fully infectious days. This estimate of mean infectivity is equivalent to calculating the human component of malarial R0 . We also predict that mean daily infectivity exceeds five percent for approximately 138 days. The mechanistic framework described herein, made available as stand-alone software, will enable investigators to conduct detailed studies into theories of malaria control, including the effects of

  7. Emulation of dynamic simulators with application to hydrology

    Energy Technology Data Exchange (ETDEWEB)

    Machac, David, E-mail: david.machac@eawag.ch [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Department of Systems Analysis, Integrated Assessment and Modelling, 8600 Dübendorf (Switzerland); ETH Zurich, Department of Environmental Systems Science, 8092 Zurich (Switzerland); Reichert, Peter [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Department of Systems Analysis, Integrated Assessment and Modelling, 8600 Dübendorf (Switzerland); ETH Zurich, Department of Environmental Systems Science, 8092 Zurich (Switzerland); Albert, Carlo [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Department of Systems Analysis, Integrated Assessment and Modelling, 8600 Dübendorf (Switzerland)

    2016-05-15

    Many simulation-intensive tasks in the applied sciences, such as sensitivity analysis, parameter inference or real time control, are hampered by slow simulators. Emulators provide the opportunity of speeding up simulations at the cost of introducing some inaccuracy. An emulator is a fast approximation to a simulator that interpolates between design input–output pairs of the simulator. Increasing the number of design data sets is a computationally demanding way of improving the accuracy of emulation. We investigate the complementary approach of increasing emulation accuracy by including knowledge about the mechanisms of the simulator into the formulation of the emulator. To approximately reproduce the output of dynamic simulators, we consider emulators that are based on a system of linear, ordinary or partial stochastic differential equations with a noise term formulated as a Gaussian process of the parameters to be emulated. This stochastic model is then conditioned to the design data so that it mimics the behavior of the nonlinear simulator as a function of the parameters. The drift terms of the linear model are designed to provide a simplified description of the simulator as a function of its key parameters so that the required corrections by the conditioned Gaussian process noise are as small as possible. The goal of this paper is to compare the gain in accuracy of these emulators by enlarging the design data set and by varying the degree of simplification of the linear model. We apply this framework to a simulator for the shallow water equations in a channel and compare emulation accuracy for emulators based on different spatial discretization levels of the channel and for a standard non-mechanistic emulator. Our results indicate that we have a large gain in accuracy already when using the simplest mechanistic description by a single linear reservoir to formulate the drift term of the linear model. Adding some more reservoirs does not lead to a significant

  8. Mechanistic kinetic models of enzymatic cellulose hydrolysis-A review.

    Science.gov (United States)

    Jeoh, Tina; Cardona, Maria J; Karuna, Nardrapee; Mudinoor, Akshata R; Nill, Jennifer

    2017-07-01

    Bioconversion of lignocellulose forms the basis for renewable, advanced biofuels, and bioproducts. Mechanisms of hydrolysis of cellulose by cellulases have been actively studied for nearly 70 years with significant gains in understanding of the cellulolytic enzymes. Yet, a full mechanistic understanding of the hydrolysis reaction has been elusive. We present a review to highlight new insights gained since the most recent comprehensive review of cellulose hydrolysis kinetic models by Bansal et al. (2009) Biotechnol Adv 27:833-848. Recent models have taken a two-pronged approach to tackle the challenge of modeling the complex heterogeneous reaction-an enzyme-centric modeling approach centered on the molecularity of the cellulase-cellulose interactions to examine rate limiting elementary steps and a substrate-centric modeling approach aimed at capturing the limiting property of the insoluble cellulose substrate. Collectively, modeling results suggest that at the molecular-scale, how rapidly cellulases can bind productively (complexation) and release from cellulose (decomplexation) is limiting, while the overall hydrolysis rate is largely insensitive to the catalytic rate constant. The surface area of the insoluble substrate and the degrees of polymerization of the cellulose molecules in the reaction both limit initial hydrolysis rates only. Neither enzyme-centric models nor substrate-centric models can consistently capture hydrolysis time course at extended reaction times. Thus, questions of the true reaction limiting factors at extended reaction times and the role of complexation and decomplexation in rate limitation remain unresolved. Biotechnol. Bioeng. 2017;114: 1369-1385. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  9. Multiscale development of a fission gas thermal conductivity model: Coupling atomic, meso and continuum level simulations

    International Nuclear Information System (INIS)

    Tonks, Michael R.; Millett, Paul C.; Nerikar, Pankaj; Du, Shiyu; Andersson, David; Stanek, Christopher R.; Gaston, Derek; Andrs, David; Williamson, Richard

    2013-01-01

    Fission gas production and evolution significantly impact the fuel performance, causing swelling, a reduction in the thermal conductivity and fission gas release. However, typical empirical models of fuel properties treat each of these effects separately and uncoupled. Here, we couple a fission gas release model to a model of the impact of fission gas on the fuel thermal conductivity. To quantify the specific impact of grain boundary (GB) bubbles on the thermal conductivity, we use atomistic and mesoscale simulations. Atomistic molecular dynamic simulations were employed to determine the GB thermal resistance. These values were then used in mesoscale heat conduction simulations to develop a mechanistic expression for the effective GB thermal resistance of a GB containing gas bubbles, as a function of the percentage of the GB covered by fission gas. The coupled fission gas release and thermal conductivity model was implemented in Idaho National Laboratory’s BISON fuel performance code to model the behavior of a 10-pellet LWR fuel rodlet, showing how the fission gas impacts the UO 2 thermal conductivity. Furthermore, additional BISON simulations were conducted to demonstrate the impact of average grain size on both the fuel thermal conductivity and the fission gas release

  10. Effects of septum and pericardium on heart-lung interactions in a cardiopulmonary simulation model.

    Science.gov (United States)

    Karamolegkos, Nikolaos; Albanese, Antonio; Chbat, Nicolas W

    2017-07-01

    Mechanical heart-lung interactions are often overlooked in clinical settings. However, their impact on cardiac function can be quite significant. Mechanistic physiology-based models can provide invaluable insights into such cardiorespiratory interactions, which occur not only under external mechanical ventilatory support but in normal physiology as well. In this work, we focus on the cardiac component of a previously developed mathematical model of the human cardiopulmonary system, aiming to improve the model's response to the intrathoracic pressure variations that are associated with the respiratory cycle. Interventricular septum and pericardial membrane are integrated into the existing model. Their effect on the overall cardiac response is explained by means of comparison against simulation results from the original model as well as experimental data from literature.

  11. VIC–CropSyst-v2: A regional-scale modeling platform to simulate the nexus of climate, hydrology, cropping systems, and human decisions

    Directory of Open Access Journals (Sweden)

    K. Malek

    2017-08-01

    Full Text Available Food supply is affected by a complex nexus of land, atmosphere, and human processes, including short- and long-term stressors (e.g., drought and climate change, respectively. A simulation platform that captures these complex elements can be used to inform policy and best management practices to promote sustainable agriculture. We have developed a tightly coupled framework using the macroscale variable infiltration capacity (VIC hydrologic model and the CropSyst agricultural model. A mechanistic irrigation module was also developed for inclusion in this framework. Because VIC–CropSyst combines two widely used and mechanistic models (for crop phenology, growth, management, and macroscale hydrology, it can provide realistic and hydrologically consistent simulations of water availability, crop water requirements for irrigation, and agricultural productivity for both irrigated and dryland systems. This allows VIC–CropSyst to provide managers and decision makers with reliable information on regional water stresses and their impacts on food production. Additionally, VIC–CropSyst is being used in conjunction with socioeconomic models, river system models, and atmospheric models to simulate feedback processes between regional water availability, agricultural water management decisions, and land–atmosphere interactions. The performance of VIC–CropSyst was evaluated on both regional (over the US Pacific Northwest and point scales. Point-scale evaluation involved using two flux tower sites located in agricultural fields in the US (Nebraska and Illinois. The agreement between recorded and simulated evapotranspiration (ET, applied irrigation water, soil moisture, leaf area index (LAI, and yield indicated that, although the model is intended to work on regional scales, it also captures field-scale processes in agricultural areas.

  12. Productivity of "Collisions Generate Heat" for Reconciling an Energy Model with Mechanistic Reasoning: A Case Study

    Science.gov (United States)

    Scherr, Rachel E.; Robertson, Amy D.

    2015-01-01

    We observe teachers in professional development courses about energy constructing mechanistic accounts of energy transformations. We analyze a case in which teachers investigating adiabatic compression develop a model of the transformation of kinetic energy to thermal energy. Among their ideas is the idea that thermal energy is generated as a…

  13. Development and application of a mechanistic model to estimate emission of nitrous oxide from UK agriculture

    International Nuclear Information System (INIS)

    Brown, L.; Jarvis, S.C.; Syed, B.; Goulding, K.W.T.; Li, C.

    2002-01-01

    A mechanistic model of N 2 O emission from agricultural soil (DeNitrification-DeComposition - DNDC) was modified for application to the UK, and was used as the basis of an inventory of N 2 O emission from UK agriculture in 1990. UK-specific input data were added to DNDC's database and the ability to simulate daily C and N inputs from grazing animals and applied animal waste was added to the model. The UK version of the model, UK-DNDC, simulated emissions from 18 different crop types on the 3 areally dominant soils in each county. Validation of the model at the field scale showed that predictions matched observations well. Emission factors for the inventory were calculated from estimates of N 2 O emission from UK-DNDC, in order to maintain direct comparability with the IPCC approach. These, along with activity data, were included in a transparent spreadsheet format. Using UK-DNDC, the estimate of N 2 O-N emission from UK current agricultural practice in 1990 was 50.9Gg. This total comprised 31.7Gg from the soil sector, 5.9Gg from animals and 13.2Gg from the indirect sector. The range of this estimate (using the range of soil organic C for each soil used) was 30.5-62.5Gg N. Estimates of emissions in each sector were compared to those calculated using the IPCC default methodology. Emissions from the soil and indirect sectors were smaller with the UK-DNDC approach than with the IPCC methodology, while emissions from the animal sector were larger. The model runs suggested a relatively large emission from agricultural land that was not attributable to current agricultural practices (33.8Gg in total, 27.4Gg from the soil sector). This 'background' component is partly the result of historical agricultural land use. It is not normally included in inventories of emission, but would increase the total emission of N 2 O-N from agricultural land in 1990 to 78.3Gg. (Author)

  14. Development and application of a mechanistic model to estimate emission of nitrous oxide from UK agriculture

    Energy Technology Data Exchange (ETDEWEB)

    Brown, L.; Jarvis, S.C. [Institute of Grassland and Environmental Research, Okehampton (United Kingdom); Syed, B. [Cranfield Univ., Silsoe (United Kingdom). Soil Survey and Land Research Centre; Sneath, R.W.; Phillips, V.R. [Silsoe Research Inst. (United Kingdom); Goulding, K.W.T. [Institute of Arable Crops Research, Rothamsted (United Kingdom); Li, C. [University of New Hampshire (United States). Inst. for the Study of Earth, Oceans and Space

    2002-07-01

    A mechanistic model of N{sub 2}O emission from agricultural soil (DeNitrification-DeComposition - DNDC) was modified for application to the UK, and was used as the basis of an inventory of N{sub 2}O emission from UK agriculture in 1990. UK-specific input data were added to DNDC's database and the ability to simulate daily C and N inputs from grazing animals and applied animal waste was added to the model. The UK version of the model, UK-DNDC, simulated emissions from 18 different crop types on the 3 areally dominant soils in each county. Validation of the model at the field scale showed that predictions matched observations well. Emission factors for the inventory were calculated from estimates of N{sub 2}O emission from UK-DNDC, in order to maintain direct comparability with the IPCC approach. These, along with activity data, were included in a transparent spreadsheet format. Using UK-DNDC, the estimate of N{sub 2}O-N emission from UK current agricultural practice in 1990 was 50.9Gg. This total comprised 31.7Gg from the soil sector, 5.9Gg from animals and 13.2Gg from the indirect sector. The range of this estimate (using the range of soil organic C for each soil used) was 30.5-62.5Gg N. Estimates of emissions in each sector were compared to those calculated using the IPCC default methodology. Emissions from the soil and indirect sectors were smaller with the UK-DNDC approach than with the IPCC methodology, while emissions from the animal sector were larger. The model runs suggested a relatively large emission from agricultural land that was not attributable to current agricultural practices (33.8Gg in total, 27.4Gg from the soil sector). This 'background' component is partly the result of historical agricultural land use. It is not normally included in inventories of emission, but would increase the total emission of N{sub 2}O-N from agricultural land in 1990 to 78.3Gg. (Author)

  15. Development and application of a mechanistic model to estimate emission of nitrous oxide from UK agriculture

    Science.gov (United States)

    Brown, L.; Syed, B.; Jarvis, S. C.; Sneath, R. W.; Phillips, V. R.; Goulding, K. W. T.; Li, C.

    A mechanistic model of N 2O emission from agricultural soil (DeNitrification-DeComposition—DNDC) was modified for application to the UK, and was used as the basis of an inventory of N 2O emission from UK agriculture in 1990. UK-specific input data were added to DNDC's database and the ability to simulate daily C and N inputs from grazing animals and applied animal waste was added to the model. The UK version of the model, UK-DNDC, simulated emissions from 18 different crop types on the 3 areally dominant soils in each county. Validation of the model at the field scale showed that predictions matched observations well. Emission factors for the inventory were calculated from estimates of N 2O emission from UK-DNDC, in order to maintain direct comparability with the IPCC approach. These, along with activity data, were included in a transparent spreadsheet format. Using UK-DNDC, the estimate of N 2O-N emission from UK current agricultural practice in 1990 was 50.9 Gg. This total comprised 31.7 Gg from the soil sector, 5.9 Gg from animals and 13.2 Gg from the indirect sector. The range of this estimate (using the range of soil organic C for each soil used) was 30.5-62.5 Gg N. Estimates of emissions in each sector were compared to those calculated using the IPCC default methodology. Emissions from the soil and indirect sectors were smaller with the UK-DNDC approach than with the IPCC methodology, while emissions from the animal sector were larger. The model runs suggested a relatively large emission from agricultural land that was not attributable to current agricultural practices (33.8 Gg in total, 27.4 Gg from the soil sector). This 'background' component is partly the result of historical agricultural land use. It is not normally included in inventories of emission, but would increase the total emission of N 2O-N from agricultural land in 1990 to 78.3 Gg.

  16. Description and evaluation of a mechanistically based conceptual model for spall

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, F.D.; Knowles, M.K.; Thompson, T.W. [and others

    1997-08-01

    A mechanistically based model for a possible spall event at the WIPP site is developed and evaluated in this report. Release of waste material to the surface during an inadvertent borehole intrusion is possible if future states of the repository include high gas pressure and waste material consisting of fine particulates having low mechanical strength. The conceptual model incorporates the physics of wellbore hydraulics coupled to transient gas flow to the intrusion borehole, and mechanical response of the waste. Degraded waste properties using of the model. The evaluations include both numerical and analytical implementations of the conceptual model. A tensile failure criterion is assumed appropriate for calculation of volumes of waste experiencing fragmentation. Calculations show that for repository gas pressures less than 12 MPa, no tensile failure occurs. Minimal volumes of material experience failure below gas pressure of 14 MPa. Repository conditions dictate that the probability of gas pressures exceeding 14 MPa is approximately 1%. For these conditions, a maximum failed volume of 0.25 m{sup 3} is calculated.

  17. Description and evaluation of a mechanistically based conceptual model for spall

    International Nuclear Information System (INIS)

    Hansen, F.D.; Knowles, M.K.; Thompson, T.W.

    1997-08-01

    A mechanistically based model for a possible spall event at the WIPP site is developed and evaluated in this report. Release of waste material to the surface during an inadvertent borehole intrusion is possible if future states of the repository include high gas pressure and waste material consisting of fine particulates having low mechanical strength. The conceptual model incorporates the physics of wellbore hydraulics coupled to transient gas flow to the intrusion borehole, and mechanical response of the waste. Degraded waste properties using of the model. The evaluations include both numerical and analytical implementations of the conceptual model. A tensile failure criterion is assumed appropriate for calculation of volumes of waste experiencing fragmentation. Calculations show that for repository gas pressures less than 12 MPa, no tensile failure occurs. Minimal volumes of material experience failure below gas pressure of 14 MPa. Repository conditions dictate that the probability of gas pressures exceeding 14 MPa is approximately 1%. For these conditions, a maximum failed volume of 0.25 m 3 is calculated

  18. Fetal programming of CVD and renal disease: animal models and mechanistic considerations.

    Science.gov (United States)

    Langley-Evans, Simon C

    2013-08-01

    The developmental origins of health and disease hypothesis postulates that exposure to a less than optimal maternal environment during fetal development programmes physiological function, and determines risk of disease in adult life. Much evidence of such programming comes from retrospective epidemiological cohorts, which demonstrate associations between birth anthropometry and non-communicable diseases of adulthood. The assertion that variation in maternal nutrition drives these associations is supported by studies using animal models, which demonstrate that maternal under- or over-nutrition during pregnancy can programme offspring development. Typically, the offspring of animals that are undernourished in pregnancy exhibit a relatively narrow range of physiological phenotypes that includes higher blood pressure, glucose intolerance, renal insufficiency and increased adiposity. The observation that common phenotypes arise from very diverse maternal nutritional insults has led to the proposal that programming is driven by a small number of mechanistic processes. The remodelling of tissues during development as a consequence of maternal nutritional status being signalled by endocrine imbalance or key nutrients limiting processes in the fetus may lead to organs having irreversibly altered structures that may limit their function with ageing. It has been proposed that the maternal diet may impact upon epigenetic marks that determine gene expression in fetal tissues, and this may be an important mechanism connecting maternal nutrient intakes to long-term programming of offspring phenotype. The objective for this review is to provide an overview of the mechanistic basis of fetal programming, demonstrating the critical role of animal models as tools for the investigation of programming phenomena.

  19. Computational Fluid Dynamic Simulation of Single Bubble Growth under High-Pressure Pool Boiling Conditions

    Directory of Open Access Journals (Sweden)

    Janani Murallidharan

    2016-08-01

    Full Text Available Component-scale modeling of boiling is predominantly based on the Eulerian–Eulerian two-fluid approach. Within this framework, wall boiling is accounted for via the Rensselaer Polytechnic Institute (RPI model and, within this model, the bubble is characterized using three main parameters: departure diameter (D, nucleation site density (N, and departure frequency (f. Typically, the magnitudes of these three parameters are obtained from empirical correlations. However, in recent years, efforts have been directed toward mechanistic modeling of the boiling process. Of the three parameters mentioned above, the departure diameter (D is least affected by the intrinsic uncertainties of the nucleate boiling process. This feature, along with its prominence within the RPI boiling model, has made it the primary candidate for mechanistic modeling ventures. Mechanistic modeling of D is mostly carried out through solving of force balance equations on the bubble. Forces incorporated in these equations are formulated as functions of the radius of the bubble and have been developed for, and applied to, low-pressure conditions only. Conversely, for high-pressure conditions, no mechanistic information is available regarding the growth rates of bubbles and the forces acting on them. In this study, we use direct numerical simulation coupled with an interface tracking method to simulate bubble growth under high (up to 45 bar pressure, to obtain the kind of mechanistic information required for an RPI-type approach. In this study, we compare the resulting bubble growth rate curves with predictions made with existing experimental data.

  20. Mechanistic modelling of a cathode-supported tubular solid oxide fuel cell

    Science.gov (United States)

    Suwanwarangkul, R.; Croiset, E.; Pritzker, M. D.; Fowler, M. W.; Douglas, P. L.; Entchev, E.

    A two-dimensional mechanistic model of a tubular solid oxide fuel cell (SOFC) considering momentum, energy, mass and charge transport is developed. The model geometry of a single cell comprises an air-preheating tube, air channel, fuel channel, anode, cathode and electrolyte layers. The heat radiation between cell and air-preheating tube is also incorporated into the model. This allows the model to predict heat transfer between the cell and air-preheating tube accurately. The model is validated and shows good agreement with literature data. It is anticipated that this model can be used to help develop efficient fuel cell designs and set operating variables under practical conditions. The transport phenomena inside the cell, including gas flow behaviour, temperature, overpotential, current density and species concentration, are analysed and discussed in detail. Fuel and air velocities are found to vary along flow passages depending on the local temperature and species concentrations. This model demonstrates the importance of incorporating heat radiation into a tubular SOFC model. Furthermore, the model shows that the overall cell performance is limited by O 2 diffusion through the thick porous cathode and points to the development of new cathode materials and designs being important avenues to enhance cell performance.

  1. Improving the prediction of methane production and representation of rumen fermentation for finishing beef cattle within a mechanistic model

    NARCIS (Netherlands)

    Ellis, J.L.; Dijkstra, J.; Bannink, A.; Kebreab, E.; Archibeque, S.; Benchaar, C.; Beauchemin, K.; Nkrumah, D.J.; France, J.

    2014-01-01

    The purpose of this study was to evaluate prediction of methane emissions from finishing beef cattle using an extant mechanistic model with pH-independent or pH-dependent VFA stoichiometries, a recent stoichiometry adjustment for the use of monensin, and adaptation of the underlying model structure,

  2. Stochastic Simulation Using @ Risk for Dairy Business Investment Decisions

    Science.gov (United States)

    A dynamic, stochastic, mechanistic simulation model of a dairy business was developed to evaluate the cost and benefit streams coinciding with technology investments. The model was constructed to embody the biological and economical complexities of a dairy farm system within a partial budgeting fram...

  3. Simulation of Drought-induced Tree Mortality Using a New Individual and Hydraulic Trait-based Model (S-TEDy)

    Science.gov (United States)

    Sinha, T.; Gangodagamage, C.; Ale, S.; Frazier, A. G.; Giambelluca, T. W.; Kumagai, T.; Nakai, T.; Sato, H.

    2017-12-01

    Drought-related tree mortality at a regional scale causes drastic shifts in carbon and water cycling in Southeast Asian tropical rainforests, where severe droughts are projected to occur more frequently, especially under El Niño conditions. To provide a useful tool for projecting the tropical rainforest dynamics under climate change conditions, we developed the Spatially Explicit Individual-Based (SEIB) Dynamic Global Vegetation Model (DGVM) applicable to simulating mechanistic tree mortality induced by the climatic impacts via individual-tree-scale ecophysiology such as hydraulic failure and carbon starvation. In this study, we present the new model, SEIB-originated Terrestrial Ecosystem Dynamics (S-TEDy) model, and the computation results were compared with observations collected at a field site in a Bornean tropical rainforest. Furthermore, after validating the model's performance, numerical experiments addressing a future of the tropical rainforest were conducted using some global climate model (GCM) simulation outputs.

  4. The simulation of multidimensional multiphase flows

    International Nuclear Information System (INIS)

    Lahey, Richard T.

    2005-01-01

    This paper presents an assessment of various models which can be used for the multidimensional simulation of multiphase flows, such as may occur in nuclear reactors. In particular, a model appropriate for the direct numerical simulation (DNS) of multiphase flows and a mechanistically based, three-dimensional, four-field, turbulent, two-fluid computational multiphase fluid dynamics (CMFD) model are discussed. A two-fluid bubbly flow model, which was derived using potential flow theory, can be extended to other flow regimes, but this will normally involve ensemble-averaging the results from direct numerical simulations (DNS) of various flow regimes to provide the detailed numerical data necessary for the development of flow-regime-specific interfacial and wall closure laws

  5. Development of mechanistic sorption model and treatment of uncertainties for Ni sorption on montmorillonite/bentonite

    International Nuclear Information System (INIS)

    Ochs, Michael; Ganter, Charlotte; Tachi, Yukio; Suyama, Tadahiro; Yui, Mikazu

    2011-02-01

    Sorption and diffusion of radionuclides in buffer materials (bentonite) are the key processes in the safe geological disposal of radioactive waste, because migration of radionuclides in this barrier is expected to be diffusion-controlled and retarded by sorption processes. It is therefore necessary to understand the detailed/coupled processes of sorption and diffusion in compacted bentonite and develop mechanistic /predictive models, so that reliable parameters can be set under a variety of geochemical conditions relevant to performance assessment (PA). For this purpose, JAEA has developed the integrated sorption and diffusion (ISD) model/database in montmorillonite/bentonite systems. The main goal of the mechanistic model/database development is to provide a tool for a consistent explanation, prediction, and uncertainty assessment of K d as well as diffusion parameters needed for the quantification of radionuclide transport. The present report focuses on developing the thermodynamic sorption model (TSM) and on the quantification and handling of model uncertainties in applications, based on illustrating by example of Ni sorption on montmorillonite/bentonite. This includes 1) a summary of the present state of the art of thermodynamic sorption modeling, 2) a discussion of the selection of surface species and model design appropriate for the present purpose, 3) possible sources and representations of TSM uncertainties, and 4) details of modeling, testing and uncertainty evaluation for Ni sorption. Two fundamentally different approaches are presented and compared for representing TSM uncertainties: 1) TSM parameter uncertainties calculated by FITEQL optimization routines and some statistical procedure, 2) overall error estimated by direct comparison of modeled and experimental K d values. The overall error in K d is viewed as the best representation of model uncertainty in ISD model/database development. (author)

  6. Applicability of one-dimensional mechanistic post-dryout prediction model

    International Nuclear Information System (INIS)

    Jeong, Hae Yong; No Hee Cheon

    1996-01-01

    Through the analysis of many experimental post-dryout data, it is shown that the most probable flow regime near dryout or quench front is not annular flow but churn-turbulent flow when the mass flux is low. A correlation describing the initial droplet size just after the CHF position at low mass flux is low. A correlation describing the initial droplet size just after the CHF position at low mass flux is suggested through regression analysis. In the post-dryout region at low pressure and low flow, it is found that the suggested one-dimensional mechanistic model is not applicable when the vapor superficial velocity is very low, i. e., when the flow is bubbly or slug flow regime. This is explained by the change of main entrainment mechanism with the change of flow regime. Therefore, the suggested correlation is valid only in the churn-turbulent flow regime (j * g = 0.5 ∼ 4.5)

  7. Stochastic simulation using @Risk for dairy business investment decisions

    NARCIS (Netherlands)

    Bewley, J.D.; Boehlje, M.D.; Gray, A.W.; Hogeveen, H.; Kenyon, S.J.; Eicher, S.D.; Schutz, M.M.

    2010-01-01

    Purpose – The purpose of this paper is to develop a dynamic, stochastic, mechanistic simulation model of a dairy business to evaluate the cost and benefit streams coinciding with technology investments. The model was constructed to embody the biological and economical complexities of a dairy farm

  8. Mechanistic modelling of gaseous fission product behaviour in UO2 fuel by Rtop code

    International Nuclear Information System (INIS)

    Kanukova, V.D.; Khoruzhii, O.V.; Kourtchatov, S.Y.; Likhanskii, V.V.; Matveew, L.V.

    2002-01-01

    The current status of a mechanistic modelling by the RTOP code of the fission product behaviour in polycrystalline UO 2 fuel is described. An outline of the code and implemented physical models is presented. The general approach to code validation is discussed. It is exemplified by the results of validation of the models of fuel oxidation and grain growth. The different models of intragranular and intergranular gas bubble behaviour have been tested and the sensitivity of the code in the framework of these models has been analysed. An analysis of available models of the resolution of grain face bubbles is also presented. The possibilities of the RTOP code are presented through the example of modelling behaviour of WWER fuel over the course of a comparative WWER-PWR experiment performed at Halden and by comparison with Yanagisawa experiments. (author)

  9. Mechanistic study of manganese-substituted glycerol dehydrogenase using a kinetic and thermodynamic analysis.

    Science.gov (United States)

    Fang, Baishan; Niu, Jin; Ren, Hong; Guo, Yingxia; Wang, Shizhen

    2014-01-01

    Mechanistic insights regarding the activity enhancement of dehydrogenase by metal ion substitution were investigated by a simple method using a kinetic and thermodynamic analysis. By profiling the binding energy of both the substrate and product, the metal ion's role in catalysis enhancement was revealed. Glycerol dehydrogenase (GDH) from Klebsiella pneumoniae sp., which demonstrated an improvement in activity by the substitution of a zinc ion with a manganese ion, was used as a model for the mechanistic study of metal ion substitution. A kinetic model based on an ordered Bi-Bi mechanism was proposed considering the noncompetitive product inhibition of dihydroxyacetone (DHA) and the competitive product inhibition of NADH. By obtaining preliminary kinetic parameters of substrate and product inhibition, the number of estimated parameters was reduced from 10 to 4 for a nonlinear regression-based kinetic parameter estimation. The simulated values of time-concentration curves fit the experimental values well, with an average relative error of 11.5% and 12.7% for Mn-GDH and GDH, respectively. A comparison of the binding energy of enzyme ternary complex for Mn-GDH and GDH derived from kinetic parameters indicated that metal ion substitution accelerated the release of dioxyacetone. The metal ion's role in catalysis enhancement was explicated.

  10. Mechanistic study of manganese-substituted glycerol dehydrogenase using a kinetic and thermodynamic analysis.

    Directory of Open Access Journals (Sweden)

    Baishan Fang

    Full Text Available Mechanistic insights regarding the activity enhancement of dehydrogenase by metal ion substitution were investigated by a simple method using a kinetic and thermodynamic analysis. By profiling the binding energy of both the substrate and product, the metal ion's role in catalysis enhancement was revealed. Glycerol dehydrogenase (GDH from Klebsiella pneumoniae sp., which demonstrated an improvement in activity by the substitution of a zinc ion with a manganese ion, was used as a model for the mechanistic study of metal ion substitution. A kinetic model based on an ordered Bi-Bi mechanism was proposed considering the noncompetitive product inhibition of dihydroxyacetone (DHA and the competitive product inhibition of NADH. By obtaining preliminary kinetic parameters of substrate and product inhibition, the number of estimated parameters was reduced from 10 to 4 for a nonlinear regression-based kinetic parameter estimation. The simulated values of time-concentration curves fit the experimental values well, with an average relative error of 11.5% and 12.7% for Mn-GDH and GDH, respectively. A comparison of the binding energy of enzyme ternary complex for Mn-GDH and GDH derived from kinetic parameters indicated that metal ion substitution accelerated the release of dioxyacetone. The metal ion's role in catalysis enhancement was explicated.

  11. Application of mechanistic empirical approach to predict rutting of superpave mixtures in Iraq

    Directory of Open Access Journals (Sweden)

    Qasim Zaynab

    2018-01-01

    Full Text Available In Iraq rutting is considered as a real distress in flexible pavements as a result of high summer temperature, and increased axle loads. This distress majorly affects asphalt pavement performance, lessens the pavement useful service life and makes serious hazards for highway users. Performance of HMA mixtures against rutting using Mechanistic- Empirical approach is predicted by considering Wheel-Tracking test and employing the Superpave mix design requirements. Roller Wheel Compactor has been locally manufactured to prepare slab specimens. In view of study laboratory outcomes that are judged to be simulative of field loading conditions, models are developed for predicting permanent strain of compacted samples of local asphalt concrete mixtures after considering the stress level, properties of local material and environmental impacts variables. All in all, laboratory results were produced utilizing statistical analysis with the aid of SPSS software. Permanent strain models for asphalt concrete mixtures were developed as a function of: number of passes, temperature, asphalt content, viscosity, air voids and additive content. Mechanistic Empirical design approach through the MnPAVE software was applied to characterize rutting in HMA and to predict allowable number of loading repetitions of mixtures as a function of expected traffic loads, material properties, and environmental temperature.

  12. Mechanistic site-based emulation of a global ocean biogeochemical model (MEDUSA 1.0 for parametric analysis and calibration: an application of the Marine Model Optimization Testbed (MarMOT 1.1

    Directory of Open Access Journals (Sweden)

    J. C. P. Hemmings

    2015-03-01

    Full Text Available Biogeochemical ocean circulation models used to investigate the role of plankton ecosystems in global change rely on adjustable parameters to capture the dominant biogeochemical dynamics of a complex biological system. In principle, optimal parameter values can be estimated by fitting models to observational data, including satellite ocean colour products such as chlorophyll that achieve good spatial and temporal coverage of the surface ocean. However, comprehensive parametric analyses require large ensemble experiments that are computationally infeasible with global 3-D simulations. Site-based simulations provide an efficient alternative but can only be used to make reliable inferences about global model performance if robust quantitative descriptions of their relationships with the corresponding 3-D simulations can be established. The feasibility of establishing such a relationship is investigated for an intermediate complexity biogeochemistry model (MEDUSA coupled with a widely used global ocean model (NEMO. A site-based mechanistic emulator is constructed for surface chlorophyll output from this target model as a function of model parameters. The emulator comprises an array of 1-D simulators and a statistical quantification of the uncertainty in their predictions. The unknown parameter-dependent biogeochemical environment, in terms of initial tracer concentrations and lateral flux information required by the simulators, is a significant source of uncertainty. It is approximated by a mean environment derived from a small ensemble of 3-D simulations representing variability of the target model behaviour over the parameter space of interest. The performance of two alternative uncertainty quantification schemes is examined: a direct method based on comparisons between simulator output and a sample of known target model "truths" and an indirect method that is only partially reliant on knowledge of the target model output. In general, chlorophyll

  13. Soil pH controls the environmental availability of phosphorus: Experimental and mechanistic modelling approaches

    International Nuclear Information System (INIS)

    Devau, Nicolas; Cadre, Edith Le; Hinsinger, Philippe; Jaillard, Benoit; Gerard, Frederic

    2009-01-01

    Inorganic P is the least mobile major nutrient in most soils and is frequently the prime limiting factor for plant growth in terrestrial ecosystems. In this study, the extraction of soil inorganic P with CaCl 2 (P-CaCl 2 ) and geochemical modelling were combined in order to unravel the processes controlling the environmentally available P (EAP) of a soil over a range of pH values (pH ∼ 4-10). Mechanistic descriptions of the adsorption of cations and anions by the soil constituents were used (1-pK Triple Plane, ion-exchange and NICA-Donnan models). These models are implemented into the geochemical code Visual MINTEQ. An additive approach was used for their application to the surface horizon of a Cambisol. The geochemical code accurately reproduced the concentration of extracted P at the different soil pH values (R 2 = 0.9, RMSE = 0.03 mg kg -1 ). Model parameters were either directly found in the literature or estimated by fitting published experimental results in single mineral systems. The strong agreement between measurements and modelling results demonstrated that adsorption processes exerted a major control on the EAP of the soil over a large range of pH values. An influence of the precipitation of P-containing mineral is discounted based on thermodynamic calculations. Modelling results indicated that the variations in P-CaCl 2 with soil pH were controlled by the deprotonation/protonation of the surface hydroxyl groups, the distribution of P surface complexes, and the adsorption of Ca and Cl from the electrolyte background. Iron-oxides and gibbsite were found to be the major P-adsorbing soil constituents at acidic and alkaline pHs, whereas P was mainly adsorbed by clay minerals at intermediate pH values. This study demonstrates the efficacy of geochemical modelling to understand soil processes, and the applicability of mechanistic adsorption models to a 'real' soil, with its mineralogical complexity and the additional contribution of soil organic matter.

  14. Soil pH controls the environmental availability of phosphorus: Experimental and mechanistic modelling approaches

    Energy Technology Data Exchange (ETDEWEB)

    Devau, Nicolas [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Cadre, Edith Le [Supagro, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Hinsinger, Philippe; Jaillard, Benoit [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Gerard, Frederic, E-mail: gerard@supagro.inra.fr [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France)

    2009-11-15

    Inorganic P is the least mobile major nutrient in most soils and is frequently the prime limiting factor for plant growth in terrestrial ecosystems. In this study, the extraction of soil inorganic P with CaCl{sub 2} (P-CaCl{sub 2}) and geochemical modelling were combined in order to unravel the processes controlling the environmentally available P (EAP) of a soil over a range of pH values (pH {approx} 4-10). Mechanistic descriptions of the adsorption of cations and anions by the soil constituents were used (1-pK Triple Plane, ion-exchange and NICA-Donnan models). These models are implemented into the geochemical code Visual MINTEQ. An additive approach was used for their application to the surface horizon of a Cambisol. The geochemical code accurately reproduced the concentration of extracted P at the different soil pH values (R{sup 2} = 0.9, RMSE = 0.03 mg kg{sup -1}). Model parameters were either directly found in the literature or estimated by fitting published experimental results in single mineral systems. The strong agreement between measurements and modelling results demonstrated that adsorption processes exerted a major control on the EAP of the soil over a large range of pH values. An influence of the precipitation of P-containing mineral is discounted based on thermodynamic calculations. Modelling results indicated that the variations in P-CaCl{sub 2} with soil pH were controlled by the deprotonation/protonation of the surface hydroxyl groups, the distribution of P surface complexes, and the adsorption of Ca and Cl from the electrolyte background. Iron-oxides and gibbsite were found to be the major P-adsorbing soil constituents at acidic and alkaline pHs, whereas P was mainly adsorbed by clay minerals at intermediate pH values. This study demonstrates the efficacy of geochemical modelling to understand soil processes, and the applicability of mechanistic adsorption models to a 'real' soil, with its mineralogical complexity and the additional

  15. Insight into the hydraulics and resilience of Ponderosa pine seedlings using a mechanistic ecohydrologic model

    Science.gov (United States)

    Maneta, M. P.; Simeone, C.; Dobrowski, S.; Holden, Z.; Sapes, G.; Sala, A.; Begueria, S.

    2017-12-01

    In semiarid regions, drought-induced seedling mortality is considered to be caused by failure in the tree hydraulic column. Understanding the mechanisms that cause hydraulic failure and death in seedlings is important, among other things, to diagnose where some tree species may fail to regenerate, triggering demographic imbalances in the forest that could result in climate-driven shifts of tree species. Ponderosa pine is a common lower tree line species in the western US. Seedlings of ponderosa pine are often subject to low soil water potentials, which require lower water potentials in the xylem and leaves to maintain the negative pressure gradient that drives water upward. The resilience of the hydraulic column to hydraulic tension is species dependent, but from greenhouse experiments, we have identified general tension thresholds beyond which loss of xylem conductivity becomes critical, and mortality in Ponderosa pine seedlings start to occur. We describe this hydraulic behavior of plants using a mechanistic soil-vegetation-atmosphere transfer model. Before we use this models to understand water-stress induced seedling mortality at the landscape scale, we perform a modeling analysis of the dynamics of soil moisture, transpiration, leaf water potential and loss of plant water conductivity using detailed data from our green house experiments. The analysis is done using a spatially distributed model that simulates water fluxes, energy exchanges and water potentials in the soil-vegetation-atmosphere continuum. Plant hydraulic and physiological parameters of this model were calibrated using Monte Carlo methods against information on soil moisture, soil hydraulic potential, transpiration, leaf water potential and percent loss of conductivity in the xylem. This analysis permits us to construct a full portrait of the parameter space for Ponderosa pine seedling and generate posterior predictive distributions of tree response to understand the sensitivity of transpiration

  16. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  17. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC)

    International Nuclear Information System (INIS)

    Schultz, Peter Andrew

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M and S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V and V) is required throughout the system to establish evidence-based metrics for the level of confidence in M and S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V and V challenge at the subcontinuum scale, an approach to incorporate V and V concepts into subcontinuum scale modeling and simulation (M and S), and a plan to incrementally incorporate effective V and V into subcontinuum scale M and S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  18. Mechanistic Links Between PARP, NAD, and Brain Inflammation After TBI

    Science.gov (United States)

    2015-10-01

    1 AWARD NUMBER: W81XWH-13-2-0091 TITLE: Mechanistic Links Between PARP, NAD , and Brain Inflammation After TBI PRINCIPAL INVESTIGATOR...COVERED 25 Sep 2014 - 24 Sep 2015 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Mechanistic Links Between PARP, NAD , and Brain Inflammation After TBI 5b. GRANT...efficacy of veliparib and NAD as agents for suppressing inflammation and improving outcomes after traumatic brain injury. The animal models include

  19. Hardware in the loop simulation test platform of fuel cell backup system

    Directory of Open Access Journals (Sweden)

    Ma Tiancai

    2015-01-01

    Full Text Available Based on an analysis of voltage mechanistic model, a real-time simulation model of the proton exchange membrane (PEM fuel cell backup system is developed, and verified by the measurable experiment data. The method of online parameters identification for the model is also improved. Based on the software LabVIEW/VeriStand real-time environment and the PXI Express hardware system, the PEM fuel cell system controller hardware in the loop (HIL simulation plat-form is established. Controller simulation test results showed the accuracy of HIL simulation platform.

  20. A mechanistic modeling and data assimilation framework for Mojave Desert ecohydrology

    Science.gov (United States)

    Ng, Gene-Hua Crystal; Bedford, David R.; Miller, David M.

    2014-06-01

    This study demonstrates and addresses challenges in coupled ecohydrological modeling in deserts, which arise due to unique plant adaptations, marginal growing conditions, slow net primary production rates, and highly variable rainfall. We consider model uncertainty from both structural and parameter errors and present a mechanistic model for the shrub Larrea tridentata (creosote bush) under conditions found in the Mojave National Preserve in southeastern California (USA). Desert-specific plant and soil features are incorporated into the CLM-CN model by Oleson et al. (2010). We then develop a data assimilation framework using the ensemble Kalman filter (EnKF) to estimate model parameters based on soil moisture and leaf-area index observations. A new implementation procedure, the "multisite loop EnKF," tackles parameter estimation difficulties found to affect desert ecohydrological applications. Specifically, the procedure iterates through data from various observation sites to alleviate adverse filter impacts from non-Gaussianity in small desert vegetation state values. It also readjusts inconsistent parameters and states through a model spin-up step that accounts for longer dynamical time scales due to infrequent rainfall in deserts. Observation error variance inflation may also be needed to help prevent divergence of estimates from true values. Synthetic test results highlight the importance of adequate observations for reducing model uncertainty, which can be achieved through data quality or quantity.

  1. A mechanistic modeling and data assimilation framework for Mojave Desert ecohydrology

    Science.gov (United States)

    Ng, Gene-Hua Crystal.; Bedford, David; Miller, David

    2014-01-01

    This study demonstrates and addresses challenges in coupled ecohydrological modeling in deserts, which arise due to unique plant adaptations, marginal growing conditions, slow net primary production rates, and highly variable rainfall. We consider model uncertainty from both structural and parameter errors and present a mechanistic model for the shrub Larrea tridentata (creosote bush) under conditions found in the Mojave National Preserve in southeastern California (USA). Desert-specific plant and soil features are incorporated into the CLM-CN model by Oleson et al. (2010). We then develop a data assimilation framework using the ensemble Kalman filter (EnKF) to estimate model parameters based on soil moisture and leaf-area index observations. A new implementation procedure, the “multisite loop EnKF,” tackles parameter estimation difficulties found to affect desert ecohydrological applications. Specifically, the procedure iterates through data from various observation sites to alleviate adverse filter impacts from non-Gaussianity in small desert vegetation state values. It also readjusts inconsistent parameters and states through a model spin-up step that accounts for longer dynamical time scales due to infrequent rainfall in deserts. Observation error variance inflation may also be needed to help prevent divergence of estimates from true values. Synthetic test results highlight the importance of adequate observations for reducing model uncertainty, which can be achieved through data quality or quantity.

  2. Combining experimental and simulation data of molecular processes via augmented Markov models.

    Science.gov (United States)

    Olsson, Simon; Wu, Hao; Paul, Fabian; Clementi, Cecilia; Noé, Frank

    2017-08-01

    Accurate mechanistic description of structural changes in biomolecules is an increasingly important topic in structural and chemical biology. Markov models have emerged as a powerful way to approximate the molecular kinetics of large biomolecules while keeping full structural resolution in a divide-and-conquer fashion. However, the accuracy of these models is limited by that of the force fields used to generate the underlying molecular dynamics (MD) simulation data. Whereas the quality of classical MD force fields has improved significantly in recent years, remaining errors in the Boltzmann weights are still on the order of a few [Formula: see text], which may lead to significant discrepancies when comparing to experimentally measured rates or state populations. Here we take the view that simulations using a sufficiently good force-field sample conformations that are valid but have inaccurate weights, yet these weights may be made accurate by incorporating experimental data a posteriori. To do so, we propose augmented Markov models (AMMs), an approach that combines concepts from probability theory and information theory to consistently treat systematic force-field error and statistical errors in simulation and experiment. Our results demonstrate that AMMs can reconcile conflicting results for protein mechanisms obtained by different force fields and correct for a wide range of stationary and dynamical observables even when only equilibrium measurements are incorporated into the estimation process. This approach constitutes a unique avenue to combine experiment and computation into integrative models of biomolecular structure and dynamics.

  3. A rigorous mechanistic model for predicting gas hydrate formation kinetics: The case of CO2 recovery and sequestration

    International Nuclear Information System (INIS)

    ZareNezhad, Bahman; Mottahedin, Mona

    2012-01-01

    Highlights: ► A mechanistic model for predicting gas hydrate formation kinetics is presented. ► A secondary nucleation rate model is proposed for the first time. ► Crystal–crystal collisions and crystal–impeller collisions are distinguished. ► Simultaneous determination of nucleation and growth kinetics are established. ► Important for design of gas hydrate based energy storage and CO 2 recovery systems. - Abstract: A rigorous mechanistic model for predicting gas hydrate formation crystallization kinetics is presented and the special case of CO 2 gas hydrate formation regarding CO 2 recovery and sequestration processes has been investigated by using the proposed model. A physical model for prediction of secondary nucleation rate is proposed for the first time and the formation rates of secondary nuclei by crystal–crystal collisions and crystal–impeller collisions are formulated. The objective functions for simultaneous determination of nucleation and growth kinetics are presented and a theoretical framework for predicting the dynamic behavior of gas hydrate formation is presented. Predicted time variations of CO 2 content, total number and surface area of produced hydrate crystals are in good agreement with the available experimental data. The proposed approach can have considerable application for design of gas hydrate converters regarding energy storage and CO 2 recovery processes.

  4. A mechanistic compartmental model for total antibody uptake in tumors.

    Science.gov (United States)

    Thurber, Greg M; Dane Wittrup, K

    2012-12-07

    Antibodies are under development to treat a variety of cancers, such as lymphomas, colon, and breast cancer. A major limitation to greater efficacy for this class of drugs is poor distribution in vivo. Localization of antibodies occurs slowly, often in insufficient therapeutic amounts, and distributes heterogeneously throughout the tumor. While the microdistribution around individual vessels is important for many therapies, the total amount of antibody localized in the tumor is paramount for many applications such as imaging, determining the therapeutic index with antibody drug conjugates, and dosing in radioimmunotherapy. With imaging and pretargeted therapeutic strategies, the time course of uptake is critical in determining when to take an image or deliver a secondary reagent. We present here a simple mechanistic model of antibody uptake and retention that captures the major rates that determine the time course of antibody concentration within a tumor including dose, affinity, plasma clearance, target expression, internalization, permeability, and vascularization. Since many of the parameters are known or can be estimated in vitro, this model can approximate the time course of antibody concentration in tumors to aid in experimental design, data interpretation, and strategies to improve localization. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Modeling of iodine radiation chemistry in the presence of organic compounds

    International Nuclear Information System (INIS)

    Taghipour, Fariborz; Evans, Greg J.

    2002-01-01

    A kinetic-based model was developed that simulates the radiation chemistry of iodine in the presence of organic compounds. The model's mechanistic description of iodine chemistry and generic semi-mechanistic reactions for various classes of organics, provided a reasonable representation of experimental results. The majority of the model and experimental results of iodine volatilization rates were in agreement within an order of magnitude

  6. Four Mechanistic Models of Peer Influence on Adolescent Cannabis Use.

    Science.gov (United States)

    Caouette, Justin D; Feldstein Ewing, Sarah W

    2017-06-01

    Most adolescents begin exploring cannabis in peer contexts, but the neural mechanisms that underlie peer influence on adolescent cannabis use are still unknown. This theoretical overview elucidates the intersecting roles of neural function and peer factors in cannabis use in adolescents. Novel paradigms using functional magnetic resonance imaging (fMRI) in adolescents have identified distinct neural mechanisms of risk decision-making and incentive processing in peer contexts, centered on reward-motivation and affect regulatory neural networks; these findings inform a theoretical model of peer-driven cannabis use decisions in adolescents. We propose four "mechanistic profiles" of social facilitation of cannabis use in adolescents: (1) peer influence as the primary driver of use; (2) cannabis exploration as the primary driver, which may be enhanced in peer contexts; (3) social anxiety; and (4) negative peer experiences. Identification of "neural targets" involved in motivating cannabis use may inform clinicians about which treatment strategies work best in adolescents with cannabis use problems, and via which social and neurocognitive processes.

  7. Recent advances in mathematical modeling of developmental abnormalities using mechanistic information.

    Science.gov (United States)

    Kavlock, R J

    1997-01-01

    During the last several years, significant changes in the risk assessment process for developmental toxicity of environmental contaminants have begun to emerge. The first of these changes is the development and beginning use of statistically based dose-response models [the benchmark dose (BMD) approach] that better utilize data derived from existing testing approaches. Accompanying this change is the greater emphasis placed on understanding and using mechanistic information to yield more accurate, reliable, and less uncertain risk assessments. The next stage in the evolution of risk assessment will be the use of biologically based dose-response (BBDR) models that begin to build into the statistically based models factors related to the underlying kinetic, biochemical, and/or physiologic processes perturbed by a toxicant. Such models are now emerging from several research laboratories. The introduction of quantitative models and the incorporation of biologic information into them has pointed to the need for even more sophisticated modifications for which we offer the term embryologically based dose-response (EBDR) models. Because these models would be based upon the understanding of normal morphogenesis, they represent a quantum leap in our thinking, but their complexity presents daunting challenges both to the developmental biologist and the developmental toxicologist. Implementation of these models will require extensive communication between developmental toxicologists, molecular embryologists, and biomathematicians. The remarkable progress in the understanding of mammalian embryonic development at the molecular level that has occurred over the last decade combined with advances in computing power and computational models should eventually enable these as yet hypothetical models to be brought into use.

  8. Simulations of NLC formation using a microphysical model driven by three-dimensional dynamics

    Science.gov (United States)

    Kirsch, Annekatrin; Becker, Erich; Rapp, Markus; Megner, Linda; Wilms, Henrike

    2014-05-01

    Noctilucent clouds (NLCs) represent an optical phenomenon occurring in the polar summer mesopause region. These clouds have been known since the late 19th century. Current physical understanding of NLCs is based on numerous observational and theoretical studies, in recent years especially observations from satellites and by lidars from ground. Theoretical studies based on numerical models that simulate NLCs with the underlying microphysical processes are uncommon. Up to date no three-dimensional numerical simulations of NLCs exist that take all relevant dynamical scales into account, i.e., from the planetary scale down to gravity waves and turbulence. Rather, modeling is usually restricted to certain flow regimes. In this study we make a more rigorous attempt and simulate NLC formation in the environment of the general circulation of the mesopause region by explicitly including gravity waves motions. For this purpose we couple the Community Aerosol and Radiation Model for Atmosphere (CARMA) to gravity-wave resolving dynamical fields simulated beforehand with the Kuehlungsborn Mechanistic Circulation Model (KMCM). In our case, the KMCM is run with a horizontal resolution of T120 which corresponds to a minimum horizontal wavelength of 350 km. This restriction causes the resolved gravity waves to be somewhat biased to larger scales. The simulated general circulation is dynamically controlled by these waves in a self-consitent fashion and provides realistic temperatures and wind-fields for July conditions. Assuming a water vapor mixing ratio profile in agreement with current observations results in reasonable supersaturations of up to 100. In a first step, CARMA is applied to a horizontal section covering the Northern hemisphere. The vertical resolution is 120 levels ranging from 72 to 101 km. In this paper we will present initial results of this coupled dynamical microphysical model focussing on the interaction of waves and turbulent diffusion with NLC-microphysics.

  9. Rotary ultrasonic machining of CFRP: a mechanistic predictive model for cutting force.

    Science.gov (United States)

    Cong, W L; Pei, Z J; Sun, X; Zhang, C L

    2014-02-01

    Cutting force is one of the most important output variables in rotary ultrasonic machining (RUM) of carbon fiber reinforced plastic (CFRP) composites. Many experimental investigations on cutting force in RUM of CFRP have been reported. However, in the literature, there are no cutting force models for RUM of CFRP. This paper develops a mechanistic predictive model for cutting force in RUM of CFRP. The material removal mechanism of CFRP in RUM has been analyzed first. The model is based on the assumption that brittle fracture is the dominant mode of material removal. CFRP micromechanical analysis has been conducted to represent CFRP as an equivalent homogeneous material to obtain the mechanical properties of CFRP from its components. Based on this model, relationships between input variables (including ultrasonic vibration amplitude, tool rotation speed, feedrate, abrasive size, and abrasive concentration) and cutting force can be predicted. The relationships between input variables and important intermediate variables (indentation depth, effective contact time, and maximum impact force of single abrasive grain) have been investigated to explain predicted trends of cutting force. Experiments are conducted to verify the model, and experimental results agree well with predicted trends from this model. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Radiation fields, dosimetry, biokinetics and biophysical models for cancer induction by ionising radiation 1996-1999. Biophysical models for the induction of cancer by radiation. Final report

    International Nuclear Information System (INIS)

    Paretzke, H.G.; Ballarini, F.; Brugmans, M.

    2000-01-01

    The overall project is organised into seven work packages. WP1 concentrates on the development of mechanistic, quantitative models for radiation oncogenesis using selected data sets from radiation epidemiology and from experimental animal studies. WP2 concentrates on the development of mechanistic, mathematical models for the induction of chromosome aberrations. WP3 develops mechanistic models for radiation mutagenesis, particularly using the HPRT-mutation as a paradigm. WP4 will develop mechanistic models for damage and repair of DNA, and compare these with experimentally derived data. WP5 concentrates on the improvement of our knowledge on the chemical reaction pathways of initial radiation chemical species in particular those that migrate to react with the DNA and on their simulation in track structure codes. WP6 models by track structure simulation codes the production of initial physical and chemical species, within DNA, water and other components of mammalian cells, in the tracks of charged particles following the physical processes of energy transfer, migration, absorption, and decay of excited states. WP7 concentrates on the determination of the start spectra of those tracks considered in WP6 for different impinging radiation fields and different irradiated biological objects. (orig.)

  11. Critical review of glass performance modeling

    International Nuclear Information System (INIS)

    Bourcier, W.L.

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process

  12. Critical review of glass performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Bourcier, W.L. [Lawrence Livermore National Lab., CA (United States)

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process.

  13. Supporting Mechanistic Reasoning in Domain-Specific Contexts

    Science.gov (United States)

    Weinberg, Paul J.

    2017-01-01

    Mechanistic reasoning is an epistemic practice central within science, technology, engineering, and mathematics disciplines. Although there has been some work on mechanistic reasoning in the research literature and standards documents, much of this work targets domain-general characterizations of mechanistic reasoning; this study provides…

  14. A poor metabolizer of both CYP2C19 and CYP2D6 identified by mechanistic pharmacokinetic simulation in a fatal drug poisoning case involving venlafaxine

    DEFF Research Database (Denmark)

    Jornil, J; Nielsen, T S; Rosendal, I

    2013-01-01

    Abstract We present a fatal drug poisoning case involving venlafaxine (VEN). The deceased took his medication regularly (including 150 mg VEN twice daily), and nothing in the case or autopsy findings pointed towards suicide. The toxicological assessment concluded that the cause of death was most...... combined with genotyping were considered very useful in this fatal drug poisoning case. Keywords CYP2D6; CYP2C19; Venlafaxine; Poor metabolizer; Drug poisoning; Mechanistic pharmacokinetic simulation --------------------------------------------------------------------------------...

  15. Spatially explicit and stochastic simulation of forest landscape fire disturbance and succession

    Science.gov (United States)

    Hong S. He; David J. Mladenoff

    1999-01-01

    Understanding disturbance and recovery of forest landscapes is a challenge because of complex interactions over a range of temporal and spatial scales. Landscape simulation models offer an approach to studying such systems at broad scales. Fire can be simulated spatially using mechanistic or stochastic approaches. We describe the fire module in a spatially explicit,...

  16. Precision and accuracy of mechanistic-empirical pavement design

    CSIR Research Space (South Africa)

    Theyse, HL

    2006-09-01

    Full Text Available are discussed in general. The effects of variability and error on the design accuracy and design risk are lastly illustrated at the hand of a simple mechanistic-empirical design problem, showing that the engineering models alone determine the accuracy...

  17. A Mechanistically Informed User-Friendly Model to Predict Greenhouse Gas (GHG) Fluxes and Carbon Storage from Coastal Wetlands

    Science.gov (United States)

    Abdul-Aziz, O. I.; Ishtiaq, K. S.

    2015-12-01

    We present a user-friendly modeling tool on MS Excel to predict the greenhouse gas (GHG) fluxes and estimate potential carbon sequestration from the coastal wetlands. The dominant controls of wetland GHG fluxes and their relative mechanistic linkages with various hydro-climatic, sea level, biogeochemical and ecological drivers were first determined by employing a systematic data-analytics method, including Pearson correlation matrix, principal component and factor analyses, and exploratory partial least squares regressions. The mechanistic knowledge and understanding was then utilized to develop parsimonious non-linear (power-law) models to predict wetland carbon dioxide (CO2) and methane (CH4) fluxes based on a sub-set of climatic, hydrologic and environmental drivers such as the photosynthetically active radiation, soil temperature, water depth, and soil salinity. The models were tested with field data for multiple sites and seasons (2012-13) collected from the Waquoit Bay, MA. The model estimated the annual wetland carbon storage by up-scaling the instantaneous predicted fluxes to an extended growing season (e.g., May-October) and by accounting for the net annual lateral carbon fluxes between the wetlands and estuary. The Excel Spreadsheet model is a simple ecological engineering tool for coastal carbon management and their incorporation into a potential carbon market under a changing climate, sea level and environment. Specifically, the model can help to determine appropriate GHG offset protocols and monitoring plans for projects that focus on tidal wetland restoration and maintenance.

  18. Comparing Productivity Simulated with Inventory Data Using Different Modelling Technologies

    Science.gov (United States)

    Klopf, M.; Pietsch, S. A.; Hasenauer, H.

    2009-04-01

    The Lime Stone National Park in Austria was established in 1997 to protect sensible lime stone soils from degradation due to heavy forest management. Since 1997 the management activities were successively reduced and standing volume and coarse woody debris (CWD) increased and degraded soils began to recover. One option to study the rehabilitation process towards natural virgin forest state is the use of modelling technology. In this study we will test two different modelling approaches for their applicability to Lime Stone National Park. We will compare standing tree volume simulated resulting from (i) the individual tree growth model MOSES, and (ii) the species and management sensitive adaptation of the biogeochemical-mechanistic model Biome-BGC. The results from the two models are compared with filed observations form repeated permanent forest inventory plots of the Lime Stone National Park in Austria. The simulated CWD predictions of the BGC-model were compared with dead wood measurements (standing and lying dead wood) recorded at the permanent inventory plots. The inventory was established between 1994 and 1996 and remeasured from 2004 to 2005. For this analysis 40 plots of this inventory were selected which comprise the required dead wood components and are dominated by a single tree species. First we used the distance dependant individual tree growth model MOSES to derive the standing timber and the amount of mortality per hectare. MOSES is initialized with the inventory data at plot establishment and each sampling plot is treated as forest stand. The Biome-BGC is a process based biogeochemical model with extensions for Austrian tree species, a self initialization and a forest management tool. The initialization for the actual simulations with the BGC model was done as follows: We first used spin up runs to derive a balanced forest vegetation, similar to an undisturbed forest. Next we considered the management history of the past centuries (heavy clear cuts

  19. Semi-Mechanistic Population Pharmacokinetic Modeling of L-Histidine Disposition and Brain Uptake in Wildtype and Pht1 Null Mice.

    Science.gov (United States)

    Wang, Xiao-Xing; Li, Yang-Bing; Feng, Meihua R; Smith, David E

    2018-01-05

    To develop a semi-mechanistic population pharmacokinetic (PK) model to quantitate the disposition kinetics of L-histidine, a peptide-histidine transporter 1 (PHT1) substrate, in the plasma, cerebrospinal fluid and brain parenchyma of wildtype (WT) and Pht1 knockout (KO) mice. L-[ 14 C]Hisidine (L-His) was administrated to WT and KO mice via tail vein injection, after which plasma, cerebrospinal fluid (CSF) and brain parenchyma samples were collected. A PK model was developed using non-linear mixed effects modeling (NONMEM). The disposition of L-His between the plasma, brain, and CSF was described by a combination of PHT1-mediated uptake, CSF bulk flow and first-order micro-rate constants. The PK profile of L-His was best described by a four-compartment model. A more rapid uptake of L-His in brain parenchyma was observed in WT mice due to PHT1-mediated uptake, a process characterized by a Michaelis-Menten component (V max  = 0.051 nmoL/min and K m  = 34.94 μM). A semi-mechanistic population PK model was successfully developed, for the first time, to quantitatively characterize the disposition kinetics of L-His in brain under in vivo conditions. This model may prove a useful tool in predicting the uptake of L-His, and possibly other PHT1 peptide/mimetic substrates, for drug delivery to the brain.

  20. A new mechanistic and engineering fission gas release model for a uranium dioxide fuel

    International Nuclear Information System (INIS)

    Lee, Chan Bock; Yang, Yong Sik; Kim, Dae Ho; Kim, Sun Ki; Bang, Je Geun

    2008-01-01

    A mechanistic and engineering fission gas release model (MEGA) for uranium dioxide (UO 2 ) fuel was developed. It was based upon the diffusional release of fission gases from inside the grain to the grain boundary and the release of fission gases from the grain boundary to the external surface by the interconnection of the fission gas bubbles in the grain boundary. The capability of the MEGA model was validated by a comparison with the fission gas release data base and the sensitivity analyses of the parameters. It was found that the MEGA model correctly predicts the fission gas release in the broad range of fuel burnups up to 98 MWd/kgU. Especially, the enhancement of fission gas release in a high-burnup fuel, and the reduction of fission gas release at a high burnup by increasing the UO 2 grain size were found to be correctly predicted by the MEGA model without using any artificial factor. (author)

  1. Integrity: A semi-mechanistic model for stress corrosion cracking of fuel

    Energy Technology Data Exchange (ETDEWEB)

    Tayal, M; Hallgrimson, K; Macquarrie, J; Alavi, P [Atomic Energy of Canada Ltd., Mississauga, ON (Canada); Sato, S; Kinoshita, Y; Nishimura, T [Electric Power Development Co. Ltd., Tokyo (Japan)

    1997-08-01

    In this paper we describe the features, validation, and illustrative applications of a semi-mechanistic model, INTEGRITY, which calculates the probability of fuel defects due to stress corrosion cracking. The model expresses the defect probability in terms of fundamental parameters such as local stresses, local strains, and fission product concentration. The assessments of defect probability continue to reflect the influence of conventional parameters like ramped power, power-ramp, burnup and Canlub coating. In addition, the INTEGRITY model provides a mechanism to account for the impacts of additional factors involving detailed fuel design and reactor operation. Some examples of the latter include pellet density, pellet shape and size, sheath diameter and thickness, pellet/sheath clearance, coolant temperature and pressure, etc. The model has been fitted to a database of 554 power-ramp irradiations of CANDU fuel with and without Canlub. For this database the INTEGRITY model calculates 75 defects vs 75 actual defects. Similarly good agreements were noted in the different sub-groups of the data involving non-Canlub, thin-Canlub, and thick-Canlub fuel. Moreover, the shapes and the locations of the defect thresholds were consistent with all the above defects as well as with additional 14 ripple defects that were not in the above database. Two illustrative examples demonstrate how the defect thresholds are influenced by changes in the internal design of the fuel element and by extended burnup. (author). 19 refs, 7 figs.

  2. Integrity: A semi-mechanistic model for stress corrosion cracking of fuel

    International Nuclear Information System (INIS)

    Tayal, M.; Hallgrimson, K.; Macquarrie, J.; Alavi, P.; Sato, S.; Kinoshita, Y.; Nishimura, T.

    1997-01-01

    In this paper we describe the features, validation, and illustrative applications of a semi-mechanistic model, INTEGRITY, which calculates the probability of fuel defects due to stress corrosion cracking. The model expresses the defect probability in terms of fundamental parameters such as local stresses, local strains, and fission product concentration. The assessments of defect probability continue to reflect the influence of conventional parameters like ramped power, power-ramp, burnup and Canlub coating. In addition, the INTEGRITY model provides a mechanism to account for the impacts of additional factors involving detailed fuel design and reactor operation. Some examples of the latter include pellet density, pellet shape and size, sheath diameter and thickness, pellet/sheath clearance, coolant temperature and pressure, etc. The model has been fitted to a database of 554 power-ramp irradiations of CANDU fuel with and without Canlub. For this database the INTEGRITY model calculates 75 defects vs 75 actual defects. Similarly good agreements were noted in the different sub-groups of the data involving non-Canlub, thin-Canlub, and thick-Canlub fuel. Moreover, the shapes and the locations of the defect thresholds were consistent with all the above defects as well as with additional 14 ripple defects that were not in the above database. Two illustrative examples demonstrate how the defect thresholds are influenced by changes in the internal design of the fuel element and by extended burnup. (author). 19 refs, 7 figs

  3. Application of a Mechanistic Model as a Tool for On-line Monitoring of Pilot Scale Filamentous Fungal Fermentation Processes - The Importance of Evaporation Effects

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads Orla

    2017-01-01

    A mechanistic model-based soft sensor is developed and validated for 550L filamentous fungus fermentations operated at Novozymes A/S. The soft sensor is comprised of a parameter estimation block based on a stoichiometric balance, coupled to a dynamic process model. The on-line parameter estimation...... a historical dataset of eleven batches from the fermentation pilot plant (550L) at Novozymes A/S. The model is then implemented on-line in 550L fermentation processes operated at Novozymes A/S in order to validate the state estimator model on fourteen new batches utilizing a new strain. The product...... block models the changing rates of formation of product, biomass, and water, and the rate of consumption of feed using standard, available on-line measurements. This parameter estimation block, is coupled to a mechanistic process model, which solves the current states of biomass, product, substrate...

  4. Fidelity in Animal Modeling: Prerequisite for a Mechanistic Research Front Relevant to the Inflammatory Incompetence of Acute Pediatric Malnutrition

    Science.gov (United States)

    Woodward, Bill

    2016-01-01

    Inflammatory incompetence is characteristic of acute pediatric protein-energy malnutrition, but its underlying mechanisms remain obscure. Perhaps substantially because the research front lacks the driving force of a scholarly unifying hypothesis, it is adrift and research activity is declining. A body of animal-based research points to a unifying paradigm, the Tolerance Model, with some potential to offer coherence and a mechanistic impetus to the field. However, reasonable skepticism prevails regarding the relevance of animal models of acute pediatric malnutrition; consequently, the fundamental contributions of the animal-based component of this research front are largely overlooked. Design-related modifications to improve the relevance of animal modeling in this research front include, most notably, prioritizing essential features of pediatric malnutrition pathology rather than dietary minutiae specific to infants and children, selecting windows of experimental animal development that correspond to targeted stages of pediatric immunological ontogeny, and controlling for ontogeny-related confounders. In addition, important opportunities are presented by newer tools including the immunologically humanized mouse and outbred stocks exhibiting a magnitude of genetic heterogeneity comparable to that of human populations. Sound animal modeling is within our grasp to stimulate and support a mechanistic research front relevant to the immunological problems that accompany acute pediatric malnutrition. PMID:27077845

  5. Fidelity in Animal Modeling: Prerequisite for a Mechanistic Research Front Relevant to the Inflammatory Incompetence of Acute Pediatric Malnutrition.

    Science.gov (United States)

    Woodward, Bill

    2016-04-11

    Inflammatory incompetence is characteristic of acute pediatric protein-energy malnutrition, but its underlying mechanisms remain obscure. Perhaps substantially because the research front lacks the driving force of a scholarly unifying hypothesis, it is adrift and research activity is declining. A body of animal-based research points to a unifying paradigm, the Tolerance Model, with some potential to offer coherence and a mechanistic impetus to the field. However, reasonable skepticism prevails regarding the relevance of animal models of acute pediatric malnutrition; consequently, the fundamental contributions of the animal-based component of this research front are largely overlooked. Design-related modifications to improve the relevance of animal modeling in this research front include, most notably, prioritizing essential features of pediatric malnutrition pathology rather than dietary minutiae specific to infants and children, selecting windows of experimental animal development that correspond to targeted stages of pediatric immunological ontogeny, and controlling for ontogeny-related confounders. In addition, important opportunities are presented by newer tools including the immunologically humanized mouse and outbred stocks exhibiting a magnitude of genetic heterogeneity comparable to that of human populations. Sound animal modeling is within our grasp to stimulate and support a mechanistic research front relevant to the immunological problems that accompany acute pediatric malnutrition.

  6. The value of mechanistic biophysical information for systems-level understanding of complex biological processes such as cytokinesis.

    Science.gov (United States)

    Pollard, Thomas D

    2014-12-02

    This review illustrates the value of quantitative information including concentrations, kinetic constants and equilibrium constants in modeling and simulating complex biological processes. Although much has been learned about some biological systems without these parameter values, they greatly strengthen mechanistic accounts of dynamical systems. The analysis of muscle contraction is a classic example of the value of combining an inventory of the molecules, atomic structures of the molecules, kinetic constants for the reactions, reconstitutions with purified proteins and theoretical modeling to account for the contraction of whole muscles. A similar strategy is now being used to understand the mechanism of cytokinesis using fission yeast as a favorable model system. Copyright © 2014 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  7. Incorporation of FcRn-mediated disposition model to describe the population pharmacokinetics of therapeutic monoclonal IgG antibody in clinical patients.

    Science.gov (United States)

    Ng, Chee M

    2016-03-01

    The two-compartment linear model used to describe the population pharmacokinetics (PK) of many therapeutic monoclonal antibodies (TMAbs) offered little biological insight to antibody disposition in humans. The purpose of this study is to develop a semi-mechanistic FcRn-mediated IgG disposition model to describe the population PK of TMAbs in clinical patients. A standard two-compartment linear PK model from a previously published population PK model of pertuzumab was used to simulate intensive PK data of 100 subjects for model development. Two different semi-mechanistic FcRn-mediated IgG disposition models were developed and First Order Conditional Estimation (FOCE) with the interaction method in NONMEM was used to obtain the final model estimates. The performances of these models were then compared with the two-compartment linear PK model used to simulate the data for model development. A semi-mechanistic FcRn-mediated IgG disposition model consisting of a peripheral tissue compartment and FcRn-containing endosomes in the central compartment best describes the simulated pertuzumab population PK data. This developed semi-mechanistic population PK model had the same number of model parameters, produced very similar concentration-time profiles but provided additional biological insight to the FcRn-mediated IgG disposition in human subjects compared with the standard linear two-compartment linear PK model. This first reported semi-mechanistic model may serve as an important model framework for developing future population PK models of TMAbs in clinical patients. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Projected Climate Impacts to South African Maize and Wheat Production in 2055: A Comparison of Empirical and Mechanistic Modeling Approaches

    Science.gov (United States)

    Estes, Lyndon D.; Beukes, Hein; Bradley, Bethany A.; Debats, Stephanie R.; Oppenheimer, Michael; Ruane, Alex C.; Schulze, Roland; Tadross, Mark

    2013-01-01

    Crop model-specific biases are a key uncertainty affecting our understanding of climate change impacts to agriculture. There is increasing research focus on intermodel variation, but comparisons between mechanistic (MMs) and empirical models (EMs) are rare despite both being used widely in this field. We combined MMs and EMs to project future (2055) changes in the potential distribution (suitability) and productivity of maize and spring wheat in South Africa under 18 downscaled climate scenarios (9 models run under 2 emissions scenarios). EMs projected larger yield losses or smaller gains than MMs. The EMs' median-projected maize and wheat yield changes were 3.6% and 6.2%, respectively, compared to 6.5% and 15.2% for the MM. The EM projected a 10% reduction in the potential maize growing area, where the MM projected a 9% gain. Both models showed increases in the potential spring wheat production region (EM = 48%, MM = 20%), but these results were more equivocal because both models (particularly the EM) substantially overestimated the extent of current suitability. The substantial water-use efficiency gains simulated by the MMs under elevated CO2 accounted for much of the EMMM difference, but EMs may have more accurately represented crop temperature sensitivities. Our results align with earlier studies showing that EMs may show larger climate change losses than MMs. Crop forecasting efforts should expand to include EMMM comparisons to provide a fuller picture of crop-climate response uncertainties.

  9. Mechanistic model for Sr and Ba release from severely damaged fuel

    International Nuclear Information System (INIS)

    Rest, J.; Cronenberg, A.W.

    1985-11-01

    Among radionuclides associated with fission product release during severe accidents, the primary ones with health consequences are the volatile species of I, Te, and Cs, and the next most important are Sr, Ba, and Ru. Considerable progress has been made in the mechanistic understanding of I, Cs, Te, and noble gas release; however, no capability presently exists for estimating the release of Sr, Ba, and Ru. This paper presents a description of the primary physical/chemical models recently incorporated into the FASTGRASS-VFP (volatile fission product) code for the estimation of Sr and Ba release. FASTGRASS-VFP release predictions are compared with two data sets: (1) data from out-of-reactor induction-heating experiments on declad low-burnup (1000 and 4000 MWd/t) pellets, and (2) data from the more recent in-reactor PBF Severe Fuel Damage Tests, in which one-meter-long, trace-irradiated (89 MWd/t) and normally irradiated (approx.35,000 MWd/t) fuel rods were tested under accident conditions. 10 refs

  10. PBDOWN - a computer code for simulating core material discharge and thermal to mechanical energy conversion in LMFBR hypothetical accidents

    International Nuclear Information System (INIS)

    Royl, P.

    1981-01-01

    PBDOWN is a computer code that simulates the blowdown of confined boiling materials ('pools') into a colder upper coolant plenum as time dependent ejection and expansion with consideration of a few selected exchange processes. Its application is restricted to situations resulting from hypothetical loss of flow (LOF) accidents in LMFBR's, where enough voiding has occured, that in core sodium vapor pressures become negligible. PBDOWN considers one working fluid for the discharge process (either fuel or steel) and a maximum of two working fluids (either fuel and sodium or steel and sodium) for the expansion process in the upper coolant plenum. Entrainment of sodium at the accelerated bubble liquid interfaces is mechanistically calculated by a Taylor instability entrainment model. Simulation of a hemispherical expansion form together with this mechanistic entrainment model gives a new integrated calculation of the time dependent sodium mass in the bubble. The paper summarizes the basic equations and assumptions of this computer model. Sample results compare different heat transfer and Na entrainment models during steel and fuel driven discharge processes. Mechanistic sodium entrainment simulation for SNR-type reactors coupled with a realistic heat transfer model is shown to reduce the integral mechanical work potential by a factor of 1.3 to 2.0 over the isentropic energy of the discharge working fluids. (orig.)

  11. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term - Trial Calculation

    International Nuclear Information System (INIS)

    Grabaskas, David

    2016-01-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  12. Study of n-Butyl Acrylate Self-Initiation Reaction Experimentally and via Macroscopic Mechanistic Modeling

    Directory of Open Access Journals (Sweden)

    Ahmad Arabi Shamsabadi

    2016-04-01

    Full Text Available This paper presents an experimental study of the self-initiation reaction of n-butyl acrylate (n-BA in free-radical polymerization. For the first time, the frequency factor and activation energy of the monomer self-initiation reaction are estimated from measurements of n-BA conversion in free-radical homo-polymerization initiated only by the monomer. The estimation was carried out using a macroscopic mechanistic mathematical model of the reactor. In addition to already-known reactions that contribute to the polymerization, the model considers a n-BA self-initiation reaction mechanism that is based on our previous electronic-level first-principles theoretical study of the self-initiation reaction. Reaction rate equations are derived using the method of moments. The reaction-rate parameter estimates obtained from conversion measurements agree well with estimates obtained via our purely-theoretical quantum chemical calculations.

  13. MODELING AND SIMULATION OF RELIEF INFLUENCE ON EUCALYPTUS FORESTS: INTERACTION BETWEEN SOLAR IRRADIANCE AND PRODUCTIVITY

    Directory of Open Access Journals (Sweden)

    Yhasmin Paiva Rody

    2016-04-01

    Full Text Available ABSTRACT This study aimed to verify the differences in radiation intensity as a function of distinct relief exposure surfaces and to quantify these effects on the leaf area index (LAI and other variables expressing eucalyptus forest productivity for simulations in a process-based growth model. The study was carried out at two contrasting edaphoclimatic locations in the Rio Doce basin in Minas Gerais, Brazil. Two stands with 32-year-old plantations were used, allocating fixed plots in locations with northern and southern exposure surfaces. The meteorological data were obtained from two automated weather stations located near the study sites. Solar radiation was corrected for terrain inclination and exposure surfaces, as it is measured based on the plane, perpendicularly to the vertical location. The LAI values collected in the field were used. For the comparative simulations in productivity variation, the mechanistic 3PG model was used, considering the relief exposure surfaces. It was verified that during most of the year, the southern surfaces showed lower availability of incident solar radiation, resulting in up to 66% losses, compared to the same surface considered plane, probably related to its geographical location and higher declivity. Higher values were obtained for the plantings located on the northern surface for the variables LAI, volume and mean annual wood increase, with this tendency being repeated in the 3PG model simulations.

  14. Modeling of isothermal bubbly flow with interfacial area transport equation and bubble number density approach

    Energy Technology Data Exchange (ETDEWEB)

    Sari, Salih [Hacettepe University, Department of Nuclear Engineering, Beytepe, 06800 Ankara (Turkey); Erguen, Sule [Hacettepe University, Department of Nuclear Engineering, Beytepe, 06800 Ankara (Turkey); Barik, Muhammet; Kocar, Cemil; Soekmen, Cemal Niyazi [Hacettepe University, Department of Nuclear Engineering, Beytepe, 06800 Ankara (Turkey)

    2009-03-15

    In this study, isothermal turbulent bubbly flow is mechanistically modeled. For the modeling, Fluent version 6.3.26 is used as the computational fluid dynamics solver. First, the mechanistic models that simulate the interphase momentum transfer between the gas (bubbles) and liquid (continuous) phases are investigated, and proper models for the known flow conditions are selected. Second, an interfacial area transport equation (IATE) solution is added to Fluent's solution scheme in order to model the interphase momentum transfer mechanisms. In addition to solving IATE, bubble number density (BND) approach is also added to Fluent and this approach is also used in the simulations. Different source/sink models derived for the IATE and BND models are also investigated. The simulations of experiments based on the available data in literature are performed by using IATE and BND models in two and three-dimensions. The results show that the simulations performed by using IATE and BND models agree with each other and with the experimental data. The simulations performed in three-dimensions give better agreement with the experimental data.

  15. Modeling of isothermal bubbly flow with interfacial area transport equation and bubble number density approach

    International Nuclear Information System (INIS)

    Sari, Salih; Erguen, Sule; Barik, Muhammet; Kocar, Cemil; Soekmen, Cemal Niyazi

    2009-01-01

    In this study, isothermal turbulent bubbly flow is mechanistically modeled. For the modeling, Fluent version 6.3.26 is used as the computational fluid dynamics solver. First, the mechanistic models that simulate the interphase momentum transfer between the gas (bubbles) and liquid (continuous) phases are investigated, and proper models for the known flow conditions are selected. Second, an interfacial area transport equation (IATE) solution is added to Fluent's solution scheme in order to model the interphase momentum transfer mechanisms. In addition to solving IATE, bubble number density (BND) approach is also added to Fluent and this approach is also used in the simulations. Different source/sink models derived for the IATE and BND models are also investigated. The simulations of experiments based on the available data in literature are performed by using IATE and BND models in two and three-dimensions. The results show that the simulations performed by using IATE and BND models agree with each other and with the experimental data. The simulations performed in three-dimensions give better agreement with the experimental data

  16. Social deprivation and burden of influenza: Testing hypotheses and gaining insights from a simulation model for the spread of influenza

    Directory of Open Access Journals (Sweden)

    Ayaz Hyder

    2015-06-01

    Full Text Available Factors associated with the burden of influenza among vulnerable populations have mainly been identified using statistical methodologies. Complex simulation models provide mechanistic explanations, in terms of spatial heterogeneity and contact rates, while controlling other factors and may be used to better understand statistical patterns and, ultimately, design optimal population-level interventions. We extended a sophisticated simulation model, which was applied to forecast epidemics and validated for predictive ability, to identify mechanisms for the empirical relationship between social deprivation and the burden of influenza. Our modeled scenarios and associated epidemic metrics systematically assessed whether neighborhood composition and/or spatial arrangement could qualitatively replicate this empirical relationship. We further used the model to determine consequences of local-scale heterogeneities on larger scale disease spread. Our findings indicated that both neighborhood composition and spatial arrangement were critical to qualitatively match the empirical relationship of interest. Also, when social deprivation was fully included in the model, we observed lower age-based attack rates and greater delay in epidemic peak week in the most socially deprived neighborhoods. Insights from simulation models complement current understandings from statistical-based association studies. Additional insights from our study are: (1 heterogeneous spatial arrangement of neighborhoods is a necessary condition for simulating observed disparities in the burden of influenza and (2 unmeasured factors may lead to a better quantitative match between simulated and observed rate ratio in the burden of influenza between the most and least socially deprived populations.

  17. Simulating the effects of climate change on the distribution of an invasive plant, using a high resolution, local scale, mechanistic approach: challenges and insights.

    Science.gov (United States)

    Fennell, Mark; Murphy, James E; Gallagher, Tommy; Osborne, Bruce

    2013-04-01

    The growing economic and ecological damage associated with biological invasions, which will likely be exacerbated by climate change, necessitates improved projections of invasive spread. Generally, potential changes in species distribution are investigated using climate envelope models; however, the reliability of such models has been questioned and they are not suitable for use at local scales. At this scale, mechanistic models are more appropriate. This paper discusses some key requirements for mechanistic models and utilises a newly developed model (PSS[gt]) that incorporates the influence of habitat type and related features (e.g., roads and rivers), as well as demographic processes and propagule dispersal dynamics, to model climate induced changes in the distribution of an invasive plant (Gunnera tinctoria) at a local scale. A new methodology is introduced, dynamic baseline benchmarking, which distinguishes climate-induced alterations in species distributions from other potential drivers of change. Using this approach, it was concluded that climate change, based on IPCC and C4i projections, has the potential to increase the spread-rate and intensity of G. tinctoria invasions. Increases in the number of individuals were primarily due to intensification of invasion in areas already invaded or in areas projected to be invaded in the dynamic baseline scenario. Temperature had the largest influence on changes in plant distributions. Water availability also had a large influence and introduced the most uncertainty in the projections. Additionally, due to the difficulties of parameterising models such as this, the process has been streamlined by utilising methods for estimating unknown variables and selecting only essential parameters. © 2012 Blackwell Publishing Ltd.

  18. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  19. Assessing the potential value for an automated dairy cattle body condition scoring system through stochastic simulation

    NARCIS (Netherlands)

    Bewley, J.M.; Boehlje, M.D.; Gray, A.W.; Hogeveen, H.; Kenyon, S.J.; Eicher, S.D.; Schutz, M.M.

    2010-01-01

    Purpose – The purpose of this paper is to develop a dynamic, stochastic, mechanistic simulation model of a dairy business to evaluate the cost and benefit streams coinciding with technology investments. The model was constructed to embody the biological and economical complexities of a dairy farm

  20. Responses to atmospheric CO2 concentrations in crop simulation models: a review of current simple and semicomplex representations and options for model development.

    Science.gov (United States)

    Vanuytrecht, Eline; Thorburn, Peter J

    2017-05-01

    Elevated atmospheric CO 2 concentrations ([CO 2 ]) cause direct changes in crop physiological processes (e.g. photosynthesis and stomatal conductance). To represent these CO 2 responses, commonly used crop simulation models have been amended, using simple and semicomplex representations of the processes involved. Yet, there is no standard approach to and often poor documentation of these developments. This study used a bottom-up approach (starting with the APSIM framework as case study) to evaluate modelled responses in a consortium of commonly used crop models and illuminate whether variation in responses reflects true uncertainty in our understanding compared to arbitrary choices of model developers. Diversity in simulated CO 2 responses and limited validation were common among models, both within the APSIM framework and more generally. Whereas production responses show some consistency up to moderately high [CO 2 ] (around 700 ppm), transpiration and stomatal responses vary more widely in nature and magnitude (e.g. a decrease in stomatal conductance varying between 35% and 90% among models was found for [CO 2 ] doubling to 700 ppm). Most notably, nitrogen responses were found to be included in few crop models despite being commonly observed and critical for the simulation of photosynthetic acclimation, crop nutritional quality and carbon allocation. We suggest harmonization and consideration of more mechanistic concepts in particular subroutines, for example, for the simulation of N dynamics, as a way to improve our predictive understanding of CO 2 responses and capture secondary processes. Intercomparison studies could assist in this aim, provided that they go beyond simple output comparison and explicitly identify the representations and assumptions that are causal for intermodel differences. Additionally, validation and proper documentation of the representation of CO 2 responses within models should be prioritized. © 2017 John Wiley & Sons Ltd.

  1. Using a Mechanistic Reactive Transport Model to Represent Soil Organic Matter Dynamics and Climate Sensitivity

    Science.gov (United States)

    Guerry, N.; Riley, W. J.; Maggi, F.; Torn, M. S.; Kleber, M.

    2011-12-01

    The nature of long term Soil Organic Matter (SOM) dynamics is uncertain and the mechanisms involved are crudely represented in site, regional, and global models. Recent work challenging the paradigm that SOM is stabilized because of its sequential transformations to more intrinsically recalcitrant compounds motivated us to develop a mechanistic modeling framework that can be used to test hypotheses of SOM dynamics. We developed our C cycling model in TOUGHREACT, an established 3-dimensional reactive transport solver that accounts for multiple phases (aqueous, gaseous, sorbed), multiple species, advection and diffusion, and multiple microbial populations. Energy and mass exchange through the soil boundaries are accounted for via ground heat flux, rainfall, C sources (e.g., exudation, woody, leaf, root litter) and C losses (e.g., CO2 emissions and DOC deep percolation). SOM is categorized according to the various types of compounds commonly found in the above mentioned C sources and microbial byproducts, including poly- and monosaccharides, lignin, amino compounds, organic acids, nucleic acids, lipids, and phenols. Each of these compounds is accounted for by one or more representative species in the model. A reaction network was developed to describe the microbially-mediated processes and chemical interactions of these species, including depolymerization, microbial assimilation, respiration and deposition of byproducts, and incorporation of dead biomass into SOM stocks. Enzymatic reactions are characterized by Michaelis-Menten kinetics, with maximum reaction rates determined by the species' O/C ratio. Microbial activity is further regulated by soil moisture content, O2 availability, pH, and temperature. For the initial set of simulations, literature values were used to constrain microbial Monod parameters, Michaelis-Menten parameters, sorption parameters, physical protection, partitioning of microbial byproducts, and partitioning of litter inputs, although there is

  2. Encapsulating model complexity and landscape-scale analyses of state-and-transition simulation models: an application of ecoinformatics and juniper encroachment in sagebrush steppe ecosystems

    Science.gov (United States)

    O'Donnell, Michael

    2015-01-01

    State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http

  3. Fast charging technique for high power LiFePO4 batteries: A mechanistic analysis of aging

    Science.gov (United States)

    Anseán, D.; Dubarry, M.; Devie, A.; Liaw, B. Y.; García, V. M.; Viera, J. C.; González, M.

    2016-07-01

    One of the major issues hampering the acceptance of electric vehicles (EVs) is the anxiety associated with long charging time. Hence, the ability to fast charging lithium-ion battery (LIB) systems is gaining notable interest. However, fast charging is not tolerated by all LIB chemistries because it affects battery functionality and accelerates its aging processes. Here, we investigate the long-term effects of multistage fast charging on a commercial high power LiFePO4-based cell and compare it to another cell tested under standard charging. Coupling incremental capacity (IC) and IC peak area analysis together with mechanistic model simulations ('Alawa' toolbox with harvested half-cell data), we quantify the degradation modes that cause aging of the tested cells. The results show that the proposed fast charging technique caused similar aging effects as standard charging. The degradation is caused by a linear loss of lithium inventory, coupled with a less degree of linear loss of active material on the negative electrode. This study validates fast charging as a feasible mean of operation for this particular LIB chemistry and cell architecture. It also illustrates the benefits of a mechanistic approach to understand cell degradation on commercial cells.

  4. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Bucknor, Matthew [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Jerden, James [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Brunett, Acacia J. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Denman, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Clark, Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Denning, Richard S. [Consultant, Columbus, OH (United States)

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  5. Population PK modelling and simulation based on fluoxetine and norfluoxetine concentrations in milk: a milk concentration-based prediction model.

    Science.gov (United States)

    Tanoshima, Reo; Bournissen, Facundo Garcia; Tanigawara, Yusuke; Kristensen, Judith H; Taddio, Anna; Ilett, Kenneth F; Begg, Evan J; Wallach, Izhar; Ito, Shinya

    2014-10-01

    Population pharmacokinetic (pop PK) modelling can be used for PK assessment of drugs in breast milk. However, complex mechanistic modelling of a parent and an active metabolite using both blood and milk samples is challenging. We aimed to develop a simple predictive pop PK model for milk concentration-time profiles of a parent and a metabolite, using data on fluoxetine (FX) and its active metabolite, norfluoxetine (NFX), in milk. Using a previously published data set of drug concentrations in milk from 25 women treated with FX, a pop PK model predictive of milk concentration-time profiles of FX and NFX was developed. Simulation was performed with the model to generate FX and NFX concentration-time profiles in milk of 1000 mothers. This milk concentration-based pop PK model was compared with the previously validated plasma/milk concentration-based pop PK model of FX. Milk FX and NFX concentration-time profiles were described reasonably well by a one compartment model with a FX-to-NFX conversion coefficient. Median values of the simulated relative infant dose on a weight basis (sRID: weight-adjusted daily doses of FX and NFX through breastmilk to the infant, expressed as a fraction of therapeutic FX daily dose per body weight) were 0.028 for FX and 0.029 for NFX. The FX sRID estimates were consistent with those of the plasma/milk-based pop PK model. A predictive pop PK model based on only milk concentrations can be developed for simultaneous estimation of milk concentration-time profiles of a parent (FX) and an active metabolite (NFX). © 2014 The British Pharmacological Society.

  6. Biomeasures and mechanistic modeling highlight PK/PD risks for a monoclonal antibody targeting Fn14 in kidney disease.

    Science.gov (United States)

    Chen, Xiaoying; Farrokhi, Vahid; Singh, Pratap; Ocana, Mireia Fernandez; Patel, Jenil; Lin, Lih-Ling; Neubert, Hendrik; Brodfuehrer, Joanne

    2018-01-01

    Discovery of the upregulation of fibroblast growth factor-inducible-14 (Fn14) receptor following tissue injury has prompted investigation into biotherapeutic targeting of the Fn14 receptor for the treatment of conditions such as chronic kidney diseases. In the development of monoclonal antibody (mAb) therapeutics, there is an increasing trend to use biomeasures combined with mechanistic pharmacokinetic/pharmacodynamic (PK/PD) modeling to enable decision making in early discovery. With the aim of guiding preclinical efforts on designing an antibody with optimized properties, we developed a mechanistic site-of-action (SoA) PK/PD model for human application. This model incorporates experimental biomeasures, including concentration of soluble Fn14 (sFn14) in human plasma and membrane Fn14 (mFn14) in human kidney tissue, and turnover rate of human sFn14. Pulse-chase studies using stable isotope-labeled amino acids and mass spectrometry indicated the sFn14 half-life to be approximately 5 hours in healthy volunteers. The biomeasures (concentration, turnover) of sFn14 in plasma reveals a significant hurdle in designing an antibody against Fn14 with desired characteristics. The projected dose (>1 mg/kg/wk for 90% target coverage) derived from the human PK/PD model revealed potential high and frequent dosing requirements under certain conditions. The PK/PD model suggested a unique bell-shaped relationship between target coverage and antibody affinity for anti-Fn14 mAb, which could be applied to direct the antibody engineering towards an optimized affinity. This investigation highlighted potential applications, including assessment of PK/PD risks during early target validation, human dose prediction and drug candidate optimization.

  7. Assessing the Role of Climate Variability on Liver Fluke Risk in the UK Through Mechanistic Hydro-Epidemiological Modelling

    Science.gov (United States)

    Beltrame, L.; Dunne, T.; Rose, H.; Walker, J.; Morgan, E.; Vickerman, P.; Wagener, T.

    2016-12-01

    Liver fluke is a flatworm parasite infecting grazing animals worldwide. In the UK, it causes considerable production losses to cattle and sheep industries and costs farmers millions of pounds each year due to reduced growth rates and lower milk yields. Large part of the parasite life-cycle takes place outside of the host, with its survival and development strongly controlled by climatic and hydrologic conditions. Evidence of climate-driven changes in the distribution and seasonality of fluke disease already exists, as the infection is increasingly expanding to new areas and becoming a year-round problem. Therefore, it is crucial to assess current and potential future impacts of climate variability on the disease to guide interventions at the farm scale and mitigate risk. Climate-based fluke risk models have been available since the 1950s, however, they are based on empirical relationships derived between historical climate and incidence data, and thus are unlikely to be robust for simulating risk under changing conditions. Moreover, they are not dynamic, but estimate risk over large regions in the UK based on monthly average climate conditions, so they do not allow investigating the effects of climate variability for supporting farmers' decisions. In this study, we introduce a mechanistic model for fluke, which represents habitat suitability for disease development at 25m resolution with a daily time step, explicitly linking the parasite life-cycle to key hydro-climate conditions. The model is used on a case study in the UK and sensitivity analysis is performed to better understand the role of climate variability on the space-time dynamics of the disease, while explicitly accounting for uncertainties. Comparisons are presented with experts' knowledge and a widely used empirical model.

  8. Aviation Safety Simulation Model

    Science.gov (United States)

    Houser, Scott; Yackovetsky, Robert (Technical Monitor)

    2001-01-01

    The Aviation Safety Simulation Model is a software tool that enables users to configure a terrain, a flight path, and an aircraft and simulate the aircraft's flight along the path. The simulation monitors the aircraft's proximity to terrain obstructions, and reports when the aircraft violates accepted minimum distances from an obstruction. This model design facilitates future enhancements to address other flight safety issues, particularly air and runway traffic scenarios. This report shows the user how to build a simulation scenario and run it. It also explains the model's output.

  9. Cognitive models embedded in system simulation models

    International Nuclear Information System (INIS)

    Siegel, A.I.; Wolf, J.J.

    1982-01-01

    If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context

  10. Exposure factors for marine eutrophication impacts assessment based on a mechanistic biological model

    DEFF Research Database (Denmark)

    Cosme, Nuno Miguel Dias; Koski, Marja; Hauschild, Michael Zwicky

    2015-01-01

    marine ecosystem (LME), five climate zones, and site-generic. The XFs obtained range from 0.45 (Central Arctic Ocean) to 15.9kgO2kgN-1 (Baltic Sea). While LME resolution is recommended, aggregated PE or XF per climate zone can be adopted, but not global aggregation due to high variability. The XF......Emissions of nitrogen (N) from anthropogenic sources enrich marine waters and promote planktonic growth. This newly synthesised organic carbon is eventually exported to benthic waters where aerobic respiration by heterotrophic bacteria results in the consumption of dissolved oxygen (DO......). This pathway is typical of marine eutrophication. A model is proposed to mechanistically estimate the response of coastal marine ecosystems to N inputs. It addresses the biological processes of nutrient-limited primary production (PP), metazoan consumption, and bacterial degradation, in four distinct sinking...

  11. A probabilistic model-based soft sensor to monitor lactic acid bacteria fermentations

    DEFF Research Database (Denmark)

    Spann, Robert; Roca, Christophe; Kold, David

    2018-01-01

    A probabilistic soft sensor based on a mechanistic model was designed to monitor S. thermophilus fermentations, and validated with experimental lab-scale data. It considered uncertainties in the initial conditions, on-line measurements, and model parameters by performing Monte Carlo simulations...... the model parameters that were then used as input to the mechanistic model. The soft sensor predicted both the current state variables, as well as the future course of the fermentation, e.g. with a relative mean error of the biomass concentration of 8 %. This successful implementation of a process...... within the monitoring system. It predicted, therefore, the probability distributions of the unmeasured states, such as biomass, lactose, and lactic acid concentrations. To this end, a mechanistic model was developed first, and a statistical parameter estimation was performed in order to assess parameter...

  12. Toward a Mechanistic Understanding of Deuterium Excess as a Tracer for Evapotranspiration

    Energy Technology Data Exchange (ETDEWEB)

    Lai, Chun-Ta [Department of Biology, San Diego State University, San Diego, CA (United States)

    2013-07-15

    An understanding of atmospheric water vapour and its isotopic composition is useful for modelling effects of terrestrial evapotranspiration on regional hydrologic cycles. Previous studies showed diurnal and vertical patterns of water vapour isotope ratios ({delta}{sup 2}H{sub v} and {delta}{sup 18}O{sub v}) consistently observed in an old growth coniferous forest. Using a box model and a mass balance approach to simulate 'isoflux of d-excess', the effect of evapotranspiration on the d-excess in atmospheric water vapour is quantitatively demonstrated. The results suggest that d-excess can be mechanistically utilized to identify processes that contribute to the diurnal variation in atmospheric moisture. These new findings have implications for larger-scale predictions of precipitation across the terrestrial landscape. In this paper, I report the initial results of the {delta}{sup 2}H{sub v} and {delta}{sup 18}O{sub v} measurements using a cavity enhanced spectroscopy instrument. These recent data are consistent with the pattern observed by the conventional sampling method, providing new opportunities for studying d-excess as a tracer for evapotranspiration. (author)

  13. Polymerization kinetics of wheat gluten upon thermosetting. A mechanistic model.

    Science.gov (United States)

    Domenek, Sandra; Morel, Marie-Hélène; Bonicel, Joëlle; Guilbert, Stéphane

    2002-10-09

    Size exclusion high-performance liquid chromatography analysis was carried out on wheat gluten-glycerol blends subjected to different heat treatments. The elution profiles were analyzed in order to follow the solubility loss of protein fractions with specific molecular size. Owing to the known biochemical changes involved during the heat denaturation of gluten, a mechanistic mathematical model was developed, which divided the protein denaturation into two distinct reaction steps: (i) reversible change in protein conformation and (ii) protein precipitation through disulfide bonding between initially SDS-soluble and SDS-insoluble reaction partners. Activation energies of gluten unfolding, refolding, and precipitation were calculated with the Arrhenius law to 53.9 kJ x mol(-1), 29.5 kJ x mol(-1), and 172 kJ x mol(-1), respectively. The rate of protein solubility loss decreased as the cross-linking reaction proceeded, which may be attributed to the formation of a three-dimensional network progressively hindering the reaction. The enhanced susceptibility to aggregation of large molecules was assigned to a risen reaction probability due to their higher number of cysteine residues and to the increased percentage of unfolded and thereby activated proteins as complete protein refolding seemed to be an anticooperative process.

  14. Mechanistic Mathematical Modeling Tests Hypotheses of the Neurovascular Coupling in fMRI.

    Directory of Open Access Journals (Sweden)

    Karin Lundengård

    2016-06-01

    Full Text Available Functional magnetic resonance imaging (fMRI measures brain activity by detecting the blood-oxygen-level dependent (BOLD response to neural activity. The BOLD response depends on the neurovascular coupling, which connects cerebral blood flow, cerebral blood volume, and deoxyhemoglobin level to neuronal activity. The exact mechanisms behind this neurovascular coupling are not yet fully investigated. There are at least three different ways in which these mechanisms are being discussed. Firstly, mathematical models involving the so-called Balloon model describes the relation between oxygen metabolism, cerebral blood volume, and cerebral blood flow. However, the Balloon model does not describe cellular and biochemical mechanisms. Secondly, the metabolic feedback hypothesis, which is based on experimental findings on metabolism associated with brain activation, and thirdly, the neurotransmitter feed-forward hypothesis which describes intracellular pathways leading to vasoactive substance release. Both the metabolic feedback and the neurotransmitter feed-forward hypotheses have been extensively studied, but only experimentally. These two hypotheses have never been implemented as mathematical models. Here we investigate these two hypotheses by mechanistic mathematical modeling using a systems biology approach; these methods have been used in biological research for many years but never been applied to the BOLD response in fMRI. In the current work, model structures describing the metabolic feedback and the neurotransmitter feed-forward hypotheses were applied to measured BOLD responses in the visual cortex of 12 healthy volunteers. Evaluating each hypothesis separately shows that neither hypothesis alone can describe the data in a biologically plausible way. However, by adding metabolism to the neurotransmitter feed-forward model structure, we obtained a new model structure which is able to fit the estimation data and successfully predict new

  15. Controls on the spatial variability of key soil properties: comparing field data with a mechanistic soilscape evolution model

    Science.gov (United States)

    Vanwalleghem, T.; Román, A.; Giraldez, J. V.

    2016-12-01

    There is a need for better understanding the processes influencing soil formation and the resulting distribution of soil properties. Soil properties can exhibit strong spatial variation, even at the small catchment scale. Especially soil carbon pools in semi-arid, mountainous areas are highly uncertain because bulk density and stoniness are very heterogeneous and rarely measured explicitly. In this study, we explore the spatial variability in key soil properties (soil carbon stocks, stoniness, bulk density and soil depth) as a function of processes shaping the critical zone (weathering, erosion, soil water fluxes and vegetation patterns). We also compare the potential of a geostatistical versus a mechanistic soil formation model (MILESD) for predicting these key soil properties. Soil core samples were collected from 67 locations at 6 depths. Total soil organic carbon stocks were 4.38 kg m-2. Solar radiation proved to be the key variable controlling soil carbon distribution. Stone content was mostly controlled by slope, indicating the importance of erosion. Spatial distribution of bulk density was found to be highly random. Finally, total carbon stocks were predicted using a random forest model whose main covariates were solar radiation and NDVI. The model predicts carbon stocks that are double as high on north versus south-facing slopes. However, validation showed that these covariates only explained 25% of the variation in the dataset. Apparently, present-day landscape and vegetation properties are not sufficient to fully explain variability in the soil carbon stocks in this complex terrain under natural vegetation. This is attributed to a high spatial variability in bulk density and stoniness, key variables controlling carbon stocks. Similar results were obtained with the mechanistic soil formation model MILESD, suggesting that more complex models might be needed to further explore this high spatial variability.

  16. A mechanistic model for long-term nuclear waste glass dissolution integrating chemical affinity and interfacial diffusion barrier

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Teqi [Northwest Institute of Nuclear Technology, No.28 Pingyu Road, Baqiao District, Xi' an,Shaanxi, 710024 (China); Mechanics and Physics of Solids Research Group, Modelling and Simulation Centre, The University of Manchester, Oxford Road, Manchester, M13 9PL (United Kingdom); Jivkov, Andrey P., E-mail: andrey.jivkov@manchester.ac.uk [Mechanics and Physics of Solids Research Group, Modelling and Simulation Centre, The University of Manchester, Oxford Road, Manchester, M13 9PL (United Kingdom); Li, Weiping; Liang, Wei; Wang, Yu; Xu, Hui [Northwest Institute of Nuclear Technology, No.28 Pingyu Road, Baqiao District, Xi' an,Shaanxi, 710024 (China); Han, Xiaoyuan, E-mail: xyhan_nint@sina.cn [Northwest Institute of Nuclear Technology, No.28 Pingyu Road, Baqiao District, Xi' an,Shaanxi, 710024 (China)

    2017-04-01

    Understanding the alteration of nuclear waste glass in geological repository conditions is critical element of the analysis of repository retention function. Experimental observations of glass alterations provide a general agreement on the following regimes: inter-diffusion, hydrolysis process, rate drop, residual rate and, under very particular conditions, resumption of alteration. Of these, the mechanisms controlling the rate drop and the residual rate remain a subject of dispute. This paper offers a critical review of the two most competitive models related to these regimes: affinity–limited dissolution and diffusion barrier. The limitations of these models are highlighted by comparison of their predictions with available experimental evidence. Based on the comprehensive discussion of the existing models, a new mechanistic model is proposed as a combination of the chemical affinity and diffusion barrier concepts. It is demonstrated how the model can explain experimental phenomena and data, for which the existing models are shown to be not fully adequate.

  17. Mechanistic pathways of recognition of a solvent-inaccessible cavity of protein by a ligand

    Science.gov (United States)

    Mondal, Jagannath; Pandit, Subhendu; Dandekar, Bhupendra; Vallurupalli, Pramodh

    One of the puzzling questions in the realm of protein-ligand recognition is how a solvent-inaccessible hydrophobic cavity of a protein gets recognized by a ligand. We address the topic by simulating, for the first time, the complete binding process of benzene from aqueous media to the well-known buried cavity of L99A T4 Lysozyme at an atomistic resolution. Our multiple unbiased microsecond-long trajectories, which were completely blind to the location of target binding site, are able to unequivocally identify the kinetic pathways along which benzene molecule meanders across the solvent and protein and ultimately spontaneously recognizes the deeply buried cavity of L99A T4 Lysozyme at an accurate precision. Our simulation, combined with analysis based on markov state model and free energy calculation, reveals that there are more than one distinct ligand binding pathways. Intriguingly, each of the identified pathways involves the transient opening of a channel of the protein prior to ligand binding. The work will also decipher rich mechanistic details on unbinding kinetics of the ligand as obtained from enhanced sampling techniques.

  18. Modeling and simulation of xylitol production in bioreactor by Debaryomyces nepalensis NCYC 3413 using unstructured and artificial neural network models.

    Science.gov (United States)

    Pappu, J Sharon Mano; Gummadi, Sathyanarayana N

    2016-11-01

    This study examines the use of unstructured kinetic model and artificial neural networks as predictive tools for xylitol production by Debaryomyces nepalensis NCYC 3413 in bioreactor. An unstructured kinetic model was proposed in order to assess the influence of pH (4, 5 and 6), temperature (25°C, 30°C and 35°C) and volumetric oxygen transfer coefficient kLa (0.14h(-1), 0.28h(-1) and 0.56h(-1)) on growth and xylitol production. A feed-forward back-propagation artificial neural network (ANN) has been developed to investigate the effect of process condition on xylitol production. ANN configuration of 6-10-3 layers was selected and trained with 339 experimental data points from bioreactor studies. Results showed that simulation and prediction accuracy of ANN was apparently higher when compared to unstructured mechanistic model under varying operational conditions. ANN was found to be an efficient data-driven tool to predict the optimal harvest time in xylitol production. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  20. Validation of mechanistic models for gas precipitation in solids during postirradiation annealing experiments

    Science.gov (United States)

    Rest, J.

    1989-12-01

    A number of different phenomenological models for gas precipitation in solids during postirradiation annealing experiments have been proposed. Validation of such mechanistic models for gas release and swelling is complicated by the use of data containing large systematic errors, and phenomena characterized by synergistic effects as well as uncertainties in materials properties. Statistical regression analysis is recommended for the selection of a reasonably well characterized data base for gas release from irradiated fuel under transient heating conditions. It is demonstrated that an appropriate data selection method is required in order to realistically examine the impact of differing descriptions of the phenomena, and uncertainties in selected materials properties, on the validation results. The results of the analysis show that the kinetics of gas precipitation in solids depend on bubble overpressurization effects and need to be accounted for during the heatup phase of isothermal heating experiments. It is shown that if only the total gas release values (as opposed to time-dependent data) were available, differentiation between different gas precipitation models would be ambiguous. The observed sustained increase in the fractional release curve at relatively high temperatures after the total precipitation of intragranular gas in fission gas bubbles is ascribed to the effects of a grain-growth/grain-boundary sweeping mechanism.

  1. Validation of mechanistic models for gas precipitation in solids during postirradiation annealing experiments

    International Nuclear Information System (INIS)

    Rest, J.

    1989-01-01

    A number of different phenomenological models for gas precipitation in solids during postirradiation annealing experiments have been proposed. Validation of such mechanistic models for gas release and swelling is complicated by the use of data containing large systematic errors, and phenomena characterized by synergistic effects as well as uncertainties in materials properties. Statistical regression analysis is recommended for the selection of a reasonably well characterized data base for gas release from irradiated fuel under transient heating conditions. It is demonstrated that an appropriate data selection method is required in order to realistically examine the impact of differing descriptions of the phenomena, and uncertainties in selected materials properties, on the validation results. The results of the analysis show that the kinetics of gas precipitation in solid depend on bubble overpressurization effects and need to be accounted for during the heatup phase of isothermal heating experiments. It is shown that if only the total gas release values (as opposed to time-dependent data) were available, differentiation between different gas precipitation models would be ambiguous. The observed sustained increase in the fractional release curve at relatively high temperatures after the total precipitation of intragranular gas in fission gas bubbles is ascribed to the effects of a grain-growth/grain-boundary sweeping mechanism. (orig.)

  2. A review of hydrological/water-quality models

    Directory of Open Access Journals (Sweden)

    Liangliang GAO,Daoliang LI

    2014-12-01

    Full Text Available Water quality models are important in predicting the changes in surface water quality for environmental management. A range of water quality models are wildly used, but every model has its advantages and limitations for specific situations. The aim of this review is to provide a guide to researcher for selecting a suitable water quality model. Eight well known water quality models were selected for this review: SWAT, WASP, QUALs, MIKE 11, HSPF, CE-QUAL-W2, ELCOM-CAEDYM and EFDC. Each model is described according to its intended use, development, simulation elements, basic principles and applicability (e.g., for rivers, lakes, and reservoirs and estuaries. Currently, the most important trends for future model development are: (1 combination models─individual models cannot completely solve the complex situations so combined models are needed to obtain the most appropriate results, (2 application of artificial intelligence and mechanistic models combined with non-mechanistic models will provide more accurate results because of the realistic parameters derived from non-mechanistic models, and (3 integration with remote sensing, geographical information and global position systems (3S ─3S can solve problems requiring large amounts of data.

  3. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  4. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  5. Inferring spatial memory and spatiotemporal scaling from GPS data: comparing red deer Cervus elaphus movements with simulation models.

    Science.gov (United States)

    Gautestad, Arild O; Loe, Leif E; Mysterud, Atle

    2013-05-01

    1. Increased inference regarding underlying behavioural mechanisms of animal movement can be achieved by comparing GPS data with statistical mechanical movement models such as random walk and Lévy walk with known underlying behaviour and statistical properties. 2. GPS data are typically collected with ≥ 1 h intervals not exactly tracking every mechanistic step along the movement path, so a statistical mechanical model approach rather than a mechanistic approach is appropriate. However, comparisons require a coherent framework involving both scaling and memory aspects of the underlying process. Thus, simulation models have recently been extended to include memory-guided returns to previously visited patches, that is, site fidelity. 3. We define four main classes of movement, differing in incorporation of memory and scaling (based on respective intervals of the statistical fractal dimension D and presence/absence of site fidelity). Using three statistical protocols to estimate D and site fidelity, we compare these main movement classes with patterns observed in GPS data from 52 females of red deer (Cervus elaphus). 4. The results show best compliance with a scale-free and memory-enhanced kind of space use; that is, a power law distribution of step lengths, a fractal distribution of the spatial scatter of fixes and site fidelity. 5. Our study thus demonstrates how inference regarding memory effects and a hierarchical pattern of space use can be derived from analysis of GPS data. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.

  6. Malaria's Missing Number: Calculating the Human Component of R0 by a Within-Host Mechanistic Model of Plasmodium falciparum Infection and Transmission

    OpenAIRE

    Johnston, Geoffrey L.; Smith, David L.; Fidock, David A.

    2013-01-01

    Human infection by malarial parasites of the genus Plasmodium begins with the bite of an infected Anopheles mosquito. Current estimates place malaria mortality at over 650,000 individuals each year, mostly in African children. Efforts to reduce disease burden can benefit from the development of mathematical models of disease transmission. To date, however, comprehensive modeling of the parameters defining human infectivity to mosquitoes has remained elusive. Here, we describe a mechanistic wi...

  7. Fractal growth of tumors and other cellular populations: Linking the mechanistic to the phenomenological modeling and vice versa

    International Nuclear Information System (INIS)

    D'Onofrio, Alberto

    2009-01-01

    In this paper we study and extend the mechanistic mean field theory of growth of cellular populations proposed by Mombach et al. [Mombach JCM, Lemke N, Bodmann BEJ, Idiart MAP. A mean-field theory of cellular growth. Europhys Lett 2002;59:923-928] (MLBI model), and we demonstrate that the original model and our generalizations lead to inferences of biological interest. In the first part of this paper, we show that the model in study is widely general since it admits, as particular cases, the main phenomenological models of cellular growth. In the second part of this work, we generalize the MLBI model to a wider family of models by allowing the cells to have a generic unspecified biologically plausible interaction. Then, we derive a relationship between this generic microscopic interaction function and the growth rate of the corresponding macroscopic model. Finally, we propose to use this relationship in order to help the investigation of the biological plausibility of phenomenological models of cancer growth.

  8. An idealized radiative transfer scheme for use in a mechanistic general circulation model from the surface up to the mesopause region

    International Nuclear Information System (INIS)

    Knoepfel, Rahel; Becker, Erich

    2011-01-01

    A new and numerically efficient method to compute radiative flux densities and heating rates in a general atmospheric circulation model is presented. Our method accommodates the fundamental differences between the troposphere and middle atmosphere in the long-wave regime within a single parameterization that extends continuously from the surface up to the mesopause region and takes the deviations from the gray limit and from the local thermodynamic equilibrium into account. For this purpose, frequency-averaged Eddington-type transfer equations are derived for four broad absorber bands. The frequency variation inside each band is parameterized by application of the Elsasser band model extended by a slowly varying envelope function. This yields additional transfer equations for the perturbation amplitudes that are solved numerically along with the mean transfer equations. Deviations from local thermodynamic equilibrium are included in terms of isotropic scattering, calculating the single scattering albedo from the two-level model for each band. Solar radiative flux densities are computed for four energetically defined bands using the simple Beer-Bougert-Lambert relation for absorption within the atmosphere. The new scheme is implemented in a mechanistic general circulation model from the surface up to the mesopause region. A test simulation with prescribed concentrations of the radiatively active constituents shows quite reasonable results. In particular, since we take the full surface energy budget into account by means of a swamp ocean, and since the internal dynamics and turbulent diffusion of the model are formulated in accordance with the conservation laws, an equilibrated climatological radiation budget is obtained both at the top of the atmosphere and at the surface.

  9. Verification of a mechanistic model for the strain rate of zircaloy-4 fuel sheaths during transient heating

    International Nuclear Information System (INIS)

    Hunt, C.E.L.

    1980-10-01

    A mechanistic strain rate model for Zircaloy-4, named NIRVANA, was tested against experiments where pressurized fuel sheaths were strained during complex temperature-stress-time histories. The same histories were then examined to determine the spread in calculated strain which may be expected because of variations in dimensions, chemical content and mechanical properties which are allowed in the fuel sheath specifications. It was found that the variations allowed by the specifications could result in a probable spread in the predicted strain of plus or minus a factor of two from the mean value. The experimental results were well within this range. (auth)

  10. A Three-Stage Mechanistic Model for Solidification Cracking During Welding of Steel

    Science.gov (United States)

    Aucott, L.; Huang, D.; Dong, H. B.; Wen, S. W.; Marsden, J.; Rack, A.; Cocks, A. C. F.

    2018-03-01

    A three-stage mechanistic model for solidification cracking during TIG welding of steel is proposed from in situ synchrotron X-ray imaging of solidification cracking and subsequent analysis of fracture surfaces. Stage 1—Nucleation of inter-granular hot cracks: cracks nucleate inter-granularly in sub-surface where maximum volumetric strain is localized and volume fraction of liquid is less than 0.1; the crack nuclei occur at solute-enriched liquid pockets which remain trapped in increasingly impermeable semi-solid skeleton. Stage 2—Coalescence of cracks via inter-granular fracture: as the applied strain increases, cracks coalesce through inter-granular fracture; the coalescence path is preferential to the direction of the heat source and propagates through the grain boundaries to solidifying dendrites. Stage 3—Propagation through inter-dendritic hot tearing: inter-dendritic hot tearing occurs along the boundaries between solidifying columnar dendrites with higher liquid fraction. It is recommended that future solidification cracking criterion shall be based on the application of multiphase mechanics and fracture mechanics to the failure of semi-solid materials.

  11. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  12. Melanie Klein's metapsychology: phenomenological and mechanistic perspective.

    Science.gov (United States)

    Mackay, N

    1981-01-01

    Freud's metapsychology is the subject of an important debate. This is over whether psychoanalysis is best construed as a science of the natural science type or as a special human science. The same debate applies to Melanie Klein's work. In Klein's metapsychology are two different and incompatible models of explanation. One is taken over from Freud's structural theory and appears to be similarly mechanistic. The other is clinically based and phenomenological. These two are discussed with special reference to the concepts of "phantasy" and "internal object".

  13. Chemical kinetic mechanistic models to investigate cancer biology and impact cancer medicine

    International Nuclear Information System (INIS)

    Stites, Edward C

    2013-01-01

    Traditional experimental biology has provided a mechanistic understanding of cancer in which the malignancy develops through the acquisition of mutations that disrupt cellular processes. Several drugs developed to target such mutations have now demonstrated clinical value. These advances are unequivocal testaments to the value of traditional cellular and molecular biology. However, several features of cancer may limit the pace of progress that can be made with established experimental approaches alone. The mutated genes (and resultant mutant proteins) function within large biochemical networks. Biochemical networks typically have a large number of component molecules and are characterized by a large number of quantitative properties. Responses to a stimulus or perturbation are typically nonlinear and can display qualitative changes that depend upon the specific values of variable system properties. Features such as these can complicate the interpretation of experimental data and the formulation of logical hypotheses that drive further research. Mathematical models based upon the molecular reactions that define these networks combined with computational studies have the potential to deal with these obstacles and to enable currently available information to be more completely utilized. Many of the pressing problems in cancer biology and cancer medicine may benefit from a mathematical treatment. As work in this area advances, one can envision a future where such models may meaningfully contribute to the clinical management of cancer patients. (paper)

  14. Comparison of Two Mechanistic Microbial Growth Models to Estimate Shelf Life of Perishable Food Package under Dynamic Temperature Conditions

    Directory of Open Access Journals (Sweden)

    Dong Sun Lee

    2014-01-01

    Full Text Available Two mechanistic microbial growth models (Huang’s model and model of Baranyi and Roberts given in differential and integrated equation forms were compared in predicting the microbial growth and shelf life under dynamic temperature storage and distribution conditions. Literatures consistently reporting the microbial growth data under constant and changing temperature conditions were selected to obtain the primary model parameters, set up the secondary models, and apply them to predict the microbial growth and shelf life under fluctuating temperatures. When evaluated by general estimation behavior, bias factor, accuracy factor, and root-mean-square error, Huang’s model was comparable to Baranyi and Roberts’ model in the capability to estimate microbial growth under dynamic temperature conditions. Its simple form of single differential equation incorporating directly the growth rate and lag time may work as an advantage to be used in online shelf life estimation by using the electronic device.

  15. Steady-State Simulation of Steam Reforming of INEEL Tank Farm Waste

    International Nuclear Information System (INIS)

    Nichols, T.T.; Taylor, D.D.; Wood, R.A.; Barnes, C.M. email toddn@inel.gov

    2002-01-01

    A steady-state model of the Sodium-Bearing Waste steam reforming process at the Idaho National Engineering and Environmental Laboratory has been performed using the commercial ASPEN Plus process simulator. The preliminary process configuration and its representation in ASPEN are described. As assessment of the capability of the model to mechanistically predict product stream compositions was made, and fidelity gaps and opportunities for model enhancement were identified, resulting in the following conclusions: (1) Appreciable benefit is derived from using an activity coefficient model for electrolyte solution thermodynamics rather than assuming ideality (unity assumed for all activity coefficients). The concentrations of fifteen percent of the species present in the primary output stream were changed by more than 50%, relative to Electrolyte NRTL, when ideality was assumed; (2) The current baseline model provides a good start for estimating mass balances and performing integrated process optimization because it contains several key species, uses a mechanistic electrolyte thermodynamic model, and is based on a reasonable process configuration; and (3) Appreciable improvement to model fidelity can be realized by expanding the species list and the list of chemical and phase transformations. A path forward is proposed focusing on the use of an improved electrolyte thermodynamic property method, addition of chemical and phase transformations for key species currently absent from the model, and the combination of RGibbs and Flash blocks to simulate simultaneous phase and chemical equilibria in the off-gas treatment train

  16. Applied stochastic modelling

    CERN Document Server

    Morgan, Byron JT; Tanner, Martin Abba; Carlin, Bradley P

    2008-01-01

    Introduction and Examples Introduction Examples of data sets Basic Model Fitting Introduction Maximum-likelihood estimation for a geometric model Maximum-likelihood for the beta-geometric model Modelling polyspermy Which model? What is a model for? Mechanistic models Function Optimisation Introduction MATLAB: graphs and finite differences Deterministic search methods Stochastic search methods Accuracy and a hybrid approach Basic Likelihood ToolsIntroduction Estimating standard errors and correlations Looking at surfaces: profile log-likelihoods Confidence regions from profiles Hypothesis testing in model selectionScore and Wald tests Classical goodness of fit Model selection biasGeneral Principles Introduction Parameterisation Parameter redundancy Boundary estimates Regression and influence The EM algorithm Alternative methods of model fitting Non-regular problemsSimulation Techniques Introduction Simulating random variables Integral estimation Verification Monte Carlo inference Estimating sampling distributi...

  17. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  18. Notes on modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Redondo, Antonio [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  19. Thermal unit availability modeling in a regional simulation model

    International Nuclear Information System (INIS)

    Yamayee, Z.A.; Port, J.; Robinett, W.

    1983-01-01

    The System Analysis Model (SAM) developed under the umbrella of PNUCC's System Analysis Committee is capable of simulating the operation of a given load/resource scenario. This model employs a Monte-Carlo simulation to incorporate uncertainties. Among uncertainties modeled is thermal unit availability both for energy simulation (seasonal) and capacity simulations (hourly). This paper presents the availability modeling in the capacity and energy models. The use of regional and national data in deriving the two availability models, the interaction between the two and modifications made to the capacity model in order to reflect regional practices is presented. A sample problem is presented to show the modification process. Results for modeling a nuclear unit using NERC-GADS is presented

  20. A computational systems biology software platform for multiscale modeling and simulation: Integrating whole-body physiology, disease biology, and molecular reaction networks

    Directory of Open Access Journals (Sweden)

    Thomas eEissing

    2011-02-01

    Full Text Available Today, in silico studies and trial simulations already complement experimental approaches in pharmaceutical R&D and have become indispensable tools for decision making and communication with regulatory agencies. While biology is multi-scale by nature, project work and software tools usually focus on isolated aspects of drug action, such as pharmacokinetics at the organism scale or pharmacodynamic interaction on the molecular level. We present a modeling and simulation software platform consisting of PK-Sim® and MoBi® capable of building and simulating models that integrate across biological scales. A prototypical multiscale model for the progression of a pancreatic tumor and its response to pharmacotherapy is constructed and virtual patients are treated with a prodrug activated by hepatic metabolization. Tumor growth is driven by signal transduction leading to cell cycle transition and proliferation. Free tumor concentrations of the active metabolite inhibit Raf kinase in the signaling cascade and thereby cell cycle progression. In a virtual clinical study, the individual therapeutic outcome of the chemotherapeutic intervention is simulated for a large population with heterogeneous genomic background. Thereby, the platform allows efficient model building and integration of biological knowledge and prior data from all biological scales. Experimental in vitro model systems can be linked with observations in animal experiments and clinical trials. The interplay between patients, diseases, and drugs and topics with high clinical relevance such as the role of pharmacogenomics, drug-drug or drug-metabolite interactions can be addressed using this mechanistic, insight driven multiscale modeling approach.

  1. Mechanistic phenotypes: an aggregative phenotyping strategy to identify disease mechanisms using GWAS data.

    Directory of Open Access Journals (Sweden)

    Jonathan D Mosley

    Full Text Available A single mutation can alter cellular and global homeostatic mechanisms and give rise to multiple clinical diseases. We hypothesized that these disease mechanisms could be identified using low minor allele frequency (MAF<0.1 non-synonymous SNPs (nsSNPs associated with "mechanistic phenotypes", comprised of collections of related diagnoses. We studied two mechanistic phenotypes: (1 thrombosis, evaluated in a population of 1,655 African Americans; and (2 four groupings of cancer diagnoses, evaluated in 3,009 white European Americans. We tested associations between nsSNPs represented on GWAS platforms and mechanistic phenotypes ascertained from electronic medical records (EMRs, and sought enrichment in functional ontologies across the top-ranked associations. We used a two-step analytic approach whereby nsSNPs were first sorted by the strength of their association with a phenotype. We tested associations using two reverse genetic models and standard additive and recessive models. In the second step, we employed a hypothesis-free ontological enrichment analysis using the sorted nsSNPs to identify functional mechanisms underlying the diagnoses comprising the mechanistic phenotypes. The thrombosis phenotype was solely associated with ontologies related to blood coagulation (Fisher's p = 0.0001, FDR p = 0.03, driven by the F5, P2RY12 and F2RL2 genes. For the cancer phenotypes, the reverse genetics models were enriched in DNA repair functions (p = 2×10-5, FDR p = 0.03 (POLG/FANCI, SLX4/FANCP, XRCC1, BRCA1, FANCA, CHD1L while the additive model showed enrichment related to chromatid segregation (p = 4×10-6, FDR p = 0.005 (KIF25, PINX1. We were able to replicate nsSNP associations for POLG/FANCI, BRCA1, FANCA and CHD1L in independent data sets. Mechanism-oriented phenotyping using collections of EMR-derived diagnoses can elucidate fundamental disease mechanisms.

  2. General introduction to simulation models

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Boklund, Anette

    2012-01-01

    trials. However, if simulation models would be used, good quality input data must be available. To model FMD, several disease spread models are available. For this project, we chose three simulation model; Davis Animal Disease Spread (DADS), that has been upgraded to DTU-DADS, InterSpread Plus (ISP......Monte Carlo simulation can be defined as a representation of real life systems to gain insight into their functions and to investigate the effects of alternative conditions or actions on the modeled system. Models are a simplification of a system. Most often, it is best to use experiments and field...... trials to investigate the effect of alternative conditions or actions on a specific system. Nonetheless, field trials are expensive and sometimes not possible to conduct, as in case of foot-and-mouth disease (FMD). Instead, simulation models can be a good and cheap substitute for experiments and field...

  3. Mechanistically-Based Field-Scale Models of Uranium Biogeochemistry from Upscaling Pore-Scale Experiments and Models

    International Nuclear Information System (INIS)

    Tim Scheibe; Alexandre Tartakovsky; Brian Wood; Joe Seymour

    2007-01-01

    Effective environmental management of DOE sites requires reliable prediction of reactive transport phenomena. A central issue in prediction of subsurface reactive transport is the impact of multiscale physical, chemical, and biological heterogeneity. Heterogeneity manifests itself through incomplete mixing of reactants at scales below those at which concentrations are explicitly defined (i.e., the numerical grid scale). This results in a mismatch between simulated reaction processes (formulated in terms of average concentrations) and actual processes (controlled by local concentrations). At the field scale, this results in apparent scale-dependence of model parameters and inability to utilize laboratory parameters in field models. Accordingly, most field modeling efforts are restricted to empirical estimation of model parameters by fitting to field observations, which renders extrapolation of model predictions beyond fitted conditions unreliable. The objective of this project is to develop a theoretical and computational framework for (1) connecting models of coupled reactive transport from pore-scale processes to field-scale bioremediation through a hierarchy of models that maintain crucial information from the smaller scales at the larger scales; and (2) quantifying the uncertainty that is introduced by both the upscaling process and uncertainty in physical parameters. One of the challenges of addressing scale-dependent effects of coupled processes in heterogeneous porous media is the problem-specificity of solutions. Much effort has been aimed at developing generalized scaling laws or theories, but these require restrictive assumptions that render them ineffective in many real problems. We propose instead an approach that applies physical and numerical experiments at small scales (specifically the pore scale) to a selected model system in order to identify the scaling approach appropriate to that type of problem. Although the results of such studies will

  4. Mechanistically-Based Field-Scale Models of Uranium Biogeochemistry from Upscaling Pore-Scale Experiments and Models

    Energy Technology Data Exchange (ETDEWEB)

    Tim Scheibe; Alexandre Tartakovsky; Brian Wood; Joe Seymour

    2007-04-19

    Effective environmental management of DOE sites requires reliable prediction of reactive transport phenomena. A central issue in prediction of subsurface reactive transport is the impact of multiscale physical, chemical, and biological heterogeneity. Heterogeneity manifests itself through incomplete mixing of reactants at scales below those at which concentrations are explicitly defined (i.e., the numerical grid scale). This results in a mismatch between simulated reaction processes (formulated in terms of average concentrations) and actual processes (controlled by local concentrations). At the field scale, this results in apparent scale-dependence of model parameters and inability to utilize laboratory parameters in field models. Accordingly, most field modeling efforts are restricted to empirical estimation of model parameters by fitting to field observations, which renders extrapolation of model predictions beyond fitted conditions unreliable. The objective of this project is to develop a theoretical and computational framework for (1) connecting models of coupled reactive transport from pore-scale processes to field-scale bioremediation through a hierarchy of models that maintain crucial information from the smaller scales at the larger scales; and (2) quantifying the uncertainty that is introduced by both the upscaling process and uncertainty in physical parameters. One of the challenges of addressing scale-dependent effects of coupled processes in heterogeneous porous media is the problem-specificity of solutions. Much effort has been aimed at developing generalized scaling laws or theories, but these require restrictive assumptions that render them ineffective in many real problems. We propose instead an approach that applies physical and numerical experiments at small scales (specifically the pore scale) to a selected model system in order to identify the scaling approach appropriate to that type of problem. Although the results of such studies will

  5. Pathophysiology of white-nose syndrome in bats: a mechanistic model linking wing damage to mortality.

    Science.gov (United States)

    Warnecke, Lisa; Turner, James M; Bollinger, Trent K; Misra, Vikram; Cryan, Paul M; Blehert, David S; Wibbelt, Gudrun; Willis, Craig K R

    2013-08-23

    White-nose syndrome is devastating North American bat populations but we lack basic information on disease mechanisms. Altered blood physiology owing to epidermal invasion by the fungal pathogen Geomyces destructans (Gd) has been hypothesized as a cause of disrupted torpor patterns of affected hibernating bats, leading to mortality. Here, we present data on blood electrolyte concentration, haematology and acid-base balance of hibernating little brown bats, Myotis lucifugus, following experimental inoculation with Gd. Compared with controls, infected bats showed electrolyte depletion (i.e. lower plasma sodium), changes in haematology (i.e. increased haematocrit and decreased glucose) and disrupted acid-base balance (i.e. lower CO2 partial pressure and bicarbonate). These findings indicate hypotonic dehydration, hypovolaemia and metabolic acidosis. We propose a mechanistic model linking tissue damage to altered homeostasis and morbidity/mortality.

  6. An effective convectivity model for simulation of in-vessel core melt progression in a boiling water reactor

    International Nuclear Information System (INIS)

    Tran, C.T.; Dinh, T.N.

    2007-01-01

    The present paper is concerned with development and application of a so-called Effective Convection Model (ECM), which aims to provide a detailed, mechanistic description of heat transfer processes in a BWR lower plenum. The ECM is a Computational Fluid Dynamics (CFD)-like tool which employs a simpler and more effective approach to compute heat transfer by solving only energy conservation equation instead of solving the full set of Navier-Stokes and energy equations by a CFD code. We implement the ECM in a CFD code (Fluent), with detailed description of the ECM development, implementation and validation. A dual approach is used to validate the ECM, namely validation against experimental data and against heat transfer results obtained by CFD predictions in the same geometries and conditions. Insights gained from CFD simulations are also used to improve ECM. The ECM capability as an effective tool to simulate heat transfer of an internally heated volume in 3-dimensional complex geometry is demonstrated through examples of heat transfer analysis in a BWR lower plenum being cooled by coolant flow in Control Rod Guide Tubes. Simulation results and key findings of this case are reported and discussed. (authors)

  7. Emergence of nutrient limitation in tropical dry forests: hypotheses from simulation models

    Science.gov (United States)

    Medvigy, D.; Waring, B. G.; Xu, X.; Trierweiler, A.; Werden, L. K.; Wang, G.; Zhu, Q.; Powers, J. S.

    2017-12-01

    It is unclear to what extent tropical dry forest productivity may be limited by nutrients. Direct assessment of nutrient limitation through fertilization experiments has been rare, and paradigms pertaining to other ecosystems may not extend to tropical dry forests. For example, because dry tropical forests have a lower water supply than moist tropical forests, dry forests can have lower decomposition rates, higher soil carbon and nitrogen concentrations, and a more open nitrogen cycle than moist forests. We used a mechanistic, numerical model to generate hypotheses about nutrient limitation in tropical dry forests. The model dynamically couples ED2 (vegetation dynamics), MEND (biogeochemistry), and N-COM (plant-microbe competition for nutrients). Here, the MEND-component of the model has been extended to include nitrogen (N) and phosphorus (P) cycles. We focus on simulation of sixteen 25m x 25m plots in Costa Rica where a fertilization experiment has been underway since 2015. Baseline simulations are characterized by both nitrogen and phosphorus limitation of vegetation. Fertilization with N and P increased vegetation biomass, with N fertilization having a somewhat stronger effect. Nutrient limitation was also sensitive to climate and was more pronounced during drought periods. Overflow respiration was identified as a key process that mitigated nutrient limitation. These results suggest that, despite often having richer soils than tropical moist forests, tropical dry forests can also become nutrient-limited. If the climate becomes drier in the next century, as is expected for Central America, drier soils may decrease microbial activity and exacerbate nutrient limitation. The importance of overflow respiration underscores the need for appropriate treatment of microbial dynamics in ecosystem models. Ongoing and new nutrient fertilization experiments will present opportunities for testing whether, and how, nutrient limitation may indeed be emerging in tropical dry

  8. ECONOMIC MODELING STOCKS CONTROL SYSTEM: SIMULATION MODEL

    OpenAIRE

    Климак, М.С.; Войтко, С.В.

    2016-01-01

    Considered theoretical and applied aspects of the development of simulation models to predictthe optimal development and production systems that create tangible products andservices. It isproved that theprocessof inventory control needs of economicandmathematical modeling in viewof thecomplexity of theoretical studies. A simulation model of stocks control that allows make managementdecisions with production logistics

  9. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  10. Models of iodine behavior in reactor containments

    Energy Technology Data Exchange (ETDEWEB)

    Weber, C.F.; Beahm, E.C.; Kress, T.S.

    1992-10-01

    Models are developed for many phenomena of interest concerning iodine behavior in reactor containments during severe accidents. Processes include speciation in both gas and liquid phases, reactions with surfaces, airborne aerosols, and other materials, and gas-liquid interface behavior. Although some models are largely empirical formulations, every effort has been made to construct mechanistic and rigorous descriptions of relevant chemical processes. All are based on actual experimental data generated at the Oak Ridge National Laboratory (ORNL) or elsewhere, and, hence, considerable data evaluation and parameter estimation are contained in this study. No application or encoding is attempted, but each model is stated in terms of rate processes, with the intention of allowing mechanistic simulation. Taken together, this collection of models represents a best estimate iodine behavior and transport in reactor accidents.

  11. Models of iodine behavior in reactor containments

    International Nuclear Information System (INIS)

    Weber, C.F.; Beahm, E.C.; Kress, T.S.

    1992-10-01

    Models are developed for many phenomena of interest concerning iodine behavior in reactor containments during severe accidents. Processes include speciation in both gas and liquid phases, reactions with surfaces, airborne aerosols, and other materials, and gas-liquid interface behavior. Although some models are largely empirical formulations, every effort has been made to construct mechanistic and rigorous descriptions of relevant chemical processes. All are based on actual experimental data generated at the Oak Ridge National Laboratory (ORNL) or elsewhere, and, hence, considerable data evaluation and parameter estimation are contained in this study. No application or encoding is attempted, but each model is stated in terms of rate processes, with the intention of allowing mechanistic simulation. Taken together, this collection of models represents a best estimate iodine behavior and transport in reactor accidents

  12. Mechanistic model to predict colostrum intake based on deuterium oxide dilution technique data and impact of gestation and prefarrowing diets on piglet intake and sow yield of colostrum.

    Science.gov (United States)

    Theil, P K; Flummer, C; Hurley, W L; Kristensen, N B; Labouriau, R L; Sørensen, M T

    2014-12-01

    The aims of the present study were to quantify colostrum intake (CI) of piglets using the D2O dilution technique, to develop a mechanistic model to predict CI, to compare these data with CI predicted by a previous empirical predictive model developed for bottle-fed piglets, and to study how composition of diets fed to gestating sows affected piglet CI, sow colostrum yield (CY), and colostrum composition. In total, 240 piglets from 40 litters were enriched with D2O. The CI measured by D2O from birth until 24 h after the birth of first-born piglet was on average 443 g (SD 151). Based on measured CI, a mechanistic model to predict CI was developed using piglet characteristics (24-h weight gain [WG; g], BW at birth [BWB; kg], and duration of CI [D; min]: CI, g=-106+2.26 WG+200 BWB+0.111 D-1,414 WG/D+0.0182 WG/BWB (R2=0.944). This model was used to predict the CI for all colostrum suckling piglets within the 40 litters (n=500, mean=437 g, SD=153 g) and was compared with the CI predicted by a previous empirical predictive model (mean=305 g, SD=140 g). The previous empirical model underestimated the CI by 30% compared with that obtained by the new mechanistic model. The sows were fed 1 of 4 gestation diets (n=10 per diet) based on different fiber sources (low fiber [17%] or potato pulp, pectin residue, or sugarbeet pulp [32 to 40%]) from mating until d 108 of gestation. From d 108 of gestation until parturition, sows were fed 1 of 5 prefarrowing diets (n=8 per diet) varying in supplemented fat (3% animal fat, 8% coconut oil, 8% sunflower oil, 8% fish oil, or 4% fish oil+4% octanoic acid). Sows fed diets with pectin residue or sugarbeet pulp during gestation produced colostrum with lower protein, fat, DM, and energy concentrations and higher lactose concentrations, and their piglets had greater CI as compared with sows fed potato pulp or the low-fiber diet (Pcoconut oil decreased lactose and increased DM concentrations of colostrum compared with other prefarrowing diets (P

  13. Progress in modeling and simulation.

    Science.gov (United States)

    Kindler, E

    1998-01-01

    For the modeling of systems, the computers are more and more used while the other "media" (including the human intellect) carrying the models are abandoned. For the modeling of knowledges, i.e. of more or less general concepts (possibly used to model systems composed of instances of such concepts), the object-oriented programming is nowadays widely used. For the modeling of processes existing and developing in the time, computer simulation is used, the results of which are often presented by means of animation (graphical pictures moving and changing in time). Unfortunately, the object-oriented programming tools are commonly not designed to be of a great use for simulation while the programming tools for simulation do not enable their users to apply the advantages of the object-oriented programming. Nevertheless, there are exclusions enabling to use general concepts represented at a computer, for constructing simulation models and for their easy modification. They are described in the present paper, together with true definitions of modeling, simulation and object-oriented programming (including cases that do not satisfy the definitions but are dangerous to introduce misunderstanding), an outline of their applications and of their further development. In relation to the fact that computing systems are being introduced to be control components into a large spectrum of (technological, social and biological) systems, the attention is oriented to models of systems containing modeling components.

  14. Simulation modelling of fynbos ecosystems: Systems analysis and conceptual models

    CSIR Research Space (South Africa)

    Kruger, FJ

    1985-03-01

    Full Text Available -animal interactions. An additional two models, which expand aspects of the FYNBOS model, are described: a model for simulating canopy processes; and a Fire Recovery Simulator. The canopy process model will simulate ecophysiological processes in more detail than FYNBOS...

  15. Modelling water fluxes in a pine wood soil-vegetation-atmosphere system. Comparison of a water budget and water flow model using different parameter data sources

    International Nuclear Information System (INIS)

    Schneider, S.; Jacques, D.; Mallants, D.

    2010-01-01

    For modelling complex hydrological problems, realistic models and accurate hydraulic properties are needed. A mechanistic model (HYDRUS-1D) and a compartment model are evaluated for simulating the water balance in a soil-vegetation-atmosphere system using time series of measured water content at several depths in two lysimeters in a podzol soil with Scots Pine vegetation. 10 calibration scenarios are used to investigate the impact of the model type and the number of horizons in the profile on the calibration accuracy. Main results are: (i) with a large number of soil layers, both models describe accurately the water contents at all depths, (II) the number of soil layers is the major factor that controls the quality of the calibration. The compartment model is as an abstracted model and the mechanistic model is our reference model. Drainage values are the considered output. Drainage values simulated by the abstracted model were close to those of the reference model when averaged over a sufficiently long period (about 9 months). This result suggests that drainage values obtained with an abstracted model are reliably when averaged over sufficiently long periods; the abstracted model needs less computational time without an important loss of accuracy.

  16. Modelling water fluxes in a pine wood soil-vegetation-atmosphere system. Comparison of a water budget and water flow model using different parameter data sources

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, S.; Jacques, D.; Mallants, D.

    2010-02-15

    For modelling complex hydrological problems, realistic models and accurate hydraulic properties are needed. A mechanistic model (HYDRUS-1D) and a compartment model are evaluated for simulating the water balance in a soil-vegetation-atmosphere system using time series of measured water content at several depths in two lysimeters in a podzol soil with Scots Pine vegetation. 10 calibration scenarios are used to investigate the impact of the model type and the number of horizons in the profile on the calibration accuracy. Main results are: (i) with a large number of soil layers, both models describe accurately the water contents at all depths, (II) the number of soil layers is the major factor that controls the quality of the calibration. The compartment model is as an abstracted model and the mechanistic model is our reference model. Drainage values are the considered output. Drainage values simulated by the abstracted model were close to those of the reference model when averaged over a sufficiently long period (about 9 months). This result suggests that drainage values obtained with an abstracted model are reliably when averaged over sufficiently long periods; the abstracted model needs less computational time without an important loss of accuracy.

  17. General cracked-hinge model for simulation of low-cycle damage in cemented beams on soil

    DEFF Research Database (Denmark)

    Skar, Asmus; Poulsen, Peter Noe; Olesen, John Forbes

    2017-01-01

    The need for mechanistic constitutive models to evaluate the complex interaction between concrete crack propagation, geometry and soil foundation in concrete- and composite pavement systems has been recognized. Several models developed are either too complex or designed to solve relatively simple...

  18. Credibility, Replicability, and Reproducibility in Simulation for Biomedicine and Clinical Applications in Neuroscience

    Science.gov (United States)

    Mulugeta, Lealem; Drach, Andrew; Erdemir, Ahmet; Hunt, C. A.; Horner, Marc; Ku, Joy P.; Myers Jr., Jerry G.; Vadigepalli, Rajanikanth; Lytton, William W.

    2018-01-01

    Modeling and simulation in computational neuroscience is currently a research enterprise to better understand neural systems. It is not yet directly applicable to the problems of patients with brain disease. To be used for clinical applications, there must not only be considerable progress in the field but also a concerted effort to use best practices in order to demonstrate model credibility to regulatory bodies, to clinics and hospitals, to doctors, and to patients. In doing this for neuroscience, we can learn lessons from long-standing practices in other areas of simulation (aircraft, computer chips), from software engineering, and from other biomedical disciplines. In this manuscript, we introduce some basic concepts that will be important in the development of credible clinical neuroscience models: reproducibility and replicability; verification and validation; model configuration; and procedures and processes for credible mechanistic multiscale modeling. We also discuss how garnering strong community involvement can promote model credibility. Finally, in addition to direct usage with patients, we note the potential for simulation usage in the area of Simulation-Based Medical Education, an area which to date has been primarily reliant on physical models (mannequins) and scenario-based simulations rather than on numerical simulations. PMID:29713272

  19. Credibility, Replicability, and Reproducibility in Simulation for Biomedicine and Clinical Applications in Neuroscience

    Directory of Open Access Journals (Sweden)

    Lealem Mulugeta

    2018-04-01

    Full Text Available Modeling and simulation in computational neuroscience is currently a research enterprise to better understand neural systems. It is not yet directly applicable to the problems of patients with brain disease. To be used for clinical applications, there must not only be considerable progress in the field but also a concerted effort to use best practices in order to demonstrate model credibility to regulatory bodies, to clinics and hospitals, to doctors, and to patients. In doing this for neuroscience, we can learn lessons from long-standing practices in other areas of simulation (aircraft, computer chips, from software engineering, and from other biomedical disciplines. In this manuscript, we introduce some basic concepts that will be important in the development of credible clinical neuroscience models: reproducibility and replicability; verification and validation; model configuration; and procedures and processes for credible mechanistic multiscale modeling. We also discuss how garnering strong community involvement can promote model credibility. Finally, in addition to direct usage with patients, we note the potential for simulation usage in the area of Simulation-Based Medical Education, an area which to date has been primarily reliant on physical models (mannequins and scenario-based simulations rather than on numerical simulations.

  20. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  1. A Mechanistic Model of Human Recall of Social Network Structure and Relationship Affect.

    Science.gov (United States)

    Omodei, Elisa; Brashears, Matthew E; Arenas, Alex

    2017-12-07

    The social brain hypothesis argues that the need to deal with social challenges was key to our evolution of high intelligence. Research with non-human primates as well as experimental and fMRI studies in humans produce results consistent with this claim, leading to an estimate that human primary groups should consist of roughly 150 individuals. Gaps between this prediction and empirical observations can be partially accounted for using "compression heuristics", or schemata that simplify the encoding and recall of social information. However, little is known about the specific algorithmic processes used by humans to store and recall social information. We describe a mechanistic model of human network recall and demonstrate its sufficiency for capturing human recall behavior observed in experimental contexts. We find that human recall is predicated on accurate recall of a small number of high degree network nodes and the application of heuristics for both structural and affective information. This provides new insight into human memory, social network evolution, and demonstrates a novel approach to uncovering human cognitive operations.

  2. Gross margin losses due to Salmonella Dublin infection in Danish dairy cattle herds estimated by simulation modelling

    DEFF Research Database (Denmark)

    Nielsen, Torben Dahl; Kudahl, Anne Braad; Østergaard, S.

    2013-01-01

    and dynamic simulation model. The model incorporated six age groups (neonatal, pre-weaned calves, weaned calves, growing heifers, breeding heifers and cows) and five infection stages (susceptible, acutely infected, carrier, super shedder and resistant). The effects of introducing one S. Dublin infectious......Salmonella Dublin affects production and animal health in cattle herds. The objective of this study was to quantify the gross margin (GM) losses following introduction and spread of S. Dublin within dairy herds. The GM losses were estimated using an age-structured stochastic, mechanistic...... with poorer management and herd size, e.g. average annual GM losses were estimated to 49 euros per stall for the first year after infection, and to 8 euros per stall annually averaged over the 10 years after herd infection for a 200 cow stall herd with very good management. In contrast, a 200 cow stall herd...

  3. Development of the Transport Class Model (TCM) Aircraft Simulation From a Sub-Scale Generic Transport Model (GTM) Simulation

    Science.gov (United States)

    Hueschen, Richard M.

    2011-01-01

    A six degree-of-freedom, flat-earth dynamics, non-linear, and non-proprietary aircraft simulation was developed that is representative of a generic mid-sized twin-jet transport aircraft. The simulation was developed from a non-proprietary, publicly available, subscale twin-jet transport aircraft simulation using scaling relationships and a modified aerodynamic database. The simulation has an extended aerodynamics database with aero data outside the normal transport-operating envelope (large angle-of-attack and sideslip values). The simulation has representative transport aircraft surface actuator models with variable rate-limits and generally fixed position limits. The simulation contains a generic 40,000 lb sea level thrust engine model. The engine model is a first order dynamic model with a variable time constant that changes according to simulation conditions. The simulation provides a means for interfacing a flight control system to use the simulation sensor variables and to command the surface actuators and throttle position of the engine model.

  4. A 3-D CFD approach to the mechanistic prediction of forced convective critical heat flux at low quality

    International Nuclear Information System (INIS)

    Jean-Marie Le Corre; Cristina H Amon; Shi-Chune Yao

    2005-01-01

    Full text of publication follows: The prediction of the Critical Heat Flux (CHF) in a heat flux controlled boiling heat exchanger is important to assess the maximal thermal capability of the system. In the case of a nuclear reactor, CHF margin gain (using improved mixing vane grid design, for instance) can allow power up-rate and enhanced operating flexibility. In general, current nuclear core design procedures use quasi-1D approach to model the coolant thermal-hydraulic conditions within the fuel bundles coupled with fully empirical CHF prediction methods. In addition, several CHF mechanistic models have been developed in the past and coupled with 1D and quasi-1D thermal-hydraulic codes. These mechanistic models have demonstrated reasonable CHF prediction characteristics and, more remarkably, correct parametric trends over wide range of fluid conditions. However, since the phenomena leading to CHF are localized near the heater, models are needed to relate local quantities of interest to area-averaged quantities. As a consequence, large CHF prediction uncertainties may be introduced and 3D fluid characteristics (such as swirling flow) cannot be accounted properly. Therefore, a fully mechanistic approach to CHF prediction is, in general, not possible using the current approach. The development of CHF-enhanced fuel assembly designs requires the use of more advanced 3D coolant properties computations coupled with a CHF mechanistic modeling. In the present work, the commercial CFD code CFX-5 is used to compute 3D coolant conditions in a vertical heated tube with upward flow. Several CHF mechanistic models at low quality available in the literature are coupled with the CFD code by developing adequate models between local coolant properties and local parameters of interest to predict CHF. The prediction performances of these models are assessed using CHF databases available in the open literature and the 1995 CHF look-up table. Since CFD can reasonably capture 3D fluid

  5. Mechanistic Modelling of Biodiesel Production using a Liquid Lipase Formulation

    DEFF Research Database (Denmark)

    Price, Jason Anthony; Hofmann, Björn; Silva, Vanessa T. L.

    2014-01-01

    , with respect to the industrial production of biodiesel. The developed kinetic model, coupled with a mass balance of the system, was fitted to and validated on experimental results for the fed-batch transesterification of rapeseed oil. The confidence intervals of the parameter estimates, along...... that constrains the amount of methanol in the reactor was computed and the predictions experimentally validated. Monte-Carlo simulations were then used to characterize the effect of the parameter uncertainty on the model outputs, giving a biodiesel yield, based on the mass of oil, of 90.8 ± 0.55 mass %. © 2014...

  6. Causation at Different Levels: Tracking the Commitments of Mechanistic Explanations

    DEFF Research Database (Denmark)

    Fazekas, Peter; Kertész, Gergely

    2011-01-01

    connections transparent. These general commitments get confronted with two claims made by certain proponents of the mechanistic approach: William Bechtel often argues that within the mechanistic framework it is possible to balance between reducing higher levels and maintaining their autonomy at the same time...... their autonomy at the same time than standard reductive accounts are, and that what mechanistic explanations are able to do at best is showing that downward causation does not exist....

  7. A mechanistic model of heat transfer for gas-liquid flow in vertical wellbore annuli.

    Science.gov (United States)

    Yin, Bang-Tang; Li, Xiang-Fang; Liu, Gang

    2018-01-01

    The most prominent aspect of multiphase flow is the variation in the physical distribution of the phases in the flow conduit known as the flow pattern. Several different flow patterns can exist under different flow conditions which have significant effects on liquid holdup, pressure gradient and heat transfer. Gas-liquid two-phase flow in an annulus can be found in a variety of practical situations. In high rate oil and gas production, it may be beneficial to flow fluids vertically through the annulus configuration between well tubing and casing. The flow patterns in annuli are different from pipe flow. There are both casing and tubing liquid films in slug flow and annular flow in the annulus. Multiphase heat transfer depends on the hydrodynamic behavior of the flow. There are very limited research results that can be found in the open literature for multiphase heat transfer in wellbore annuli. A mechanistic model of multiphase heat transfer is developed for different flow patterns of upward gas-liquid flow in vertical annuli. The required local flow parameters are predicted by use of the hydraulic model of steady-state multiphase flow in wellbore annuli recently developed by Yin et al. The modified heat-transfer model for single gas or liquid flow is verified by comparison with Manabe's experimental results. For different flow patterns, it is compared with modified unified Zhang et al. model based on representative diameters.

  8. Blinded prospective evaluation of computer-based mechanistic schizophrenia disease model for predicting drug response.

    Directory of Open Access Journals (Sweden)

    Hugo Geerts

    Full Text Available The tremendous advances in understanding the neurobiological circuits involved in schizophrenia have not translated into more effective treatments. An alternative strategy is to use a recently published 'Quantitative Systems Pharmacology' computer-based mechanistic disease model of cortical/subcortical and striatal circuits based upon preclinical physiology, human pathology and pharmacology. The physiology of 27 relevant dopamine, serotonin, acetylcholine, norepinephrine, gamma-aminobutyric acid (GABA and glutamate-mediated targets is calibrated using retrospective clinical data on 24 different antipsychotics. The model was challenged to predict quantitatively the clinical outcome in a blinded fashion of two experimental antipsychotic drugs; JNJ37822681, a highly selective low-affinity dopamine D(2 antagonist and ocaperidone, a very high affinity dopamine D(2 antagonist, using only pharmacology and human positron emission tomography (PET imaging data. The model correctly predicted the lower performance of JNJ37822681 on the positive and negative syndrome scale (PANSS total score and the higher extra-pyramidal symptom (EPS liability compared to olanzapine and the relative performance of ocaperidone against olanzapine, but did not predict the absolute PANSS total score outcome and EPS liability for ocaperidone, possibly due to placebo responses and EPS assessment methods. Because of its virtual nature, this modeling approach can support central nervous system research and development by accounting for unique human drug properties, such as human metabolites, exposure, genotypes and off-target effects and can be a helpful tool for drug discovery and development.

  9. On the closed form mechanistic modeling of milling: Specific cutting energy, torque, and power

    Science.gov (United States)

    Bayoumi, A. E.; Yücesan, G.; Hutton, D. V.

    1994-02-01

    Specific energy in metal cutting, defined as the energy expended in removing a unit volume of workpiece material, is formulated and determined using a previously developed closed form mechanistic force model for milling operations. Cutting power is computed from the cutting torque, cutting force, kinematics of the cutter, and the volumetric material removal rate. Closed form expressions for specific cutting energy were formulated and found to be functions of the process parameters: pressure and friction for both rake and flank surfaces and chip flow angle at the rake face of the tool. Friction is found to play a very important role in cutting torque and power. Experiments were carried out to determine the effects of feedrate, cutting speed, workpiece material, and flank wear land width on specific cutting energy. It was found that the specific cutting energy increases with a decrease in the chip thickness and with an increase in flank wear land.

  10. Bacterial transformation and biodegradation processes simulation in horizontal subsurface flow constructed wetlands using CWM1-RETRASO.

    Science.gov (United States)

    Llorens, Esther; Saaltink, Maarten W; Poch, Manel; García, Joan

    2011-01-01

    The performance and reliability of the CWM1-RETRASO model for simulating processes in horizontal subsurface flow constructed wetlands (HSSF CWs) and the relative contribution of different microbial reactions to organic matter (COD) removal in a HSSF CW treating urban wastewater were evaluated. Various different approaches with diverse influent configurations were simulated. According to the simulations, anaerobic processes were more widespread in the simulated wetland and contributed to a higher COD removal rate [72-79%] than anoxic [0-1%] and aerobic reactions [20-27%] did. In all the cases tested, the reaction that most contributed to COD removal was methanogenesis [58-73%]. All results provided by the model were in consonance with literature and experimental field observations, suggesting a good performance and reliability of CWM1-RETRASO. According to the good simulation predictions, CWM1-RETRASO is the first mechanistic model able to successfully simulate the processes described by the CWM1 model in HSSF CWs. Copyright © 2010 Elsevier Ltd. All rights reserved.

  11. Viscoplastic discontinuum model of time-dependent fracture and seismicity effects in brittle rock

    CSIR Research Space (South Africa)

    Napier, JAL

    1997-10-01

    Full Text Available A model is proposed for the direct mechanistic simulation of seismic activity and stress transfer effects in deep level mines. The model uses a discontinuum viscoplastic formulation to relate the rate of slip on a crack to the shear stress acting...

  12. Mechanistic facility safety and source term analysis

    International Nuclear Information System (INIS)

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  13. Computational simulations of direct contact condensation as the driving force for water hammer

    International Nuclear Information System (INIS)

    Ceuca, Sabin-Cristian

    2015-01-01

    An analysis, based on Computer Simulations of the Direct Contact Condensation as the Driving Force for the Condensation Induced Water Hammer phenomenon is performed within this thesis. The goal of the work is to develop a mechanistic HTC model, with predictive capabilities for the simulation of horizontal or nearly horizontal two-phase ows with complex patterns including the e ect of interfacial heat and mass transfer. The newly developed HTC model was implemented into the system code ATHLET and into the CFD tools ANSYS CFX and OpenFOAM. Validation calculations have been performed for horizontal or nearly horizontal ows, where simulation results have been compared against the local measurement data such as void and temperature or area averaged data delivered by a wire mesh sensor.

  14. Computational simulations of direct contact condensation as the driving force for water hammer

    Energy Technology Data Exchange (ETDEWEB)

    Ceuca, Sabin-Cristian

    2015-04-27

    An analysis, based on Computer Simulations of the Direct Contact Condensation as the Driving Force for the Condensation Induced Water Hammer phenomenon is performed within this thesis. The goal of the work is to develop a mechanistic HTC model, with predictive capabilities for the simulation of horizontal or nearly horizontal two-phase ows with complex patterns including the e ect of interfacial heat and mass transfer. The newly developed HTC model was implemented into the system code ATHLET and into the CFD tools ANSYS CFX and OpenFOAM. Validation calculations have been performed for horizontal or nearly horizontal ows, where simulation results have been compared against the local measurement data such as void and temperature or area averaged data delivered by a wire mesh sensor.

  15. AEGIS geologic simulation model

    International Nuclear Information System (INIS)

    Foley, M.G.

    1982-01-01

    The Geologic Simulation Model (GSM) is used by the AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) program at the Pacific Northwest Laboratory to simulate the dynamic geology and hydrology of a geologic nuclear waste repository site over a million-year period following repository closure. The GSM helps to organize geologic/hydrologic data; to focus attention on active natural processes by requiring their simulation; and, through interactive simulation and calibration, to reduce subjective evaluations of the geologic system. During each computer run, the GSM produces a million-year geologic history that is possible for the region and the repository site. In addition, the GSM records in permanent history files everything that occurred during that time span. Statistical analyses of data in the history files of several hundred simulations are used to classify typical evolutionary paths, to establish the probabilities associated with deviations from the typical paths, and to determine which types of perturbations of the geologic/hydrologic system, if any, are most likely to occur. These simulations will be evaluated by geologists familiar with the repository region to determine validity of the results. Perturbed systems that are determined to be the most realistic, within whatever probability limits are established, will be used for the analyses that involve radionuclide transport and dose models. The GSM is designed to be continuously refined and updated. Simulation models are site specific, and, although the submodels may have limited general applicability, the input data equirements necessitate detailed characterization of each site before application

  16. Agent-Based Modeling in Systems Pharmacology.

    Science.gov (United States)

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.

  17. Mechanistic applicability domain classification of a local lymph node assay dataset for skin sensitization.

    Science.gov (United States)

    Roberts, David W; Patlewicz, Grace; Kern, Petra S; Gerberick, Frank; Kimber, Ian; Dearman, Rebecca J; Ryan, Cindy A; Basketter, David A; Aptula, Aynur O

    2007-07-01

    The goal of eliminating animal testing in the predictive identification of chemicals with the intrinsic ability to cause skin sensitization is an important target, the attainment of which has recently been brought into even sharper relief by the EU Cosmetics Directive and the requirements of the REACH legislation. Development of alternative methods requires that the chemicals used to evaluate and validate novel approaches comprise not only confirmed skin sensitizers and non-sensitizers but also substances that span the full chemical mechanistic spectrum associated with skin sensitization. To this end, a recently published database of more than 200 chemicals tested in the mouse local lymph node assay (LLNA) has been examined in relation to various chemical reaction mechanistic domains known to be associated with sensitization. It is demonstrated here that the dataset does cover the main reaction mechanistic domains. In addition, it is shown that assignment to a reaction mechanistic domain is a critical first step in a strategic approach to understanding, ultimately on a quantitative basis, how chemical properties influence the potency of skin sensitizing chemicals. This understanding is necessary if reliable non-animal approaches, including (quantitative) structure-activity relationships (Q)SARs, read-across, and experimental chemistry based models, are to be developed.

  18. Mechanistic model to predict colostrum intake based on deuterium oxide dilution technique data and impact of gestation and prefarrowing diets on piglet intake and sow yield of colostrum

    DEFF Research Database (Denmark)

    Theil, Peter Kappel; Flummer, Christine; Hurley, W L

    2014-01-01

    The aims of the present study were to quantify colostrum intake (CI) of piglets using the D2O dilution technique, to develop a mechanistic model to predict CI, to compare these data with CI predicted by a previous empirical predictive model developed for bottle-fed piglets, and to study how...... composition of diets fed to gestating sows affected piglet CI, sow colostrum yield (CY), and colostrum composition. In total, 240 piglets from 40 litters were enriched with D2O. The CI measured by D2O from birth until 24 h after the birth of first-born piglet was on average 443 g (SD 151). Based on measured...... CI, a mechanistic model to predict CI was developed using piglet characteristics (24-h weight gain [WG; g], BW at birth [BWB; kg], and duration of CI [D; min]: CI, g = –106 + 2.26 WG + 200 BWB + 0.111 D – 1,414 WG/D + 0.0182 WG/BWB (R2 = 0.944). This model was used to predict the CI for all colostrum...

  19. THE MARK I BUSINESS SYSTEM SIMULATION MODEL

    Science.gov (United States)

    of a large-scale business simulation model as a vehicle for doing research in management controls. The major results of the program were the...development of the Mark I business simulation model and the Simulation Package (SIMPAC). SIMPAC is a method and set of programs facilitating the construction...of large simulation models. The object of this document is to describe the Mark I Corporation model, state why parts of the business were modeled as they were, and indicate the research applications of the model. (Author)

  20. Disruption of steroidogenesis: Cell models for mechanistic investigations and as screening tools.

    Science.gov (United States)

    Odermatt, Alex; Strajhar, Petra; Engeli, Roger T

    2016-04-01

    In the modern world, humans are exposed during their whole life to a large number of synthetic chemicals. Some of these chemicals have the potential to disrupt endocrine functions and contribute to the development and/or progression of major diseases. Every year approximately 1000 novel chemicals, used in industrial production, agriculture, consumer products or as pharmaceuticals, are reaching the market, often with limited safety assessment regarding potential endocrine activities. Steroids are essential endocrine hormones, and the importance of the steroidogenesis pathway as a target for endocrine disrupting chemicals (EDCs) has been recognized by leading scientists and authorities. Cell lines have a prominent role in the initial stages of toxicity assessment, i.e. for mechanistic investigations and for the medium to high throughput analysis of chemicals for potential steroidogenesis disrupting activities. Nevertheless, the users have to be aware of the limitations of the existing cell models in order to apply them properly, and there is a great demand for improved cell-based testing systems and protocols. This review intends to provide an overview of the available cell lines for studying effects of chemicals on gonadal and adrenal steroidogenesis, their use and limitations, as well as the need for future improvements of cell-based testing systems and protocols. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  2. Wear-dependent specific coefficients in a mechanistic model for turning of nickel-based superalloy with ceramic tools

    Science.gov (United States)

    López de Lacalle, Luis Norberto; Urbicain Pelayo, Gorka; Fernández-Valdivielso, Asier; Alvarez, Alvaro; González, Haizea

    2017-09-01

    Difficult to cut materials such as nickel and titanium alloys are used in the aeronautical industry, the former alloys due to its heat-resistant behavior and the latter for the low weight - high strength ratio. Ceramic tools made out alumina with reinforce SiC whiskers are a choice in turning for roughing and semifinishing workpiece stages. Wear rate is high in the machining of these alloys, and consequently cutting forces tends to increase along one operation. This paper establishes the cutting force relation between work-piece and tool in the turning of such difficult-to-cut alloys by means of a mechanistic cutting force model that considers the tool wear effect. The cutting force model demonstrates the force sensitivity to the cutting engagement parameters (ap, f) when using ceramic inserts and wear is considered. Wear is introduced through a cutting time factor, being useful in real conditions taking into account that wear quickly appears in alloys machining. A good accuracy in the cutting force model coefficients is the key issue for an accurate prediction of turning forces, which could be used as criteria for tool replacement or as input for chatter or other models.

  3. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operati...

  4. Mechanistic Prediction of the Effect of Microstructural Coarsening on Creep Response of SnAgCu Solder Joints

    Science.gov (United States)

    Mukherjee, S.; Chauhan, P.; Osterman, M.; Dasgupta, A.; Pecht, M.

    2016-07-01

    Mechanistic microstructural models have been developed to capture the effect of isothermal aging on time dependent viscoplastic response of Sn3.0Ag0.5Cu (SAC305) solders. SnAgCu (SAC) solders undergo continuous microstructural coarsening during both storage and service because of their high homologous temperature. The microstructures of these low melting point alloys continuously evolve during service. This results in evolution of creep properties of the joint over time, thereby influencing the long term reliability of microelectronic packages. It is well documented that isothermal aging degrades the creep resistance of SAC solder. SAC305 alloy is aged for (24-1000) h at (25-100)°C (~0.6-0.8 × T melt). Cross-sectioning and image processing techniques were used to periodically quantify the effect of isothermal aging on phase coarsening and evolution. The parameters monitored during isothermal aging include size, area fraction, and inter-particle spacing of nanoscale Ag3Sn intermetallic compounds (IMCs) and the volume fraction of micronscale Cu6Sn5 IMCs, as well as the area fraction of pure tin dendrites. Effects of microstructural evolution on secondary creep constitutive response of SAC305 solder joints were then modeled using a mechanistic multiscale creep model. The mechanistic phenomena modeled include: (1) dispersion strengthening by coarsened nanoscale Ag3Sn IMCs in the eutectic phase; and (2) load sharing between pro-eutectic Sn dendrites and the surrounding coarsened eutectic Sn-Ag phase and microscale Cu6Sn5 IMCs. The coarse-grained polycrystalline Sn microstructure in SAC305 solder was not captured in the above model because isothermal aging does not cause any significant change in the initial grain size and orientation of SAC305 solder joints. The above mechanistic model can successfully capture the drop in creep resistance due to the influence of isothermal aging on SAC305 single crystals. Contribution of grain boundary sliding to the creep strain of

  5. Coupling a three-dimensional subsurface flow and transport model with a land surface model to simulate stream–aquifer–land interactions (CP v1.0

    Directory of Open Access Journals (Sweden)

    G. Bisht

    2017-12-01

    Full Text Available A fully coupled three-dimensional surface and subsurface land model is developed and applied to a site along the Columbia River to simulate three-way interactions among river water, groundwater, and land surface processes. The model features the coupling of the Community Land Model version 4.5 (CLM4.5 and a massively parallel multiphysics reactive transport model (PFLOTRAN. The coupled model, named CP v1.0, is applied to a 400 m × 400 m study domain instrumented with groundwater monitoring wells along the Columbia River shoreline. CP v1.0 simulations are performed at three spatial resolutions (i.e., 2, 10, and 20 m over a 5-year period to evaluate the impact of hydroclimatic conditions and spatial resolution on simulated variables. Results show that the coupled model is capable of simulating groundwater–river-water interactions driven by river stage variability along managed river reaches, which are of global significance as a result of over 30 000 dams constructed worldwide during the past half-century. Our numerical experiments suggest that the land-surface energy partitioning is strongly modulated by groundwater–river-water interactions through expanding the periodically inundated fraction of the riparian zone, and enhancing moisture availability in the vadose zone via capillary rise in response to the river stage change. Meanwhile, CLM4.5 fails to capture the key hydrologic process (i.e., groundwater–river-water exchange at the site, and consequently simulates drastically different water and energy budgets. Furthermore, spatial resolution is found to significantly impact the accuracy of estimated the mass exchange rates at the boundaries of the aquifer, and it becomes critical when surface and subsurface become more tightly coupled with groundwater table within 6 to 7 meters below the surface. Inclusion of lateral subsurface flow influenced both the surface energy budget and subsurface transport processes as a result

  6. Coupling a three-dimensional subsurface flow and transport model with a land surface model to simulate stream-aquifer-land interactions (CP v1.0)

    Science.gov (United States)

    Bisht, Gautam; Huang, Maoyi; Zhou, Tian; Chen, Xingyuan; Dai, Heng; Hammond, Glenn E.; Riley, William J.; Downs, Janelle L.; Liu, Ying; Zachara, John M.

    2017-12-01

    A fully coupled three-dimensional surface and subsurface land model is developed and applied to a site along the Columbia River to simulate three-way interactions among river water, groundwater, and land surface processes. The model features the coupling of the Community Land Model version 4.5 (CLM4.5) and a massively parallel multiphysics reactive transport model (PFLOTRAN). The coupled model, named CP v1.0, is applied to a 400 m × 400 m study domain instrumented with groundwater monitoring wells along the Columbia River shoreline. CP v1.0 simulations are performed at three spatial resolutions (i.e., 2, 10, and 20 m) over a 5-year period to evaluate the impact of hydroclimatic conditions and spatial resolution on simulated variables. Results show that the coupled model is capable of simulating groundwater-river-water interactions driven by river stage variability along managed river reaches, which are of global significance as a result of over 30 000 dams constructed worldwide during the past half-century. Our numerical experiments suggest that the land-surface energy partitioning is strongly modulated by groundwater-river-water interactions through expanding the periodically inundated fraction of the riparian zone, and enhancing moisture availability in the vadose zone via capillary rise in response to the river stage change. Meanwhile, CLM4.5 fails to capture the key hydrologic process (i.e., groundwater-river-water exchange) at the site, and consequently simulates drastically different water and energy budgets. Furthermore, spatial resolution is found to significantly impact the accuracy of estimated the mass exchange rates at the boundaries of the aquifer, and it becomes critical when surface and subsurface become more tightly coupled with groundwater table within 6 to 7 meters below the surface. Inclusion of lateral subsurface flow influenced both the surface energy budget and subsurface transport processes as a result of river-water intrusion into the

  7. A VRLA battery simulation model

    International Nuclear Information System (INIS)

    Pascoe, Phillip E.; Anbuky, Adnan H.

    2004-01-01

    A valve regulated lead acid (VRLA) battery simulation model is an invaluable tool for the standby power system engineer. The obvious use for such a model is to allow the assessment of battery performance. This may involve determining the influence of cells suffering from state of health (SOH) degradation on the performance of the entire string, or the running of test scenarios to ascertain the most suitable battery size for the application. In addition, it enables the engineer to assess the performance of the overall power system. This includes, for example, running test scenarios to determine the benefits of various load shedding schemes. It also allows the assessment of other power system components, either for determining their requirements and/or vulnerabilities. Finally, a VRLA battery simulation model is vital as a stand alone tool for educational purposes. Despite the fundamentals of the VRLA battery having been established for over 100 years, its operating behaviour is often poorly understood. An accurate simulation model enables the engineer to gain a better understanding of VRLA battery behaviour. A system level multipurpose VRLA battery simulation model is presented. It allows an arbitrary battery (capacity, SOH, number of cells and number of strings) to be simulated under arbitrary operating conditions (discharge rate, ambient temperature, end voltage, charge rate and initial state of charge). The model accurately reflects the VRLA battery discharge and recharge behaviour. This includes the complex start of discharge region known as the coup de fouet

  8. Kinetics from Replica Exchange Molecular Dynamics Simulations.

    Science.gov (United States)

    Stelzl, Lukas S; Hummer, Gerhard

    2017-08-08

    Transitions between metastable states govern many fundamental processes in physics, chemistry and biology, from nucleation events in phase transitions to the folding of proteins. The free energy surfaces underlying these processes can be obtained from simulations using enhanced sampling methods. However, their altered dynamics makes kinetic and mechanistic information difficult or impossible to extract. Here, we show that, with replica exchange molecular dynamics (REMD), one can not only sample equilibrium properties but also extract kinetic information. For systems that strictly obey first-order kinetics, the procedure to extract rates is rigorous. For actual molecular systems whose long-time dynamics are captured by kinetic rate models, accurate rate coefficients can be determined from the statistics of the transitions between the metastable states at each replica temperature. We demonstrate the practical applicability of the procedure by constructing master equation (Markov state) models of peptide and RNA folding from REMD simulations.

  9. Mechanistic study of aerosol dry deposition on vegetated canopies

    International Nuclear Information System (INIS)

    Petroff, A.

    2005-04-01

    The dry deposition of aerosols onto vegetated canopies is modelled through a mechanistic approach. The interaction between aerosols and vegetation is first formulated by using a set of parameters, which are defined at the local scale of one surface. The overall deposition is then deduced at the canopy scale through an up-scaling procedure based on the statistic distribution parameters. This model takes into account the canopy structural and morphological properties, and the main characteristics of the turbulent flow. Deposition mechanisms considered are Brownian diffusion, interception, initial and turbulent impaction, initially with coniferous branches and then with entire canopies of different roughness, such as grass, crop field and forest. (author)

  10. Fluid mechanics in dentinal microtubules provides mechanistic insights into the difference between hot and cold dental pain.

    Science.gov (United States)

    Lin, Min; Luo, Zheng Yuan; Bai, Bo Feng; Xu, Feng; Lu, Tian Jian

    2011-03-23

    Dental thermal pain is a significant health problem in daily life and dentistry. There is a long-standing question regarding the phenomenon that cold stimulation evokes sharper and more shooting pain sensations than hot stimulation. This phenomenon, however, outlives the well-known hydrodynamic theory used to explain dental thermal pain mechanism. Here, we present a mathematical model based on the hypothesis that hot or cold stimulation-induced different directions of dentinal fluid flow and the corresponding odontoblast movements in dentinal microtubules contribute to different dental pain responses. We coupled a computational fluid dynamics model, describing the fluid mechanics in dentinal microtubules, with a modified Hodgkin-Huxley model, describing the discharge behavior of intradental neuron. The simulated results agreed well with existing experimental measurements. We thence demonstrated theoretically that intradental mechano-sensitive nociceptors are not "equally sensitive" to inward (into the pulp) and outward (away from the pulp) fluid flows, providing mechanistic insights into the difference between hot and cold dental pain. The model developed here could enable better diagnosis in endodontics which requires an understanding of pulpal histology, neurology and physiology, as well as their dynamic response to the thermal stimulation used in dental practices.

  11. Magnetosphere Modeling: From Cartoons to Simulations

    Science.gov (United States)

    Gombosi, T. I.

    2017-12-01

    Over the last half a century physics-based global computer simulations became a bridge between experiment and basic theory and now it represents the "third pillar" of geospace research. Today, many of our scientific publications utilize large-scale simulations to interpret observations, test new ideas, plan campaigns, or design new instruments. Realistic simulations of the complex Sun-Earth system have been made possible by the dramatically increased power of both computing hardware and numerical algorithms. Early magnetosphere models were based on simple E&M concepts (like the Chapman-Ferraro cavity) and hydrodynamic analogies (bow shock). At the beginning of the space age current system models were developed culminating in the sophisticated Tsyganenko-type description of the magnetic configuration. The first 3D MHD simulations of the magnetosphere were published in the early 1980s. A decade later there were several competing global models that were able to reproduce many fundamental properties of the magnetosphere. The leading models included the impact of the ionosphere by using a height-integrated electric potential description. Dynamic coupling of global and regional models started in the early 2000s by integrating a ring current and a global magnetosphere model. It has been recognized for quite some time that plasma kinetic effects play an important role. Presently, global hybrid simulations of the dynamic magnetosphere are expected to be possible on exascale supercomputers, while fully kinetic simulations with realistic mass ratios are still decades away. In the 2010s several groups started to experiment with PIC simulations embedded in large-scale 3D MHD models. Presently this integrated MHD-PIC approach is at the forefront of magnetosphere simulations and this technique is expected to lead to some important advances in our understanding of magnetosheric physics. This talk will review the evolution of magnetosphere modeling from cartoons to current systems

  12. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  13. Data-based mechanistic modeling of dissolved organic carbon load through storms using continuous 15-minute resolution observations within UK upland watersheds

    Science.gov (United States)

    Jones, T.; Chappell, N. A.

    2013-12-01

    . Furthermore this allows a data-based mechanistic (DBM) modelling philosophy to be followed where no assumptions about processes are defined a priori (given that dominant processes are often not known before analysis) & where the information contained in the time-series is used to identify multiple structures of models that are statistically robust. Within the final stage of DBM, biogeochemical & hydrological processes are interpreted from those models that are observable from the available stream time-series. We show that this approach can simulate the key features of DOC dynamics within & between storms & that some of the resultant response characteristics change with varying DOC processes in different seasons. Through the use of MISO (multiple-input single-output) models we demonstrate the relative importance of different variables (e.g., rainfall, temperature) in controlling DOC responses. The contrasting behaviour of the six experimental catchments is also reflected in differing response characteristics. These characteristics are shown to contribute to understanding of basin-integrated DOC export processes & to the ecosystem service impacts of DOC & color on commercial water treatment within the surrounding water supply basins.

  14. Putting mechanisms into crop production models.

    Science.gov (United States)

    Boote, Kenneth J; Jones, James W; White, Jeffrey W; Asseng, Senthold; Lizaso, Jon I

    2013-09-01

    Crop growth models dynamically simulate processes of C, N and water balance on daily or hourly time-steps to predict crop growth and development and at season-end, final yield. Their ability to integrate effects of genetics, environment and crop management have led to applications ranging from understanding gene function to predicting potential impacts of climate change. The history of crop models is reviewed briefly, and their level of mechanistic detail for assimilation and respiration, ranging from hourly leaf-to-canopy assimilation to daily radiation-use efficiency is discussed. Crop models have improved steadily over the past 30-40 years, but much work remains. Improvements are needed for the prediction of transpiration response to elevated CO₂ and high temperature effects on phenology and reproductive fertility, and simulation of root growth and nutrient uptake under stressful edaphic conditions. Mechanistic improvements are needed to better connect crop growth to genetics and to soil fertility, soil waterlogging and pest damage. Because crop models integrate multiple processes and consider impacts of environment and management, they have excellent potential for linking research from genomics and allied disciplines to crop responses at the field scale, thus providing a valuable tool for deciphering genotype by environment by management effects. © 2013 John Wiley & Sons Ltd.

  15. SEIR model simulation for Hepatitis B

    Science.gov (United States)

    Side, Syafruddin; Irwan, Mulbar, Usman; Sanusi, Wahidah

    2017-09-01

    Mathematical modelling and simulation for Hepatitis B discuss in this paper. Population devided by four variables, namely: Susceptible, Exposed, Infected and Recovered (SEIR). Several factors affect the population in this model is vaccination, immigration and emigration that occurred in the population. SEIR Model obtained Ordinary Differential Equation (ODE) non-linear System 4-D which then reduces to 3-D. SEIR model simulation undertaken to predict the number of Hepatitis B cases. The results of the simulation indicates the number of Hepatitis B cases will increase and then decrease for several months. The result of simulation using the number of case in Makassar also found the basic reproduction number less than one, that means, Makassar city is not an endemic area of Hepatitis B.

  16. Climate simulations for 1880-2003 with GISS modelE

    International Nuclear Information System (INIS)

    Hansen, J.; Lacis, A.; Miller, R.; Schmidt, G.A.; Russell, G.; Canuto, V.; Del Genio, A.; Hall, T.; Hansen, J.; Sato, M.; Kharecha, P.; Nazarenko, L.; Aleinov, I.; Bauer, S.; Chandler, M.; Faluvegi, G.; Jonas, J.; Ruedy, R.; Lo, K.; Cheng, Y.; Lacis, A.; Schmidt, G.A.; Del Genio, A.; Miller, R.; Cairns, B.; Hall, T.; Baum, E.; Cohen, A.; Fleming, E.; Jackman, C.; Friend, A.; Kelley, M.

    2007-01-01

    We carry out climate simulations for 1880-2003 with GISS modelE driven by ten measured or estimated climate forcing. An ensemble of climate model runs is carried out for each forcing acting individually and for all forcing mechanisms acting together. We compare side-by-side simulated climate change for each forcing, all forcing, observations, unforced variability among model ensemble members, and, if available, observed variability. Discrepancies between observations and simulations with all forcing are due to model deficiencies, inaccurate or incomplete forcing, and imperfect observations. Although there are notable discrepancies between model and observations, the fidelity is sufficient to encourage use of the model for simulations of future climate change. By using a fixed well-documented model and accurately defining the 1880-2003 forcing, we aim to provide a benchmark against which the effect of improvements in the model, climate forcing, and observations can be tested. Principal model deficiencies include unrealistic weak tropical El Nino-like variability and a poor distribution of sea ice, with too much sea ice in the Northern Hemisphere and too little in the Southern Hemisphere. Greatest uncertainties in the forcing are the temporal and spatial variations of anthropogenic aerosols and their indirect effects on clouds. (authors)

  17. FASTBUS simulation models in VHDL

    International Nuclear Information System (INIS)

    Appelquist, G.

    1992-11-01

    Four hardware simulation models implementing the FASTBUS protocol are described. The models are written in the VHDL hardware description language to obtain portability, i.e. without relations to any specific simulator. They include two complete FASTBUS devices, a full-duplex segment interconnect and ancillary logic for the segment. In addition, master and slave models using a high level interface to describe FASTBUS operations, are presented. With these models different configurations of FASTBUS systems can be evaluated and the FASTBUS transactions of new devices can be verified. (au)

  18. Scientific Modeling and simulations

    CERN Document Server

    Diaz de la Rubia, Tomás

    2009-01-01

    Showcases the conceptual advantages of modeling which, coupled with the unprecedented computing power through simulations, allow scientists to tackle the formibable problems of our society, such as the search for hydrocarbons, understanding the structure of a virus, or the intersection between simulations and real data in extreme environments

  19. Generative mechanistic explanation building in undergraduate molecular and cellular biology

    Science.gov (United States)

    Southard, Katelyn M.; Espindola, Melissa R.; Zaepfel, Samantha D.; Bolger, Molly S.

    2017-09-01

    When conducting scientific research, experts in molecular and cellular biology (MCB) use specific reasoning strategies to construct mechanistic explanations for the underlying causal features of molecular phenomena. We explored how undergraduate students applied this scientific practice in MCB. Drawing from studies of explanation building among scientists, we created and applied a theoretical framework to explore the strategies students use to construct explanations for 'novel' biological phenomena. Specifically, we explored how students navigated the multi-level nature of complex biological systems using generative mechanistic reasoning. Interviews were conducted with introductory and upper-division biology students at a large public university in the United States. Results of qualitative coding revealed key features of students' explanation building. Students used modular thinking to consider the functional subdivisions of the system, which they 'filled in' to varying degrees with mechanistic elements. They also hypothesised the involvement of mechanistic entities and instantiated abstract schema to adapt their explanations to unfamiliar biological contexts. Finally, we explored the flexible thinking that students used to hypothesise the impact of mutations on multi-leveled biological systems. Results revealed a number of ways that students drew mechanistic connections between molecules, functional modules (sets of molecules with an emergent function), cells, tissues, organisms and populations.

  20. Network Modeling and Simulation A Practical Perspective

    CERN Document Server

    Guizani, Mohsen; Khan, Bilal

    2010-01-01

    Network Modeling and Simulation is a practical guide to using modeling and simulation to solve real-life problems. The authors give a comprehensive exposition of the core concepts in modeling and simulation, and then systematically address the many practical considerations faced by developers in modeling complex large-scale systems. The authors provide examples from computer and telecommunication networks and use these to illustrate the process of mapping generic simulation concepts to domain-specific problems in different industries and disciplines. Key features: Provides the tools and strate

  1. Modeling microbiological and chemical processes in municipal solid waste bioreactor, Part II: Application of numerical model BIOKEMOD-3P.

    Science.gov (United States)

    Gawande, Nitin A; Reinhart, Debra R; Yeh, Gour-Tsyh

    2010-02-01

    Biodegradation process modeling of municipal solid waste (MSW) bioreactor landfills requires the knowledge of various process reactions and corresponding kinetic parameters. Mechanistic models available to date are able to simulate biodegradation processes with the help of pre-defined species and reactions. Some of these models consider the effect of critical parameters such as moisture content, pH, and temperature. Biomass concentration is a vital parameter for any biomass growth model and often not compared with field and laboratory results. A more complex biodegradation model includes a large number of chemical and microbiological species. Increasing the number of species and user defined process reactions in the simulation requires a robust numerical tool. A generalized microbiological and chemical model, BIOKEMOD-3P, was developed to simulate biodegradation processes in three-phases (Gawande et al. 2009). This paper presents the application of this model to simulate laboratory-scale MSW bioreactors under anaerobic conditions. BIOKEMOD-3P was able to closely simulate the experimental data. The results from this study may help in application of this model to full-scale landfill operation.

  2. Model reduction for circuit simulation

    CERN Document Server

    Hinze, Michael; Maten, E Jan W Ter

    2011-01-01

    Simulation based on mathematical models plays a major role in computer aided design of integrated circuits (ICs). Decreasing structure sizes, increasing packing densities and driving frequencies require the use of refined mathematical models, and to take into account secondary, parasitic effects. This leads to very high dimensional problems which nowadays require simulation times too large for the short time-to-market demands in industry. Modern Model Order Reduction (MOR) techniques present a way out of this dilemma in providing surrogate models which keep the main characteristics of the devi

  3. Simulation of a SAGD well blowout using a reservoir-wellbore coupled simulator

    Energy Technology Data Exchange (ETDEWEB)

    Walter, J.; Vanegas, P.; Cunha, L.B. [Alberta Univ., Edmonton, AB (Canada); Worth, D.J. [C-FER Technologies, Edmonton, AB (Canada); Crepin, S. [Petrocedeno, Caracas (Venezuela)

    2008-10-15

    Single barrier completion systems are typically used in SAGD projects due to the lack of equipment suitable for high temperature SAGD downhole environments. This study used a wellbore and reservoir coupled thermal simulator tool to investigate the blowout behaviour of a steam assisted gravity drainage (SAGD) well pair when the safety barrier has failed. Fluid flow pressure drop through the wellbore and heat losses between the wellbore and the reservoir were modelled using a discretized wellbore option and a semi-analytical model. The fully coupled mechanistic model accounted for the simultaneous transient pressure and temperature variations along the wellbore and the reservoir. The simulations were used to predict flowing potential and fluid compositions of both wells in a SAGD well pair under various flowing conditions. Blowout scenarios were created for 3 different points in the well pair's life. Three flow paths during the blowout were evaluated for both the production and injection wells. Results of the study were used to conduct a comparative risk assessment between a double barrier and a single barrier completion. The modelling study confirmed that both the injection and production wells had the potential for blowouts lasting significant periods of time, with liquid rates over 50 times the normal production liquid rates. The model successfully predicted the blowout flow potential of the SAGD well pairs. 8 refs., 3 tabs., 18 figs.

  4. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  5. Simulation of radionuclide retardation at Yucca Mountain using a stochastic mineralogical/geochemical model

    International Nuclear Information System (INIS)

    Birdsell, K.H.; Campbell, K.; Eggert, K.; Travis, B.J.

    1990-01-01

    This paper presents preliminary transport calculations for radionuclide movement at Yucca Mountain. Several different realizations of spatially distributed sorption coefficients are used to study the sensitivity of radionuclide migration. These sorption coefficients are assumed to be functions of the mineralogic assemblages of the underlying rock. The simulations were run with TRACRN 1 , a finite-difference porous flow and radionuclide transport code developed for the Yucca Mountain Project. Approximately 30,000 nodes are used to represent the unsaturated and saturated zones underlying the repository in three dimensions. Transport calculations for a representative radionuclide cation, 135 Cs, and anion, 99 Tc, are presented. Calculations such as these will be used to study the effectiveness of the site's geochemical barriers at a mechanistic level and to help guide the geochemical site characterization program. The preliminary calculations should be viewed as a demonstration of the modeling methodology rather than as a study of the effectiveness of the geochemical barriers. The model provides a method for examining the integration of flow scenarios with transport and retardation processes as currently understood for the site. The effects on transport of many of the processes thought to be active at Yucca Mountain may be examined using this approach. 11 refs., 14 figs., 1 tab

  6. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  7. Mechanistic simulation of batch acetone-butanol-ethanol (ABE) fermentation with in situ gas stripping using Aspen Plus™.

    Science.gov (United States)

    Darkwah, Kwabena; Nokes, Sue E; Seay, Jeffrey R; Knutson, Barbara L

    2018-05-22

    Process simulations of batch fermentations with in situ product separation traditionally decouple these interdependent steps by simulating a separate "steady state" continuous fermentation and separation units. In this study, an integrated batch fermentation and separation process was simulated for a model system of acetone-butanol-ethanol (ABE) fermentation with in situ gas stripping, such that the fermentation kinetics are linked in real-time to the gas stripping process. A time-dependent cell growth, substrate utilization, and product production is translated to an Aspen Plus batch reactor. This approach capitalizes on the phase equilibria calculations of Aspen Plus to predict the effect of stripping on the ABE fermentation kinetics. The product profiles of the integrated fermentation and separation are shown to be sensitive to gas flow rate, unlike separate steady state fermentation and separation simulations. This study demonstrates the importance of coupled fermentation and separation simulation approaches for the systematic analyses of unsteady state processes.

  8. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model.

    Science.gov (United States)

    Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  9. Crop growth and two dimensional modeling of soil water transport in drip irrigated potatoes

    DEFF Research Database (Denmark)

    Plauborg, Finn; Iversen, Bo Vangsø; Mollerup, Mikkel

    2009-01-01

    of abscisic acid (ABA). Model outputs from the mechanistic simulation model Daisy, in SAFIR developed to include 2D soil processes and gas exchange processes based on Ball et al. and Farquhar were compared with measured crop dynamics, final DM yield and volumetric water content in the soil measured by TDR...

  10. Protective role of OH scavengers and DNA/chromatin organization in the induction of DNA breaks: mechanistic models and Monte Carlo simulations

    International Nuclear Information System (INIS)

    Ballarini, F.; Rossetti, M.; Scannicchio, D.; Jacob, P.; Molinelli, S.; Ottolenghi, A.; Volata, A.

    2003-01-01

    Radiation-induced DNA damage can be modulated by various factors, including the environment scavenging capacity (SC) and the DNA organization within the cell nucleus (chromatin compactness, DNA-binding proteins etc.). In this context the induction of ssb and dsb by photons and light ions of different energies impinging on different DNA structures (e.g. linear DNA, SV40 'minichromosomes' and cellular DNA) at different OH-radical SC values was modelled with the Monte Carlo PARTRAC code. Presently PARTRAC can transport electrons, photons, protons and alpha particles in liquid water with an 'event-by-event' approach, and can simulate the DNA content of mammalian cells with an 'atom-by-atom' description, from nucleotide pairs to chromatin fibre loops and chromosome territories. Energy depositions in the sugar-phosphate were considered as potential (direct) ssb. The production, diffusion and reaction of chemical species were explicitly simulated; reactions of OH radicals with the sugar-phosphate were assumed to lead to 'indirect' ssb with probability 65%. Two ssb on opposite strands within 10 bp were considered as a dsb. Yields of ssb and dsb/Gy/Dalton were calculated for different DNA structures as a function of the OH mean life time. By Zyuzikov, N.; Michael, B.D. (Gray Cancer Institute, (GB)); Wu, L. (Ch Zyuzdirect damage yields. In general, also depending on radiation quality, linear DNA was found to be more susceptible to strand breakage than SV40 minichromosomes, which in turn showed higher damage yields with respect to cellular DNA. The very good agreement found with available experimental data provided a validation of the model and allowed us to quantify separately the protective effect of OH scavengers and DNA/chromatin organization. Comparisons with data on nucleoids (DNA unfolded and depleted of histones) suggested that the experimental procedures used to obtain such targets might lower the environment SC, due to the loss of cellular scavenging compounds

  11. "Ratio via Machina": Three Standards of Mechanistic Explanation in Sociology

    Science.gov (United States)

    Aviles, Natalie B.; Reed, Isaac Ariail

    2017-01-01

    Recently, sociologists have expended much effort in attempts to define social mechanisms. We intervene in these debates by proposing that sociologists in fact have a choice to make between three standards of what constitutes a good mechanistic explanation: substantial, formal, and metaphorical mechanistic explanation. All three standards are…

  12. Mechanistic evidence for a ring-opening pathway in the Pd-catalyzed direct arylation of benzoxazoles

    DEFF Research Database (Denmark)

    Sanchez, R.S.; Zhuravlev, Fedor

    2007-01-01

    The direct Pd-catalyzed arylation of 5-substituted benzoxazoles, used as a mechanistic model for 1,3-azoles, was investigated experimentally and computationally. The results of the primary deuterium kinetic isotope effect, Hammett studies, and H/D exchange were shown to be inconsistent with the r......The direct Pd-catalyzed arylation of 5-substituted benzoxazoles, used as a mechanistic model for 1,3-azoles, was investigated experimentally and computationally. The results of the primary deuterium kinetic isotope effect, Hammett studies, and H/D exchange were shown to be inconsistent...... with the rate-limiting electrophilic or concerted palladation. A mechanism, proposed on the basis of kinetic and computational studies, includes generation of isocyanophenolate as the key step. The DFT calculations suggest that the overall catalytic cycle is facile and is largely controlled by the C-H acidity...

  13. Toward mechanistic classification of enzyme functions.

    Science.gov (United States)

    Almonacid, Daniel E; Babbitt, Patricia C

    2011-06-01

    Classification of enzyme function should be quantitative, computationally accessible, and informed by sequences and structures to enable use of genomic information for functional inference and other applications. Large-scale studies have established that divergently evolved enzymes share conserved elements of structure and common mechanistic steps and that convergently evolved enzymes often converge to similar mechanisms too, suggesting that reaction mechanisms could be used to develop finer-grained functional descriptions than provided by the Enzyme Commission (EC) system currently in use. Here we describe how evolution informs these structure-function mappings and review the databases that store mechanisms of enzyme reactions along with recent developments to measure ligand and mechanistic similarities. Together, these provide a foundation for new classifications of enzyme function. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.

    2013-01-01

    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  15. An introduction to enterprise modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ostic, J.K.; Cannon, C.E. [Los Alamos National Lab., NM (United States). Technology Modeling and Analysis Group

    1996-09-01

    As part of an ongoing effort to continuously improve productivity, quality, and efficiency of both industry and Department of Energy enterprises, Los Alamos National Laboratory is investigating various manufacturing and business enterprise simulation methods. A number of enterprise simulation software models are being developed to enable engineering analysis of enterprise activities. In this document the authors define the scope of enterprise modeling and simulation efforts, and review recent work in enterprise simulation at Los Alamos National Laboratory as well as at other industrial, academic, and research institutions. References of enterprise modeling and simulation methods and a glossary of enterprise-related terms are provided.

  16. Improving the International Agency for Research on Cancer's consideration of mechanistic evidence

    International Nuclear Information System (INIS)

    Goodman, Julie; Lynch, Heather

    2017-01-01

    Background: The International Agency for Research on Cancer (IARC) recently developed a framework for evaluating mechanistic evidence that includes a list of 10 key characteristics of carcinogens. This framework is useful for identifying and organizing large bodies of literature on carcinogenic mechanisms, but it lacks sufficient guidance for conducting evaluations that fully integrate mechanistic evidence into hazard assessments. Objectives: We summarize the framework, and suggest approaches to strengthen the evaluation of mechanistic evidence using this framework. Discussion: While the framework is useful for organizing mechanistic evidence, its lack of guidance for implementation limits its utility for understanding human carcinogenic potential. Specifically, it does not include explicit guidance for evaluating the biological significance of mechanistic endpoints, inter- and intra-individual variability, or study quality and relevance. It also does not explicitly address how mechanistic evidence should be integrated with other realms of evidence. Because mechanistic evidence is critical to understanding human cancer hazards, we recommend that IARC develop transparent and systematic guidelines for the use of this framework so that mechanistic evidence will be evaluated and integrated in a robust manner, and concurrently with other realms of evidence, to reach a final human cancer hazard conclusion. Conclusions: IARC does not currently provide a standardized approach to evaluating mechanistic evidence. Incorporating the recommendations discussed here will make IARC analyses of mechanistic evidence more transparent, and lead to assessments of cancer hazards that reflect the weight of the scientific evidence and allow for scientifically defensible decision-making. - Highlights: • IARC has a revised framework for evaluating literature on carcinogenic mechanisms. • The framework is based on 10 key characteristics of carcinogens. • IARC should develop transparent

  17. Improving the International Agency for Research on Cancer's consideration of mechanistic evidence

    Energy Technology Data Exchange (ETDEWEB)

    Goodman, Julie, E-mail: jgoodman@gradientcorp.com; Lynch, Heather

    2017-03-15

    Background: The International Agency for Research on Cancer (IARC) recently developed a framework for evaluating mechanistic evidence that includes a list of 10 key characteristics of carcinogens. This framework is useful for identifying and organizing large bodies of literature on carcinogenic mechanisms, but it lacks sufficient guidance for conducting evaluations that fully integrate mechanistic evidence into hazard assessments. Objectives: We summarize the framework, and suggest approaches to strengthen the evaluation of mechanistic evidence using this framework. Discussion: While the framework is useful for organizing mechanistic evidence, its lack of guidance for implementation limits its utility for understanding human carcinogenic potential. Specifically, it does not include explicit guidance for evaluating the biological significance of mechanistic endpoints, inter- and intra-individual variability, or study quality and relevance. It also does not explicitly address how mechanistic evidence should be integrated with other realms of evidence. Because mechanistic evidence is critical to understanding human cancer hazards, we recommend that IARC develop transparent and systematic guidelines for the use of this framework so that mechanistic evidence will be evaluated and integrated in a robust manner, and concurrently with other realms of evidence, to reach a final human cancer hazard conclusion. Conclusions: IARC does not currently provide a standardized approach to evaluating mechanistic evidence. Incorporating the recommendations discussed here will make IARC analyses of mechanistic evidence more transparent, and lead to assessments of cancer hazards that reflect the weight of the scientific evidence and allow for scientifically defensible decision-making. - Highlights: • IARC has a revised framework for evaluating literature on carcinogenic mechanisms. • The framework is based on 10 key characteristics of carcinogens. • IARC should develop transparent

  18. Uncovering molecular processes in crystal nucleation and growth by using molecular simulation.

    Science.gov (United States)

    Anwar, Jamshed; Zahn, Dirk

    2011-02-25

    Exploring nucleation processes by molecular simulation provides a mechanistic understanding at the atomic level and also enables kinetic and thermodynamic quantities to be estimated. However, whilst the potential for modeling crystal nucleation and growth processes is immense, there are specific technical challenges to modeling. In general, rare events, such as nucleation cannot be simulated using a direct "brute force" molecular dynamics approach. The limited time and length scales that are accessible by conventional molecular dynamics simulations have inspired a number of advances to tackle problems that were considered outside the scope of molecular simulation. While general insights and features could be explored from efficient generic models, new methods paved the way to realistic crystal nucleation scenarios. The association of single ions in solvent environments, the mechanisms of motif formation, ripening reactions, and the self-organization of nanocrystals can now be investigated at the molecular level. The analysis of interactions with growth-controlling additives gives a new understanding of functionalized nanocrystals and the precipitation of composite materials. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Modeling and simulation of blood collection systems.

    Science.gov (United States)

    Alfonso, Edgar; Xie, Xiaolan; Augusto, Vincent; Garraud, Olivier

    2012-03-01

    This paper addresses the modeling and simulation of blood collection systems in France for both fixed site and mobile blood collection with walk in whole blood donors and scheduled plasma and platelet donors. Petri net models are first proposed to precisely describe different blood collection processes, donor behaviors, their material/human resource requirements and relevant regulations. Petri net models are then enriched with quantitative modeling of donor arrivals, donor behaviors, activity times and resource capacity. Relevant performance indicators are defined. The resulting simulation models can be straightforwardly implemented with any simulation language. Numerical experiments are performed to show how the simulation models can be used to select, for different walk in donor arrival patterns, appropriate human resource planning and donor appointment strategies.

  20. Innovative mathematical modeling in environmental remediation

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, Gour T. [Taiwan Typhoon and Flood Research Institute (Taiwan); National Central Univ. (Taiwan); Univ. of Central Florida (United States); Gwo, Jin Ping [Nuclear Regulatory Commission (NRC), Rockville, MD (United States); Siegel, Malcolm D. [Sandia National Laboratories, Albuquerque, NM (United States); Li, Ming-Hsu [National Central Univ. (Taiwan); ; Fang, Yilin [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Zhang, Fan [Inst. of Tibetan Plateau Research, Chinese Academy of Sciences (China); Luo, Wensui [Inst. of Tibetan Plateau Research, Chinese Academy of Sciences (China); Yabusaki, Steven B. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)

    2013-05-01

    There are two different ways to model reactive transport: ad hoc and innovative reaction-based approaches. The former, such as the Kd simplification of adsorption, has been widely employed by practitioners, while the latter has been mainly used in scientific communities for elucidating mechanisms of biogeochemical transport processes. It is believed that innovative mechanistic-based models could serve as protocols for environmental remediation as well. This paper reviews the development of a mechanistically coupled fluid flow, thermal transport, hydrologic transport, and reactive biogeochemical model and example-applications to environmental remediation problems. Theoretical bases are sufficiently described. Four example problems previously carried out are used to demonstrate how numerical experimentation can be used to evaluate the feasibility of different remediation approaches. The first one involved the application of a 56-species uranium tailing problem to the Melton Branch Subwatershed at Oak Ridge National Laboratory (ORNL) using the parallel version of the model. Simulations were made to demonstrate the potential mobilization of uranium and other chelating agents in the proposed waste disposal site. The second problem simulated laboratory-scale system to investigate the role of natural attenuation in potential off-site migration of uranium from uranium mill tailings after restoration. It showed inadequacy of using a single Kd even for a homogeneous medium. The third example simulated laboratory experiments involving extremely high concentrations of uranium, technetium, aluminum, nitrate, and toxic metals (e.g.,Ni, Cr, Co).The fourth example modeled microbially-mediated immobilization of uranium in an unconfined aquifer using acetate amendment in a field-scale experiment. The purposes of these modeling studies were to simulate various mechanisms of mobilization and immobilization of radioactive wastes and to illustrate how to apply reactive transport models

  1. Innovative mathematical modeling in environmental remediation

    International Nuclear Information System (INIS)

    Yeh, Gour T.; Gwo, Jin Ping; Siegel, Malcolm D.; Li, Ming-Hsu; Fang, Yilin; Zhang, Fan; Luo, Wensui; Yabusaki, Steven B.

    2013-01-01

    There are two different ways to model reactive transport: ad hoc and innovative reaction-based approaches. The former, such as the Kd simplification of adsorption, has been widely employed by practitioners, while the latter has been mainly used in scientific communities for elucidating mechanisms of biogeochemical transport processes. It is believed that innovative mechanistic-based models could serve as protocols for environmental remediation as well. This paper reviews the development of a mechanistically coupled fluid flow, thermal transport, hydrologic transport, and reactive biogeochemical model and example-applications to environmental remediation problems. Theoretical bases are sufficiently described. Four example problems previously carried out are used to demonstrate how numerical experimentation can be used to evaluate the feasibility of different remediation approaches. The first one involved the application of a 56-species uranium tailing problem to the Melton Branch Subwatershed at Oak Ridge National Laboratory (ORNL) using the parallel version of the model. Simulations were made to demonstrate the potential mobilization of uranium and other chelating agents in the proposed waste disposal site. The second problem simulated laboratory-scale system to investigate the role of natural attenuation in potential off-site migration of uranium from uranium mill tailings after restoration. It showed inadequacy of using a single Kd even for a homogeneous medium. The third example simulated laboratory experiments involving extremely high concentrations of uranium, technetium, aluminum, nitrate, and toxic metals (e.g.,Ni, Cr, Co). The fourth example modeled microbially-mediated immobilization of uranium in an unconfined aquifer using acetate amendment in a field-scale experiment. The purposes of these modeling studies were to simulate various mechanisms of mobilization and immobilization of radioactive wastes and to illustrate how to apply reactive transport models

  2. Predicting interactions from mechanistic information: Can omic data validate theories?

    International Nuclear Information System (INIS)

    Borgert, Christopher J.

    2007-01-01

    To address the most pressing and relevant issues for improving mixture risk assessment, researchers must first recognize that risk assessment is driven by both regulatory requirements and scientific research, and that regulatory concerns may expand beyond the purely scientific interests of researchers. Concepts of 'mode of action' and 'mechanism of action' are used in particular ways within the regulatory arena, depending on the specific assessment goals. The data requirements for delineating a mode of action and predicting interactive toxicity in mixtures are not well defined from a scientific standpoint due largely to inherent difficulties in testing certain underlying assumptions. Understanding the regulatory perspective on mechanistic concepts will be important for designing experiments that can be interpreted clearly and applied in risk assessments without undue reliance on extrapolation and assumption. In like fashion, regulators and risk assessors can be better equipped to apply mechanistic data if the concepts underlying mechanistic research and the limitations that must be placed on interpretation of mechanistic data are understood. This will be critically important for applying new technologies to risk assessment, such as functional genomics, proteomics, and metabolomics. It will be essential not only for risk assessors to become conversant with the language and concepts of mechanistic research, including new omic technologies, but also, for researchers to become more intimately familiar with the challenges and needs of risk assessment

  3. Explanation and inference: Mechanistic and functional explanations guide property generalization

    Directory of Open Access Journals (Sweden)

    Tania eLombrozo

    2014-09-01

    Full Text Available The ability to generalize from the known to the unknown is central to learning and inference. Two experiments explore the relationship between how a property is explained and how that property is generalized to novel species and artifacts. The experiments contrast the consequences of explaining a property mechanistically, by appeal to parts and processes, with the consequences of explaining the property functionally, by appeal to functions and goals. The findings suggest that properties that are explained functionally are more likely to be generalized on the basis of shared functions, with a weaker relationship between mechanistic explanations and generalization on the basis of shared parts and processes. The influence of explanation type on generalization holds even though all participants are provided with the same mechanistic and functional information, and whether an explanation type is freely generated (Experiment 1, experimentally provided (Experiment 2, or experimentally induced (Experiment 2. The experiments also demonstrate that explanations and generalizations of a particular type (mechanistic or functional can be experimentally induced by providing sample explanations of that type, with a comparable effect when the sample explanations come from the same domain or from a different domains. These results suggest that explanations serve as a guide to generalization, and contribute to a growing body of work supporting the value of distinguishing mechanistic and functional explanations.

  4. Explanation and inference: mechanistic and functional explanations guide property generalization.

    Science.gov (United States)

    Lombrozo, Tania; Gwynne, Nicholas Z

    2014-01-01

    The ability to generalize from the known to the unknown is central to learning and inference. Two experiments explore the relationship between how a property is explained and how that property is generalized to novel species and artifacts. The experiments contrast the consequences of explaining a property mechanistically, by appeal to parts and processes, with the consequences of explaining the property functionally, by appeal to functions and goals. The findings suggest that properties that are explained functionally are more likely to be generalized on the basis of shared functions, with a weaker relationship between mechanistic explanations and generalization on the basis of shared parts and processes. The influence of explanation type on generalization holds even though all participants are provided with the same mechanistic and functional information, and whether an explanation type is freely generated (Experiment 1), experimentally provided (Experiment 2), or experimentally induced (Experiment 2). The experiments also demonstrate that explanations and generalizations of a particular type (mechanistic or functional) can be experimentally induced by providing sample explanations of that type, with a comparable effect when the sample explanations come from the same domain or from a different domains. These results suggest that explanations serve as a guide to generalization, and contribute to a growing body of work supporting the value of distinguishing mechanistic and functional explanations.

  5. CFD simulation of hydrogen mixing and mitigation by means of passive auto-catalytic recombiners

    International Nuclear Information System (INIS)

    Kelm, S.; Reinecke, E-A.; Jahn, W.; Allelein, H-J.

    2011-01-01

    Modeling of passive auto-catalytic recombiners (PARs) operation in containment geometries involves a large variety of scales; thus, a CFD calculation resolving all these scales would be much too expensive. Therefore, the mechanistic PAR model REKO-DIREKT, developed at Forschungszentrum Juelich, has been coupled with the commercial CFD code ANSYS CFX in order to simulate PAR operation as well as the induced flow and transport phenomena. Based on a short introduction of REKO-DIREKT, its interface to CFX and the explicit coupling scheme is discussed. The paper is finalized by a first demonstration of simulation capabilities on the basis of the ThAI PAR-4 experiment (Becker Technologies GmbH, Eschborn, Germany). (author)

  6. Models and simulations

    International Nuclear Information System (INIS)

    Lee, M.J.; Sheppard, J.C.; Sullenberger, M.; Woodley, M.D.

    1983-09-01

    On-line mathematical models have been used successfully for computer controlled operation of SPEAR and PEP. The same model control concept is being implemented for the operation of the LINAC and for the Damping Ring, which will be part of the Stanford Linear Collider (SLC). The purpose of this paper is to describe the general relationships between models, simulations and the control system for any machine at SLAC. The work we have done on the development of the empirical model for the Damping Ring will be presented as an example

  7. The physicochemical process of bacterial attachment to abiotic surfaces: Challenges for mechanistic studies, predictability and the development of control strategies.

    Science.gov (United States)

    Wang, Yi; Lee, Sui Mae; Dykes, Gary

    2015-01-01

    Bacterial attachment to abiotic surfaces can be explained as a physicochemical process. Mechanisms of the process have been widely studied but are not yet well understood due to their complexity. Physicochemical processes can be influenced by various interactions and factors in attachment systems, including, but not limited to, hydrophobic interactions, electrostatic interactions and substratum surface roughness. Mechanistic models and control strategies for bacterial attachment to abiotic surfaces have been established based on the current understanding of the attachment process and the interactions involved. Due to a lack of process control and standardization in the methodologies used to study the mechanisms of bacterial attachment, however, various challenges are apparent in the development of models and control strategies. In this review, the physicochemical mechanisms, interactions and factors affecting the process of bacterial attachment to abiotic surfaces are described. Mechanistic models established based on these parameters are discussed in terms of their limitations. Currently employed methods to study these parameters and bacterial attachment are critically compared. The roles of these parameters in the development of control strategies for bacterial attachment are reviewed, and the challenges that arise in developing mechanistic models and control strategies are assessed.

  8. Simulation model of a PWR power plant

    International Nuclear Information System (INIS)

    Larsen, N.

    1987-03-01

    A simulation model of a hypothetical PWR power plant is described. A large number of disturbances and failures in plant function can be simulated. The model is written as seven modules to the modular simulation system for continuous processes DYSIM and serves also as a user example of this system. The model runs in Fortran 77 on the IBM-PC-AT. (author)

  9. Mechanistic simulation of normal-tissue damage in radiotherapy-implications for dose-volume analyses

    International Nuclear Information System (INIS)

    Rutkowska, Eva; Baker, Colin; Nahum, Alan

    2010-01-01

    A radiobiologically based 3D model of normal tissue has been developed in which complications are generated when 'irradiated'. The aim is to provide insight into the connection between dose-distribution characteristics, different organ architectures and complication rates beyond that obtainable with simple DVH-based analytical NTCP models. In this model the organ consists of a large number of functional subunits (FSUs), populated by stem cells which are killed according to the LQ model. A complication is triggered if the density of FSUs in any 'critical functioning volume' (CFV) falls below some threshold. The (fractional) CFV determines the organ architecture and can be varied continuously from small (series-like behaviour) to large (parallel-like). A key feature of the model is its ability to account for the spatial dependence of dose distributions. Simulations were carried out to investigate correlations between dose-volume parameters and the incidence of 'complications' using different pseudo-clinical dose distributions. Correlations between dose-volume parameters and outcome depended on characteristics of the dose distributions and on organ architecture. As anticipated, the mean dose and V 20 correlated most strongly with outcome for a parallel organ, and the maximum dose for a serial organ. Interestingly better correlation was obtained between the 3D computer model and the LKB model with dose distributions typical for serial organs than with those typical for parallel organs. This work links the results of dose-volume analyses to dataset characteristics typical for serial and parallel organs and it may help investigators interpret the results from clinical studies.

  10. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  11. Modeling and Simulation for Safeguards

    International Nuclear Information System (INIS)

    Swinhoe, Martyn T.

    2012-01-01

    The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R and D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.

  12. Development of the simulation system IMPACT for analysis of nuclear power plant severe accidents

    International Nuclear Information System (INIS)

    Naitoh, Masanori; Ujita, Hiroshi; Nagumo, Hiroichi

    1997-01-01

    The Nuclear Power Engineering Corporation (NUPEC) has initiated a long-term program to develop the simulation system IMPACT for analysis of hypothetical severe accidents in nuclear power plants. IMPACT employs advanced methods of physical modeling and numerical computation, and can simulate a wide spectrum of senarios ranging from normal operation to hypothetical, beyond-design-basis-accident events. Designed as a large-scale system of interconnected, hierarchical modules, IMPACT's distinguishing features include mechanistic models based on first principles and high speed simulation on parallel processing computers. The present plan is a ten-year program starting from 1993, consisting of the initial one-year of preparatory work followed by three technical phases: Phase-1 for development of a prototype system; Phase-2 for completion of the simulation system, incorporating new achievements from basic studies; and Phase-3 for refinement through extensive verification and validation against test results and available real plant data

  13. Hybrid simulation models of production networks

    CERN Document Server

    Kouikoglou, Vassilis S

    2001-01-01

    This book is concerned with a most important area of industrial production, that of analysis and optimization of production lines and networks using discrete-event models and simulation. The book introduces a novel approach that combines analytic models and discrete-event simulation. Unlike conventional piece-by-piece simulation, this method observes a reduced number of events between which the evolution of the system is tracked analytically. Using this hybrid approach, several models are developed for the analysis of production lines and networks. The hybrid approach combines speed and accuracy for exceptional analysis of most practical situations. A number of optimization problems, involving buffer design, workforce planning, and production control, are solved through the use of hybrid models.

  14. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  15. Mechanistic modelling of Middle Eocene atmospheric carbon dioxide using fossil plant material

    Science.gov (United States)

    Grein, Michaela; Roth-Nebelsick, Anita; Wilde, Volker; Konrad, Wilfried; Utescher, Torsten

    2010-05-01

    Various proxies (such as pedogenic carbonates, boron isotopes or phytoplankton) and geochemical models were applied in order to reconstruct palaeoatmospheric carbon dioxide, partially providing conflicting results. Another promising proxy is the frequency of stomata (pores on the leaf surface used for gaseous exchange). In this project, fossil plant material from the Messel Pit (Hesse, Germany) is used to reconstruct atmospheric carbon dioxide concentration in the Middle Eocene by analyzing stomatal density. We applied the novel mechanistic-theoretical approach of Konrad et al. (2008) which provides a quantitative derivation of the stomatal density response (number of stomata per leaf area) to varying atmospheric carbon dioxide concentration. The model couples 1) C3-photosynthesis, 2) the process of diffusion and 3) an optimisation principle providing maximum photosynthesis (via carbon dioxide uptake) and minimum water loss (via stomatal transpiration). These three sub-models also include data of the palaeoenvironment (temperature, water availability, wind velocity, atmospheric humidity, precipitation) and anatomy of leaf and stoma (depth, length and width of stomatal porus, thickness of assimilation tissue, leaf length). In order to calculate curves of stomatal density as a function of atmospheric carbon dioxide concentration, various biochemical parameters have to be borrowed from extant representatives. The necessary palaeoclimate data are reconstructed from the whole Messel flora using Leaf Margin Analysis (LMA) and the Coexistence Approach (CA). In order to obtain a significant result, we selected three species from which a large number of well-preserved leaves is available (at least 20 leaves per species). Palaeoclimate calculations for the Middle Eocene Messel Pit indicate a warm and humid climate with mean annual temperature of approximately 22°C, up to 2540 mm mean annual precipitation and the absence of extended periods of drought. Mean relative air

  16. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  17. Simulating the Epidemiological and Economic Impact of Paratuberculosis Control Actions in Dairy Cattle

    DEFF Research Database (Denmark)

    Kirkeby, Carsten Thure; Græsbøll, Kaare; Nielsen, Søren Saxmose

    2016-01-01

    We describe a new mechanistic bioeconomic model for simulating the spread of Mycobacterium avium subsp. paratuberculosis (MAP) within a dairy cattle herd. The model includes age-dependent susceptibility for infection; age-dependent sensitivity for detection; environmental MAP build up in five...... control actions from the Danish MAP control program, it was not economically attractive since the expenses for the control actions outweigh the benefits. Furthermore, the three most popular control actions against the spread of MAP on the farm were found to be costly and inefficient in lowering...

  18. Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling

    Science.gov (United States)

    Schum, William K.; Doolittle, Christina M.; Boyarko, George A.

    2006-05-01

    During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.

  19. Numerical simulation of Higgs models

    International Nuclear Information System (INIS)

    Jaster, A.

    1995-10-01

    The SU(2) Higgs and the Schwinger model on the lattice were analysed. Numerical simulations of the SU(2) Higgs model were performed to study the finite temperature electroweak phase transition. With the help of the multicanonical method the distribution of an order parameter at the phase transition point was measured. This was used to obtain the order of the phase transition and the value of the interface tension with the histogram method. Numerical simulations were also performed at zero temperature to perform renormalization. The measured values for the Wilson loops were used to determine the static potential and from this the renormalized gauge coupling. The Schwinger model was simulated at different gauge couplings to analyse the properties of the Kaplan-Shamir fermions. The prediction that the mass parameter gets only multiplicative renormalization was tested and verified. (orig.)

  20. Mechanistic formulation of a lineal-quadratic-linear (LQL) model: Split-dose experiments and exponentially decaying sources

    International Nuclear Information System (INIS)

    Guerrero, Mariana; Carlone, Marco

    2010-01-01

    Purpose: In recent years, several models were proposed that modify the standard linear-quadratic (LQ) model to make the predicted survival curve linear at high doses. Most of these models are purely phenomenological and can only be applied in the particular case of acute doses per fraction. The authors consider a mechanistic formulation of a linear-quadratic-linear (LQL) model in the case of split-dose experiments and exponentially decaying sources. This model provides a comprehensive description of radiation response for arbitrary dose rate and fractionation with only one additional parameter. Methods: The authors use a compartmental formulation of the LQL model from the literature. They analytically solve the model's differential equations for the case of a split-dose experiment and for an exponentially decaying source. They compare the solutions of the survival fraction with the standard LQ equations and with the lethal-potentially lethal (LPL) model. Results: In the case of the split-dose experiment, the LQL model predicts a recovery ratio as a function of dose per fraction that deviates from the square law of the standard LQ. The survival fraction as a function of time between fractions follows a similar exponential law as the LQ but adds a multiplicative factor to the LQ parameter β. The LQL solution for the split-dose experiment is very close to the LPL prediction. For the decaying source, the differences between the LQL and the LQ solutions are negligible when the half-life of the source is much larger than the characteristic repair time, which is the clinically relevant case. Conclusions: The compartmental formulation of the LQL model can be used for arbitrary dose rates and provides a comprehensive description of dose response. When the survival fraction for acute doses is linear for high dose, a deviation of the square law formula of the recovery ratio for split doses is also predicted.

  1. Model improvements to simulate charging in SEM

    Science.gov (United States)

    Arat, K. T.; Klimpel, T.; Hagen, C. W.

    2018-03-01

    Charging of insulators is a complex phenomenon to simulate since the accuracy of the simulations is very sensitive to the interaction of electrons with matter and electric fields. In this study, we report model improvements for a previously developed Monte-Carlo simulator to more accurately simulate samples that charge. The improvements include both modelling of low energy electron scattering and charging of insulators. The new first-principle scattering models provide a more realistic charge distribution cloud in the material, and a better match between non-charging simulations and experimental results. Improvements on charging models mainly focus on redistribution of the charge carriers in the material with an induced conductivity (EBIC) and a breakdown model, leading to a smoother distribution of the charges. Combined with a more accurate tracing of low energy electrons in the electric field, we managed to reproduce the dynamically changing charging contrast due to an induced positive surface potential.

  2. Censored rainfall modelling for estimation of fine-scale extremes

    Science.gov (United States)

    Cross, David; Onof, Christian; Winter, Hugo; Bernardara, Pietro

    2018-01-01

    Reliable estimation of rainfall extremes is essential for drainage system design, flood mitigation, and risk quantification. However, traditional techniques lack physical realism and extrapolation can be highly uncertain. In this study, we improve the physical basis for short-duration extreme rainfall estimation by simulating the heavy portion of the rainfall record mechanistically using the Bartlett-Lewis rectangular pulse (BLRP) model. Mechanistic rainfall models have had a tendency to underestimate rainfall extremes at fine temporal scales. Despite this, the simple process representation of rectangular pulse models is appealing in the context of extreme rainfall estimation because it emulates the known phenomenology of rainfall generation. A censored approach to Bartlett-Lewis model calibration is proposed and performed for single-site rainfall from two gauges in the UK and Germany. Extreme rainfall estimation is performed for each gauge at the 5, 15, and 60 min resolutions, and considerations for censor selection discussed.

  3. WE-H-BRA-07: Mechanistic Modelling of the Relative Biological Effectiveness of Heavy Charged Particles

    Energy Technology Data Exchange (ETDEWEB)

    McMahon, S [Massachusetts General Hospital, Boston, MA (United States); Queen’s University, Belfast, Belfast (United Kingdom); McNamara, A; Schuemann, J; Paganetti, H [Massachusetts General Hospital, Boston, MA (United States); Prise, K [Queen’s University, Belfast, Belfast (United Kingdom)

    2016-06-15

    Purpose Uncertainty in the Relative Biological Effectiveness (RBE) of heavy charged particles compared to photons remains one of the major uncertainties in particle therapy. As RBEs depend strongly on clinical variables such as tissue type, dose, and radiation quality, more accurate individualised models are needed to fully optimise treatments. MethodsWe have developed a model of DNA damage and repair following X-ray irradiation in a number of settings, incorporating mechanistic descriptions of DNA repair pathways, geometric effects on DNA repair, cell cycle effects and cell death. Our model has previously been shown to accurately predict a range of biological endpoints including chromosome aberrations, mutations, and cell death. This model was combined with nanodosimetric models of individual ion tracks to calculate the additional probability of lethal damage forming within a single track. These lethal damage probabilities can be used to predict survival and RBE for cells irradiated with ions of different Linear Energy Transfer (LET). ResultsBy combining the X-ray response model with nanodosimetry information, predictions of RBE can be made without cell-line specific fitting. The model’s RBE predictions were found to agree well with empirical proton RBE models (Mean absolute difference between models of 1.9% and 1.8% for cells with α/β ratios of 9 and 1.4, respectively, for LETs between 0 and 15 keV/µm). The model also accurately recovers the impact of high-LET carbon ion exposures, showing both the reduced efficacy of ions at extremely high LET, as well as the impact of defects in non-homologous end joining on RBE values in Chinese Hamster Ovary cells.ConclusionOur model is predicts RBE without the inclusion of empirical LET fitting parameters for a range of experimental conditions. This approach has the potential to deliver improved personalisation of particle therapy, with future developments allowing for the calculation of individualised RBEs. SJM is

  4. Mechanistic modeling of sulfur-deprived photosynthesis and hydrogen production in suspensions of Chlamydomonas reinhardtii.

    Science.gov (United States)

    Williams, C R; Bees, M A

    2014-02-01

    The ability of unicellular green algal species such as Chlamydomonas reinhardtii to produce hydrogen gas via iron-hydrogenase is well known. However, the oxygen-sensitive hydrogenase is closely linked to the photosynthetic chain in such a way that hydrogen and oxygen production need to be separated temporally for sustained photo-production. Under illumination, sulfur-deprivation has been shown to accommodate the production of hydrogen gas by partially-deactivating O2 evolution activity, leading to anaerobiosis in a sealed culture. As these facets are coupled, and the system complex, mathematical approaches potentially are of significant value since they may reveal improved or even optimal schemes for maximizing hydrogen production. Here, a mechanistic model of the system is constructed from consideration of the essential pathways and processes. The role of sulfur in photosynthesis (via PSII) and the storage and catabolism of endogenous substrate, and thus growth and decay of culture density, are explicitly modeled in order to describe and explore the complex interactions that lead to H2 production during sulfur-deprivation. As far as possible, functional forms and parameter values are determined or estimated from experimental data. The model is compared with published experimental studies and, encouragingly, qualitative agreement for trends in hydrogen yield and initiation time are found. It is then employed to probe optimal external sulfur and illumination conditions for hydrogen production, which are found to differ depending on whether a maximum yield of gas or initial production rate is required. The model constitutes a powerful theoretical tool for investigating novel sulfur cycling regimes that may ultimately be used to improve the commercial viability of hydrogen gas production from microorganisms. © 2013 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc.

  5. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  6. Comment on Chiesi et al. (2011): “Use of BIOME-BGC to simulate Mediterranean forest carbon stocks”

    OpenAIRE

    Eastaugh CS

    2011-01-01

    The mechanistic forest growth model BIOME-BGC utilizes a “spin-up” procedure to estimate site parameters for forests in a steady-state condition, as they may have been expected to be prior to anthropogenic influence. Forests in this condition have no net growth, as living biomass accumulation is balanced by mortality. To simulate current ecosystems it is necessary to reset the model to reflect a forest of the correct development stage. The alternative approach of simply post-adjus...

  7. Modeling salmonella Dublin into the dairy herd simulation model Simherd

    DEFF Research Database (Denmark)

    Kudahl, Anne Braad

    2010-01-01

    Infection with Salmonella Dublin in the dairy herd and effects of the infection and relevant control measures are currently being modeled into the dairy herd simulation model called Simherd. The aim is to compare the effects of different control strategies against Salmonella Dublin on both within...... of the simulations will therefore be used for decision support in the national surveillance and eradication program against Salmonella Dublin. Basic structures of the model are programmed and will be presented at the workshop. The model is in a phase of face-validation by a group of Salmonella......-herd- prevalence and economy by simulations. The project Dublin on both within-herd- prevalence and economy by simulations. The project is a part of a larger national project "Salmonella 2007 - 2011" with the main objective to reduce the prevalence of Salmonella Dublin in Danish Dairy herds. Results...

  8. Application of a mechanistic model as a tool for on-line monitoring of pilot scale filamentous fungal fermentation processes-The importance of evaporation effects.

    Science.gov (United States)

    Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V

    2017-03-01

    A mechanistic model-based soft sensor is developed and validated for 550L filamentous fungus fermentations operated at Novozymes A/S. The soft sensor is comprised of a parameter estimation block based on a stoichiometric balance, coupled to a dynamic process model. The on-line parameter estimation block models the changing rates of formation of product, biomass, and water, and the rate of consumption of feed using standard, available on-line measurements. This parameter estimation block, is coupled to a mechanistic process model, which solves the current states of biomass, product, substrate, dissolved oxygen and mass, as well as other process parameters including k L a, viscosity and partial pressure of CO 2 . State estimation at this scale requires a robust mass model including evaporation, which is a factor not often considered at smaller scales of operation. The model is developed using a historical data set of 11 batches from the fermentation pilot plant (550L) at Novozymes A/S. The model is then implemented on-line in 550L fermentation processes operated at Novozymes A/S in order to validate the state estimator model on 14 new batches utilizing a new strain. The product concentration in the validation batches was predicted with an average root mean sum of squared error (RMSSE) of 16.6%. In addition, calculation of the Janus coefficient for the validation batches shows a suitably calibrated model. The robustness of the model prediction is assessed with respect to the accuracy of the input data. Parameter estimation uncertainty is also carried out. The application of this on-line state estimator allows for on-line monitoring of pilot scale batches, including real-time estimates of multiple parameters which are not able to be monitored on-line. With successful application of a soft sensor at this scale, this allows for improved process monitoring, as well as opening up further possibilities for on-line control algorithms, utilizing these on-line model outputs

  9. Leaf chlorophyll constraint on model simulated gross primary productivity in agricultural systems

    KAUST Repository

    Houborg, Rasmus

    2015-05-05

    Leaf chlorophyll content (Chll) may serve as an observational proxy for the maximum rate of carboxylation (Vmax), which describes leaf photosynthetic capacity and represents the single most important control on modeled leaf photosynthesis within most Terrestrial Biosphere Models (TBMs). The parameterization of Vmax is associated with great uncertainty as it can vary significantly between plants and in response to changes in leaf nitrogen (N) availability, plant phenology and environmental conditions. Houborg et al. (2013) outlined a semi-mechanistic relationship between V max 25 (Vmax normalized to 25 °C) and Chll based on inter-linkages between V max 25 , Rubisco enzyme kinetics, N and Chll. Here, these relationships are parameterized for a wider range of important agricultural crops and embedded within the leaf photosynthesis-conductance scheme of the Community Land Model (CLM), bypassing the questionable use of temporally invariant and broadly defined plant functional type (PFT) specific V max 25 values. In this study, the new Chll constrained version of CLM is refined with an updated parameterization scheme for specific application to soybean and maize. The benefit of using in-situ measured and satellite retrieved Chll for constraining model simulations of Gross Primary Productivity (GPP) is evaluated over fields in central Nebraska, U.S.A between 2001 and 2005. Landsat-based Chll time-series records derived from the Regularized Canopy Reflectance model (REGFLEC) are used as forcing to the CLM. Validation of simulated GPP against 15 site-years of flux tower observations demonstrate the utility of Chll as a model constraint, with the coefficient of efficiency increasing from 0.91 to 0.94 and from 0.87 to 0.91 for maize and soybean, respectively. Model performances particularly improve during the late reproductive and senescence stage, where the largest temporal variations in Chll (averaging 35–55 μg cm−2 for maize and 20–35 μg cm−2 for soybean) are

  10. A metabonomic approach for mechanistic exploration of pre-clinical toxicology.

    Science.gov (United States)

    Coen, Muireann

    2010-12-30

    Metabonomics involves the application of advanced analytical tools to profile the diverse metabolic complement of a given biofluid or tissue. Subsequent statistical modelling of the complex multivariate spectral profiles enables discrimination between phenotypes of interest and identifies panels of discriminatory metabolites that represent candidate biomarkers. This review article presents an overview of recent developments in the field of metabonomics with a focus on application to pre-clinical toxicology studies. Recent research investigations carried out as part of the international COMET 2 consortium project on the hepatotoxic action of the aminosugar, galactosamine (galN) are presented. The application of advanced, high-field NMR spectroscopy is demonstrated, together with complementary application of a targeted mass spectrometry platform coupled with ultra-performance liquid chromatography. Much novel mechanistic information has been gleaned on both the mechanism of galN hepatotoxicity in multiple biofluids and tissues, and on the protection afforded by co-administration of glycine and uridine. The simultaneous identification of both the metabolic fate of galN and its associated endogenous consequences in spectral profiles is demonstrated. Furthermore, metabonomic assessment of inter-animal variability in response to galN presents enhanced mechanistic insight on variable response phentoypes and is relevant to understanding wider aspects of individual variability in drug response. This exemplar highlights the analytical and statistical tools commonly applied in metabonomic studies and notably, the approach is applicable to the study of any toxin/drug or intervention of interest. The metabonomic approach holds considerable promise and potential to significantly advance our understanding of the mechanistic bases for adverse drug reactions. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  11. Progress toward bridging from atomistic to continuum modeling to predict nuclear waste glass dissolution.

    Energy Technology Data Exchange (ETDEWEB)

    Zapol, Peter (Argonne National Laboratory, Argonne, IL); Bourg, Ian (Lawrence Berkeley National Laboratories, Berkeley, CA); Criscenti, Louise Jacqueline; Steefel, Carl I. (Lawrence Berkeley National Laboratories, Berkeley, CA); Schultz, Peter Andrew

    2011-10-01

    This report summarizes research performed for the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Subcontinuum and Upscaling Task. The work conducted focused on developing a roadmap to include molecular scale, mechanistic information in continuum-scale models of nuclear waste glass dissolution. This information is derived from molecular-scale modeling efforts that are validated through comparison with experimental data. In addition to developing a master plan to incorporate a subcontinuum mechanistic understanding of glass dissolution into continuum models, methods were developed to generate constitutive dissolution rate expressions from quantum calculations, force field models were selected to generate multicomponent glass structures and gel layers, classical molecular modeling was used to study diffusion through nanopores analogous to those in the interfacial gel layer, and a micro-continuum model (K{mu}C) was developed to study coupled diffusion and reaction at the glass-gel-solution interface.

  12. [Mechanistic modelling allows to assess pathways of DNA lesion interactions underlying chromosome aberration formation].

    Science.gov (United States)

    Eĭdel'man, Iu A; Slanina, S V; Sal'nikov, I V; Andreev, S G

    2012-12-01

    The knowledge of radiation-induced chromosomal aberration (CA) mechanisms is required in many fields of radiation genetics, radiation biology, biodosimetry, etc. However, these mechanisms are yet to be quantitatively characterised. One of the reasons is that the relationships between primary lesions of DNA/chromatin/chromosomes and dose-response curves for CA are unknown because the pathways of lesion interactions in an interphase nucleus are currently inaccessible for direct experimental observation. This article aims for the comparative analysis of two principally different scenarios of formation of simple and complex interchromosomal exchange aberrations: by lesion interactions at chromosome territories' surface vs. in the whole space of the nucleus. The analysis was based on quantitative mechanistic modelling of different levels of structures and processes involved in CA formation: chromosome structure in an interphase nucleus, induction, repair and interactions of DNA lesions. It was shown that the restricted diffusion of chromosomal loci, predicted by computational modelling of chromosome organization, results in lesion interactions in the whole space of the nucleus being impossible. At the same time, predicted features of subchromosomal dynamics agrees well with in vivo observations and does not contradict the mechanism of CA formation at the surface of chromosome territories. On the other hand, the "surface mechanism" of CA formation, despite having certain qualities, proved to be insufficient to explain high frequency of complex exchange aberrations observed by mFISH technique. The alternative mechanism, CA formation on nuclear centres is expected to be sufficient to explain frequent complex exchanges.

  13. Managing mechanistic and organic structure in health care organizations.

    Science.gov (United States)

    Olden, Peter C

    2012-01-01

    Managers at all levels in a health care organization must organize work to achieve the organization's mission and goals. This requires managers to decide the organization structure, which involves dividing the work among jobs and departments and then coordinating them all toward the common purpose. Organization structure, which is reflected in an organization chart, may range on a continuum from very mechanistic to very organic. Managers must decide how mechanistic versus how organic to make the entire organization and each of its departments. To do this, managers should carefully consider 5 factors for the organization and for each individual department: external environment, goals, work production, size, and culture. Some factors may push toward more mechanistic structure, whereas others may push in the opposite direction toward more organic structure. Practical advice can help managers at all levels design appropriate structure for their departments and organization.

  14. Asymmetric Responses of Primary Productivity to Altered Precipitation Simulated by Land Surface Models across Three Long-term Grassland Sites

    Science.gov (United States)

    Wu, D.; Ciais, P.; Viovy, N.; Knapp, A.; Wilcox, K.; Bahn, M.; Smith, M. D.; Ito, A.; Arneth, A.; Harper, A. B.; Ukkola, A.; Paschalis, A.; Poulter, B.; Peng, C.; Reick, C. H.; Hayes, D. J.; Ricciuto, D. M.; Reinthaler, D.; Chen, G.; Tian, H.; Helene, G.; Zscheischler, J.; Mao, J.; Ingrisch, J.; Nabel, J.; Pongratz, J.; Boysen, L.; Kautz, M.; Schmitt, M.; Krohn, M.; Zeng, N.; Meir, P.; Zhang, Q.; Zhu, Q.; Hasibeder, R.; Vicca, S.; Sippel, S.; Dangal, S. R. S.; Fatichi, S.; Sitch, S.; Shi, X.; Wang, Y.; Luo, Y.; Liu, Y.; Piao, S.

    2017-12-01

    Changes in precipitation variability including the occurrence of extreme events strongly influence plant growth in grasslands. Field measurements of aboveground net primary production (ANPP) in temperate grasslands suggest a positive asymmetric response with wet years resulting in ANPP gains larger than ANPP declines in dry years. Whether land surface models used for historical simulations and future projections of the coupled carbon-water system in grasslands are capable to simulate such non-symmetrical ANPP responses remains an important open research question. In this study, we evaluate the simulated responses of grassland primary productivity to altered precipitation with fourteen land surface models at the three sites of Colorado Shortgrass Steppe (SGS), Konza prairie (KNZ) and Stubai Valley meadow (STU) along a rainfall gradient from dry to wet. Our results suggest that: (i) Gross primary production (GPP), NPP, ANPP and belowground NPP (BNPP) show nonlinear response curves (concave-down) in all the models, but with different curvatures and mean values. In contrast across the sites, primary production increases and then saturates along increasing precipitation with a flattening at the wetter site. (ii) Slopes of spatial relationships between modeled primary production and precipitation are steeper than the temporal slopes (obtained from inter-annual variations). (iii) Asymmetric responses under nominal precipitation range with modeled inter-annual primary production show large uncertainties, and model-ensemble median generally suggests negative asymmetry (greater declines in dry years than increases in wet years) across the three sites. (iv) Primary production at the drier site is predicted to more sensitive to precipitation compared to wetter site, and median sensitivity consistently indicates greater negative impacts of reduced precipitation than positive effects of increased precipitation under extreme conditions. This study implies that most models

  15. Leatherbacks swimming in silico: modeling and verifying their momentum and heat balance using computational fluid dynamics.

    Science.gov (United States)

    Dudley, Peter N; Bonazza, Riccardo; Jones, T Todd; Wyneken, Jeanette; Porter, Warren P

    2014-01-01

    As global temperatures increase throughout the coming decades, species ranges will shift. New combinations of abiotic conditions will make predicting these range shifts difficult. Biophysical mechanistic niche modeling places bounds on an animal's niche through analyzing the animal's physical interactions with the environment. Biophysical mechanistic niche modeling is flexible enough to accommodate these new combinations of abiotic conditions. However, this approach is difficult to implement for aquatic species because of complex interactions among thrust, metabolic rate and heat transfer. We use contemporary computational fluid dynamic techniques to overcome these difficulties. We model the complex 3D motion of a swimming neonate and juvenile leatherback sea turtle to find power and heat transfer rates during the stroke. We combine the results from these simulations and a numerical model to accurately predict the core temperature of a swimming leatherback. These results are the first steps in developing a highly accurate mechanistic niche model, which can assists paleontologist in understanding biogeographic shifts as well as aid contemporary species managers about potential range shifts over the coming decades.

  16. Leatherbacks swimming in silico: modeling and verifying their momentum and heat balance using computational fluid dynamics.

    Directory of Open Access Journals (Sweden)

    Peter N Dudley

    Full Text Available As global temperatures increase throughout the coming decades, species ranges will shift. New combinations of abiotic conditions will make predicting these range shifts difficult. Biophysical mechanistic niche modeling places bounds on an animal's niche through analyzing the animal's physical interactions with the environment. Biophysical mechanistic niche modeling is flexible enough to accommodate these new combinations of abiotic conditions. However, this approach is difficult to implement for aquatic species because of complex interactions among thrust, metabolic rate and heat transfer. We use contemporary computational fluid dynamic techniques to overcome these difficulties. We model the complex 3D motion of a swimming neonate and juvenile leatherback sea turtle to find power and heat transfer rates during the stroke. We combine the results from these simulations and a numerical model to accurately predict the core temperature of a swimming leatherback. These results are the first steps in developing a highly accurate mechanistic niche model, which can assists paleontologist in understanding biogeographic shifts as well as aid contemporary species managers about potential range shifts over the coming decades.

  17. Cognitive science as an interface between rational and mechanistic explanation.

    Science.gov (United States)

    Chater, Nick

    2014-04-01

    Cognitive science views thought as computation; and computation, by its very nature, can be understood in both rational and mechanistic terms. In rational terms, a computation solves some information processing problem (e.g., mapping sensory information into a description of the external world; parsing a sentence; selecting among a set of possible actions). In mechanistic terms, a computation corresponds to causal chain of events in a physical device (in engineering context, a silicon chip; in biological context, the nervous system). The discipline is thus at the interface between two very different styles of explanation--as the papers in the current special issue well illustrate, it explores the interplay of rational and mechanistic forces. Copyright © 2014 Cognitive Science Society, Inc.

  18. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  19. ReactionPredictor: prediction of complex chemical reactions at the mechanistic level using machine learning.

    Science.gov (United States)

    Kayala, Matthew A; Baldi, Pierre

    2012-10-22

    Proposing reasonable mechanisms and predicting the course of chemical reactions is important to the practice of organic chemistry. Approaches to reaction prediction have historically used obfuscating representations and manually encoded patterns or rules. Here we present ReactionPredictor, a machine learning approach to reaction prediction that models elementary, mechanistic reactions as interactions between approximate molecular orbitals (MOs). A training data set of productive reactions known to occur at reasonable rates and yields and verified by inclusion in the literature or textbooks is derived from an existing rule-based system and expanded upon with manual curation from graduate level textbooks. Using this training data set of complex polar, hypervalent, radical, and pericyclic reactions, a two-stage machine learning prediction framework is trained and validated. In the first stage, filtering models trained at the level of individual MOs are used to reduce the space of possible reactions to consider. In the second stage, ranking models over the filtered space of possible reactions are used to order the reactions such that the productive reactions are the top ranked. The resulting model, ReactionPredictor, perfectly ranks polar reactions 78.1% of the time and recovers all productive reactions 95.7% of the time when allowing for small numbers of errors. Pericyclic and radical reactions are perfectly ranked 85.8% and 77.0% of the time, respectively, rising to >93% recovery for both reaction types with a small number of allowed errors. Decisions about which of the polar, pericyclic, or radical reaction type ranking models to use can be made with >99% accuracy. Finally, for multistep reaction pathways, we implement the first mechanistic pathway predictor using constrained tree-search to discover a set of reasonable mechanistic steps from given reactants to given products. Webserver implementations of both the single step and pathway versions of Reaction

  20. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  1. A study for production simulation model generation system based on data model at a shipyard

    Directory of Open Access Journals (Sweden)

    Myung-Gi Back

    2016-09-01

    Full Text Available Simulation technology is a type of shipbuilding product lifecycle management solution used to support production planning or decision-making. Normally, most shipbuilding processes are consisted of job shop production, and the modeling and simulation require professional skills and experience on shipbuilding. For these reasons, many shipbuilding companies have difficulties adapting simulation systems, regardless of the necessity for the technology. In this paper, the data model for shipyard production simulation model generation was defined by analyzing the iterative simulation modeling procedure. The shipyard production simulation data model defined in this study contains the information necessary for the conventional simulation modeling procedure and can serve as a basis for simulation model generation. The efficacy of the developed system was validated by applying it to the simulation model generation of the panel block production line. By implementing the initial simulation model generation process, which was performed in the past with a simulation modeler, the proposed system substantially reduced the modeling time. In addition, by reducing the difficulties posed by different modeler-dependent generation methods, the proposed system makes the standardization of the simulation model quality possible.

  2. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  3. Modeling phosphorus capture by plants growing in a multi-species riparian buffer

    Science.gov (United States)

    The NST 3.0 mechanistic nutrient uptake model was used to explore phosphorus (P) uptake to a depth of 120 cm over a 126-d growing season in simulated buffer communities composed of mixtures of cottonwood (Populus deltoids Bartr.), switchgrass (Panicum virgatum L.), and smooth brome (Bromis inermis L...

  4. The development of an industrial-scale fed-batch fermentation simulation.

    Science.gov (United States)

    Goldrick, Stephen; Ştefan, Andrei; Lovett, David; Montague, Gary; Lennox, Barry

    2015-01-10

    This paper describes a simulation of an industrial-scale fed-batch fermentation that can be used as a benchmark in process systems analysis and control studies. The simulation was developed using a mechanistic model and validated using historical data collected from an industrial-scale penicillin fermentation process. Each batch was carried out in a 100,000 L bioreactor that used an industrial strain of Penicillium chrysogenum. The manipulated variables recorded during each batch were used as inputs to the simulator and the predicted outputs were then compared with the on-line and off-line measurements recorded in the real process. The simulator adapted a previously published structured model to describe the penicillin fermentation and extended it to include the main environmental effects of dissolved oxygen, viscosity, temperature, pH and dissolved carbon dioxide. In addition the effects of nitrogen and phenylacetic acid concentrations on the biomass and penicillin production rates were also included. The simulated model predictions of all the on-line and off-line process measurements, including the off-gas analysis, were in good agreement with the batch records. The simulator and industrial process data are available to download at www.industrialpenicillinsimulation.com and can be used to evaluate, study and improve on the current control strategy implemented on this facility. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  5. A satellite simulator for TRMM PR applied to climate model simulations

    Science.gov (United States)

    Spangehl, T.; Schroeder, M.; Bodas-Salcedo, A.; Hollmann, R.; Riley Dellaripa, E. M.; Schumacher, C.

    2017-12-01

    Climate model simulations have to be compared against observation based datasets in order to assess their skill in representing precipitation characteristics. Here we use a satellite simulator for TRMM PR in order to evaluate simulations performed with MPI-ESM (Earth system model of the Max Planck Institute for Meteorology in Hamburg, Germany) performed within the MiKlip project (https://www.fona-miklip.de/, funded by Federal Ministry of Education and Research in Germany). While classical evaluation methods focus on geophysical parameters such as precipitation amounts, the application of the satellite simulator enables an evaluation in the instrument's parameter space thereby reducing uncertainties on the reference side. The CFMIP Observation Simulator Package (COSP) provides a framework for the application of satellite simulators to climate model simulations. The approach requires the introduction of sub-grid cloud and precipitation variability. Radar reflectivities are obtained by applying Mie theory, with the microphysical assumptions being chosen to match the atmosphere component of MPI-ESM (ECHAM6). The results are found to be sensitive to the methods used to distribute the convective precipitation over the sub-grid boxes. Simple parameterization methods are used to introduce sub-grid variability of convective clouds and precipitation. In order to constrain uncertainties a comprehensive comparison with sub-grid scale convective precipitation variability which is deduced from TRMM PR observations is carried out.

  6. High Fidelity In Situ Shoulder Dystocia Simulation

    Directory of Open Access Journals (Sweden)

    Andrew Pelikan, MD

    2018-04-01

    Full Text Available Audience: Resident physicians, emergency department (ED staff Introduction: Precipitous deliveries are high acuity, low occurrence in most emergency departments. Shoulder dystocia is a rare but potentially fatal complication of labor that can be relieved by specific maneuvers that must be implemented in a timely manner. This simulation is designed to educate resident learners on the critical management steps in a shoulder dystocia presenting to the emergency department. A special aspect of this simulation is the unique utilization of the “Noelle” model with an instructing physician at bedside maneuvering the fetus through the stations of labor and providing subtle adjustments to fetal positioning not possible though a mechanized model. A literature search of “shoulder dystocia simulation” consists primarily of obstetrics and mid-wife journals, many of which utilize various mannequin models. None of the reviewed articles utilized a bedside provider maneuvering the fetus with the Noelle model, making this method unique. While the Noelle model is equipped with a remote-controlled motor that automatically rotates and delivers the baby either to the head or to the shoulders and can produce a turtle sign and which will prevent delivery of the baby until signaled to do so by the instructor, using the bedside instructor method allows this simulation to be reproduced with less mechanistically advanced and lower cost models.1-5 Objectives: At the end of this simulation, learners will: 1 Recognize impending delivery and mobilize appropriate resources (ie, both obstetrics [OB] and NICU/pediatrics; 2 Identify risk factors for shoulder dystocia based on history and physical; 3 Recognize shoulder dystocia during delivery; 4 Demonstrate maneuvers to relieve shoulder dystocia; 5 Communicate with team members and nursing staff during resuscitation of a critically ill patient. Method: High-fidelity simulation. Topics: High fidelity, in situ, Noelle model

  7. Use case driven approach to develop simulation model for PCS of APR1400 simulator

    International Nuclear Information System (INIS)

    Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang; Byung Hwan, Bae

    2006-01-01

    The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and the resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)

  8. Mechanistic kinetic modeling generates system-independent P-glycoprotein mediated transport elementary rate constants for inhibition and, in combination with 3D SIM microscopy, elucidates the importance of microvilli morphology on P-glycoprotein mediated efflux activity.

    Science.gov (United States)

    Ellens, Harma; Meng, Zhou; Le Marchand, Sylvain J; Bentz, Joe

    2018-06-01

    In vitro transporter kinetics are typically analyzed by steady-state Michaelis-Menten approximations. However, no clear evidence exists that these approximations, applied to multiple transporters in biological membranes, yield system-independent mechanistic parameters needed for reliable in vivo hypothesis generation and testing. Areas covered: The classical mass action model has been developed for P-glycoprotein (P-gp) mediated transport across confluent polarized cell monolayers. Numerical integration of the mass action equations for transport using a stable global optimization program yields fitted elementary rate constants that are system-independent. The efflux active P-gp was defined by the rate at which P-gp delivers drugs to the apical chamber, since as much as 90% of drugs effluxed by P-gp partition back into nearby microvilli prior to reaching the apical chamber. The efflux active P-gp concentration was 10-fold smaller than the total expressed P-gp for Caco-2 cells, due to their microvilli membrane morphology. The mechanistic insights from this analysis are readily extrapolated to P-gp mediated transport in vivo. Expert opinion: In vitro system-independent elementary rate constants for transporters are essential for the generation and validation of robust mechanistic PBPK models. Our modeling approach and programs have broad application potential. They can be used for any drug transporter with minor adaptations.

  9. Modeling process-structure-property relationships for additive manufacturing

    Science.gov (United States)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Yu, Cheng; Liu, Zeliang; Lian, Yanping; Wolff, Sarah; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-02-01

    This paper presents our latest work on comprehensive modeling of process-structure-property relationships for additive manufacturing (AM) materials, including using data-mining techniques to close the cycle of design-predict-optimize. To illustrate the processstructure relationship, the multi-scale multi-physics process modeling starts from the micro-scale to establish a mechanistic heat source model, to the meso-scale models of individual powder particle evolution, and finally to the macro-scale model to simulate the fabrication process of a complex product. To link structure and properties, a highefficiency mechanistic model, self-consistent clustering analyses, is developed to capture a variety of material response. The model incorporates factors such as voids, phase composition, inclusions, and grain structures, which are the differentiating features of AM metals. Furthermore, we propose data-mining as an effective solution for novel rapid design and optimization, which is motivated by the numerous influencing factors in the AM process. We believe this paper will provide a roadmap to advance AM fundamental understanding and guide the monitoring and advanced diagnostics of AM processing.

  10. Plasma modelling and numerical simulation

    International Nuclear Information System (INIS)

    Van Dijk, J; Kroesen, G M W; Bogaerts, A

    2009-01-01

    Plasma modelling is an exciting subject in which virtually all physical disciplines are represented. Plasma models combine the electromagnetic, statistical and fluid dynamical theories that have their roots in the 19th century with the modern insights concerning the structure of matter that were developed throughout the 20th century. The present cluster issue consists of 20 invited contributions, which are representative of the state of the art in plasma modelling and numerical simulation. These contributions provide an in-depth discussion of the major theories and modelling and simulation strategies, and their applications to contemporary plasma-based technologies. In this editorial review, we introduce and complement those papers by providing a bird's eye perspective on plasma modelling and discussing the historical context in which it has surfaced. (editorial review)

  11. A Mechanistic Model of Intermittent Gastric Emptying and Glucose-Insulin Dynamics following a Meal Containing Milk Components.

    Directory of Open Access Journals (Sweden)

    Priska Stahel

    Full Text Available To support decision-making around diet selection choices to manage glycemia following a meal, a novel mechanistic model of intermittent gastric emptying and plasma glucose-insulin dynamics was developed. Model development was guided by postprandial timecourses of plasma glucose, insulin and the gastric emptying marker acetaminophen in infant calves fed meals of 2 or 4 L milk replacer. Assigning a fast, slow or zero first-order gastric emptying rate to each interval between plasma samples fit acetaminophen curves with prediction errors equal to 9% of the mean observed acetaminophen concentration. Those gastric emptying parameters were applied to glucose appearance in conjunction with minimal models of glucose disposal and insulin dynamics to describe postprandial glycemia and insulinemia. The final model contains 20 parameters, 8 of which can be obtained by direct measurement and 12 by fitting to observations. The minimal model of intestinal glucose delivery contains 2 gastric emptying parameters and a third parameter describing the time lag between emptying and appearance of glucose in plasma. Sensitivity analysis of the aggregate model revealed that gastric emptying rate influences area under the plasma insulin curve but has little effect on area under the plasma glucose curve. This result indicates that pancreatic responsiveness is influenced by gastric emptying rate as a consequence of the quasi-exponential relationship between plasma glucose concentration and pancreatic insulin release. The fitted aggregate model was able to reproduce the multiple postprandial rises and falls in plasma glucose concentration observed in calves consuming a normal-sized meal containing milk components.

  12. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  13. Validation of the simulator neutronics model

    International Nuclear Information System (INIS)

    Gregory, M.V.

    1984-01-01

    The neutronics model in the SRP reactor training simulator computes the variation with time of the neutron population in the reactor core. The power output of a reactor is directly proportional to the neutron population, thus in a very real sense the neutronics model determines the response of the simulator. The geometrical complexity of the reactor control system in SRP reactors requires the neutronics model to provide a detailed, 3D representation of the reactor core. Existing simulator technology does not allow such a detailed representation to run in real-time in a minicomputer environment, thus an entirely different approach to the problem was required. A prompt jump method has been developed in answer to this need

  14. Modelling and simulation of a heat exchanger

    Science.gov (United States)

    Xia, Lei; Deabreu-Garcia, J. Alex; Hartley, Tom T.

    1991-01-01

    Two models for two different control systems are developed for a parallel heat exchanger. First by spatially lumping a heat exchanger model, a good approximate model which has a high system order is produced. Model reduction techniques are applied to these to obtain low order models that are suitable for dynamic analysis and control design. The simulation method is discussed to ensure a valid simulation result.

  15. Modeling and Simulation of U-tube Steam Generator

    Science.gov (United States)

    Zhang, Mingming; Fu, Zhongguang; Li, Jinyao; Wang, Mingfei

    2018-03-01

    The U-tube natural circulation steam generator was mainly researched with modeling and simulation in this article. The research is based on simuworks system simulation software platform. By analyzing the structural characteristics and the operating principle of U-tube steam generator, there are 14 control volumes in the model, including primary side, secondary side, down channel and steam plenum, etc. The model depends completely on conservation laws, and it is applied to make some simulation tests. The results show that the model is capable of simulating properly the dynamic response of U-tube steam generator.

  16. Model for Simulation Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik

    1976-01-01

    A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance...... eigenfunctions and estimates of the distributions of the corresponding expansion coefficients. The simulation method utilizes the eigenfunction expansion procedure to produce preliminary time histories of the three velocity components simultaneously. As a final step, a spectral shaping procedure is then applied....... The method is unique in modeling the three velocity components simultaneously, and it is found that important cross-statistical features are reasonably well-behaved. It is concluded that the model provides a practical, operational simulator of atmospheric turbulence....

  17. System modeling and simulation at EBR-II

    International Nuclear Information System (INIS)

    Dean, E.M.; Lehto, W.K.; Larson, H.A.

    1986-01-01

    The codes being developed and verified using EBR-II data are the NATDEMO, DSNP and CSYRED. NATDEMO is a variation of the Westinghouse DEMO code coupled to the NATCON code previously used to simulate perturbations of reactor flow and inlet temperature and loss-of-flow transients leading to natural convection in EBR-II. CSYRED uses the Continuous System Modeling Program (CSMP) to simulate the EBR-II core, including power, temperature, control-rod movement reactivity effects and flow and is used primarily to model reactivity induced power transients. The Dynamic Simulator for Nuclear Power Plants (DSNP) allows a whole plant, thermal-hydraulic simulation using specific component and system models called from libraries. It has been used to simulate flow coastdown transients, reactivity insertion events and balance-of-plant perturbations

  18. The development of a fully-integrated immune response model (FIRM) simulator of the immune response through integration of multiple subset models.

    Science.gov (United States)

    Palsson, Sirus; Hickling, Timothy P; Bradshaw-Pierce, Erica L; Zager, Michael; Jooss, Karin; O'Brien, Peter J; Spilker, Mary E; Palsson, Bernhard O; Vicini, Paolo

    2013-09-28

    The complexity and multiscale nature of the mammalian immune response provides an excellent test bed for the potential of mathematical modeling and simulation to facilitate mechanistic understanding. Historically, mathematical models of the immune response focused on subsets of the immune system and/or specific aspects of the response. Mathematical models have been developed for the humoral side of the immune response, or for the cellular side, or for cytokine kinetics, but rarely have they been proposed to encompass the overall system complexity. We propose here a framework for integration of subset models, based on a system biology approach. A dynamic simulator, the Fully-integrated Immune Response Model (FIRM), was built in a stepwise fashion by integrating published subset models and adding novel features. The approach used to build the model includes the formulation of the network of interacting species and the subsequent introduction of rate laws to describe each biological process. The resulting model represents a multi-organ structure, comprised of the target organ where the immune response takes place, circulating blood, lymphoid T, and lymphoid B tissue. The cell types accounted for include macrophages, a few T-cell lineages (cytotoxic, regulatory, helper 1, and helper 2), and B-cell activation to plasma cells. Four different cytokines were accounted for: IFN-γ, IL-4, IL-10 and IL-12. In addition, generic inflammatory signals are used to represent the kinetics of IL-1, IL-2, and TGF-β. Cell recruitment, differentiation, replication, apoptosis and migration are described as appropriate for the different cell types. The model is a hybrid structure containing information from several mammalian species. The structure of the network was built to be physiologically and biochemically consistent. Rate laws for all the cellular fate processes, growth factor production rates and half-lives, together with antibody production rates and half-lives, are provided. The

  19. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first-passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on slender members of offshore structures is described. The wave elevation of the sea state is modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  20. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  1. Simulation modeling for the health care manager.

    Science.gov (United States)

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  2. In vitro solubility, dissolution and permeability studies combined with semi-mechanistic modeling to investigate the intestinal absorption of desvenlafaxine from an immediate- and extended release formulation.

    Science.gov (United States)

    Franek, F; Jarlfors, A; Larsen, F; Holm, P; Steffansen, B

    2015-09-18

    Desvenlafaxine is a biopharmaceutics classification system (BCS) class 1 (high solubility, high permeability) and biopharmaceutical drug disposition classification system (BDDCS) class 3, (high solubility, poor metabolism; implying low permeability) compound. Thus the rate-limiting step for desvenlafaxine absorption (i.e. intestinal dissolution or permeation) is not fully clarified. The aim of this study was to investigate whether dissolution and/or intestinal permeability rate-limit desvenlafaxine absorption from an immediate-release formulation (IRF) and Pristiq(®), an extended release formulation (ERF). Semi-mechanistic models of desvenlafaxine were built (using SimCyp(®)) by combining in vitro data on dissolution and permeation (mechanistic part of model) with clinical data (obtained from literature) on distribution and clearance (non-mechanistic part of model). The model predictions of desvenlafaxine pharmacokinetics after IRF and ERF administration were compared with published clinical data from 14 trials. Desvenlafaxine in vivo dissolution from the IRF and ERF was predicted from in vitro solubility studies and biorelevant dissolution studies (using the USP3 dissolution apparatus), respectively. Desvenlafaxine apparent permeability (Papp) at varying apical pH was investigated using the Caco-2 cell line and extrapolated to effective intestinal permeability (Peff) in human duodenum, jejunum, ileum and colon. Desvenlafaxine pKa-values and octanol-water partition coefficients (Do:w) were determined experimentally. Due to predicted rapid dissolution after IRF administration, desvenlafaxine was predicted to be available for permeation in the duodenum. Desvenlafaxine Do:w and Papp increased approximately 13-fold when increasing apical pH from 5.5 to 7.4. Desvenlafaxine Peff thus increased with pH down the small intestine. Consequently, desvenlafaxine absorption from an IRF appears rate-limited by low Peff in the upper small intestine, which "delays" the predicted

  3. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  4. Hysteresis in simulations of malaria transmission

    Science.gov (United States)

    Yamana, Teresa K.; Qiu, Xin; Eltahir, Elfatih A. B.

    2017-10-01

    Malaria transmission is a complex system and in many parts of the world is closely related to climate conditions. However, studies on environmental determinants of malaria generally consider only concurrent climate conditions and ignore the historical or initial conditions of the system. Here, we demonstrate the concept of hysteresis in malaria transmission, defined as non-uniqueness of the relationship between malaria prevalence and concurrent climate conditions. We show the dependence of simulated malaria transmission on initial prevalence and the initial level of human immunity in the population. Using realistic time series of environmental variables, we quantify the effect of hysteresis in a modeled population. In a set of numerical experiments using HYDREMATS, a field-tested mechanistic model of malaria transmission, the simulated maximum malaria prevalence depends on both the initial prevalence and the initial level of human immunity in the population. We found the effects of initial conditions to be of comparable magnitude to the effects of interannual variability in environmental conditions in determining malaria prevalence. The memory associated with this hysteresis effect is longer in high transmission settings than in low transmission settings. Our results show that efforts to simulate and forecast malaria transmission must consider the exposure history of a location as well as the concurrent environmental drivers.

  5. Evaluation and comparison of models and modelling tools simulating nitrogen processes in treatment wetlands

    DEFF Research Database (Denmark)

    Edelfeldt, Stina; Fritzson, Peter

    2008-01-01

    with Modelica 2.1 (Wiley-IEEE Press, USA, 2004).] and an associated tool. The differences and similarities between the MathModelica Model Editor and three other ecological modelling tools have also been evaluated. The results show that the models can well be modelled and simulated in the MathModelica Model...... Editor, and that nitrogen decrease in a constructed treatment wetland should be described and simulated using the Nitrification/Denitrification model as this model has the highest overall quality score and provides a more variable environment.......In this paper, two ecological models of nitrogen processes in treatment wetlands have been evaluated and compared. These models were implemented, simulated, and visualized using the Modelica modelling and simulation language [P. Fritzson, Principles of Object-Oriented Modelling and Simulation...

  6. Simulation as a vehicle for enhancing collaborative practice models.

    Science.gov (United States)

    Jeffries, Pamela R; McNelis, Angela M; Wheeler, Corinne A

    2008-12-01

    Clinical simulation used in a collaborative practice approach is a powerful tool to prepare health care providers for shared responsibility for patient care. Clinical simulations are being used increasingly in professional curricula to prepare providers for quality practice. Little is known, however, about how these simulations can be used to foster collaborative practice across disciplines. This article provides an overview of what simulation is, what collaborative practice models are, and how to set up a model using simulations. An example of a collaborative practice model is presented, and nursing implications of using a collaborative practice model in simulations are discussed.

  7. Vermont Yankee simulator BOP model upgrade

    International Nuclear Information System (INIS)

    Alejandro, R.; Udbinac, M.J.

    2006-01-01

    The Vermont Yankee simulator has undergone significant changes in the 20 years since the original order was placed. After the move from the original Unix to MS Windows environment, and upgrade to the latest version of SimPort, now called MASTER, the platform was set for an overhaul and replacement of major plant system models. Over a period of a few months, the VY simulator team, in partnership with WSC engineers, replaced outdated legacy models of the main steam, condenser, condensate, circulating water, feedwater and feedwater heaters, and main turbine and auxiliaries. The timing was ideal, as the plant was undergoing a power up-rate, so the opportunity was taken to replace the legacy models with industry-leading, true on-line object oriented graphical models. Due to the efficiency of design and ease of use of the MASTER tools, VY staff performed the majority of the modeling work themselves with great success, with only occasional assistance from WSC, in a relatively short time-period, despite having to maintain all of their 'regular' simulator maintenance responsibilities. This paper will provide a more detailed view of the VY simulator, including how it is used and how it has benefited from the enhancements and upgrades implemented during the project. (author)

  8. Regional model simulations of New Zealand climate

    Science.gov (United States)

    Renwick, James A.; Katzfey, Jack J.; Nguyen, Kim C.; McGregor, John L.

    1998-03-01

    Simulation of New Zealand climate is examined through the use of a regional climate model nested within the output of the Commonwealth Scientific and Industrial Research Organisation nine-level general circulation model (GCM). R21 resolution GCM output is used to drive a regional model run at 125 km grid spacing over the Australasian region. The 125 km run is used in turn to drive a simulation at 50 km resolution over New Zealand. Simulations with a full seasonal cycle are performed for 10 model years. The focus is on the quality of the simulation of present-day climate, but results of a doubled-CO2 run are discussed briefly. Spatial patterns of mean simulated precipitation and surface temperatures improve markedly as horizontal resolution is increased, through the better resolution of the country's orography. However, increased horizontal resolution leads to a positive bias in precipitation. At 50 km resolution, simulated frequency distributions of daily maximum/minimum temperatures are statistically similar to those of observations at many stations, while frequency distributions of daily precipitation appear to be statistically different to those of observations at most stations. Modeled daily precipitation variability at 125 km resolution is considerably less than observed, but is comparable to, or exceeds, observed variability at 50 km resolution. The sensitivity of the simulated climate to changes in the specification of the land surface is discussed briefly. Spatial patterns of the frequency of extreme temperatures and precipitation are generally well modeled. Under a doubling of CO2, the frequency of precipitation extremes changes only slightly at most locations, while air frosts become virtually unknown except at high-elevation sites.

  9. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  10. Polypropyleneimine and polyamidoamine dendrimer mediated enhanced solubilization of bortezomib: Comparison and evaluation of mechanistic aspects by thermodynamics and molecular simulations

    International Nuclear Information System (INIS)

    Chaudhary, Sonam; Gothwal, Avinash; Khan, Iliyas; Srivastava, Shubham; Malik, Ruchi; Gupta, Umesh

    2017-01-01

    Bortezomib (BTZ) is the first proteasome inhibitor approved by the US-FDA is majorly used for the treatment of newly diagnosed and relapsed multiple myeloma including mantle cell lymphoma. BTZ is hydrophobic in nature and is a major cause for its minimal presence as marketed formulations. The present study reports the design, development and characterization of dendrimer based formulation for the improved solubility and effectivity of bortezomib. The study also equally focuses on the mechanistic elucidation of solubilization by two types of dendrimers i.e. fourth generation of poly (amidoamine) dendrimers (G4-PAMAM-NH 2 ) and fifth generation of poly (propylene) imine dendrimers (G5-PPI-NH 2 ). It was observed that aqueous solubility of BTZ was concentration and pH dependent. At 2 mM G5-PPI-NH 2 concentration, the fold increase in bortezomib solubility was 1152.63 times in water, while approximately 3426.69 folds increase in solubility was observed at pH 10.0, respectively (p < 0.05). The solubility of the drug was increased to a greater extent with G5-PPI-NH 2 dendrimers because it has more hydrophobic interior than G4-PAMAM-NH 2 dendrimers. The release of BTZ from G5-PPI-NH 2 complex was comparatively slower than G4-PAMAM-NH 2 . The thermodynamic treatment of data proved that dendrimer drug complexes were stable at all pH with values of ΔG always negative. The experimental findings were also proven by molecular simulation studies and by calculating RMSD and intermolecular hydrogen bonding through Schrodinger software. It was concluded that PPI dendrimers were able to solubilize the drug more effectively than PAMAM dendrimers through electrostatic interactions. - Highlights: • The present study reports the application of PAMAM and PPI dendrimers in solubilizing bortezomib with possible mechanism. • Improved solubility of bortezomib through dendrimers could significantly contribute its successful anticancer potential. • Molecular simulation and thermodynamic

  11. Polypropyleneimine and polyamidoamine dendrimer mediated enhanced solubilization of bortezomib: Comparison and evaluation of mechanistic aspects by thermodynamics and molecular simulations

    Energy Technology Data Exchange (ETDEWEB)

    Chaudhary, Sonam; Gothwal, Avinash; Khan, Iliyas; Srivastava, Shubham; Malik, Ruchi; Gupta, Umesh, E-mail: umeshgupta175@gmail.com

    2017-03-01

    Bortezomib (BTZ) is the first proteasome inhibitor approved by the US-FDA is majorly used for the treatment of newly diagnosed and relapsed multiple myeloma including mantle cell lymphoma. BTZ is hydrophobic in nature and is a major cause for its minimal presence as marketed formulations. The present study reports the design, development and characterization of dendrimer based formulation for the improved solubility and effectivity of bortezomib. The study also equally focuses on the mechanistic elucidation of solubilization by two types of dendrimers i.e. fourth generation of poly (amidoamine) dendrimers (G4-PAMAM-NH{sub 2}) and fifth generation of poly (propylene) imine dendrimers (G5-PPI-NH{sub 2}). It was observed that aqueous solubility of BTZ was concentration and pH dependent. At 2 mM G5-PPI-NH{sub 2} concentration, the fold increase in bortezomib solubility was 1152.63 times in water, while approximately 3426.69 folds increase in solubility was observed at pH 10.0, respectively (p < 0.05). The solubility of the drug was increased to a greater extent with G5-PPI-NH{sub 2} dendrimers because it has more hydrophobic interior than G4-PAMAM-NH{sub 2} dendrimers. The release of BTZ from G5-PPI-NH{sub 2} complex was comparatively slower than G4-PAMAM-NH{sub 2}. The thermodynamic treatment of data proved that dendrimer drug complexes were stable at all pH with values of ΔG always negative. The experimental findings were also proven by molecular simulation studies and by calculating RMSD and intermolecular hydrogen bonding through Schrodinger software. It was concluded that PPI dendrimers were able to solubilize the drug more effectively than PAMAM dendrimers through electrostatic interactions. - Highlights: • The present study reports the application of PAMAM and PPI dendrimers in solubilizing bortezomib with possible mechanism. • Improved solubility of bortezomib through dendrimers could significantly contribute its successful anticancer potential.

  12. Application of the coastal generalized ecosystem model (CGEM) to assess the impacts of a potential future climate scenario on northern Gulf of Mexico hypoxia

    Science.gov (United States)

    Mechanistic hypoxia models for the northern Gulf of Mexico are being used to guide policy goals for Mississippi River nutrient loading reductions. However, to date, these models have not examined the effects of both nutrient loads and future climate. Here, we simulate a future c...

  13. Mars Exploration Rover Terminal Descent Mission Modeling and Simulation

    Science.gov (United States)

    Raiszadeh, Behzad; Queen, Eric M.

    2004-01-01

    Because of NASA's added reliance on simulation for successful interplanetary missions, the MER mission has developed a detailed EDL trajectory modeling and simulation. This paper summarizes how the MER EDL sequence of events are modeled, verification of the methods used, and the inputs. This simulation is built upon a multibody parachute trajectory simulation tool that has been developed in POST I1 that accurately simulates the trajectory of multiple vehicles in flight with interacting forces. In this model the parachute and the suspended bodies are treated as 6 Degree-of-Freedom (6 DOF) bodies. The terminal descent phase of the mission consists of several Entry, Descent, Landing (EDL) events, such as parachute deployment, heatshield separation, deployment of the lander from the backshell, deployment of the airbags, RAD firings, TIRS firings, etc. For an accurate, reliable simulation these events need to be modeled seamlessly and robustly so that the simulations will remain numerically stable during Monte-Carlo simulations. This paper also summarizes how the events have been modeled, the numerical issues, and modeling challenges.

  14. Simulation - modeling - experiment

    International Nuclear Information System (INIS)

    2004-01-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  15. Simulation-Based Internal Models for Safer Robots

    Directory of Open Access Journals (Sweden)

    Christian Blum

    2018-01-01

    Full Text Available In this paper, we explore the potential of mobile robots with simulation-based internal models for safety in highly dynamic environments. We propose a robot with a simulation of itself, other dynamic actors and its environment, inside itself. Operating in real time, this simulation-based internal model is able to look ahead and predict the consequences of both the robot’s own actions and those of the other dynamic actors in its vicinity. Hence, the robot continuously modifies its own actions in order to actively maintain its own safety while also achieving its goal. Inspired by the problem of how mobile robots could move quickly and safely through crowds of moving humans, we present experimental results which compare the performance of our internal simulation-based controller with a purely reactive approach as a proof-of-concept study for the practical use of simulation-based internal models.

  16. Modeling and Simulation of Low Voltage Arcs

    NARCIS (Netherlands)

    Ghezzi, L.; Balestrero, A.

    2010-01-01

    Modeling and Simulation of Low Voltage Arcs is an attempt to improve the physical understanding, mathematical modeling and numerical simulation of the electric arcs that are found during current interruptions in low voltage circuit breakers. An empirical description is gained by refined electrical

  17. Elements of complexity in subsurface modeling, exemplified with three case studies

    Energy Technology Data Exchange (ETDEWEB)

    Freedman, Vicky L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Truex, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rockhold, Mark [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bacon, Diana H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Freshley, Mark D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wellman, Dawn M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-04-03

    There are complexity elements to consider when applying subsurface flow and transport models to support environmental analyses. Modelers balance the benefits and costs of modeling along the spectrum of complexity, taking into account the attributes of more simple models (e.g., lower cost, faster execution, easier to explain, less mechanistic) and the attributes of more complex models (higher cost, slower execution, harder to explain, more mechanistic and technically defensible). In this paper, modeling complexity is examined with respect to considering this balance. The discussion of modeling complexity is organized into three primary elements: 1) modeling approach, 2) description of process, and 3) description of heterogeneity. Three examples are used to examine these complexity elements. Two of the examples use simulations generated from a complex model to develop simpler models for efficient use in model applications. The first example is designed to support performance evaluation of soil vapor extraction remediation in terms of groundwater protection. The second example investigates the importance of simulating different categories of geochemical reactions for carbon sequestration and selecting appropriate simplifications for use in evaluating sequestration scenarios. In the third example, the modeling history for a uranium-contaminated site demonstrates that conservative parameter estimates were inadequate surrogates for complex, critical processes and there is discussion on the selection of more appropriate model complexity for this application. All three examples highlight how complexity considerations are essential to create scientifically defensible models that achieve a balance between model simplification and complexity.

  18. Repository simulation model: Final report

    International Nuclear Information System (INIS)

    1988-03-01

    This report documents the application of computer simulation for the design analysis of the nuclear waste repository's waste handling and packaging operations. The Salt Repository Simulation Model was used to evaluate design alternatives during the conceptual design phase of the Salt Repository Project. Code development and verification was performed by the Office of Nuclear Waste Isolation (ONWL). The focus of this report is to relate the experience gained during the development and application of the Salt Repository Simulation Model to future repository design phases. Design of the repository's waste handling and packaging systems will require sophisticated analysis tools to evaluate complex operational and logistical design alternatives. Selection of these design alternatives in the Advanced Conceptual Design (ACD) and License Application Design (LAD) phases must be supported by analysis to demonstrate that the repository design will cost effectively meet DOE's mandated emplacement schedule and that uncertainties in the performance of the repository's systems have been objectively evaluated. Computer simulation of repository operations will provide future repository designers with data and insights that no other analytical form of analysis can provide. 6 refs., 10 figs

  19. Stochastic models to simulate paratuberculosis in dairy herds

    DEFF Research Database (Denmark)

    Nielsen, Søren Saxmose; Weber, M.F.; Kudahl, Anne Margrethe Braad

    2011-01-01

    Stochastic simulation models are widely accepted as a means of assessing the impact of changes in daily management and the control of different diseases, such as paratuberculosis, in dairy herds. This paper summarises and discusses the assumptions of four stochastic simulation models and their use...... the models are somewhat different in their underlying principles and do put slightly different values on the different strategies, their overall findings are similar. Therefore, simulation models may be useful in planning paratuberculosis strategies in dairy herds, although as with all models caution...

  20. Modelling and simulation of superalloys. Book of abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Rogal, Jutta; Hammerschmidt, Thomas; Drautz, Ralf (eds.)

    2014-07-01

    Superalloys are multi-component materials with complex microstructures that offer unique properties for high-temperature applications. The complexity of the superalloy materials makes it particularly challenging to obtain fundamental insight into their behaviour from the atomic structure to turbine blades. Recent advances in modelling and simulation of superalloys contribute to a better understanding and prediction of materials properties and therefore offer guidance for the development of new alloys. This workshop will give an overview of recent progress in modelling and simulation of materials for superalloys, with a focus on single crystal Ni-base and Co-base alloys. Topics will include electronic structure methods, atomistic simulations, microstructure modelling and modelling of microstructural evolution, solidification and process simulation as well as the modelling of phase stability and thermodynamics.

  1. Mechanistic characterization and molecular modeling of hepatitis B virus polymerase resistance to entecavir.

    Science.gov (United States)

    Walsh, Ann W; Langley, David R; Colonno, Richard J; Tenney, Daniel J

    2010-02-12

    Entecavir (ETV) is a deoxyguanosine analog competitive inhibitor of hepatitis B virus (HBV) polymerase that exhibits delayed chain termination of HBV DNA. A high barrier to entecavir-resistance (ETVr) is observed clinically, likely due to its potency and a requirement for multiple resistance changes to overcome suppression. Changes in the HBV polymerase reverse-transcriptase (RT) domain involve lamivudine-resistance (LVDr) substitutions in the conserved YMDD motif (M204V/I +/- L180M), plus an additional ETV-specific change at residues T184, S202 or M250. These substitutions surround the putative dNTP binding site or primer grip regions of the HBV RT. To determine the mechanistic basis for ETVr, wildtype, lamivudine-resistant (M204V, L180M) and ETVr HBVs were studied using in vitro RT enzyme and cell culture assays, as well as molecular modeling. Resistance substitutions significantly reduced ETV incorporation and chain termination in HBV DNA and increased the ETV-TP inhibition constant (K(i)) for HBV RT. Resistant HBVs exhibited impaired replication in culture and reduced enzyme activity (k(cat)) in vitro. Molecular modeling of the HBV RT suggested that ETVr residue T184 was adjacent to and stabilized S202 within the LVDr YMDD loop. ETVr arose through steric changes at T184 or S202 or by disruption of hydrogen-bonding between the two, both of which repositioned the loop and reduced the ETV-triphosphate (ETV-TP) binding pocket. In contrast to T184 and S202 changes, ETVr at primer grip residue M250 was observed during RNA-directed DNA synthesis only. Experimentally, M250 changes also impacted the dNTP-binding site. Modeling suggested a novel mechanism for M250 resistance, whereby repositioning of the primer-template component of the dNTP-binding site shifted the ETV-TP binding pocket. No structural data are available to confirm the HBV RT modeling, however, results were consistent with phenotypic analysis of comprehensive substitutions of each ETVr position

  2. Mechanistic characterization and molecular modeling of hepatitis B virus polymerase resistance to entecavir.

    Directory of Open Access Journals (Sweden)

    Ann W Walsh

    Full Text Available BACKGROUND: Entecavir (ETV is a deoxyguanosine analog competitive inhibitor of hepatitis B virus (HBV polymerase that exhibits delayed chain termination of HBV DNA. A high barrier to entecavir-resistance (ETVr is observed clinically, likely due to its potency and a requirement for multiple resistance changes to overcome suppression. Changes in the HBV polymerase reverse-transcriptase (RT domain involve lamivudine-resistance (LVDr substitutions in the conserved YMDD motif (M204V/I +/- L180M, plus an additional ETV-specific change at residues T184, S202 or M250. These substitutions surround the putative dNTP binding site or primer grip regions of the HBV RT. METHODS/PRINCIPAL FINDINGS: To determine the mechanistic basis for ETVr, wildtype, lamivudine-resistant (M204V, L180M and ETVr HBVs were studied using in vitro RT enzyme and cell culture assays, as well as molecular modeling. Resistance substitutions significantly reduced ETV incorporation and chain termination in HBV DNA and increased the ETV-TP inhibition constant (K(i for HBV RT. Resistant HBVs exhibited impaired replication in culture and reduced enzyme activity (k(cat in vitro. Molecular modeling of the HBV RT suggested that ETVr residue T184 was adjacent to and stabilized S202 within the LVDr YMDD loop. ETVr arose through steric changes at T184 or S202 or by disruption of hydrogen-bonding between the two, both of which repositioned the loop and reduced the ETV-triphosphate (ETV-TP binding pocket. In contrast to T184 and S202 changes, ETVr at primer grip residue M250 was observed during RNA-directed DNA synthesis only. Experimentally, M250 changes also impacted the dNTP-binding site. Modeling suggested a novel mechanism for M250 resistance, whereby repositioning of the primer-template component of the dNTP-binding site shifted the ETV-TP binding pocket. No structural data are available to confirm the HBV RT modeling, however, results were consistent with phenotypic analysis of

  3. Minimum-complexity helicopter simulation math model

    Science.gov (United States)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  4. Simulation Models for Socioeconomic Inequalities in Health: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Niko Speybroeck

    2013-11-01

    Full Text Available Background: The emergence and evolution of socioeconomic inequalities in health involves multiple factors interacting with each other at different levels. Simulation models are suitable for studying such complex and dynamic systems and have the ability to test the impact of policy interventions in silico. Objective: To explore how simulation models were used in the field of socioeconomic inequalities in health. Methods: An electronic search of studies assessing socioeconomic inequalities in health using a simulation model was conducted. Characteristics of the simulation models were extracted and distinct simulation approaches were identified. As an illustration, a simple agent-based model of the emergence of socioeconomic differences in alcohol abuse was developed. Results: We found 61 studies published between 1989 and 2013. Ten different simulation approaches were identified. The agent-based model illustration showed that multilevel, reciprocal and indirect effects of social determinants on health can be modeled flexibly. Discussion and Conclusions: Based on the review, we discuss the utility of using simulation models for studying health inequalities, and refer to good modeling practices for developing such models. The review and the simulation model example suggest that the use of simulation models may enhance the understanding and debate about existing and new socioeconomic inequalities of health frameworks.

  5. Developing Cognitive Models for Social Simulation from Survey Data

    Science.gov (United States)

    Alt, Jonathan K.; Lieberman, Stephen

    The representation of human behavior and cognition continues to challenge the modeling and simulation community. The use of survey and polling instruments to inform belief states, issue stances and action choice models provides a compelling means of developing models and simulations with empirical data. Using these types of data to population social simulations can greatly enhance the feasibility of validation efforts, the reusability of social and behavioral modeling frameworks, and the testable reliability of simulations. We provide a case study demonstrating these effects, document the use of survey data to develop cognitive models, and suggest future paths forward for social and behavioral modeling.

  6. Modeling and simulation with operator scaling

    OpenAIRE

    Cohen, Serge; Meerschaert, Mark M.; Rosiński, Jan

    2010-01-01

    Self-similar processes are useful in modeling diverse phenomena that exhibit scaling properties. Operator scaling allows a different scale factor in each coordinate. This paper develops practical methods for modeling and simulating stochastic processes with operator scaling. A simulation method for operator stable Levy processes is developed, based on a series representation, along with a Gaussian approximation of the small jumps. Several examples are given to illustrate practical application...

  7. Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2009-01-01

    This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial

  8. Protein Simulation Data in the Relational Model.

    Science.gov (United States)

    Simms, Andrew M; Daggett, Valerie

    2012-10-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost-significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server.

  9. Why did Jacques Monod make the choice of mechanistic determinism?

    Science.gov (United States)

    Loison, Laurent

    2015-06-01

    The development of molecular biology placed in the foreground a mechanistic and deterministic conception of the functioning of macromolecules. In this article, I show that this conception was neither obvious, nor necessary. Taking Jacques Monod as a case study, I detail the way he gradually came loose from a statistical understanding of determinism to finally support a mechanistic understanding. The reasons of the choice made by Monod at the beginning of the 1950s can be understood only in the light of the general theoretical schema supported by the concept of mechanistic determinism. This schema articulates three fundamental notions for Monod, namely that of the rigidity of the sequence of the genetic program, that of the intrinsic stability of macromolecules (DNA and proteins), and that of the specificity of molecular interactions. Copyright © 2015 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  10. Roughness Versus Charge Contributions to Representative Discrete Heterogeneity Underlying Mechanistic Prediction of Colloid Attachment, Detachment and Breakthrough-Elution Behavior Under Environmental Conditions.

    Science.gov (United States)

    Johnson, William; Farnsworth, Anna; Vanness, Kurt; Hilpert, Markus

    2017-04-01

    The key element of a mechanistic theory to predict colloid attachment in porous media under environmental conditions where colloid-collector repulsion exists (unfavorable conditions for attachment) is representation of the nano-scale surface heterogeneity (herein called discrete heterogeneity) that drives colloid attachment under unfavorable conditions. The observed modes of colloid attachment under unfavorable conditions emerge from simulations that incorporate discrete heterogeneity. Quantitative prediction of attachment (and detachment) requires capturing the sizes, spatial frequencies, and other properties of roughness asperities and charge heterodomains in discrete heterogeneity representations of different surfaces. The fact that a given discrete heterogeneity representation will interact differently with different-sized colloids as well as different ionic strengths for a given sized colloid allows backing out representative discrete heterogeneity via comparison of simulations to experiments performed across a range of colloid size, solution IS, and fluid velocity. This has been achieved on unfavorable smooth surfaces yielding quantitative prediction of attachment, and qualitative prediction of detachment in response to ionic strength or flow perturbations. Extending this treatment to rough surfaces, and representing the contributions of nanoscale roughness as well as charge heterogeneity is a focus of this talk. Another focus of this talk is the upscaling the pore scale simulations to produce contrasting breakthrough-elution behaviors at the continuum (column) scale that are observed, for example, for different-sized colloids, or same-sized colloids under different ionic strength conditions. The outcome of mechanistic pore scale simulations incorporating discrete heterogeneity and subsequent upscaling is that temporal processes such as blocking and ripening will emerge organically from these simulations, since these processes fundamentally stem from the

  11. Arsenic Exposure and Type 2 Diabetes: MicroRNAs as Mechanistic Links?

    OpenAIRE

    Beck, Rowan; Styblo, Miroslav; Sethupathy, Praveen

    2017-01-01

    Purpose of Review The goal of this review is to delineate the following: (1) the primary means of inorganic arsenic (iAs) exposure for human populations, (2) the adverse public health outcomes associated with chronic iAs exposure, (3) the pathophysiological connection between arsenic and type 2 diabetes (T2D), and (4) the incipient evidence for microRNAs as candidate mechanistic links between iAs exposure and T2D. Recent Findings Exposure to iAs in animal models has been associated with the d...

  12. Simulated impacts of insect defoliation on forest carbon dynamics

    International Nuclear Information System (INIS)

    Medvigy, D; Clark, K L; Skowronski, N S; Schäfer, K V R

    2012-01-01

    Many temperate and boreal forests are subject to insect epidemics. In the eastern US, over 41 million meters squared of tree basal area are thought to be at risk of gypsy moth defoliation. However, the decadal-to-century scale implications of defoliation events for ecosystem carbon dynamics are not well understood. In this study, the effects of defoliation intensity, periodicity and spatial pattern on the carbon cycle are investigated in a set of idealized model simulations. A mechanistic terrestrial biosphere model, ecosystem demography model 2, is driven with observations from a xeric oak–pine forest located in the New Jersey Pine Barrens. Simulations indicate that net ecosystem productivity (equal to photosynthesis minus respiration) decreases linearly with increasing defoliation intensity. However, because of interactions between defoliation and drought effects, aboveground biomass exhibits a nonlinear decrease with increasing defoliation intensity. The ecosystem responds strongly with both reduced productivity and biomass loss when defoliation periodicity varies from 5 to 15 yr, but exhibits a relatively weak response when defoliation periodicity varies from 15 to 60 yr. Simulations of spatially heterogeneous defoliation resulted in markedly smaller carbon stocks than simulations with spatially homogeneous defoliation. These results show that gypsy moth defoliation has a large effect on oak–pine forest biomass dynamics, functioning and its capacity to act as a carbon sink. (letter)

  13. A Companion Model Approach to Modelling and Simulation of Industrial Processes

    International Nuclear Information System (INIS)

    Juslin, K.

    2005-09-01

    Modelling and simulation provides for huge possibilities if broadly taken up by engineers as a working method. However, when considering the launching of modelling and simulation tools in an engineering design project, they shall be easy to learn and use. Then, there is no time to write equations, to consult suppliers' experts, or to manually transfer data from one tool to another. The answer seems to be in the integration of easy to use and dependable simulation software with engineering tools. Accordingly, the modelling and simulation software shall accept as input such structured design information on industrial unit processes and their connections, as provided for by e.g. CAD software and product databases. The software technology, including required specification and communication standards, is already available. Internet based service repositories make it possible for equipment manufacturers to supply 'extended products', including such design data as needed by engineers engaged in process and automation integration. There is a market niche evolving for simulation service centres, operating in co-operation with project consultants, equipment manufacturers, process integrators, automation designers, plant operating personnel, and maintenance centres. The companion model approach for specification and solution of process simulation models, as presented herein, is developed from the above premises. The focus is on how to tackle real world processes, which from the modelling point of view are heterogeneous, dynamic, very stiff, very nonlinear and only piece vice continuous, without extensive manual interventions of human experts. An additional challenge, to solve the arising equations fast and reliable, is dealt with, as well. (orig.)

  14. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  15. Nuclear reactor core modelling in multifunctional simulators

    International Nuclear Information System (INIS)

    Puska, E.K.

    1999-01-01

    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  16. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)

    1999-06-01

    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  17. Mechanistic Indicators of Childhood Asthma (MICA) Study

    Science.gov (United States)

    The Mechanistic Indicators of Childhood Asthma (MICA) Study has been designed to incorporate state-of-the-art technologies to examine the physiological and environmental factors that interact to increase the risk of asthmatic responses. MICA is primarily a clinically-bases obser...

  18. Modeling greenhouse gas emissions from dairy farms.

    Science.gov (United States)

    Rotz, C Alan

    2017-11-15

    Dairy farms have been identified as an important source of greenhouse gas emissions. Within the farm, important emissions include enteric CH 4 from the animals, CH 4 and N 2 O from manure in housing facilities during long-term storage and during field application, and N 2 O from nitrification and denitrification processes in the soil used to produce feed crops and pasture. Models using a wide range in level of detail have been developed to represent or predict these emissions. They include constant emission factors, variable process-related emission factors, empirical or statistical models, mechanistic process simulations, and life cycle assessment. To fully represent farm emissions, models representing the various emission sources must be integrated to capture the combined effects and interactions of all important components. Farm models have been developed using relationships across the full scale of detail, from constant emission factors to detailed mechanistic simulations. Simpler models, based upon emission factors and empirical relationships, tend to provide better tools for decision support, whereas more complex farm simulations provide better tools for research and education. To look beyond the farm boundaries, life cycle assessment provides an environmental accounting tool for quantifying and evaluating emissions over the full cycle, from producing the resources used on the farm through processing, distribution, consumption, and waste handling of the milk and dairy products produced. Models are useful for improving our understanding of farm processes and their interacting effects on greenhouse gas emissions. Through better understanding, they assist in the development and evaluation of mitigation strategies for reducing emissions and improving overall sustainability of dairy farms. The Authors. Published by the Federation of Animal Science Societies and Elsevier Inc. on behalf of the American Dairy Science Association®. This is an open access article

  19. Stochastic simulation of time-series models combined with geostatistics to predict water-table scenarios in a Guarani Aquifer System outcrop area, Brazil

    Science.gov (United States)

    Manzione, Rodrigo L.; Wendland, Edson; Tanikawa, Diego H.

    2012-11-01

    Stochastic methods based on time-series modeling combined with geostatistics can be useful tools to describe the variability of water-table levels in time and space and to account for uncertainty. Monitoring water-level networks can give information about the dynamic of the aquifer domain in both dimensions. Time-series modeling is an elegant way to treat monitoring data without the complexity of physical mechanistic models. Time-series model predictions can be interpolated spatially, with the spatial differences in water-table dynamics determined by the spatial variation in the system properties and the temporal variation driven by the dynamics of the inputs into the system. An integration of stochastic methods is presented, based on time-series modeling and geostatistics as a framework to predict water levels for decision making in groundwater management and land-use planning. The methodology is applied in a case study in a Guarani Aquifer System (GAS) outcrop area located in the southeastern part of Brazil. Communication of results in a clear and understandable form, via simulated scenarios, is discussed as an alternative, when translating scientific knowledge into applications of stochastic hydrogeology in large aquifers with limited monitoring network coverage like the GAS.

  20. Physiologically induced color-pattern changes in butterfly wings: mechanistic and evolutionary implications.

    Science.gov (United States)

    Otaki, Joji M

    2008-07-01

    A mechanistic understanding of the butterfly wing color-pattern determination can be facilitated by experimental pattern changes. Here I review physiologically induced color-pattern changes in nymphalid butterflies and their mechanistic and evolutionary implications. A type of color-pattern change can be elicited by elemental changes in size and position throughout the wing, as suggested by the nymphalid groundplan. These changes of pattern elements are bi-directional and bi-sided dislocation toward or away from eyespot foci and in both proximal and distal sides of the foci. The peripheral elements are dislocated even in the eyespot-less compartments. Anterior spots are more severely modified, suggesting the existence of an anterior-posterior gradient. In one species, eyespots are transformed into white spots with remnant-like orange scales, and such patterns emerge even at the eyespot-less "imaginary" foci. A series of these color-pattern modifications probably reveal "snap-shots" of a dynamic morphogenic signal due to heterochronic uncoupling between the signaling and reception steps. The conventional gradient model can be revised to account for these observed color-pattern changes.

  1. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. ODE constrained mixture modelling: a method for unraveling subpopulation structures and dynamics.

    Directory of Open Access Journals (Sweden)

    Jan Hasenauer

    2014-07-01

    Full Text Available Functional cell-to-cell variability is ubiquitous in multicellular organisms as well as bacterial populations. Even genetically identical cells of the same cell type can respond differently to identical stimuli. Methods have been developed to analyse heterogeneous populations, e.g., mixture models and stochastic population models. The available methods are, however, either incapable of simultaneously analysing different experimental conditions or are computationally demanding and difficult to apply. Furthermore, they do not account for biological information available in the literature. To overcome disadvantages of existing methods, we combine mixture models and ordinary differential equation (ODE models. The ODE models provide a mechanistic description of the underlying processes while mixture models provide an easy way to capture variability. In a simulation study, we show that the class of ODE constrained mixture models can unravel the subpopulation structure and determine the sources of cell-to-cell variability. In addition, the method provides reliable estimates for kinetic rates and subpopulation characteristics. We use ODE constrained mixture modelling to study NGF-induced Erk1/2 phosphorylation in primary sensory neurones, a process relevant in inflammatory and neuropathic pain. We propose a mechanistic pathway model for this process and reconstructed static and dynamical subpopulation characteristics across experimental conditions. We validate the model predictions experimentally, which verifies the capabilities of ODE constrained mixture models. These results illustrate that ODE constrained mixture models can reveal novel mechanistic insights and possess a high sensitivity.

  3. ODE constrained mixture modelling: a method for unraveling subpopulation structures and dynamics.

    Science.gov (United States)

    Hasenauer, Jan; Hasenauer, Christine; Hucho, Tim; Theis, Fabian J

    2014-07-01

    Functional cell-to-cell variability is ubiquitous in multicellular organisms as well as bacterial populations. Even genetically identical cells of the same cell type can respond differently to identical stimuli. Methods have been developed to analyse heterogeneous populations, e.g., mixture models and stochastic population models. The available methods are, however, either incapable of simultaneously analysing different experimental conditions or are computationally demanding and difficult to apply. Furthermore, they do not account for biological information available in the literature. To overcome disadvantages of existing methods, we combine mixture models and ordinary differential equation (ODE) models. The ODE models provide a mechanistic description of the underlying processes while mixture models provide an easy way to capture variability. In a simulation study, we show that the class of ODE constrained mixture models can unravel the subpopulation structure and determine the sources of cell-to-cell variability. In addition, the method provides reliable estimates for kinetic rates and subpopulation characteristics. We use ODE constrained mixture modelling to study NGF-induced Erk1/2 phosphorylation in primary sensory neurones, a process relevant in inflammatory and neuropathic pain. We propose a mechanistic pathway model for this process and reconstructed static and dynamical subpopulation characteristics across experimental conditions. We validate the model predictions experimentally, which verifies the capabilities of ODE constrained mixture models. These results illustrate that ODE constrained mixture models can reveal novel mechanistic insights and possess a high sensitivity.

  4. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  5. Deriving simulators for hybrid Chi models

    NARCIS (Netherlands)

    Beek, van D.A.; Man, K.L.; Reniers, M.A.; Rooda, J.E.; Schiffelers, R.R.H.

    2006-01-01

    The hybrid Chi language is formalism for modeling, simulation and verification of hybrid systems. The formal semantics of hybrid Chi allows the definition of provably correct implementations for simulation, verification and realtime control. This paper discusses the principles of deriving an

  6. A reduced-order modeling approach to represent subgrid-scale hydrological dynamics for land-surface simulations: application in a polygonal tundra landscape

    Science.gov (United States)

    Pau, G. S. H.; Bisht, G.; Riley, W. J.

    2014-09-01

    Existing land surface models (LSMs) describe physical and biological processes that occur over a wide range of spatial and temporal scales. For example, biogeochemical and hydrological processes responsible for carbon (CO2, CH4) exchanges with the atmosphere range from the molecular scale (pore-scale O2 consumption) to tens of kilometers (vegetation distribution, river networks). Additionally, many processes within LSMs are nonlinearly coupled (e.g., methane production and soil moisture dynamics), and therefore simple linear upscaling techniques can result in large prediction error. In this paper we applied a reduced-order modeling (ROM) technique known as "proper orthogonal decomposition mapping method" that reconstructs temporally resolved fine-resolution solutions based on coarse-resolution solutions. We developed four different methods and applied them to four study sites in a polygonal tundra landscape near Barrow, Alaska. Coupled surface-subsurface isothermal simulations were performed for summer months (June-September) at fine (0.25 m) and coarse (8 m) horizontal resolutions. We used simulation results from three summer seasons (1998-2000) to build ROMs of the 4-D soil moisture field for the study sites individually (single-site) and aggregated (multi-site). The results indicate that the ROM produced a significant computational speedup (> 103) with very small relative approximation error (training the ROM. We also demonstrate that our approach: (1) efficiently corrects for coarse-resolution model bias and (2) can be used for polygonal tundra sites not included in the training data set with relatively good accuracy (< 1.7% relative error), thereby allowing for the possibility of applying these ROMs across a much larger landscape. By coupling the ROMs constructed at different scales together hierarchically, this method has the potential to efficiently increase the resolution of land models for coupled climate simulations to spatial scales consistent with

  7. Proceedings of the 17. IASTED international conference on modelling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wamkeue, R. (comp.) [Quebec Univ., Abitibi-Temiscaminque, PQ (Canada)

    2006-07-01

    The International Association of Science and Technology for Development (IASTED) hosted this conference to provide a forum for international researchers and practitioners interested in all areas of modelling and simulation. The conference featured 12 sessions entitled: (1) automation, control and robotics, (2) hydraulic and hydrologic modelling, (3) applications in processes and design optimization, (4) environmental systems, (5) biomedicine and biomechanics, (6) communications, computers and informatics 1, (7) economics, management and operations research 1, (8) modelling and simulation methodologies 1, (9) economics, management and operations research 2, (10) modelling, optimization, identification and simulation, (11) communications, computers and informatics 2, and, (12) modelling and simulation methodologies 2. Participants took the opportunity to present the latest research, results, and ideas in mathematical modelling; physically-based modelling; agent-based modelling; dynamic modelling; 3-dimensional modelling; computational geometry; time series analysis; finite element methods; discrete event simulation; web-based simulation; Monte Carlo simulation; simulation optimization; simulation uncertainty; fuzzy systems; data modelling; computer aided design; and, visualization. Case studies in engineering design were also presented along with simulation tools and languages. The conference also highlighted topical issues in environmental systems modelling such as air modelling and simulation, atmospheric modelling, hazardous materials, mobile source emissions, ecosystem modelling, hydrological modelling, aquatic ecosystems, terrestrial ecosystems, biological systems, agricultural modelling, terrain analysis, meteorological modelling, earth system modelling, climatic modelling, and natural resource management. The conference featured 110 presentations, of which 3 have been catalogued separately for inclusion in this database. refs., tabs., figs.

  8. Modeling and Simulation of Nanoindentation

    Science.gov (United States)

    Huang, Sixie; Zhou, Caizhi

    2017-11-01

    Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.

  9. Mammogram synthesis using a 3D simulation. I. Breast tissue model and image acquisition simulation

    International Nuclear Information System (INIS)

    Bakic, Predrag R.; Albert, Michael; Brzakovic, Dragana; Maidment, Andrew D. A.

    2002-01-01

    A method is proposed for generating synthetic mammograms based upon simulations of breast tissue and the mammographic imaging process. A computer breast model has been designed with a realistic distribution of large and medium scale tissue structures. Parameters controlling the size and placement of simulated structures (adipose compartments and ducts) provide a method for consistently modeling images of the same simulated breast with modified position or acquisition parameters. The mammographic imaging process is simulated using a compression model and a model of the x-ray image acquisition process. The compression model estimates breast deformation using tissue elasticity parameters found in the literature and clinical force values. The synthetic mammograms were generated by a mammogram acquisition model using a monoenergetic parallel beam approximation applied to the synthetically compressed breast phantom

  10. COMPARISON OF RF CAVITY TRANSPORT MODELS FOR BBU SIMULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Ilkyoung Shin,Byung Yunn,Todd Satogata,Shahid Ahmed

    2011-03-01

    The transverse focusing effect in RF cavities plays a considerable role in beam dynamics for low-energy beamline sections and can contribute to beam breakup (BBU) instability. The purpose of this analysis is to examine RF cavity models in simulation codes which will be used for BBU experiments at Jefferson Lab and improve BBU simulation results. We review two RF cavity models in the simulation codes elegant and TDBBU (a BBU simulation code developed at Jefferson Lab). elegant can include the Rosenzweig-Serafini (R-S) model for the RF focusing effect. Whereas TDBBU uses a model from the code TRANSPORT which considers the adiabatic damping effect, but not the RF focusing effect. Quantitative comparisons are discussed for the CEBAF beamline. We also compare the R-S model with the results from numerical simulations for a CEBAF-type 5-cell superconducting cavity to validate the use of the R-S model as an improved low-energy RF cavity transport model in TDBBU. We have implemented the R-S model in TDBBU. It will improve BBU simulation results to be more matched with analytic calculations and experimental results.

  11. Comparison Of RF Cavity Transport Models For BBU Simulations

    International Nuclear Information System (INIS)

    Shin, Ilkyoung; Yunn, Byung; Satogata, Todd; Ahmed, Shahid

    2011-01-01

    The transverse focusing effect in RF cavities plays a considerable role in beam dynamics for low-energy beamline sections and can contribute to beam breakup (BBU) instability. The purpose of this analysis is to examine RF cavity models in simulation codes which will be used for BBU experiments at Jefferson Lab and improve BBU simulation results. We review two RF cavity models in the simulation codes elegant and TDBBU (a BBU simulation code developed at Jefferson Lab). elegant can include the Rosenzweig-Serafini (R-S) model for the RF focusing effect. Whereas TDBBU uses a model from the code TRANSPORT which considers the adiabatic damping effect, but not the RF focusing effect. Quantitative comparisons are discussed for the CEBAF beamline. We also compare the R-S model with the results from numerical simulations for a CEBAF-type 5-cell superconducting cavity to validate the use of the R-S model as an improved low-energy RF cavity transport model in TDBBU. We have implemented the R-S model in TDBBU. It will improve BBU simulation results to be more matched with analytic calculations and experimental results.

  12. Multiple Time Series Ising Model for Financial Market Simulations

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2015-01-01

    In this paper we propose an Ising model which simulates multiple financial time series. Our model introduces the interaction which couples to spins of other systems. Simulations from our model show that time series exhibit the volatility clustering that is often observed in the real financial markets. Furthermore we also find non-zero cross correlations between the volatilities from our model. Thus our model can simulate stock markets where volatilities of stocks are mutually correlated

  13. A Simulation Model Articulation of the REA Ontology

    Science.gov (United States)

    Laurier, Wim; Poels, Geert

    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise.

  14. Modeling and simulation of chillers with Dymola/Modelica; Modellierung und Simulation von Kaeltemaschinen mit Dymola/Modelica

    Energy Technology Data Exchange (ETDEWEB)

    Rettich, Daniel [Hochschule Biberach (Germany). Inst. fuer Gebaeude- und Energiesysteme (IGE)

    2012-07-01

    Within the contribution under consideration, a chiller was modeled and simulated with the program package Dymola / Modelica using the TIL Toolbox. An existing refrigeration technology test bench at the University of Biberach (Federal Republic of Germany) serves as a reference for the chiller illustrated in the simulation. The aim of the simulation is the future use of the models in a hardware-in-the-Loop (HIL) test bench in order to test different controllers with respect to their function and logic under identical framework conditions. Furthermore, the determination of the energy efficiency according to the regulation VDMA 24247 is in the foreground at the test bench as well as within the simulation. Following the final completion of the test bench, the models are validated against the test bench, and the model of the refrigerator will be connected to a detailed space model. Individual models were taken from the TIL toolbox, adapted for the application and parameterized with the design values of the laboratory chiller. Modifications to the TIL models were necessary in order to reflect the dynamic effects of the chiller in detail. For this purpose, investigations on indicators of the various dynamic components were employed. Subsequently to the modeling, each model was tested on the bases of design values and documents of the manufacturer. First simulation studies showed that the simulation in Dymola including the developed models provide plausible results. In the course of the modeling and parameterization of these modified models a component library was developed. Different models for future simulation studies can be extracted.

  15. Coupling of Large Eddy Simulations with Meteorological Models to simulate Methane Leaks from Natural Gas Storage Facilities

    Science.gov (United States)

    Prasad, K.

    2017-12-01

    Atmospheric transport is usually performed with weather models, e.g., the Weather Research and Forecasting (WRF) model that employs a parameterized turbulence model and does not resolve the fine scale dynamics generated by the flow around buildings and features comprising a large city. The NIST Fire Dynamics Simulator (FDS) is a computational fluid dynamics model that utilizes large eddy simulation methods to model flow around buildings at length scales much smaller than is practical with models like WRF. FDS has the potential to evaluate the impact of complex topography on near-field dispersion and mixing that is difficult to simulate with a mesoscale atmospheric model. A methodology has been developed to couple the FDS model with WRF mesoscale transport models. The coupling is based on nudging the FDS flow field towards that computed by WRF, and is currently limited to one way coupling performed in an off-line mode. This approach allows the FDS model to operate as a sub-grid scale model with in a WRF simulation. To test and validate the coupled FDS - WRF model, the methane leak from the Aliso Canyon underground storage facility was simulated. Large eddy simulations were performed over the complex topography of various natural gas storage facilities including Aliso Canyon, Honor Rancho and MacDonald Island at 10 m horizontal and vertical resolution. The goal of these simulations included improving and validating transport models as well as testing leak hypotheses. Forward simulation results were compared with aircraft and tower based in-situ measurements as well as methane plumes observed using the NASA Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) and the next generation instrument AVIRIS-NG. Comparison of simulation results with measurement data demonstrate the capability of the coupled FDS-WRF models to accurately simulate the transport and dispersion of methane plumes over urban domains. Simulated integrated methane enhancements will be presented and

  16. A virtual laboratory notebook for simulation models.

    Science.gov (United States)

    Winfield, A J

    1998-01-01

    In this paper we describe how we have adopted the laboratory notebook as a metaphor for interacting with computer simulation models. This 'virtual' notebook stores the simulation output and meta-data (which is used to record the scientist's interactions with the simulation). The meta-data stored consists of annotations (equivalent to marginal notes in a laboratory notebook), a history tree and a log of user interactions. The history tree structure records when in 'simulation' time, and from what starting point in the tree changes are made to the parameters by the user. Typically these changes define a new run of the simulation model (which is represented as a new branch of the history tree). The tree shows the structure of the changes made to the simulation and the log is required to keep the order in which the changes occurred. Together they form a record which you would normally find in a laboratory notebook. The history tree is plotted in simulation parameter space. This shows the scientist's interactions with the simulation visually and allows direct manipulation of the parameter information presented, which in turn is used to control directly the state of the simulation. The interactions with the system are graphical and usually involve directly selecting or dragging data markers and other graphical control devices around in parameter space. If the graphical manipulators do not provide precise enough control then textual manipulation is still available which allows numerical values to be entered by hand. The Virtual Laboratory Notebook, by providing interesting interactions with the visual view of the history tree, provides a mechanism for giving the user complex and novel ways of interacting with biological computer simulation models.

  17. Turbulence modeling for Francis turbine water passages simulation

    International Nuclear Information System (INIS)

    Maruzewski, P; Munch, C; Mombelli, H P; Avellan, F; Hayashi, H; Yamaishi, K; Hashii, T; Sugow, Y

    2010-01-01

    The applications of Computational Fluid Dynamics, CFD, to hydraulic machines life require the ability to handle turbulent flows and to take into account the effects of turbulence on the mean flow. Nowadays, Direct Numerical Simulation, DNS, is still not a good candidate for hydraulic machines simulations due to an expensive computational time consuming. Large Eddy Simulation, LES, even, is of the same category of DNS, could be an alternative whereby only the small scale turbulent fluctuations are modeled and the larger scale fluctuations are computed directly. Nevertheless, the Reynolds-Averaged Navier-Stokes, RANS, model have become the widespread standard base for numerous hydraulic machine design procedures. However, for many applications involving wall-bounded flows and attached boundary layers, various hybrid combinations of LES and RANS are being considered, such as Detached Eddy Simulation, DES, whereby the RANS approximation is kept in the regions where the boundary layers are attached to the solid walls. Furthermore, the accuracy of CFD simulations is highly dependent on the grid quality, in terms of grid uniformity in complex configurations. Moreover any successful structured and unstructured CFD codes have to offer a wide range to the variety of classic RANS model to hybrid complex model. The aim of this study is to compare the behavior of turbulent simulations for both structured and unstructured grids topology with two different CFD codes which used the same Francis turbine. Hence, the study is intended to outline the encountered discrepancy for predicting the wake of turbine blades by using either the standard k-ε model, or the standard k-ε model or the SST shear stress model in a steady CFD simulation. Finally, comparisons are made with experimental data from the EPFL Laboratory for Hydraulic Machines reduced scale model measurements.

  18. Turbulence modeling for Francis turbine water passages simulation

    Energy Technology Data Exchange (ETDEWEB)

    Maruzewski, P; Munch, C; Mombelli, H P; Avellan, F [Ecole polytechnique federale de Lausanne, Laboratory of Hydraulic Machines Avenue de Cour 33 bis, CH-1007 Lausanne (Switzerland); Hayashi, H; Yamaishi, K; Hashii, T; Sugow, Y, E-mail: pierre.maruzewski@epfl.c [Nippon KOEI Power Systems, 1-22 Doukyu, Aza, Morijyuku, Sukagawa, Fukushima Pref. 962-8508 (Japan)

    2010-08-15

    The applications of Computational Fluid Dynamics, CFD, to hydraulic machines life require the ability to handle turbulent flows and to take into account the effects of turbulence on the mean flow. Nowadays, Direct Numerical Simulation, DNS, is still not a good candidate for hydraulic machines simulations due to an expensive computational time consuming. Large Eddy Simulation, LES, even, is of the same category of DNS, could be an alternative whereby only the small scale turbulent fluctuations are modeled and the larger scale fluctuations are computed directly. Nevertheless, the Reynolds-Averaged Navier-Stokes, RANS, model have become the widespread standard base for numerous hydraulic machine design procedures. However, for many applications involving wall-bounded flows and attached boundary layers, various hybrid combinations of LES and RANS are being considered, such as Detached Eddy Simulation, DES, whereby the RANS approximation is kept in the regions where the boundary layers are attached to the solid walls. Furthermore, the accuracy of CFD simulations is highly dependent on the grid quality, in terms of grid uniformity in complex configurations. Moreover any successful structured and unstructured CFD codes have to offer a wide range to the variety of classic RANS model to hybrid complex model. The aim of this study is to compare the behavior of turbulent simulations for both structured and unstructured grids topology with two different CFD codes which used the same Francis turbine. Hence, the study is intended to outline the encountered discrepancy for predicting the wake of turbine blades by using either the standard k-{epsilon} model, or the standard k-{epsilon} model or the SST shear stress model in a steady CFD simulation. Finally, comparisons are made with experimental data from the EPFL Laboratory for Hydraulic Machines reduced scale model measurements.

  19. Fracture network modeling and GoldSim simulation support

    International Nuclear Information System (INIS)

    Sugita, Kenichirou; Dershowitz, W.

    2005-01-01

    During Heisei-16, Golder Associates provided support for JNC Tokai through discrete fracture network data analysis and simulation of the Mizunami Underground Research Laboratory (MIU), participation in Task 6 of the AEspoe Task Force on Modeling of Groundwater Flow and Transport, and development of methodologies for analysis of repository site characterization strategies and safety assessment. MIU support during H-16 involved updating the H-15 FracMan discrete fracture network (DFN) models for the MIU shaft region, and developing improved simulation procedures. Updates to the conceptual model included incorporation of 'Step2' (2004) versions of the deterministic structures, and revision of background fractures to be consistent with conductive structure data from the DH-2 borehole. Golder developed improved simulation procedures for these models through the use of hybrid discrete fracture network (DFN), equivalent porous medium (EPM), and nested DFN/EPM approaches. For each of these models, procedures were documented for the entire modeling process including model implementation, MMP simulation, and shaft grouting simulation. Golder supported JNC participation in Task 6AB, 6D and 6E of the AEspoe Task Force on Modeling of Groundwater Flow and Transport during H-16. For Task 6AB, Golder developed a new technique to evaluate the role of grout in performance assessment time-scale transport. For Task 6D, Golder submitted a report of H-15 simulations to SKB. For Task 6E, Golder carried out safety assessment time-scale simulations at the block scale, using the Laplace Transform Galerkin method. During H-16, Golder supported JNC's Total System Performance Assessment (TSPA) strategy by developing technologies for the analysis of the use site characterization data in safety assessment. This approach will aid in the understanding of the use of site characterization to progressively reduce site characterization uncertainty. (author)

  20. Turbulence modeling for Francis turbine water passages simulation

    Science.gov (United States)

    Maruzewski, P.; Hayashi, H.; Munch, C.; Yamaishi, K.; Hashii, T.; Mombelli, H. P.; Sugow, Y.; Avellan, F.

    2010-08-01

    The applications of Computational Fluid Dynamics, CFD, to hydraulic machines life require the ability to handle turbulent flows and to take into account the effects of turbulence on the mean flow. Nowadays, Direct Numerical Simulation, DNS, is still not a good candidate for hydraulic machines simulations due to an expensive computational time consuming. Large Eddy Simulation, LES, even, is of the same category of DNS, could be an alternative whereby only the small scale turbulent fluctuations are modeled and the larger scale fluctuations are computed directly. Nevertheless, the Reynolds-Averaged Navier-Stokes, RANS, model have become the widespread standard base for numerous hydraulic machine design procedures. However, for many applications involving wall-bounded flows and attached boundary layers, various hybrid combinations of LES and RANS are being considered, such as Detached Eddy Simulation, DES, whereby the RANS approximation is kept in the regions where the boundary layers are attached to the solid walls. Furthermore, the accuracy of CFD simulations is highly dependent on the grid quality, in terms of grid uniformity in complex configurations. Moreover any successful structured and unstructured CFD codes have to offer a wide range to the variety of classic RANS model to hybrid complex model. The aim of this study is to compare the behavior of turbulent simulations for both structured and unstructured grids topology with two different CFD codes which used the same Francis turbine. Hence, the study is intended to outline the encountered discrepancy for predicting the wake of turbine blades by using either the standard k-epsilon model, or the standard k-epsilon model or the SST shear stress model in a steady CFD simulation. Finally, comparisons are made with experimental data from the EPFL Laboratory for Hydraulic Machines reduced scale model measurements.

  1. Opening the black box—Development, testing and documentation of a mechanistically rich agent-based model

    DEFF Research Database (Denmark)

    Topping, Chris J.; Høye, Toke; Olesen, Carsten Riis

    2010-01-01

    Although increasingly widely used in biology, complex adaptive simulation models such as agent-based models have been criticised for being difficult to communicate and test. This study demonstrates the application of pattern-oriented model testing, and a novel documentation procedure to present...... accessible description of the processes included in the model. Application of the model to a comprehensive historical data set supported the hypothesis that interference competition is the primary population regulating factor in the absence of mammal predators in the brown hare, and that the effect works...

  2. Towards a standard model for research in agent-based modeling and simulation

    Directory of Open Access Journals (Sweden)

    Nuno Fachada

    2015-11-01

    Full Text Available Agent-based modeling (ABM is a bottom-up modeling approach, where each entity of the system being modeled is uniquely represented as an independent decision-making agent. ABMs are very sensitive to implementation details. Thus, it is very easy to inadvertently introduce changes which modify model dynamics. Such problems usually arise due to the lack of transparency in model descriptions, which constrains how models are assessed, implemented and replicated. In this paper, we present PPHPC, a model which aims to serve as a standard in agent based modeling research, namely, but not limited to, conceptual model specification, statistical analysis of simulation output, model comparison and parallelization studies. This paper focuses on the first two aspects (conceptual model specification and statistical analysis of simulation output, also providing a canonical implementation of PPHPC. The paper serves as a complete reference to the presented model, and can be used as a tutorial for simulation practitioners who wish to improve the way they communicate their ABMs.

  3. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  4. Numerical simulation code for combustion of sodium liquid droplet and its verification

    International Nuclear Information System (INIS)

    Okano, Yasushi

    1997-11-01

    The computer programs for sodium leak and burning phenomena had been developed based on mechanistic approach. Direct numerical simulation code for sodium liquid droplet burning had been developed for numerical analysis of droplet combustion in forced convection air flow. Distributions of heat generation and temperature and reaction rate of chemical productions, such as sodium oxide and hydroxide, are calculated and evaluated with using this numerical code. Extended MAC method coupled with a higher-order upwind scheme had been used for combustion simulation of methane-air mixture. In the numerical simulation code for combustion of sodium liquid droplet, chemical reaction model of sodium was connected with the extended MAC method. Combustion of single sodium liquid droplet was simulated in this report for the verification of developed numerical simulation code. The changes of burning rate and reaction product with droplet diameter and inlet wind velocity were investigated. These calculation results were qualitatively and quantitatively conformed to the experimental and calculation observations in combustion engineering. It was confirmed that the numerical simulation code was available for the calculation of sodium liquid droplet burning. (author)

  5. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...

  6. Evolution Model and Simulation of Profit Model of Agricultural Products Logistics Financing

    Science.gov (United States)

    Yang, Bo; Wu, Yan

    2018-03-01

    Agricultural products logistics financial warehousing business mainly involves agricultural production and processing enterprises, third-party logistics enterprises and financial institutions tripartite, to enable the three parties to achieve win-win situation, the article first gives the replication dynamics and evolutionary stability strategy between the three parties in business participation, and then use NetLogo simulation platform, using the overall modeling and simulation method of Multi-Agent, established the evolutionary game simulation model, and run the model under different revenue parameters, finally, analyzed the simulation results. To achieve the agricultural products logistics financial financing warehouse business to participate in tripartite mutually beneficial win-win situation, thus promoting the smooth flow of agricultural products logistics business.

  7. Off-gas adsorption model and simulation - OSPREY

    Energy Technology Data Exchange (ETDEWEB)

    Rutledge, V.J. [Idaho National Laboratory, P. O. Box 1625, Idaho Falls, ID (United States)

    2013-07-01

    A capability of accurately simulating the dynamic behavior of advanced fuel cycle separation processes is expected to provide substantial cost savings and many technical benefits. To support this capability, a modeling effort focused on the off-gas treatment system of a used nuclear fuel recycling facility is in progress. The off-gas separation consists of a series of scrubbers and adsorption beds to capture constituents of interest. Dynamic models are being developed to simulate each unit operation involved so each unit operation can be used as a stand-alone model and in series with multiple others. Currently, an adsorption model has been developed within Multi-physics Object Oriented Simulation Environment (MOOSE) developed at the Idaho National Laboratory (INL). Off-gas Separation and Recovery (OSPREY) models the adsorption of offgas constituents for dispersed plug flow in a packed bed under non-isothermal and non-isobaric conditions. Inputs to the model include gas composition, sorbent and column properties, equilibrium and kinetic data, and inlet conditions. The simulation outputs component concentrations along the column length as a function of time from which breakthrough data can be obtained. The breakthrough data can be used to determine bed capacity, which in turn can be used to size columns. In addition to concentration data, the model predicts temperature along the column length as a function of time and pressure drop along the column length. A description of the OSPREY model, results from krypton adsorption modeling and plans for modeling the behavior of iodine, xenon, and tritium will be discussed. (author)

  8. Confinement effects and mechanistic aspects for montmorillonite nanopores.

    Science.gov (United States)

    Li, Xiong; Zhu, Chang; Jia, Zengqiang; Yang, Gang

    2018-08-01

    Owing to the ubiquity, critical importance and special properties, confined microenvironments have recently triggered overwhelming interest. In this work, all-atom molecular dynamics simulations have been conducted to address the confinement effects and ion-specific effects for electrolyte solutions within montmorillonite nanopores, where the pore widths vary with a wide range. The adsorption number, structure, dynamics and stability of inner- and outer-sphere metal ions are affected by the change of pore widths (confinement effects), while the extents are significantly dependent on the type of adsorbed species. The type of adsorbed species is, however, not altered by the magnitude of confinement effects, and confinement effects are similar for different electrolyte concentrations. Ion-specific effects are pronounced for all magnitudes of confinement effects (from non- to strong confined conditions), and Hofmeister sequences of outer-sphere species are closely associated with the magnitude of confinement effects while those of inner-sphere species remain consistent. In addition, mechanistic aspects of confinement have been posed using the electrical double layer theories, and the results can be generalized to other confined systems that are ubiquitous in biology, chemistry, geology and nanotechnology. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. PSH Transient Simulation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-21

    PSH Transient Simulation Modeling presentation from the WPTO FY14 - FY16 Peer Review. Transient effects are an important consideration when designing a PSH system, yet numerical techniques for hydraulic transient analysis still need improvements for adjustable-speed (AS) reversible pump-turbine applications.

  10. A preliminary study of mechanistic approach in pavement design to accommodate climate change effects

    Science.gov (United States)

    Harnaeni, S. R.; Pramesti, F. P.; Budiarto, A.; Setyawan, A.

    2018-03-01

    Road damage is caused by some factors, including climate changes, overload, and inappropriate procedure for material and development process. Meanwhile, climate change is a phenomenon which cannot be avoided. The effects observed include air temperature rise, sea level rise, rainfall changes, and the intensity of extreme weather phenomena. Previous studies had shown the impacts of climate changes on road damage. Therefore, several measures to anticipate the damage should be considered during the planning and construction in order to reduce the cost of road maintenance. There are three approaches generally applied in the design of flexible pavement thickness, namely mechanistic approach, mechanistic-empirical (ME) approach and empirical approach. The advantages of applying mechanistic approach or mechanistic-empirical (ME) approaches are its efficiency and reliability in the design of flexible pavement thickness as well as its capacity to accommodate climate changes in compared to empirical approach. However, generally, the design of flexible pavement thickness in Indonesia still applies empirical approach. This preliminary study aimed to emphasize the importance of the shifting towards a mechanistic approach in the design of flexible pavement thickness.

  11. Dynamic and accurate assessment of acetaminophen-induced hepatotoxicity by integrated photoacoustic imaging and mechanistic biomarkers in vivo.

    Science.gov (United States)

    Brillant, Nathalie; Elmasry, Mohamed; Burton, Neal C; Rodriguez, Josep Monne; Sharkey, Jack W; Fenwick, Stephen; Poptani, Harish; Kitteringham, Neil R; Goldring, Christopher E; Kipar, Anja; Park, B Kevin; Antoine, Daniel J

    2017-10-01

    The prediction and understanding of acetaminophen (APAP)-induced liver injury (APAP-ILI) and the response to therapeutic interventions is complex. This is due in part to sensitivity and specificity limitations of currently used assessment techniques. Here we sought to determine the utility of integrating translational non-invasive photoacoustic imaging of liver function with mechanistic circulating biomarkers of hepatotoxicity with histological assessment to facilitate the more accurate and precise characterization of APAP-ILI and the efficacy of therapeutic intervention. Perturbation of liver function and cellular viability was assessed in C57BL/6J male mice by Indocyanine green (ICG) clearance (Multispectral Optoacoustic Tomography (MSOT)) and by measurement of mechanistic (miR-122, HMGB1) and established (ALT, bilirubin) circulating biomarkers in response to the acetaminophen and its treatment with acetylcysteine (NAC) in vivo. We utilised a 60% partial hepatectomy model as a situation of defined hepatic functional mass loss to compared acetaminophen-induced changes to. Integration of these mechanistic markers correlated with histological features of APAP hepatotoxicity in a time-dependent manner. They accurately reflected the onset and recovery from hepatotoxicity compared to traditional biomarkers and also reported the efficacy of NAC with high sensitivity. ICG clearance kinetics correlated with histological scores for acute liver damage for APAP (i.e. 3h timepoint; r=0.90, P<0.0001) and elevations in both of the mechanistic biomarkers, miR-122 (e.g. 6h timepoint; r=0.70, P=0.005) and HMGB1 (e.g. 6h timepoint; r=0.56, P=0.04). For the first time we report the utility of this non-invasive longitudinal imaging approach to provide direct visualisation of the liver function coupled with mechanistic biomarkers, in the same animal, allowing the investigation of the toxicological and pharmacological aspects of APAP-ILI and hepatic regeneration. Copyright © 2017

  12. Modelling insights on the partition of evapotranspiration components across biomes

    Science.gov (United States)

    Fatichi, Simone; Pappas, Christoforos

    2017-04-01

    Recent studies using various methodologies have found a large variability (from 35 to 90%) in the ratio of transpiration to total evapotranspiration (denoted as T:ET) across biomes or even at the global scale. Concurrently, previous results suggest that T:ET is independent of mean precipitation and has a positive correlation with Leaf Area Index (LAI). We used the mechanistic ecohydrological model, T&C, with a refined process-based description of soil resistance and a detailed treatment of canopy biophysics and ecophysiology, to investigate T:ET across multiple biomes. Contrary to observation-based estimates, simulation results highlight a well-constrained range of mean T:ET across biomes that is also robust to perturbations of the most sensitive parameters. Simulated T:ET was confirmed to be independent of average precipitation, while it was found to be uncorrelated with LAI across biomes. Higher values of LAI increase evaporation from interception but suppress ground evaporation with the two effects largely cancelling each other in many sites. These results offer mechanistic, model-based, evidence to the ongoing research about the range of T:ET and the factors affecting its magnitude across biomes.

  13. Modelling pesticide volatilization after soil application using the mechanistic model Volt'Air

    Science.gov (United States)

    Bedos, Carole; Génermont, Sophie; Le Cadre, Edith; Garcia, Lucas; Barriuso, Enrique; Cellier, Pierre

    Volatilization of pesticides participates in atmospheric contamination and affects environmental ecosystems including human welfare. Modelling at relevant time and spatial scales is needed to better understand the complex processes involved in pesticide volatilization. Volt'Air-Pesticides has been developed following a two-step procedure to study pesticide volatilization at the field scale and at a quarter time step. Firstly, Volt'Air-NH 3 was adapted by extending the initial transfer of solutes to pesticides and by adding specific calculations for physico-chemical equilibriums as well as for the degradation of pesticides in soil. Secondly, the model was evaluated in terms of 3 pesticides applied on bare soil (atrazine, alachlor, and trifluralin) which display a wide range of volatilization rates. A sensitivity analysis confirmed the relevance of tuning to K h. Then, using Volt'Air-Pesticides, environmental conditions and emission fluxes of the pesticides were compared to fluxes measured under 2 environmental conditions. The model fairly well described water temporal dynamics, soil surface temperature, and energy budget. Overall, Volt'Air-Pesticides estimates of the order of magnitude of the volatilization flux of all three compounds were in good agreement with the field measurements. The model also satisfactorily simulated the decrease in the volatilization rate of the three pesticides during night-time as well as the decrease in the soil surface residue of trifluralin before and after incorporation. However, the timing of the maximum flux rate during the day was not correctly described, thought to be linked to an increased adsorption under dry soil conditions. Thanks to Volt'Air's capacity to deal with pedo-climatic conditions, several existing parameterizations describing adsorption as a function of soil water content could be tested. However, this point requires further investigation. Practically speaking, Volt'Air-Pesticides can be a useful tool to make

  14. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  15. Modeling and simulation of large HVDC systems

    Energy Technology Data Exchange (ETDEWEB)

    Jin, H.; Sood, V.K.

    1993-01-01

    This paper addresses the complexity and the amount of work in preparing simulation data and in implementing various converter control schemes and the excessive simulation time involved in modelling and simulation of large HVDC systems. The Power Electronic Circuit Analysis program (PECAN) is used to address these problems and a large HVDC system with two dc links is simulated using PECAN. A benchmark HVDC system is studied to compare the simulation results with those from other packages. The simulation time and results are provided in the paper.

  16. Mechanistic aspects of ionic reactions in flames

    DEFF Research Database (Denmark)

    Egsgaard, H.; Carlsen, L.

    1993-01-01

    Some fundamentals of the ion chemistry of flames are summarized. Mechanistic aspects of ionic reactions in flames have been studied using a VG PlasmaQuad, the ICP-system being substituted by a simple quartz burner. Simple hydrocarbon flames as well as sulfur-containing flames have been investigated...

  17. Simulation platform to model, optimize and design wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Iov, F.; Hansen, A.D.; Soerensen, P.; Blaabjerg, F.

    2004-03-01

    This report is a general overview of the results obtained in the project 'Electrical Design and Control. Simulation Platform to Model, Optimize and Design Wind Turbines'. The motivation for this research project is the ever-increasing wind energy penetration into the power network. Therefore, the project has the main goal to create a model database in different simulation tools for a system optimization of the wind turbine systems. Using this model database a simultaneous optimization of the aerodynamic, mechanical, electrical and control systems over the whole range of wind speeds and grid characteristics can be achieved. The report is structured in six chapters. First, the background of this project and the main goals as well as the structure of the simulation platform is given. The main topologies for wind turbines, which have been taken into account during the project, are briefly presented. Then, the considered simulation tools namely: HAWC, DIgSILENT, Saber and Matlab/Simulink have been used in this simulation platform are described. The focus here is on the modelling and simulation time scale aspects. The abilities of these tools are complementary and they can together cover all the modelling aspects of the wind turbines e.g. mechanical loads, power quality, switching, control and grid faults. However, other simulation packages e.g PSCAD/EMTDC can easily be added in the simulation platform. New models and new control algorithms for wind turbine systems have been developed and tested in these tools. All these models are collected in dedicated libraries in Matlab/Simulink as well as in Saber. Some simulation results from the considered tools are presented for MW wind turbines. These simulation results focuses on fixed-speed and variable speed/pitch wind turbines. A good agreement with the real behaviour of these systems is obtained for each simulation tool. These models can easily be extended to model different kinds of wind turbines or large wind

  18. Simulation of a flowing bed kiln for the production of uranium tetrafluoride; Simulation d'un four a lit coulant pour la production de tetrafluorure d'uranium

    Energy Technology Data Exchange (ETDEWEB)

    Dussoubs, B.; Patisson, F.; Ablitzer, D. [Ecole des Mines de Nancy, Lab. de Science et Genie des Materiaux et de Metallurgie, UMR 7584, 54 (France); Jourde, J. [Comurhex, Usine de Malvesi, 11 - Narbonne (France); Houzelot, J.L. [Ecole Nationale Superieure des Industries Chimiques (ENSIC), UPR 6811, 54 - Villers-les-Nancy (France)

    2001-07-01

    A flowing bed kiln is a gas-solid reactor used in the civil nuclear fuel cycle for the successive conversion of uranium trioxide (UO{sub 3}) into uranium dioxide (UO{sub 2}) and then into uranium tetrafluoride (UF{sub 4}). A numerical model is developed which simulate the behaviour of this reactor in permanent regime. This model describes the physico-chemical phenomena involved, and combines a mechanistic approach in the vertical area of the kiln (resolution by the finite volumes method) and a systemic approach in the horizontal area, like in the model of cascade mixers. The first results have been obtained for reference operating conditions of the industrial kiln. Some possible improvements of the optimum temperature progression inside the kiln are evoked. (J.S.)

  19. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...

  20. Mechanistic interpretation of glass reaction: Input to kinetic model development

    International Nuclear Information System (INIS)

    Bates, J.K.; Ebert, W.L.; Bradley, J.P.; Bourcier, W.L.

    1991-05-01

    Actinide-doped SRL 165 type glass was reacted in J-13 groundwater at 90 degree C for times up to 278 days. The reaction was characterized by both solution and solid analyses. The glass was seen to react nonstoichiometrically with preferred leaching of alkali metals and boron. High resolution electron microscopy revealed the formation of a complex layer structure which became separated from the underlying glass as the reaction progressed. The formation of the layer and its effect on continued glass reaction are discussed with respect to the current model for glass reaction used in the EQ3/6 computer simulation. It is concluded that the layer formed after 278 days is not protective and may eventually become fractured and generate particulates that may be transported by liquid water. 5 refs., 5 figs. , 3 tabs