WorldWideScience

Sample records for mechanistically based model

  1. Appropriateness of mechanistic and non-mechanistic models for the application of ultrafiltration to mixed waste

    International Nuclear Information System (INIS)

    Foust, Henry; Ghosehajra, Malay

    2007-01-01

    This study asks two questions: (1) How appropriate is the use of a basic filtration equation to the application of ultrafiltration of mixed waste, and (2) How appropriate are non-parametric models for permeate rates (volumes)? To answer these questions, mechanistic and non-mechanistic approaches are developed for permeate rates and volumes associated with an ultrafiltration/mixed waste system in dia-filtration mode. The mechanistic approach is based on a filtration equation which states that t/V vs. V is a linear relationship. The coefficients associated with this linear regression are composed of physical/chemical parameters of the system and based the mass balance equation associated with the membrane and associated developing cake layer. For several sets of data, a high correlation is shown that supports the assertion that t/V vs. V is a linear relationship. It is also shown that non-mechanistic approaches, i.e., the use of regression models to are not appropriate. One models considered is Q(p) = a*ln(Cb)+b. Regression models are inappropriate because the scale-up from a bench scale (pilot scale) study to full-scale for permeate rates (volumes) is not simply the ratio of the two membrane surface areas. (authors)

  2. Modeling of the pyruvate production with Escherichia coli: comparison of mechanistic and neural networks-based models.

    Science.gov (United States)

    Zelić, B; Bolf, N; Vasić-Racki, D

    2006-06-01

    Three different models: the unstructured mechanistic black-box model, the input-output neural network-based model and the externally recurrent neural network model were used to describe the pyruvate production process from glucose and acetate using the genetically modified Escherichia coli YYC202 ldhA::Kan strain. The experimental data were used from the recently described batch and fed-batch experiments [ Zelić B, Study of the process development for Escherichia coli-based pyruvate production. PhD Thesis, University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb, Croatia, July 2003. (In English); Zelić et al. Bioproc Biosyst Eng 26:249-258 (2004); Zelić et al. Eng Life Sci 3:299-305 (2003); Zelić et al Biotechnol Bioeng 85:638-646 (2004)]. The neural networks were built out of the experimental data obtained in the fed-batch pyruvate production experiments with the constant glucose feed rate. The model validation was performed using the experimental results obtained from the batch and fed-batch pyruvate production experiments with the constant acetate feed rate. Dynamics of the substrate and product concentration changes was estimated using two neural network-based models for biomass and pyruvate. It was shown that neural networks could be used for the modeling of complex microbial fermentation processes, even in conditions in which mechanistic unstructured models cannot be applied.

  3. Application of mechanistic models to fermentation and biocatalysis for next-generation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist; Eliasson Lantz, Anna; Tufvesson, Pär

    2010-01-01

    of variables required for measurement, control and process design. In the near future, mechanistic models with a higher degree of detail will play key roles in the development of efficient next-generation fermentation and biocatalytic processes. Moreover, mechanistic models will be used increasingly......Mechanistic models are based on deterministic principles, and recently, interest in them has grown substantially. Herein we present an overview of mechanistic models and their applications in biotechnology, including future perspectives. Model utility is highlighted with respect to selection...

  4. Assessing uncertainty in mechanistic models

    Science.gov (United States)

    Edwin J. Green; David W. MacFarlane; Harry T. Valentine

    2000-01-01

    Concern over potential global change has led to increased interest in the use of mechanistic models for predicting forest growth. The rationale for this interest is that empirical models may be of limited usefulness if environmental conditions change. Intuitively, we expect that mechanistic models, grounded as far as possible in an understanding of the biology of tree...

  5. Evaluation of mechanistic DNB models using HCLWR CHF data

    International Nuclear Information System (INIS)

    Iwamura, Takamichi; Watanabe, Hironori; Okubo, Tsutomu; Araya, Fumimasa; Murao, Yoshio.

    1992-03-01

    An onset of departure from nucleate boiling (DNB) in light water reactor (LWR) has been generally predicted with empirical correlations. Since these correlations have less physical bases and contain adjustable empirical constants determined by best fitting of test data, applicable geometries and flow conditions are limited within the original experiment ranges. In order to obtain more universal prediction method, several mechanistic DNB models based on physical approaches have been proposed in recent years. However, the predictive capabilities of mechanistic DNB models have not been verified successfully especially for advanced LWR design purposes. In this report, typical DNB mechanistic models are reviewed and compared with critical heat flux (CHF) data for high conversion light water reactor (HCLWR). The experiments were performed using triangular 7-rods array with non-uniform axial heat flux distribution. Test pressure was 16 MPa, mass velocities ranged from 800 t0 3100 kg/s·m 2 and exit qualities from -0.07 to 0.19. The evaluated models are: 1) Wisman-Pei, 2) Chang-Lee, 3) Lee-Mudawwar, 4) Lin-Lee-Pei, and 5) Katto. The first two models are based on near-wall bubble crowding model and the other three models on sublayer dryout model. The comparison with experimental data indicated that the Weisman-Pei model agreed relatively well with the CHF data. Effects of empirical constants in each model on CHF calculation were clarified by sensitivity studies. It was also found that the magnitudes of physical quantities obtained in the course of calculation were significantly different for each model. Therefore, microscopic observation of the onset of DNB on heated surface is essential to clarify the DNB mechanism and establish a general DNB mechanistic model based on physical phenomenon. (author)

  6. Mechanistic model for microbial growth on hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Mallee, F M; Blanch, H W

    1977-12-01

    Based on available information describing the transport and consumption of insoluble alkanes, a mechanistic model is proposed for microbial growth on hydrocarbons. The model describes the atypical growth kinetics observed, and has implications in the design of large scale equipment for single cell protein (SCP) manufacture from hydrocarbons. The model presents a framework for comparison of the previously published experimental kinetic data.

  7. Mechanistic species distribution modelling as a link between physiology and conservation.

    Science.gov (United States)

    Evans, Tyler G; Diamond, Sarah E; Kelly, Morgan W

    2015-01-01

    Climate change conservation planning relies heavily on correlative species distribution models that estimate future areas of occupancy based on environmental conditions encountered in present-day ranges. The approach benefits from rapid assessment of vulnerability over a large number of organisms, but can have poor predictive power when transposed to novel environments and reveals little in the way of causal mechanisms that define changes in species distribution or abundance. Having conservation planning rely largely on this single approach also increases the risk of policy failure. Mechanistic models that are parameterized with physiological information are expected to be more robust when extrapolating distributions to future environmental conditions and can identify physiological processes that set range boundaries. Implementation of mechanistic species distribution models requires knowledge of how environmental change influences physiological performance, and because this information is currently restricted to a comparatively small number of well-studied organisms, use of mechanistic modelling in the context of climate change conservation is limited. In this review, we propose that the need to develop mechanistic models that incorporate physiological data presents an opportunity for physiologists to contribute more directly to climate change conservation and advance the field of conservation physiology. We begin by describing the prevalence of species distribution modelling in climate change conservation, highlighting the benefits and drawbacks of both mechanistic and correlative approaches. Next, we emphasize the need to expand mechanistic models and discuss potential metrics of physiological performance suitable for integration into mechanistic models. We conclude by summarizing other factors, such as the need to consider demography, limiting broader application of mechanistic models in climate change conservation. Ideally, modellers, physiologists and

  8. Mechanistic Models for Process Development and Optimization of Fed-batch Fermentation Systems

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads O.

    2016-01-01

    This work discusses the application of mechanistic models to pilot scale filamentous fungal fermentation systems operated at Novozymes A/S. For on-line applications, a state estimator model is developed based on a stoichiometric balance in order to predict the biomass and product concentration....... This is based on on-line gas measurements and ammonia addition flow rate measurements. Additionally, a mechanistic model is applied offline as a tool for batch planning, based on definition of the process back pressure, aeration rate and stirrer speed. This allows the batch starting fill to be planned, taking...... into account the oxygen transfer conditions, as well as the evaporation rates of the system. Mechanistic models are valuable tools which are applicable for both process development and optimization. The state estimator described will be a valuable tool for future work as part of control strategy development...

  9. Mechanistic Fermentation Models for Process Design, Monitoring, and Control

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads Orla

    2017-01-01

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted...... for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool...... for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding....

  10. Testing mechanistic models of growth in insects.

    Science.gov (United States)

    Maino, James L; Kearney, Michael R

    2015-11-22

    Insects are typified by their small size, large numbers, impressive reproductive output and rapid growth. However, insect growth is not simply rapid; rather, insects follow a qualitatively distinct trajectory to many other animals. Here we present a mechanistic growth model for insects and show that increasing specific assimilation during the growth phase can explain the near-exponential growth trajectory of insects. The presented model is tested against growth data on 50 insects, and compared against other mechanistic growth models. Unlike the other mechanistic models, our growth model predicts energy reserves per biomass to increase with age, which implies a higher production efficiency and energy density of biomass in later instars. These predictions are tested against data compiled from the literature whereby it is confirmed that insects increase their production efficiency (by 24 percentage points) and energy density (by 4 J mg(-1)) between hatching and the attainment of full size. The model suggests that insects achieve greater production efficiencies and enhanced growth rates by increasing specific assimilation and increasing energy reserves per biomass, which are less costly to maintain than structural biomass. Our findings illustrate how the explanatory and predictive power of mechanistic growth models comes from their grounding in underlying biological processes. © 2015 The Author(s).

  11. Description and evaluation of a mechanistically based conceptual model for spall

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, F.D.; Knowles, M.K.; Thompson, T.W. [and others

    1997-08-01

    A mechanistically based model for a possible spall event at the WIPP site is developed and evaluated in this report. Release of waste material to the surface during an inadvertent borehole intrusion is possible if future states of the repository include high gas pressure and waste material consisting of fine particulates having low mechanical strength. The conceptual model incorporates the physics of wellbore hydraulics coupled to transient gas flow to the intrusion borehole, and mechanical response of the waste. Degraded waste properties using of the model. The evaluations include both numerical and analytical implementations of the conceptual model. A tensile failure criterion is assumed appropriate for calculation of volumes of waste experiencing fragmentation. Calculations show that for repository gas pressures less than 12 MPa, no tensile failure occurs. Minimal volumes of material experience failure below gas pressure of 14 MPa. Repository conditions dictate that the probability of gas pressures exceeding 14 MPa is approximately 1%. For these conditions, a maximum failed volume of 0.25 m{sup 3} is calculated.

  12. Description and evaluation of a mechanistically based conceptual model for spall

    International Nuclear Information System (INIS)

    Hansen, F.D.; Knowles, M.K.; Thompson, T.W.

    1997-08-01

    A mechanistically based model for a possible spall event at the WIPP site is developed and evaluated in this report. Release of waste material to the surface during an inadvertent borehole intrusion is possible if future states of the repository include high gas pressure and waste material consisting of fine particulates having low mechanical strength. The conceptual model incorporates the physics of wellbore hydraulics coupled to transient gas flow to the intrusion borehole, and mechanical response of the waste. Degraded waste properties using of the model. The evaluations include both numerical and analytical implementations of the conceptual model. A tensile failure criterion is assumed appropriate for calculation of volumes of waste experiencing fragmentation. Calculations show that for repository gas pressures less than 12 MPa, no tensile failure occurs. Minimal volumes of material experience failure below gas pressure of 14 MPa. Repository conditions dictate that the probability of gas pressures exceeding 14 MPa is approximately 1%. For these conditions, a maximum failed volume of 0.25 m 3 is calculated

  13. Profiling the biological activity of oxide nanomaterials with mechanistic models

    NARCIS (Netherlands)

    Burello, E.

    2013-01-01

    In this study we present three mechanistic models for profiling the potential biological and toxicological effects of oxide nanomaterials. The models attempt to describe the reactivity, protein adsorption and membrane adhesion processes of a large range of oxide materials and are based on properties

  14. An improved mechanistic critical heat flux model for subcooled flow boiling

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Young Min [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Chang, Soon Heung [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    Based on the bubble coalescence adjacent to the heated wall as a flow structure for CHF condition, Chang and Lee developed a mechanistic critical heat flux (CHF) model for subcooled flow boiling. In this paper, improvements of Chang-Lee model are implemented with more solid theoretical bases for subcooled and low-quality flow boiling in tubes. Nedderman-Shearer`s equations for the skin friction factor and universal velocity profile models are employed. Slip effect of movable bubbly layer is implemented to improve the predictability of low mass flow. Also, mechanistic subcooled flow boiling model is used to predict the flow quality and void fraction. The performance of the present model is verified using the KAIST CHF database of water in uniformly heated tubes. It is found that the present model can give a satisfactory agreement with experimental data within less than 9% RMS error. 9 refs., 5 figs. (Author)

  15. An improved mechanistic critical heat flux model for subcooled flow boiling

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Young Min [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Chang, Soon Heung [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Based on the bubble coalescence adjacent to the heated wall as a flow structure for CHF condition, Chang and Lee developed a mechanistic critical heat flux (CHF) model for subcooled flow boiling. In this paper, improvements of Chang-Lee model are implemented with more solid theoretical bases for subcooled and low-quality flow boiling in tubes. Nedderman-Shearer`s equations for the skin friction factor and universal velocity profile models are employed. Slip effect of movable bubbly layer is implemented to improve the predictability of low mass flow. Also, mechanistic subcooled flow boiling model is used to predict the flow quality and void fraction. The performance of the present model is verified using the KAIST CHF database of water in uniformly heated tubes. It is found that the present model can give a satisfactory agreement with experimental data within less than 9% RMS error. 9 refs., 5 figs. (Author)

  16. Mechanistic modeling of CHF in forced-convection subcooled boiling

    International Nuclear Information System (INIS)

    Podowski, M.Z.; Alajbegovic, A.; Kurul, N.; Drew, D.A.; Lahey, R.T. Jr.

    1997-05-01

    Because of the complexity of phenomena governing boiling heat transfer, the approach to solve practical problems has traditionally been based on experimental correlations rather than mechanistic models. The recent progress in computational fluid dynamics (CFD), combined with improved experimental techniques in two-phase flow and heat transfer, makes the use of rigorous physically-based models a realistic alternative to the current simplistic phenomenological approach. The objective of this paper is to present a new CFD model for critical heat flux (CHF) in low quality (in particular, in subcooled boiling) forced-convection flows in heated channels

  17. Uncertainty, sensitivity analysis and the role of data based mechanistic modeling in hydrology

    Science.gov (United States)

    Ratto, M.; Young, P. C.; Romanowicz, R.; Pappenberger, F.; Saltelli, A.; Pagano, A.

    2007-05-01

    In this paper, we discuss a joint approach to calibration and uncertainty estimation for hydrologic systems that combines a top-down, data-based mechanistic (DBM) modelling methodology; and a bottom-up, reductionist modelling methodology. The combined approach is applied to the modelling of the River Hodder catchment in North-West England. The top-down DBM model provides a well identified, statistically sound yet physically meaningful description of the rainfall-flow data, revealing important characteristics of the catchment-scale response, such as the nature of the effective rainfall nonlinearity and the partitioning of the effective rainfall into different flow pathways. These characteristics are defined inductively from the data without prior assumptions about the model structure, other than it is within the generic class of nonlinear differential-delay equations. The bottom-up modelling is developed using the TOPMODEL, whose structure is assumed a priori and is evaluated by global sensitivity analysis (GSA) in order to specify the most sensitive and important parameters. The subsequent exercises in calibration and validation, performed with Generalized Likelihood Uncertainty Estimation (GLUE), are carried out in the light of the GSA and DBM analyses. This allows for the pre-calibration of the the priors used for GLUE, in order to eliminate dynamical features of the TOPMODEL that have little effect on the model output and would be rejected at the structure identification phase of the DBM modelling analysis. In this way, the elements of meaningful subjectivity in the GLUE approach, which allow the modeler to interact in the modelling process by constraining the model to have a specific form prior to calibration, are combined with other more objective, data-based benchmarks for the final uncertainty estimation. GSA plays a major role in building a bridge between the hypothetico-deductive (bottom-up) and inductive (top-down) approaches and helps to improve the

  18. INCORPORATION OF MECHANISTIC INFORMATION IN THE ARSENIC PBPK MODEL DEVELOPMENT PROCESS

    Science.gov (United States)

    INCORPORATING MECHANISTIC INSIGHTS IN A PBPK MODEL FOR ARSENICElaina M. Kenyon, Michael F. Hughes, Marina V. Evans, David J. Thomas, U.S. EPA; Miroslav Styblo, University of North Carolina; Michael Easterling, Analytical Sciences, Inc.A physiologically based phar...

  19. A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.

    Science.gov (United States)

    Revell, Christopher; Somveille, Marius

    2017-08-29

    In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.

  20. Bridging Mechanistic and Phenomenological Models of Complex Biological Systems.

    Science.gov (United States)

    Transtrum, Mark K; Qiu, Peng

    2016-05-01

    The inherent complexity of biological systems gives rise to complicated mechanistic models with a large number of parameters. On the other hand, the collective behavior of these systems can often be characterized by a relatively small number of phenomenological parameters. We use the Manifold Boundary Approximation Method (MBAM) as a tool for deriving simple phenomenological models from complicated mechanistic models. The resulting models are not black boxes, but remain expressed in terms of the microscopic parameters. In this way, we explicitly connect the macroscopic and microscopic descriptions, characterize the equivalence class of distinct systems exhibiting the same range of collective behavior, and identify the combinations of components that function as tunable control knobs for the behavior. We demonstrate the procedure for adaptation behavior exhibited by the EGFR pathway. From a 48 parameter mechanistic model, the system can be effectively described by a single adaptation parameter τ characterizing the ratio of time scales for the initial response and recovery time of the system which can in turn be expressed as a combination of microscopic reaction rates, Michaelis-Menten constants, and biochemical concentrations. The situation is not unlike modeling in physics in which microscopically complex processes can often be renormalized into simple phenomenological models with only a few effective parameters. The proposed method additionally provides a mechanistic explanation for non-universal features of the behavior.

  1. Mechanistic model for void distribution in flashing flow

    International Nuclear Information System (INIS)

    Riznic, J.; Ishii, M.; Afgan, N.

    1987-01-01

    A problem of discharging of an initially subcooled liquid from a high pressure condition into a low pressure environment is quite important in several industrial systems such as nuclear reactors and chemical reactors. A new model for the flashing process is proposed here based on the wall nucleation theory, bubble growth model and drift-flux bubble transport model. In order to calculate the bubble number density, the bubble number transport equation with a distributed source from the wall nucleation sites is used. The model predictions in terms of the void fraction are compared to Moby Dick and BNL experimental data. It shows that satisfactory agreements could be obtained from the present model without any floating parameter to be adjusted with data. This result indicates that, at least for the experimental conditions considered here, the mechanistic prediction of the flashing phenomenon is possible based on the present wall nucleation based model. 43 refs., 4 figs

  2. Development of Improved Mechanistic Deterioration Models for Flexible Pavements

    DEFF Research Database (Denmark)

    Ullidtz, Per; Ertman, Hans Larsen

    1998-01-01

    The paper describes a pilot study in Denmark with the main objective of developing improved mechanistic deterioration models for flexible pavements based on an accelerated full scale test on an instrumented pavement in the Danish Road Tessting Machine. The study was the first in "International...... Pavement Subgrade Performance Study" sponsored by the Federal Highway Administration (FHWA), USA. The paper describes in detail the data analysis and the resulting models for rutting, roughness, and a model for the plastic strain in the subgrade.The reader will get an understanding of the work needed...

  3. INTEGRATION OF QSAR AND SAR METHODS FOR THE MECHANISTIC INTERPRETATION OF PREDICTIVE MODELS FOR CARCINOGENICITY

    Directory of Open Access Journals (Sweden)

    Natalja Fjodorova

    2012-07-01

    Full Text Available The knowledge-based Toxtree expert system (SAR approach was integrated with the statistically based counter propagation artificial neural network (CP ANN model (QSAR approach to contribute to a better mechanistic understanding of a carcinogenicity model for non-congeneric chemicals using Dragon descriptors and carcinogenic potency for rats as a response. The transparency of the CP ANN algorithm was demonstrated using intrinsic mapping technique specifically Kohonen maps. Chemical structures were represented by Dragon descriptors that express the structural and electronic features of molecules such as their shape and electronic surrounding related to reactivity of molecules. It was illustrated how the descriptors are correlated with particular structural alerts (SAs for carcinogenicity with recognized mechanistic link to carcinogenic activity. Moreover, the Kohonen mapping technique enables one to examine the separation of carcinogens and non-carcinogens (for rats within a family of chemicals with a particular SA for carcinogenicity. The mechanistic interpretation of models is important for the evaluation of safety of chemicals.

  4. Mechanistic effect modeling for ecological risk assessment: where to go from here?

    Science.gov (United States)

    Grimm, Volker; Martin, Benjamin T

    2013-07-01

    Mechanistic effect models (MEMs) consider the mechanisms of how chemicals affect individuals and ecological systems such as populations and communities. There is an increasing awareness that MEMs have high potential to make risk assessment of chemicals more ecologically relevant than current standard practice. Here we discuss what kinds of MEMs are needed to improve scientific and regulatory aspects of risk assessment. To make valid predictions for a wide range of environmental conditions, MEMs need to include a sufficient amount of emergence, for example, population dynamics emerging from what individual organisms do. We present 1 example where the life cycle of individuals is described using Dynamic Energy Budget theory. The resulting individual-based population model is thus parameterized at the individual level but correctly predicts multiple patterns at the population level. This is the case for both control and treated populations. We conclude that the state-of-the-art in mechanistic effect modeling has reached a level where MEMs are robust and predictive enough to be used in regulatory risk assessment. Mechanistic effect models will thus be used to advance the scientific basis of current standard practice and will, if their development follows Good Modeling Practice, be included in a standardized way in future regulatory risk assessments. Copyright © 2013 SETAC.

  5. Calibrating mechanistic-empirical pavement performance models with an expert matrix

    Energy Technology Data Exchange (ETDEWEB)

    Tighe, S.; AlAssar, R.; Haas, R. [Waterloo Univ., ON (Canada). Dept. of Civil Engineering; Zhiwei, H. [Stantec Consulting Ltd., Cambridge, ON (Canada)

    2001-07-01

    Proper management of pavement infrastructure requires pavement performance modelling. For the past 20 years, the Ontario Ministry of Transportation has used the Ontario Pavement Analysis of Costs (OPAC) system for pavement design. Pavement needs, however, have changed substantially during that time. To address this need, a new research contract is underway to enhance the model and verify the predictions, particularly at extreme points such as low and high traffic volume pavement design. This initiative included a complete evaluation of the existing OPAC pavement design method, the construction of a new set of pavement performance prediction models, and the development of the flexible pavement design procedure that incorporates reliability analysis. The design was also expanded to include rigid pavement designs and modification of the existing life cycle cost analysis procedure which includes both the agency cost and road user cost. Performance prediction and life-cycle costs were developed based on several factors, including material properties, traffic loads and climate. Construction and maintenance schedules were also considered. The methodology for the calibration and validation of a mechanistic-empirical flexible pavement performance model was described. Mechanistic-empirical design methods combine theory based design such as calculated stresses, strains or deflections with empirical methods, where a measured response is associated with thickness and pavement performance. Elastic layer analysis was used to determine pavement response to determine the most effective design using cumulative Equivalent Single Axle Loads (ESALs), below grade type and layer thickness.The new mechanistic-empirical model separates the environment and traffic effects on performance. This makes it possible to quantify regional differences between Southern and Northern Ontario. In addition, roughness can be calculated in terms of the International Roughness Index or Riding comfort Index

  6. Specialists without spirit: limitations of the mechanistic biomedical model.

    Science.gov (United States)

    Hewa, S; Hetherington, R W

    1995-06-01

    This paper examines the origin and the development of the mechanistic model of the human body and health in terms of Max Weber's theory of rationalization. It is argued that the development of Western scientific medicine is a part of the broad process of rationalization that began in sixteenth century Europe as a result of the Reformation. The development of the mechanistic view of the human body in Western medicine is consistent with the ideas of calculability, predictability, and control-the major tenets of the process of rationalization as described by Weber. In recent years, however, the limitations of the mechanistic model have been the topic of many discussions. George Engel, a leading advocate of general systems theory, is one of the leading proponents of a new medical model which includes the general quality of life, clean environment, and psychological, or spiritual stability of life. The paper concludes with consideration of the potential of Engel's proposed new model in the context of the current state of rationalization in modern industrialized society.

  7. Prediction of warfarin maintenance dose in Han Chinese patients using a mechanistic model based on genetic and non-genetic factors.

    Science.gov (United States)

    Lu, Yuan; Yang, Jinbo; Zhang, Haiyan; Yang, Jin

    2013-07-01

    Many attempts have been made to predict the warfarin maintenance dose in patients beginning warfarin therapy using a descriptive model based on multiple linear regression. Here we report the first attempt to develop a comprehensive mechanistic model integrating in vitro-in vivo extrapolation (IVIVE) with a pharmacokinetic-pharmacodynamic model to predict the warfarin maintenance dose in Han Chinese patients. The model incorporates demographic factors [sex, age, body weight (BW)] and the genetic polymorphisms of cytochrome P450 (CYP) 2C9 (CYP2C9) and vitamin K epoxide reductase complex subunit 1 (VKORC1). Information on the various factors, mean warfarin daily dose and International Normalized Ratio (INR) was available for a cohort of 197 Han Chinese patients. Based on in vitro enzyme kinetic parameters for S-warfarin metabolism, demographic data for Han Chinese and some scaling factors, the S-warfarin clearance (CL) was predicted for patients in the cohort with different CYP2C9 genotypes using IVIVE. The plasma concentration of S-warfarin after a single oral dose was simulated using a one-compartment pharmacokinetic model with first-order absorption and a lag time and was combined with a mechanistic coagulation model to simulate the INR response. The warfarin maintenance dose was then predicted based on the demographic data and genotypes of CYP2C9 and VKORC1 for each patient and using the observed steady-state INR (INRss) as a target value. Finally, sensitivity analysis was carried out to determine which factor(s) affect the warfarin maintenance dose most strongly. The predictive performance of this mechanistic model is not inferior to that of our previous descriptive model. There were significant differences in the mean warfarin daily dose in patients with different CYP2C9 and VKORC1 genotypes. Using IVIVE, the predicted mean CL of S-warfarin for patients with CYP2C9*1/*3 (0.092 l/h, n = 11) was 57 % less than for those with wild-type *1/*1 (0.215 l/h, n

  8. Conceptual models for waste tank mechanistic analysis

    International Nuclear Information System (INIS)

    Allemann, R.T.; Antoniak, Z.I.; Eyler, L.L.; Liljegren, L.M.; Roberts, J.S.

    1992-02-01

    Pacific Northwest Laboratory (PNL) is conducting a study for Westinghouse Hanford Company (Westinghouse Hanford), a contractor for the US Department of Energy (DOE). The purpose of the work is to study possible mechanisms and fluid dynamics contributing to the periodic release of gases from double-shell waste storage tanks at the Hanford Site in Richland, Washington. This interim report emphasizing the modeling work follows two other interim reports, Mechanistic Analysis of Double-Shell Tank Gas Release Progress Report -- November 1990 and Collection and Analysis of Existing Data for Waste Tank Mechanistic Analysis Progress Report -- December 1990, that emphasized data correlation and mechanisms. The approach in this study has been to assemble and compile data that are pertinent to the mechanisms, analyze the data, evaluate physical properties and parameters, evaluate hypothetical mechanisms, and develop mathematical models of mechanisms

  9. Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation: Report of an FDA Public Workshop.

    Science.gov (United States)

    Zhang, X; Duan, J; Kesisoglou, F; Novakovic, J; Amidon, G L; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R

    2017-08-01

    On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled "Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation." The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole-body framework. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  10. Mechanistic modeling of reactive soil nitrogen emissions across agricultural management practices

    Science.gov (United States)

    Rasool, Q. Z.; Miller, D. J.; Bash, J. O.; Venterea, R. T.; Cooter, E. J.; Hastings, M. G.; Cohan, D. S.

    2017-12-01

    The global reactive nitrogen (N) budget has increased by a factor of 2-3 from pre-industrial levels. This increase is especially pronounced in highly N fertilized agricultural regions in summer. The reactive N emissions from soil to atmosphere can be in reduced (NH3) or oxidized (NO, HONO, N2O) forms, depending on complex biogeochemical transformations of soil N reservoirs. Air quality models like CMAQ typically neglect soil emissions of HONO and N2O. Previously, soil NO emissions estimated by models like CMAQ remained parametric and inconsistent with soil NH3 emissions. Thus, there is a need to more mechanistically and consistently represent the soil N processes that lead to reactive N emissions to the atmosphere. Our updated approach estimates soil NO, HONO and N2O emissions by incorporating detailed agricultural fertilizer inputs from EPIC, and CMAQ-modeled N deposition, into the soil N pool. EPIC addresses the nitrification, denitrification and volatilization rates along with soil N pools for agricultural soils. Suitable updates to account for factors like nitrite (NO2-) accumulation not addressed in EPIC, will also be made. The NO and N2O emissions from nitrification and denitrification are computed mechanistically using the N sub-model of DAYCENT. These mechanistic definitions use soil water content, temperature, NH4+ and NO3- concentrations, gas diffusivity and labile C availability as dependent parameters at various soil layers. Soil HONO emissions found to be most probable under high NO2- availability will be based on observed ratios of HONO to NO emissions under different soil moistures, pH and soil types. The updated scheme will utilize field-specific soil properties and N inputs across differing manure management practices such as tillage. Comparison of the modeled soil NO emission rates from the new mechanistic and existing schemes against field measurements will be discussed. Our updated framework will help to predict the diurnal and daily variability

  11. Mechanistic species distribution modeling reveals a niche shift during invasion.

    Science.gov (United States)

    Chapman, Daniel S; Scalone, Romain; Štefanić, Edita; Bullock, James M

    2017-06-01

    Niche shifts of nonnative plants can occur when they colonize novel climatic conditions. However, the mechanistic basis for niche shifts during invasion is poorly understood and has rarely been captured within species distribution models. We quantified the consequence of between-population variation in phenology for invasion of common ragweed (Ambrosia artemisiifolia L.) across Europe. Ragweed is of serious concern because of its harmful effects as a crop weed and because of its impact on public health as a major aeroallergen. We developed a forward mechanistic species distribution model based on responses of ragweed development rates to temperature and photoperiod. The model was parameterized and validated from the literature and by reanalyzing data from a reciprocal common garden experiment in which native and invasive populations were grown within and beyond the current invaded range. It could therefore accommodate between-population variation in the physiological requirements for flowering, and predict the potentially invaded ranges of individual populations. Northern-origin populations that were established outside the generally accepted climate envelope of the species had lower thermal requirements for bud development, suggesting local adaptation of phenology had occurred during the invasion. The model predicts that this will extend the potentially invaded range northward and increase the average suitability across Europe by 90% in the current climate and 20% in the future climate. Therefore, trait variation observed at the population scale can trigger a climatic niche shift at the biogeographic scale. For ragweed, earlier flowering phenology in established northern populations could allow the species to spread beyond its current invasive range, substantially increasing its risk to agriculture and public health. Mechanistic species distribution models offer the possibility to represent niche shifts by varying the traits and niche responses of individual

  12. The Combined Use of Correlative and Mechanistic Species Distribution Models Benefits Low Conservation Status Species.

    Directory of Open Access Journals (Sweden)

    Thibaud Rougier

    Full Text Available Species can respond to climate change by tracking appropriate environmental conditions in space, resulting in a range shift. Species Distribution Models (SDMs can help forecast such range shift responses. For few species, both correlative and mechanistic SDMs were built, but allis shad (Alosa alosa, an endangered anadromous fish species, is one of them. The main purpose of this study was to provide a framework for joint analyses of correlative and mechanistic SDMs projections in order to strengthen conservation measures for species of conservation concern. Guidelines for joint representation and subsequent interpretation of models outputs were defined and applied. The present joint analysis was based on the novel mechanistic model GR3D (Global Repositioning Dynamics of Diadromous fish Distribution which was parameterized on allis shad and then used to predict its future distribution along the European Atlantic coast under different climate change scenarios (RCP 4.5 and RCP 8.5. We then used a correlative SDM for this species to forecast its distribution across the same geographic area and under the same climate change scenarios. First, projections from correlative and mechanistic models provided congruent trends in probability of habitat suitability and population dynamics. This agreement was preferentially interpreted as referring to the species vulnerability to climate change. Climate change could not be accordingly listed as a major threat for allis shad. The congruence in predicted range limits between SDMs projections was the next point of interest. The difference, when noticed, required to deepen our understanding of the niche modelled by each approach. In this respect, the relative position of the northern range limit between the two methods strongly suggested here that a key biological process related to intraspecific variability was potentially lacking in the mechanistic SDM. Based on our knowledge, we hypothesized that local

  13. SITE-94. Adaptation of mechanistic sorption models for performance assessment calculations

    International Nuclear Information System (INIS)

    Arthur, R.C.

    1996-10-01

    Sorption is considered in most predictive models of radionuclide transport in geologic systems. Most models simulate the effects of sorption in terms of empirical parameters, which however can be criticized because the data are only strictly valid under the experimental conditions at which they were measured. An alternative is to adopt a more mechanistic modeling framework based on recent advances in understanding the electrical properties of oxide mineral-water interfaces. It has recently been proposed that these 'surface-complexation' models may be directly applicable to natural systems. A possible approach for adapting mechanistic sorption models for use in performance assessments, using this 'surface-film' concept, is described in this report. Surface-acidity parameters in the Generalized Two-Layer surface complexation model are combined with surface-complexation constants for Np(V) sorption ob hydrous ferric oxide to derive an analytical model enabling direct calculation of corresponding intrinsic distribution coefficients as a function of pH, and Ca 2+ , Cl - , and HCO 3 - concentrations. The surface film concept is then used to calculate whole-rock distribution coefficients for Np(V) sorption by altered granitic rocks coexisting with a hypothetical, oxidized Aespoe groundwater. The calculated results suggest that the distribution coefficients for Np adsorption on these rocks could range from 10 to 100 ml/g. Independent estimates of K d for Np sorption in similar systems, based on an extensive review of experimental data, are consistent, though slightly conservative, with respect to the calculated values. 31 refs

  14. Mechanistic modeling of heat transfer process governing pressure tube-to-calandria tube contact and fuel channel failure

    International Nuclear Information System (INIS)

    Luxat, J.C.

    2002-01-01

    Heat transfer behaviour and phenomena associated with ballooning deformation of a pressure tube into contact with a calandria tube have been analyzed and mechanistic models have been developed to describe the heat transfer and thermal-mechanical processes. These mechanistic models are applied to analyze experiments performed in various COG funded Contact Boiling Test series. Particular attention is given in the modeling to characterization of the conditions for which fuel channel failure may occur. Mechanistic models describing the governing heat transfer and thermal-mechanical processes are presented. The technical basis for characterizing parameters of the models from the general heat transfer literature is described. The validity of the models is demonstrated by comparison with experimental data. Fuel channel integrity criteria are proposed which are based upon three necessary and sequential mechanisms: Onset of CHF and local drypatch formation at contact; sustained film boiling in the post-contact period; and creep strain to failure of the calandria tube while in sustained film boiling. (author)

  15. Blinded prospective evaluation of computer-based mechanistic schizophrenia disease model for predicting drug response.

    Directory of Open Access Journals (Sweden)

    Hugo Geerts

    Full Text Available The tremendous advances in understanding the neurobiological circuits involved in schizophrenia have not translated into more effective treatments. An alternative strategy is to use a recently published 'Quantitative Systems Pharmacology' computer-based mechanistic disease model of cortical/subcortical and striatal circuits based upon preclinical physiology, human pathology and pharmacology. The physiology of 27 relevant dopamine, serotonin, acetylcholine, norepinephrine, gamma-aminobutyric acid (GABA and glutamate-mediated targets is calibrated using retrospective clinical data on 24 different antipsychotics. The model was challenged to predict quantitatively the clinical outcome in a blinded fashion of two experimental antipsychotic drugs; JNJ37822681, a highly selective low-affinity dopamine D(2 antagonist and ocaperidone, a very high affinity dopamine D(2 antagonist, using only pharmacology and human positron emission tomography (PET imaging data. The model correctly predicted the lower performance of JNJ37822681 on the positive and negative syndrome scale (PANSS total score and the higher extra-pyramidal symptom (EPS liability compared to olanzapine and the relative performance of ocaperidone against olanzapine, but did not predict the absolute PANSS total score outcome and EPS liability for ocaperidone, possibly due to placebo responses and EPS assessment methods. Because of its virtual nature, this modeling approach can support central nervous system research and development by accounting for unique human drug properties, such as human metabolites, exposure, genotypes and off-target effects and can be a helpful tool for drug discovery and development.

  16. LASSIM-A network inference toolbox for genome-wide mechanistic modeling.

    Directory of Open Access Journals (Sweden)

    Rasmus Magnusson

    2017-06-01

    Full Text Available Recent technological advancements have made time-resolved, quantitative, multi-omics data available for many model systems, which could be integrated for systems pharmacokinetic use. Here, we present large-scale simulation modeling (LASSIM, which is a novel mathematical tool for performing large-scale inference using mechanistically defined ordinary differential equations (ODE for gene regulatory networks (GRNs. LASSIM integrates structural knowledge about regulatory interactions and non-linear equations with multiple steady state and dynamic response expression datasets. The rationale behind LASSIM is that biological GRNs can be simplified using a limited subset of core genes that are assumed to regulate all other gene transcription events in the network. The LASSIM method is implemented as a general-purpose toolbox using the PyGMO Python package to make the most of multicore computers and high performance clusters, and is available at https://gitlab.com/Gustafsson-lab/lassim. As a method, LASSIM works in two steps, where it first infers a non-linear ODE system of the pre-specified core gene expression. Second, LASSIM in parallel optimizes the parameters that model the regulation of peripheral genes by core system genes. We showed the usefulness of this method by applying LASSIM to infer a large-scale non-linear model of naïve Th2 cell differentiation, made possible by integrating Th2 specific bindings, time-series together with six public and six novel siRNA-mediated knock-down experiments. ChIP-seq showed significant overlap for all tested transcription factors. Next, we performed novel time-series measurements of total T-cells during differentiation towards Th2 and verified that our LASSIM model could monitor those data significantly better than comparable models that used the same Th2 bindings. In summary, the LASSIM toolbox opens the door to a new type of model-based data analysis that combines the strengths of reliable mechanistic models

  17. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  18. Descriptive and mechanistic models of crop–weed competition

    NARCIS (Netherlands)

    Bastiaans, L.; Storkey, J.

    2017-01-01

    Crop-weed competitive relations are an important element of agroecosystems. Quantifying and understanding them helps to design appropriate weed management at operational, tactical and strategic level. This chapter presents and discusses simple descriptive and more mechanistic models for crop-weed

  19. Requirements on mechanistic NPP models used in CSS for diagnostics and predictions

    International Nuclear Information System (INIS)

    Juslin, K.

    1996-01-01

    Mechanistic models have for several years with good experience been used for operators' support in electric power dispatching centres. Some models of limited scope have already been in use at nuclear power plants. It is considered that also advanced mechanistic models in combination with present computer technology with preference could be used in Computerized Support Systems (CSS) for the assistance of Nuclear Power Plant (NPP) operators. Requirements with respect to accuracy, validity range, speed flexibility and level of detail on the models used for such purposes are discussed. Quality Assurance, Verification and Validation efforts are considered. A long term commitment in the field of mechanistic modelling and real time simulation is considered as the key to successful implementations. The Advanced PROcess Simulation (APROS) code system and simulation environment developed at the Technical Research Centre of Finland (VTT) is intended also for CSS applications in NPP control rooms. (author). 4 refs

  20. Mechanistic modeling for mammography screening risks

    International Nuclear Information System (INIS)

    Bijwaard, Harmen

    2008-01-01

    Full text: Western populations show a very high incidence of breast cancer and in many countries mammography screening programs have been set up for the early detection of these cancers. Through these programs large numbers of women (in the Netherlands, 700.000 per year) are exposed to low but not insignificant X-ray doses. ICRP based risk estimates indicate that the number of breast cancer casualties due to mammography screening can be as high as 50 in the Netherlands per year. The number of lives saved is estimated to be much higher, but for an accurate calculation of the benefits of screening a better estimate of these risks is indispensable. Here it is attempted to better quantify the radiological risks of mammography screening through the application of a biologically based model for breast tumor induction by X-rays. The model is applied to data obtained from the National Institutes of Health in the U.S. These concern epidemiological data of female TB patients who received high X-ray breast doses in the period 1930-1950 through frequent fluoroscopy of their lungs. The mechanistic model that is used to describe the increased breast cancer incidence is based on an earlier study by Moolgavkar et al. (1980), in which the natural background incidence of breast cancer was modeled. The model allows for a more sophisticated extrapolation of risks to the low dose X-ray exposures that are common in mammography screening and to the higher ages that are usually involved. Furthermore, it allows for risk transfer to other (non-western) populations. The results have implications for decisions on the frequency of screening, the number of mammograms taken at each screening, minimum and maximum ages for screening and the transfer to digital equipment. (author)

  1. Behavioural Procedural Models – a multipurpose mechanistic account

    Directory of Open Access Journals (Sweden)

    Leonardo Ivarola

    2012-05-01

    Full Text Available In this paper we outline an epistemological defence of what wecall Behavioural Procedural Models (BPMs, which represent the processes of individual decisions that lead to relevant economic patterns as psychologically (rather than rationally driven. Their general structure, and the way in which they may be incorporated to a multipurpose view of models, where the representational and interventionist goals are combined, is shown. It is argued that BPMs may provide “mechanistic-based explanations” in the sense defended by Hedström and Ylikoski (2010, which involve invariant regularities in Woodward’s sense. Such mechanisms provide a causal sort of explanation of anomalous economic patterns, which allow for extra marketintervention and manipulability in order to correct and improve some key individual decisions. This capability sets the basis for the so called libertarian paternalism (Sunstein and Thaler 2003.

  2. Incorporation of lysosomal sequestration in the mechanistic model for prediction of tissue distribution of basic drugs.

    Science.gov (United States)

    Assmus, Frauke; Houston, J Brian; Galetin, Aleksandra

    2017-11-15

    The prediction of tissue-to-plasma water partition coefficients (Kpu) from in vitro and in silico data using the tissue-composition based model (Rodgers & Rowland, J Pharm Sci. 2005, 94(6):1237-48.) is well established. However, distribution of basic drugs, in particular into lysosome-rich lung tissue, tends to be under-predicted by this approach. The aim of this study was to develop an extended mechanistic model for the prediction of Kpu which accounts for lysosomal sequestration and the contribution of different cell types in the tissue of interest. The extended model is based on compound-specific physicochemical properties and tissue composition data to describe drug ionization, distribution into tissue water and drug binding to neutral lipids, neutral phospholipids and acidic phospholipids in tissues, including lysosomes. Physiological data on the types of cells contributing to lung, kidney and liver, their lysosomal content and lysosomal pH were collated from the literature. The predictive power of the extended mechanistic model was evaluated using a dataset of 28 basic drugs (pK a ≥7.8, 17 β-blockers, 11 structurally diverse drugs) for which experimentally determined Kpu data in rat tissue have been reported. Accounting for the lysosomal sequestration in the extended mechanistic model improved the accuracy of Kpu predictions in lung compared to the original Rodgers model (56% drugs within 2-fold or 88% within 3-fold of observed values). Reduction in the extent of Kpu under-prediction was also evident in liver and kidney. However, consideration of lysosomal sequestration increased the occurrence of over-predictions, yielding overall comparable model performances for kidney and liver, with 68% and 54% of Kpu values within 2-fold error, respectively. High lysosomal concentration ratios relative to cytosol (>1000-fold) were predicted for the drugs investigated; the extent differed depending on the lysosomal pH and concentration of acidic phospholipids among

  3. Mechanistic Physiologically Based Pharmacokinetic (PBPK) Model of the Heart Accounting for Inter-Individual Variability: Development and Performance Verification.

    Science.gov (United States)

    Tylutki, Zofia; Mendyk, Aleksander; Polak, Sebastian

    2018-04-01

    Modern model-based approaches to cardiac safety and efficacy assessment require accurate drug concentration-effect relationship establishment. Thus, knowledge of the active concentration of drugs in heart tissue is desirable along with inter-subject variability influence estimation. To that end, we developed a mechanistic physiologically based pharmacokinetic model of the heart. The models were described with literature-derived parameters and written in R, v.3.4.0. Five parameters were estimated. The model was fitted to amitriptyline and nortriptyline concentrations after an intravenous infusion of amitriptyline. The cardiac model consisted of 5 compartments representing the pericardial fluid, heart extracellular water, and epicardial intracellular, midmyocardial intracellular, and endocardial intracellular fluids. Drug cardiac metabolism, passive diffusion, active efflux, and uptake were included in the model as mechanisms involved in the drug disposition within the heart. The model accounted for inter-individual variability. The estimates of optimized parameters were within physiological ranges. The model performance was verified by simulating 5 clinical studies of amitriptyline intravenous infusion, and the simulated pharmacokinetic profiles agreed with clinical data. The results support the model feasibility. The proposed structure can be tested with the goal of improving the patient-specific model-based cardiac safety assessment and offers a framework for predicting cardiac concentrations of various xenobiotics. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  4. Assessing first-order emulator inference for physical parameters in nonlinear mechanistic models

    Science.gov (United States)

    Hooten, Mevin B.; Leeds, William B.; Fiechter, Jerome; Wikle, Christopher K.

    2011-01-01

    We present an approach for estimating physical parameters in nonlinear models that relies on an approximation to the mechanistic model itself for computational efficiency. The proposed methodology is validated and applied in two different modeling scenarios: (a) Simulation and (b) lower trophic level ocean ecosystem model. The approach we develop relies on the ability to predict right singular vectors (resulting from a decomposition of computer model experimental output) based on the computer model input and an experimental set of parameters. Critically, we model the right singular vectors in terms of the model parameters via a nonlinear statistical model. Specifically, we focus our attention on first-order models of these right singular vectors rather than the second-order (covariance) structure.

  5. Problems in mechanistic theoretical models for cell transformation by ionizing radiation

    International Nuclear Information System (INIS)

    Chatterjee, Aloke; Holley, W.R.

    1992-01-01

    A mechanistic model based on yields of double strand breaks has been developed to determine the dose response curves for cell transformation frequencies. At its present stage the model is applicable to immortal cell lines and to various qualities (X-rays, Neon and Iron) of ionizing radiation. Presently, we have considered four types of processes which can lead to activation phenomena: (i) point mutation events on a regulatory segment of selected oncogenes, (ii) inactivation of suppressor genes, through point mutation, (iii) deletion of a suppressor gene by a single track, and (iv) deletion of a suppressor gene by two tracks. (author)

  6. Quantitative assessment of biological impact using transcriptomic data and mechanistic network models

    International Nuclear Information System (INIS)

    Thomson, Ty M.; Sewer, Alain; Martin, Florian; Belcastro, Vincenzo; Frushour, Brian P.; Gebel, Stephan; Park, Jennifer; Schlage, Walter K.; Talikka, Marja; Vasilyev, Dmitry M.; Westra, Jurjen W.; Hoeng, Julia; Peitsch, Manuel C.

    2013-01-01

    Exposure to biologically active substances such as therapeutic drugs or environmental toxicants can impact biological systems at various levels, affecting individual molecules, signaling pathways, and overall cellular processes. The ability to derive mechanistic insights from the resulting system responses requires the integration of experimental measures with a priori knowledge about the system and the interacting molecules therein. We developed a novel systems biology-based methodology that leverages mechanistic network models and transcriptomic data to quantitatively assess the biological impact of exposures to active substances. Hierarchically organized network models were first constructed to provide a coherent framework for investigating the impact of exposures at the molecular, pathway and process levels. We then validated our methodology using novel and previously published experiments. For both in vitro systems with simple exposure and in vivo systems with complex exposures, our methodology was able to recapitulate known biological responses matching expected or measured phenotypes. In addition, the quantitative results were in agreement with experimental endpoint data for many of the mechanistic effects that were assessed, providing further objective confirmation of the approach. We conclude that our methodology evaluates the biological impact of exposures in an objective, systematic, and quantifiable manner, enabling the computation of a systems-wide and pan-mechanistic biological impact measure for a given active substance or mixture. Our results suggest that various fields of human disease research, from drug development to consumer product testing and environmental impact analysis, could benefit from using this methodology. - Highlights: • The impact of biologically active substances is quantified at multiple levels. • The systems-level impact integrates the perturbations of individual networks. • The networks capture the relationships between

  7. Fuel swelling importance in PCI mechanistic modelling

    International Nuclear Information System (INIS)

    Arimescu, V.I.

    2005-01-01

    Under certain conditions, fuel pellet swelling is the most important factor in determining the intensity of the pellet-to-cladding mechanical interaction (PCMI). This is especially true during power ramps, which lead to a temperature increase to a higher terminal plateau that is maintained for hours. The time-dependent gaseous swelling is proportional to temperature and is also enhanced by the increased gas atom migration to the grain boundary during the power ramp. On the other hand, gaseous swelling is inhibited by a compressive hydrostatic stress in the pellet. Therefore, PCMI is the net result of combining gaseous swelling and pellet thermal expansion with the opposing feedback from the cladding mechanical reaction. The coupling of the thermal and mechanical processes, mentioned above, with various feedback loops is best simulated by a mechanistic fuel code. This paper discusses a mechanistic swelling model that is coupled with a fission gas release model as well as a mechanical model of the fuel pellet. The role of fuel swelling is demonstrated for typical power ramps at different burn-ups. Also, fuel swelling plays a significant role in avoiding the thermal instability for larger gap fuel rods, by limiting the potentially exponentially increasing gap due to the positive feedback loop effect of increasing fission gas release and the associated over-pressure inside the cladding. (author)

  8. Rapid Discrimination Among Putative Mechanistic Models of Biochemical Systems.

    Science.gov (United States)

    Lomnitz, Jason G; Savageau, Michael A

    2016-08-31

    An overarching goal in molecular biology is to gain an understanding of the mechanistic basis underlying biochemical systems. Success is critical if we are to predict effectively the outcome of drug treatments and the development of abnormal phenotypes. However, data from most experimental studies is typically noisy and sparse. This allows multiple potential mechanisms to account for experimental observations, and often devising experiments to test each is not feasible. Here, we introduce a novel strategy that discriminates among putative models based on their repertoire of qualitatively distinct phenotypes, without relying on knowledge of specific values for rate constants and binding constants. As an illustration, we apply this strategy to two synthetic gene circuits exhibiting anomalous behaviors. Our results show that the conventional models, based on their well-characterized components, cannot account for the experimental observations. We examine a total of 40 alternative hypotheses and show that only 5 have the potential to reproduce the experimental data, and one can do so with biologically relevant parameter values.

  9. Development of a mechanistically based computer simulation of nitrogen oxide absorption in packed towers

    International Nuclear Information System (INIS)

    Counce, R.M.

    1981-01-01

    A computer simulation for nitrogen oxide (NO/sub x/) scrubbing in packed towers was developed for use in process design and process control. This simulation implements a mechanistically based mathematical model, which was formulated from (1) an exhaustive literature review; (2) previous NO/sub x/ scrubbing experience with sieve-plate towers; and (3) comparisons of sequential sets of experiments. Nitrogen oxide scrubbing is characterized by simultaneous absorption and desorption phenomena: the model development is based on experiments designed to feature these two phenomena. The model was then successfully tested in experiments designed to put it in jeopardy

  10. A mechanistic model on methane oxidation in the rice rhizosphere

    NARCIS (Netherlands)

    Bodegom, van P.M.; Leffelaar, P.A.; Goudriaan, J.

    2001-01-01

    A mechanistic model is presented on the processes leading to methane oxidation in rice rhizosphere. The model is driven by oxygen release from a rice root into anaerobic rice soil. Oxygen is consumed by heterotrophic and methanotrophic respiration, described by double Monod kinetics, and by iron

  11. Simulating the Risk of Liver Fluke Infection using a Mechanistic Hydro-epidemiological Model

    Science.gov (United States)

    Beltrame, Ludovica; Dunne, Toby; Rose, Hannah; Walker, Josephine; Morgan, Eric; Vickerman, Peter; Wagener, Thorsten

    2016-04-01

    Liver Fluke (Fasciola hepatica) is a common parasite found in livestock and responsible for considerable economic losses throughout the world. Risk of infection is strongly influenced by climatic and hydrological conditions, which characterise the host environment for parasite development and transmission. Despite on-going control efforts, increases in fluke outbreaks have been reported in recent years in the UK, and have been often attributed to climate change. Currently used fluke risk models are based on empirical relationships derived between historical climate and incidence data. However, hydro-climate conditions are becoming increasingly non-stationary due to climate change and direct anthropogenic impacts such as land use change, making empirical models unsuitable for simulating future risk. In this study we introduce a mechanistic hydro-epidemiological model for Liver Fluke, which explicitly simulates habitat suitability for disease development in space and time, representing the parasite life cycle in connection with key environmental conditions. The model is used to assess patterns of Liver Fluke risk for two catchments in the UK under current and potential future climate conditions. Comparisons are made with a widely used empirical model employing different datasets, including data from regional veterinary laboratories. Results suggest that mechanistic models can achieve adequate predictive ability and support adaptive fluke control strategies under climate change scenarios.

  12. Modeling systems-level dynamics: Understanding without mechanistic explanation in integrative systems biology.

    Science.gov (United States)

    MacLeod, Miles; Nersessian, Nancy J

    2015-02-01

    In this paper we draw upon rich ethnographic data of two systems biology labs to explore the roles of explanation and understanding in large-scale systems modeling. We illustrate practices that depart from the goal of dynamic mechanistic explanation for the sake of more limited modeling goals. These processes use abstract mathematical formulations of bio-molecular interactions and data fitting techniques which we call top-down abstraction to trade away accurate mechanistic accounts of large-scale systems for specific information about aspects of those systems. We characterize these practices as pragmatic responses to the constraints many modelers of large-scale systems face, which in turn generate more limited pragmatic non-mechanistic forms of understanding of systems. These forms aim at knowledge of how to predict system responses in order to manipulate and control some aspects of them. We propose that this analysis of understanding provides a way to interpret what many systems biologists are aiming for in practice when they talk about the objective of a "systems-level understanding." Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Mechanistic-empirical subgrade design model based on heavy vehicle simulator test results

    CSIR Research Space (South Africa)

    Theyse, HL

    2006-06-01

    Full Text Available Although Accelerated Pavement Testing (APT) is often done with specific objectives, valuable pavement performance data is generated over the long-term that may be used to investigate pavement behaviour in general and calibrate mechanistic...

  14. A mechanistic modelling and data assimilation approach to estimate the carbon/chlorophyll and carbon/nitrogen ratios in a coupled hydrodynamical-biological model

    Directory of Open Access Journals (Sweden)

    B. Faugeras

    2004-01-01

    Full Text Available The principal objective of hydrodynamical-biological models is to provide estimates of the main carbon fluxes such as total and export oceanic production. These models are nitrogen based, that is to say that the variables are expressed in terms of their nitrogen content. Moreover models are calibrated using chlorophyll data sets. Therefore carbon to chlorophyll (C:Chl and carbon to nitrogen (C:N ratios have to be assumed. This paper addresses the problem of the representation of these ratios. In a 1D framework at the DYFAMED station (NW Mediterranean Sea we propose a model which enables the estimation of the basic biogeochemical fluxes and in which the spatio-temporal variability of the C:Chl and C:N ratios is fully represented in a mechanical way. This is achieved through the introduction of new state variables coming from the embedding of a phytoplankton growth model in a more classical Redfieldian NNPZD-DOM model (in which the C:N ratio is assumed to be a constant. Following this modelling step, the parameters of the model are estimated using the adjoint data assimilation method which enables the assimilation of chlorophyll and nitrate data sets collected at DYFAMED in 1997.Comparing the predictions of the new Mechanistic model with those of the classical Redfieldian NNPZD-DOM model which was calibrated with the same data sets, we find that both models reproduce the reference data in a comparable manner. Both fluxes and stocks can be equally well predicted by either model. However if the models are coinciding on an average basis, they are diverging from a variability prediction point of view. In the Mechanistic model biology adapts much faster to its environment giving rise to higher short term variations. Moreover the seasonal variability in total production differs from the Redfieldian NNPZD-DOM model to the Mechanistic model. In summer the Mechanistic model predicts higher production values in carbon unit than the Redfieldian NNPZD

  15. Modeling Bird Migration under Climate Change: A Mechanistic Approach

    Science.gov (United States)

    Smith, James A.

    2009-01-01

    How will migrating birds respond to changes in the environment under climate change? What are the implications for migratory success under the various accelerated climate change scenarios as forecast by the Intergovernmental Panel on Climate Change? How will reductions or increased variability in the number or quality of wetland stop-over sites affect migratory bird species? The answers to these questions have important ramifications for conservation biology and wildlife management. Here, we describe the use of continental scale simulation modeling to explore how spatio-temporal changes along migratory flyways affect en-route migration success. We use an individually based, biophysical, mechanistic, bird migration model to simulate the movement of shorebirds in North America as a tool to study how such factors as drought and wetland loss may impact migratory success and modify migration patterns. Our model is driven by remote sensing and climate data and incorporates important landscape variables. The energy budget components of the model include resting, foraging, and flight, but presently predation is ignored. Results/Conclusions We illustrate our model by studying the spring migration of sandpipers through the Great Plains to their Arctic breeding grounds. Why many species of shorebirds have shown significant declines remains a puzzle. Shorebirds are sensitive to stop-over quality and spacing because of their need for frequent refueling stops and their opportunistic feeding patterns. We predict bird "hydrographs that is, stop-over frequency with latitude, that are in agreement with the literature. Mean stop-over durations predicted from our model for nominal cases also are consistent with the limited, but available data. For the shorebird species simulated, our model predicts that shorebirds exhibit significant plasticity and are able to shift their migration patterns in response to changing drought conditions. However, the question remains as to whether this

  16. Mechanistic modeling of aberrant energy metabolism in human disease

    Directory of Open Access Journals (Sweden)

    Vineet eSangar

    2012-10-01

    Full Text Available Dysfunction in energy metabolism—including in pathways localized to the mitochondria—has been implicated in the pathogenesis of a wide array of disorders, ranging from cancer to neurodegenerative diseases to type II diabetes. The inherent complexities of energy and mitochondrial metabolism present a significant obstacle in the effort to understand the role that these molecular processes play in the development of disease. To help unravel these complexities, systems biology methods have been applied to develop an array of computational metabolic models, ranging from mitochondria-specific processes to genome-scale cellular networks. These constraint-based models can efficiently simulate aspects of normal and aberrant metabolism in various genetic and environmental conditions. Development of these models leverages—and also provides a powerful means to integrate and interpret—information from a wide range of sources including genomics, proteomics, metabolomics, and enzyme kinetics. Here, we review a variety of mechanistic modeling studies that explore metabolic functions, deficiency disorders, and aberrant biochemical pathways in mitochondria and related regions in the cell.

  17. A dynamic and mechanistic model of PCB bioaccumulation in the European hake ( Merluccius merluccius)

    Science.gov (United States)

    Bodiguel, Xavier; Maury, Olivier; Mellon-Duval, Capucine; Roupsard, François; Le Guellec, Anne-Marie; Loizeau, Véronique

    2009-08-01

    Bioaccumulation is difficult to document because responses differ among chemical compounds, with environmental conditions, and physiological processes characteristic of each species. We use a mechanistic model, based on the Dynamic Energy Budget (DEB) theory, to take into account this complexity and study factors impacting accumulation of organic pollutants in fish through ontogeny. The bioaccumulation model proposed is a comprehensive approach that relates evolution of hake PCB contamination to physiological information about the fish, such as diet, metabolism, reserve and reproduction status. The species studied is the European hake ( Merluccius merluccius, L. 1758). The model is applied to study the total concentration and the lipid normalised concentration of 4 PCB congeners in male and female hakes from the Gulf of Lions (NW Mediterranean sea) and the Bay of Biscay (NE Atlantic ocean). Outputs of the model compare consistently to measurements over the life span of fish. Simulation results clearly demonstrate the relative effects of food contamination, growth and reproduction on the PCB bioaccumulation in hake. The same species living in different habitats and exposed to different PCB prey concentrations exhibit marked difference in the body accumulation of PCBs. At the adult stage, female hakes have a lower PCB concentration compared to males for a given length. We successfully simulated these sex-specific PCB concentrations by considering two mechanisms: a higher energy allocation to growth for females and a transfer of PCBs from the female to its eggs when allocating lipids from reserve to eggs. Finally, by its mechanistic description of physiological processes, the model is relevant for other species and sets the stage for a mechanistic understanding of toxicity and ecological effects of organic contaminants in marine organisms.

  18. Mechanistic modelling of the corrosion behaviour of copper nuclear fuel waste containers

    Energy Technology Data Exchange (ETDEWEB)

    King, F; Kolar, M

    1996-10-01

    A mechanistic model has been developed to predict the long-term corrosion behaviour of copper nuclear fuel waste containers in a Canadian disposal vault. The model is based on a detailed description of the electrochemical, chemical, adsorption and mass-transport processes involved in the uniform corrosion of copper, developed from the results of an extensive experimental program. Predictions from the model are compared with the results of some of these experiments and with observations from a bronze cannon submerged in seawater saturated clay sediments. Quantitative comparisons are made between the observed and predicted corrosion potential, corrosion rate and copper concentration profiles adjacent to the corroding surface, as a way of validating the long-term model predictions. (author). 12 refs., 5 figs.

  19. Coupling machine learning with mechanistic models to study runoff production and river flow at the hillslope scale

    Science.gov (United States)

    Marçais, J.; Gupta, H. V.; De Dreuzy, J. R.; Troch, P. A. A.

    2016-12-01

    Geomorphological structure and geological heterogeneity of hillslopes are major controls on runoff responses. The diversity of hillslopes (morphological shapes and geological structures) on one hand, and the highly non linear runoff mechanism response on the other hand, make it difficult to transpose what has been learnt at one specific hillslope to another. Therefore, making reliable predictions on runoff appearance or river flow for a given hillslope is a challenge. Applying a classic model calibration (based on inverse problems technique) requires doing it for each specific hillslope and having some data available for calibration. When applied to thousands of cases it cannot always be promoted. Here we propose a novel modeling framework based on coupling process based models with data based approach. First we develop a mechanistic model, based on hillslope storage Boussinesq equations (Troch et al. 2003), able to model non linear runoff responses to rainfall at the hillslope scale. Second we set up a model database, representing thousands of non calibrated simulations. These simulations investigate different hillslope shapes (real ones obtained by analyzing 5m digital elevation model of Brittany and synthetic ones), different hillslope geological structures (i.e. different parametrizations) and different hydrologic forcing terms (i.e. different infiltration chronicles). Then, we use this model library to train a machine learning model on this physically based database. Machine learning model performance is then assessed by a classic validating phase (testing it on new hillslopes and comparing machine learning with mechanistic outputs). Finally we use this machine learning model to learn what are the hillslope properties controlling runoffs. This methodology will be further tested combining synthetic datasets with real ones.

  20. Refined pipe theory for mechanistic modeling of wood development.

    Science.gov (United States)

    Deckmyn, Gaby; Evans, Sam P; Randle, Tim J

    2006-06-01

    We present a mechanistic model of wood tissue development in response to changes in competition, management and climate. The model is based on a refinement of the pipe theory, where the constant ratio between sapwood and leaf area (pipe theory) is replaced by a ratio between pipe conductivity and leaf area. Simulated pipe conductivity changes with age, stand density and climate in response to changes in allocation or pipe radius, or both. The central equation of the model, which calculates the ratio of carbon (C) allocated to leaves and pipes, can be parameterized to describe the contrasting stem conductivity behavior of different tree species: from constant stem conductivity (functional homeostasis hypothesis) to height-related reduction in stem conductivity with age (hydraulic limitation hypothesis). The model simulates the daily growth of pipes (vessels or tracheids), fibers and parenchyma as well as vessel size and simulates the wood density profile and the earlywood to latewood ratio from these data. Initial runs indicate the model yields realistic seasonal changes in pipe radius (decreasing pipe radius from spring to autumn) and wood density, as well as realistic differences associated with the competitive status of trees (denser wood in suppressed trees).

  1. The coefficient of restitution of pressurized balls: a mechanistic model

    Science.gov (United States)

    Georgallas, Alex; Landry, Gaëtan

    2016-01-01

    Pressurized, inflated balls used in professional sports are regulated so that their behaviour upon impact can be anticipated and allow the game to have its distinctive character. However, the dynamics governing the impacts of such balls, even on stationary hard surfaces, can be extremely complex. The energy transformations, which arise from the compression of the gas within the ball and from the shear forces associated with the deformation of the wall, are examined in this paper. We develop a simple mechanistic model of the dependence of the coefficient of restitution, e, upon both the gauge pressure, P_G, of the gas and the shear modulus, G, of the wall. The model is validated using the results from a simple series of experiments using three different sports balls. The fits to the data are extremely good for P_G > 25 kPa and consistent values are obtained for the value of G for the wall material. As far as the authors can tell, this simple, mechanistic model of the pressure dependence of the coefficient of restitution is the first in the literature. *%K Coefficient of Restitution, Dynamics, Inflated Balls, Pressure, Impact Model

  2. Recent advances in mathematical modeling of developmental abnormalities using mechanistic information.

    Science.gov (United States)

    Kavlock, R J

    1997-01-01

    During the last several years, significant changes in the risk assessment process for developmental toxicity of environmental contaminants have begun to emerge. The first of these changes is the development and beginning use of statistically based dose-response models [the benchmark dose (BMD) approach] that better utilize data derived from existing testing approaches. Accompanying this change is the greater emphasis placed on understanding and using mechanistic information to yield more accurate, reliable, and less uncertain risk assessments. The next stage in the evolution of risk assessment will be the use of biologically based dose-response (BBDR) models that begin to build into the statistically based models factors related to the underlying kinetic, biochemical, and/or physiologic processes perturbed by a toxicant. Such models are now emerging from several research laboratories. The introduction of quantitative models and the incorporation of biologic information into them has pointed to the need for even more sophisticated modifications for which we offer the term embryologically based dose-response (EBDR) models. Because these models would be based upon the understanding of normal morphogenesis, they represent a quantum leap in our thinking, but their complexity presents daunting challenges both to the developmental biologist and the developmental toxicologist. Implementation of these models will require extensive communication between developmental toxicologists, molecular embryologists, and biomathematicians. The remarkable progress in the understanding of mammalian embryonic development at the molecular level that has occurred over the last decade combined with advances in computing power and computational models should eventually enable these as yet hypothetical models to be brought into use.

  3. Mechanistic model to predict colostrum intake based on deuterium oxide dilution technique data and impact of gestation and prefarrowing diets on piglet intake and sow yield of colostrum.

    Science.gov (United States)

    Theil, P K; Flummer, C; Hurley, W L; Kristensen, N B; Labouriau, R L; Sørensen, M T

    2014-12-01

    The aims of the present study were to quantify colostrum intake (CI) of piglets using the D2O dilution technique, to develop a mechanistic model to predict CI, to compare these data with CI predicted by a previous empirical predictive model developed for bottle-fed piglets, and to study how composition of diets fed to gestating sows affected piglet CI, sow colostrum yield (CY), and colostrum composition. In total, 240 piglets from 40 litters were enriched with D2O. The CI measured by D2O from birth until 24 h after the birth of first-born piglet was on average 443 g (SD 151). Based on measured CI, a mechanistic model to predict CI was developed using piglet characteristics (24-h weight gain [WG; g], BW at birth [BWB; kg], and duration of CI [D; min]: CI, g=-106+2.26 WG+200 BWB+0.111 D-1,414 WG/D+0.0182 WG/BWB (R2=0.944). This model was used to predict the CI for all colostrum suckling piglets within the 40 litters (n=500, mean=437 g, SD=153 g) and was compared with the CI predicted by a previous empirical predictive model (mean=305 g, SD=140 g). The previous empirical model underestimated the CI by 30% compared with that obtained by the new mechanistic model. The sows were fed 1 of 4 gestation diets (n=10 per diet) based on different fiber sources (low fiber [17%] or potato pulp, pectin residue, or sugarbeet pulp [32 to 40%]) from mating until d 108 of gestation. From d 108 of gestation until parturition, sows were fed 1 of 5 prefarrowing diets (n=8 per diet) varying in supplemented fat (3% animal fat, 8% coconut oil, 8% sunflower oil, 8% fish oil, or 4% fish oil+4% octanoic acid). Sows fed diets with pectin residue or sugarbeet pulp during gestation produced colostrum with lower protein, fat, DM, and energy concentrations and higher lactose concentrations, and their piglets had greater CI as compared with sows fed potato pulp or the low-fiber diet (Pcoconut oil decreased lactose and increased DM concentrations of colostrum compared with other prefarrowing diets (P

  4. Mechanistic Modeling of Water Replenishment Rate of Zeer Refrigerator

    Directory of Open Access Journals (Sweden)

    B. N. Nwankwojike

    2017-06-01

    Full Text Available A model for predicting the water replenishment rate of zeer pot refrigerator was developed in this study using mechanistic modeling approach and evaluated at Obowo, Imo State, Nigeria using six fruits, tomatoes, guava, okra, banana, orange and avocado pear. The developed model confirmed zeer pot water replenishment rate as a function of ambient temperature, relative humidity, wind speed, thermal conductivity of the pot materials and sand, density of air and water vapor, permeability coefficient of clay and heat transfer coefficient of water into air, circumferential length, height of pot, geometrical profile of the pot, heat load of the food preserved, heat flow into the device and gradient at which the pot is placed above ground level. Compared to the conventional approach of water replenishment, performance analysis results revealed 44% to 58% water economy when the zeer pot’s water was replenished based on the model’s prediction; while there was no significant difference in the shelf-life of the fruits preserved with both replenishment methods. Application of the developed water replenishment model facilitates optimal water usage in this system, thereby reducing operational cost of zeer pot refrigerator.

  5. Conceptual models for waste tank mechanistic analysis. Status report, January 1991

    Energy Technology Data Exchange (ETDEWEB)

    Allemann, R. T.; Antoniak, Z. I.; Eyler, L. L.; Liljegren, L. M.; Roberts, J. S.

    1992-02-01

    Pacific Northwest Laboratory (PNL) is conducting a study for Westinghouse Hanford Company (Westinghouse Hanford), a contractor for the US Department of Energy (DOE). The purpose of the work is to study possible mechanisms and fluid dynamics contributing to the periodic release of gases from double-shell waste storage tanks at the Hanford Site in Richland, Washington. This interim report emphasizing the modeling work follows two other interim reports, Mechanistic Analysis of Double-Shell Tank Gas Release Progress Report -- November 1990 and Collection and Analysis of Existing Data for Waste Tank Mechanistic Analysis Progress Report -- December 1990, that emphasized data correlation and mechanisms. The approach in this study has been to assemble and compile data that are pertinent to the mechanisms, analyze the data, evaluate physical properties and parameters, evaluate hypothetical mechanisms, and develop mathematical models of mechanisms.

  6. Mechanistic models for the evaluation of biocatalytic reaction conditions and biosensor design optimization

    DEFF Research Database (Denmark)

    Semenova, Daria

    . In the first case study a mechanistic model was developed to describe the enzymatic reaction of glucose oxidase and glucose in the presence of catalase inside a commercial microfluidic platform with integrated oxygen sensor spots. The simplicity of the proposed model allowed an easy calibration of the reaction...... the microfluidic device. In the second case study the flexible microfluidic platform with integrated amperometric glucose biosensors was developed for continuous monitoring of glucose consumption rates. The integration of the mixing chamber inside the platform allowed performing sample dilutions which subsequently......BRs. In the third case study the mechanistic model of the cyclic voltammetry response of the first generation glucose biosensors was developed and applied for the biosensor design optimization. Furthermore the obtained qualitative and quantitative dependencies between the model output and experimental results were...

  7. Exposure factors for marine eutrophication impacts assessment based on a mechanistic biological model

    DEFF Research Database (Denmark)

    Cosme, Nuno Miguel Dias; Koski, Marja; Hauschild, Michael Zwicky

    2015-01-01

    marine ecosystem (LME), five climate zones, and site-generic. The XFs obtained range from 0.45 (Central Arctic Ocean) to 15.9kgO2kgN-1 (Baltic Sea). While LME resolution is recommended, aggregated PE or XF per climate zone can be adopted, but not global aggregation due to high variability. The XF......Emissions of nitrogen (N) from anthropogenic sources enrich marine waters and promote planktonic growth. This newly synthesised organic carbon is eventually exported to benthic waters where aerobic respiration by heterotrophic bacteria results in the consumption of dissolved oxygen (DO......). This pathway is typical of marine eutrophication. A model is proposed to mechanistically estimate the response of coastal marine ecosystems to N inputs. It addresses the biological processes of nutrient-limited primary production (PP), metazoan consumption, and bacterial degradation, in four distinct sinking...

  8. The use of mechanistic descriptions of algal growth and zooplankton grazing in an estuarine eutrophication model

    Science.gov (United States)

    Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.

    2003-03-01

    A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.

  9. Mechanistic model to predict colostrum intake based on deuterium oxide dilution technique data and impact of gestation and prefarrowing diets on piglet intake and sow yield of colostrum

    DEFF Research Database (Denmark)

    Theil, Peter Kappel; Flummer, Christine; Hurley, W L

    2014-01-01

    The aims of the present study were to quantify colostrum intake (CI) of piglets using the D2O dilution technique, to develop a mechanistic model to predict CI, to compare these data with CI predicted by a previous empirical predictive model developed for bottle-fed piglets, and to study how...... composition of diets fed to gestating sows affected piglet CI, sow colostrum yield (CY), and colostrum composition. In total, 240 piglets from 40 litters were enriched with D2O. The CI measured by D2O from birth until 24 h after the birth of first-born piglet was on average 443 g (SD 151). Based on measured...... CI, a mechanistic model to predict CI was developed using piglet characteristics (24-h weight gain [WG; g], BW at birth [BWB; kg], and duration of CI [D; min]: CI, g = –106 + 2.26 WG + 200 BWB + 0.111 D – 1,414 WG/D + 0.0182 WG/BWB (R2 = 0.944). This model was used to predict the CI for all colostrum...

  10. A mechanistic model for electricity consumption on dairy farms: Definition, validation, and demonstration

    NARCIS (Netherlands)

    Upton, J.R.; Murphy, M.; Shallo, L.; Groot Koerkamp, P.W.G.; Boer, de I.J.M.

    2014-01-01

    Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on

  11. Mechanistic modelling of cancer: some reflections from software engineering and philosophy of science.

    Science.gov (United States)

    Cañete-Valdeón, José M; Wieringa, Roel; Smallbone, Kieran

    2012-12-01

    There is a growing interest in mathematical mechanistic modelling as a promising strategy for understanding tumour progression. This approach is accompanied by a methodological change of making research, in which models help to actively generate hypotheses instead of waiting for general principles to become apparent once sufficient data are accumulated. This paper applies recent research from philosophy of science to uncover three important problems of mechanistic modelling which may compromise its mainstream application, namely: the dilemma of formal and informal descriptions, the need to express degrees of confidence and the need of an argumentation framework. We report experience and research on similar problems from software engineering and provide evidence that the solutions adopted there can be transferred to the biological domain. We hope this paper can provoke new opportunities for further and profitable interdisciplinary research in the field.

  12. Comparative ecophysiology of two sympatric lizards. Laying the groundwork for mechanistic distribution models

    Directory of Open Access Journals (Sweden)

    Enrique García-Muñoz

    2013-12-01

    Full Text Available Distribution modelling usually makes inferences correlating species presence and environmental variables but does not take biotic relations into account. Alternative approaches based on a mechanistic understanding of biological processes are now being applied. Regarding lacertid lizards, physiological traits such as preferred body temperature (Tp are well known to correlate with several physiological optima. Much less is known about their water ecology although body temperature and evaporative water loss (Wl may trade-off. Two saxicolous lacertids, Algyroides marchi and Podarcis hispanica ss are sympatric in the Subbetic Mountains (SE Spain were they can be found in syntopy. Previous distribution modelling indicates the first species is associated with mountains, low temperatures; high precipitation and forest cover whereas the second one is more generalistic. Here, we perform two ecophysiological tests with both species: a Tp experiment in thermal gradient and a Wl experiment in sealed chambers. Although both species attained similar body temperatures, A. marchi lost more water and more uniformly in time than P. hispanica ss that displayed an apparent response to dehydration. These results suggest that water loss rather temperature is crucial to explain the distribution patterns of A. marchi in relation to P. hispanica ss, the former risking dehydration in dry areas no matter what temperature is. Ecophysiological traits represent a promising tool to build future mechanistic models for (lacertid lizards. Additionally, the implications for their biogeography and conservation are discussed.

  13. A mechanistic nitrogen limitation model for CLM(ED)

    Science.gov (United States)

    Ali, A. A.; Xu, C.; McDowell, N. G.; Rogers, A.; Wullschleger, S. D.; Fisher, R.; Vrugt, J. A.

    2014-12-01

    Photosynthetic capacity is a key plant trait that determines the rate of photosynthesis; however, in Earth System Models it is either a fixed value or derived from a linear function of leaf nitrogen content. A mechanistic leaf nitrogen allocation model have been developed for a DOE-sponsored Community Land Model coupled to the Ecosystem Demography model (CLM-ED) to predict the photosynthetic capacity [Vc,max25 (μmol CO2 m-2 s-1)] under different environmental conditions at the global scale. We collected more than 800 data points of photosynthetic capacity (Vc,max25) for 124 species from 57 studies with the corresponding leaf nitrogen content and environmental conditions (temperature, radiation, humidity and day length) from literature and the NGEE arctic site (Barrow). Based on the data, we found that environmental control of Vc,max25 is about 4 times stronger than the leaf nitrogen content. Using the Markov-Chain Monte Carlo simulation approach, we fitted the collected data to our newly developed nitrogen allocation model, which predict the leaf nitrogen investment in different components including structure, storage, respiration, light capture, carboxylation and electron transport at different environmental conditions. Our results showed that our nitrogen allocation model explained 52% of variance in observed Vc,max25 and 65% variance in observed Jmax25 using a single set of fitted model parameters for all species. Across the growing season, we found that the modeled Vc,max25 explained 49% of the variability in measured Vc,max25. In the context of future global warming, our model predicts that a temperature increase by 5oC and the doubling of atmospheric carbon dioxide reduced the Vc,max25 by 5%, 11%, respectively.

  14. Application of a Mechanistic Model as a Tool for On-line Monitoring of Pilot Scale Filamentous Fungal Fermentation Processes - The Importance of Evaporation Effects

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads Orla

    2017-01-01

    A mechanistic model-based soft sensor is developed and validated for 550L filamentous fungus fermentations operated at Novozymes A/S. The soft sensor is comprised of a parameter estimation block based on a stoichiometric balance, coupled to a dynamic process model. The on-line parameter estimation...... a historical dataset of eleven batches from the fermentation pilot plant (550L) at Novozymes A/S. The model is then implemented on-line in 550L fermentation processes operated at Novozymes A/S in order to validate the state estimator model on fourteen new batches utilizing a new strain. The product...... block models the changing rates of formation of product, biomass, and water, and the rate of consumption of feed using standard, available on-line measurements. This parameter estimation block, is coupled to a mechanistic process model, which solves the current states of biomass, product, substrate...

  15. Mechanistic model of mass-specific basal metabolic rate: evaluation in healthy young adults.

    Science.gov (United States)

    Wang, Z; Bosy-Westphal, A; Schautz, B; Müller, M

    2011-12-01

    Mass-specific basal metabolic rate (mass-specific BMR), defined as the resting energy expenditure per unit body mass per day, is an important parameter in energy metabolism research. However, a mechanistic explanation for magnitude of mass-specific BMR remains lacking. The objective of the present study was to validate the applicability of a proposed mass-specific BMR model in healthy adults. A mechanistic model was developed at the organ-tissue level, mass-specific BMR = Σ( K i × F i ), where Fi is the fraction of body mass as individual organs and tissues, and K i is the specific resting metabolic rate of major organs and tissues. The Fi values were measured by multiple MRI scans and the K i values were suggested by Elia in 1992. A database of healthy non-elderly non-obese adults (age 20 - 49 yrs, BMI BMR of all subjects was 21.6 ± 1.9 (mean ± SD) and 21.7 ± 1.6 kcal/kg per day, respectively. The measured mass-specific BMR was correlated with the predicted mass-specific BMR (r = 0.82, P BMR, versus the average of measured and predicted mass-specific BMR. In conclusion, the proposed mechanistic model was validated in non-elderly non-obese adults and can help to understand the inherent relationship between mass-specific BMR and body composition.

  16. A new mechanistic and engineering fission gas release model for a uranium dioxide fuel

    International Nuclear Information System (INIS)

    Lee, Chan Bock; Yang, Yong Sik; Kim, Dae Ho; Kim, Sun Ki; Bang, Je Geun

    2008-01-01

    A mechanistic and engineering fission gas release model (MEGA) for uranium dioxide (UO 2 ) fuel was developed. It was based upon the diffusional release of fission gases from inside the grain to the grain boundary and the release of fission gases from the grain boundary to the external surface by the interconnection of the fission gas bubbles in the grain boundary. The capability of the MEGA model was validated by a comparison with the fission gas release data base and the sensitivity analyses of the parameters. It was found that the MEGA model correctly predicts the fission gas release in the broad range of fuel burnups up to 98 MWd/kgU. Especially, the enhancement of fission gas release in a high-burnup fuel, and the reduction of fission gas release at a high burnup by increasing the UO 2 grain size were found to be correctly predicted by the MEGA model without using any artificial factor. (author)

  17. Rotary ultrasonic machining of CFRP: a mechanistic predictive model for cutting force.

    Science.gov (United States)

    Cong, W L; Pei, Z J; Sun, X; Zhang, C L

    2014-02-01

    Cutting force is one of the most important output variables in rotary ultrasonic machining (RUM) of carbon fiber reinforced plastic (CFRP) composites. Many experimental investigations on cutting force in RUM of CFRP have been reported. However, in the literature, there are no cutting force models for RUM of CFRP. This paper develops a mechanistic predictive model for cutting force in RUM of CFRP. The material removal mechanism of CFRP in RUM has been analyzed first. The model is based on the assumption that brittle fracture is the dominant mode of material removal. CFRP micromechanical analysis has been conducted to represent CFRP as an equivalent homogeneous material to obtain the mechanical properties of CFRP from its components. Based on this model, relationships between input variables (including ultrasonic vibration amplitude, tool rotation speed, feedrate, abrasive size, and abrasive concentration) and cutting force can be predicted. The relationships between input variables and important intermediate variables (indentation depth, effective contact time, and maximum impact force of single abrasive grain) have been investigated to explain predicted trends of cutting force. Experiments are conducted to verify the model, and experimental results agree well with predicted trends from this model. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Mechanistic CHF modeling for natural circulation applications in SMR

    Energy Technology Data Exchange (ETDEWEB)

    Luitjens, Jeffrey [Department of Nuclear Engineering and Radiation Health Physics, Oregon State University, 3451 SW Jefferson Way, Corvallis, OR 97331 (United States); Wu, Qiao, E-mail: qiao.wu@oregonstate.edu [Department of Nuclear Engineering and Radiation Health Physics, Oregon State University, 3451 SW Jefferson Way, Corvallis, OR 97331 (United States); Greenwood, Scott; Corradini, Michael [Department of Engineering Physics, University of Wisconsin, 1415 Engineering Drive, Madison, WI 53706 (United States)

    2016-12-15

    A mechanistic critical heat flux correlation has been developed for a wide range of operating conditions which include low mass fluxes of 540–890 kg/m{sup 2}-s, high pressures of 12–13 MPa, and critical heat fluxes of 835–1100 kW/m{sup 2}. Eleven experimental data points have been collected over these conditions to inform the development of the model using bundle geometry. Errors of within 15% have been obtained with the proposed model for predicting the critical heat flux value, location, and critical pin power for a non-uniform heat flux applied to a 2 × 2 bundle configuration.

  19. Mechanistic curiosity will not kill the Bayesian cat

    NARCIS (Netherlands)

    Borsboom, D.; Wagenmakers, E.-J.; Romeijn, J.-W.

    2011-01-01

    Jones & Love (J&L) suggest that Bayesian approaches to the explanation of human behavior should be constrained by mechanistic theories. We argue that their proposal misconstrues the relation between process models, such as the Bayesian model, and mechanisms. While mechanistic theories can answer

  20. Mechanistic curiosity will not kill the Bayesian cat

    NARCIS (Netherlands)

    Borsboom, Denny; Wagenmakers, Eric-Jan; Romeijn, Jan-Willem

    Jones & Love (J&L) suggest that Bayesian approaches to the explanation of human behavior should be constrained by mechanistic theories. We argue that their proposal misconstrues the relation between process models, such as the Bayesian model, and mechanisms. While mechanistic theories can answer

  1. An Emphasis on Perception: Teaching Image Formation Using a Mechanistic Model of Vision.

    Science.gov (United States)

    Allen, Sue; And Others

    An effective way to teach the concept of image is to give students a model of human vision which incorporates a simple mechanism of depth perception. In this study two almost identical versions of a curriculum in geometrical optics were created. One used a mechanistic, interpretive eye model, and in the other the eye was modeled as a passive,…

  2. Improving Predictive Modeling in Pediatric Drug Development: Pharmacokinetics, Pharmacodynamics, and Mechanistic Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Slikker, William; Young, John F.; Corley, Rick A.; Dorman, David C.; Conolly, Rory B.; Knudsen, Thomas; Erstad, Brian L.; Luecke, Richard H.; Faustman, Elaine M.; Timchalk, Chuck; Mattison, Donald R.

    2005-07-26

    A workshop was conducted on November 18?19, 2004, to address the issue of improving predictive models for drug delivery to developing humans. Although considerable progress has been made for adult humans, large gaps remain for predicting pharmacokinetic/pharmacodynamic (PK/PD) outcome in children because most adult models have not been tested during development. The goals of the meeting included a description of when, during development, infants/children become adultlike in handling drugs. The issue of incorporating the most recent advances into the predictive models was also addressed: both the use of imaging approaches and genomic information were considered. Disease state, as exemplified by obesity, was addressed as a modifier of drug pharmacokinetics and pharmacodynamics during development. Issues addressed in this workshop should be considered in the development of new predictive and mechanistic models of drug kinetics and dynamics in the developing human.

  3. Fidelity in Animal Modeling: Prerequisite for a Mechanistic Research Front Relevant to the Inflammatory Incompetence of Acute Pediatric Malnutrition

    Science.gov (United States)

    Woodward, Bill

    2016-01-01

    Inflammatory incompetence is characteristic of acute pediatric protein-energy malnutrition, but its underlying mechanisms remain obscure. Perhaps substantially because the research front lacks the driving force of a scholarly unifying hypothesis, it is adrift and research activity is declining. A body of animal-based research points to a unifying paradigm, the Tolerance Model, with some potential to offer coherence and a mechanistic impetus to the field. However, reasonable skepticism prevails regarding the relevance of animal models of acute pediatric malnutrition; consequently, the fundamental contributions of the animal-based component of this research front are largely overlooked. Design-related modifications to improve the relevance of animal modeling in this research front include, most notably, prioritizing essential features of pediatric malnutrition pathology rather than dietary minutiae specific to infants and children, selecting windows of experimental animal development that correspond to targeted stages of pediatric immunological ontogeny, and controlling for ontogeny-related confounders. In addition, important opportunities are presented by newer tools including the immunologically humanized mouse and outbred stocks exhibiting a magnitude of genetic heterogeneity comparable to that of human populations. Sound animal modeling is within our grasp to stimulate and support a mechanistic research front relevant to the immunological problems that accompany acute pediatric malnutrition. PMID:27077845

  4. Fidelity in Animal Modeling: Prerequisite for a Mechanistic Research Front Relevant to the Inflammatory Incompetence of Acute Pediatric Malnutrition.

    Science.gov (United States)

    Woodward, Bill

    2016-04-11

    Inflammatory incompetence is characteristic of acute pediatric protein-energy malnutrition, but its underlying mechanisms remain obscure. Perhaps substantially because the research front lacks the driving force of a scholarly unifying hypothesis, it is adrift and research activity is declining. A body of animal-based research points to a unifying paradigm, the Tolerance Model, with some potential to offer coherence and a mechanistic impetus to the field. However, reasonable skepticism prevails regarding the relevance of animal models of acute pediatric malnutrition; consequently, the fundamental contributions of the animal-based component of this research front are largely overlooked. Design-related modifications to improve the relevance of animal modeling in this research front include, most notably, prioritizing essential features of pediatric malnutrition pathology rather than dietary minutiae specific to infants and children, selecting windows of experimental animal development that correspond to targeted stages of pediatric immunological ontogeny, and controlling for ontogeny-related confounders. In addition, important opportunities are presented by newer tools including the immunologically humanized mouse and outbred stocks exhibiting a magnitude of genetic heterogeneity comparable to that of human populations. Sound animal modeling is within our grasp to stimulate and support a mechanistic research front relevant to the immunological problems that accompany acute pediatric malnutrition.

  5. A mechanistic Eulerian-Lagrangian model for dispersed flow film boiling

    International Nuclear Information System (INIS)

    Andreani, M.; Yadigaroglu, G.

    1991-01-01

    In this paper a new mechanistic model of heat transfer in the dispersed flow regime is presented. The usual assumptions that render most of the available models unsuitable for the analysis of the reflooding phase of the LOCA are discussed, and a two-dimensional time-independent numerical model is developed. The gas temperature field is solved in a fixed-grid (Eulerian) mesh, with the droplets behaving as mass and energy sources. The histories of a large number of computational droplets are followed in a Lagrangian frame, considering evaporation, break-up and interactions with the vapor and with the wall. comparisons of calculated wall and vapor temperatures with experimental data are shown for two reflooding tests

  6. Mechanistic variables can enhance predictive models of endotherm distributions: The American pika under current, past, and future climates

    Science.gov (United States)

    Mathewson, Paul; Moyer-Horner, Lucas; Beever, Erik; Briscoe, Natalie; Kearney, Michael T.; Yahn, Jeremiah; Porter, Warren P.

    2017-01-01

    How climate constrains species’ distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8–19% less habitat loss in response to annual temperature increases of ~3–5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect

  7. Mechanistic variables can enhance predictive models of endotherm distributions: the American pika under current, past, and future climates.

    Science.gov (United States)

    Mathewson, Paul D; Moyer-Horner, Lucas; Beever, Erik A; Briscoe, Natalie J; Kearney, Michael; Yahn, Jeremiah M; Porter, Warren P

    2017-03-01

    How climate constrains species' distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8-19% less habitat loss in response to annual temperature increases of ~3-5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect

  8. Development of mechanistic sorption model and treatment of uncertainties for Ni sorption on montmorillonite/bentonite

    International Nuclear Information System (INIS)

    Ochs, Michael; Ganter, Charlotte; Tachi, Yukio; Suyama, Tadahiro; Yui, Mikazu

    2011-02-01

    Sorption and diffusion of radionuclides in buffer materials (bentonite) are the key processes in the safe geological disposal of radioactive waste, because migration of radionuclides in this barrier is expected to be diffusion-controlled and retarded by sorption processes. It is therefore necessary to understand the detailed/coupled processes of sorption and diffusion in compacted bentonite and develop mechanistic /predictive models, so that reliable parameters can be set under a variety of geochemical conditions relevant to performance assessment (PA). For this purpose, JAEA has developed the integrated sorption and diffusion (ISD) model/database in montmorillonite/bentonite systems. The main goal of the mechanistic model/database development is to provide a tool for a consistent explanation, prediction, and uncertainty assessment of K d as well as diffusion parameters needed for the quantification of radionuclide transport. The present report focuses on developing the thermodynamic sorption model (TSM) and on the quantification and handling of model uncertainties in applications, based on illustrating by example of Ni sorption on montmorillonite/bentonite. This includes 1) a summary of the present state of the art of thermodynamic sorption modeling, 2) a discussion of the selection of surface species and model design appropriate for the present purpose, 3) possible sources and representations of TSM uncertainties, and 4) details of modeling, testing and uncertainty evaluation for Ni sorption. Two fundamentally different approaches are presented and compared for representing TSM uncertainties: 1) TSM parameter uncertainties calculated by FITEQL optimization routines and some statistical procedure, 2) overall error estimated by direct comparison of modeled and experimental K d values. The overall error in K d is viewed as the best representation of model uncertainty in ISD model/database development. (author)

  9. Rational and Mechanistic Perspectives on Reinforcement Learning

    Science.gov (United States)

    Chater, Nick

    2009-01-01

    This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: "mechanistic" and "rational." Reinforcement learning is often viewed in mechanistic terms--as…

  10. Mechanistic Systems Modeling to Improve Understanding and Prediction of Cardiotoxicity Caused by Targeted Cancer Therapeutics

    Directory of Open Access Journals (Sweden)

    Jaehee V. Shim

    2017-09-01

    Full Text Available Tyrosine kinase inhibitors (TKIs are highly potent cancer therapeutics that have been linked with serious cardiotoxicity, including left ventricular dysfunction, heart failure, and QT prolongation. TKI-induced cardiotoxicity is thought to result from interference with tyrosine kinase activity in cardiomyocytes, where these signaling pathways help to control critical processes such as survival signaling, energy homeostasis, and excitation–contraction coupling. However, mechanistic understanding is limited at present due to the complexities of tyrosine kinase signaling, and the wide range of targets inhibited by TKIs. Here, we review the use of TKIs in cancer and the cardiotoxicities that have been reported, discuss potential mechanisms underlying cardiotoxicity, and describe recent progress in achieving a more systematic understanding of cardiotoxicity via the use of mechanistic models. In particular, we argue that future advances are likely to be enabled by studies that combine large-scale experimental measurements with Quantitative Systems Pharmacology (QSP models describing biological mechanisms and dynamics. As such approaches have proven extremely valuable for understanding and predicting other drug toxicities, it is likely that QSP modeling can be successfully applied to cardiotoxicity induced by TKIs. We conclude by discussing a potential strategy for integrating genome-wide expression measurements with models, illustrate initial advances in applying this approach to cardiotoxicity, and describe challenges that must be overcome to truly develop a mechanistic and systematic understanding of cardiotoxicity caused by TKIs.

  11. A tissue-engineered gastric cancer model for mechanistic study of anti-tumor drugs

    International Nuclear Information System (INIS)

    Gao, Ming; Cai, Yiting; Wu, Wei; Shi, Yazhou; Fei, Zhewei

    2013-01-01

    The use of the traditional xenograft subcutaneous tumor model has been contested because of its limitations, such as a slow tumorigenesis, inconsistent chemotherapeutic results, etc. In light of these challenges, we aim to revamp the traditional model by employing an electrospun scaffold composed of polydioxanone, gelatin and elastin to boost the tumorigenesis. The scaffold featured a highly porous microstructure and successfully supported the growth of tumor cells in vitro without provoking apoptosis. In vivo studies showed that in the scaffold model the tumor volume increased by 43.27% and the weight by 75.58%, respectively, within a 12-week period. In addition, the scaffold model saw an increase of CD24 + and CD44 + cells in the tumor mass by 42% and 313%, respectively. The scaffolding materials did not lead to phenotypic changes during the tumorigenesis. Thereafter, in the scaffold model, we found that the chemotherapeutic regimen of docetaxel, cisplatin and fluorouracil unleashed a stronger capability than the regimen comprising cisplatin and fluorouracil to deplete the CD44 + subpopulation. This discovery sheds mechanistic lights on the role of docetaxel for its future chemotherapeutic applications. This revamped model affords cancer scientists a convenient and reliable platform to mechanistically investigate the chemotherapeutic drugs on gastric cancer stem cells. (paper)

  12. Comparison of Two-Phase Pipe Flow in OpenFOAM with a Mechanistic Model

    Science.gov (United States)

    Shuard, Adrian M.; Mahmud, Hisham B.; King, Andrew J.

    2016-03-01

    Two-phase pipe flow is a common occurrence in many industrial applications such as power generation and oil and gas transportation. Accurate prediction of liquid holdup and pressure drop is of vast importance to ensure effective design and operation of fluid transport systems. In this paper, a Computational Fluid Dynamics (CFD) study of a two-phase flow of air and water is performed using OpenFOAM. The two-phase solver, interFoam is used to identify flow patterns and generate values of liquid holdup and pressure drop, which are compared to results obtained from a two-phase mechanistic model developed by Petalas and Aziz (2002). A total of 60 simulations have been performed at three separate pipe inclinations of 0°, +10° and -10° respectively. A three dimensional, 0.052m diameter pipe of 4m length is used with the Shear Stress Transport (SST) k - ɷ turbulence model to solve the turbulent mixtures of air and water. Results show that the flow pattern behaviour and numerical values of liquid holdup and pressure drop compare reasonably well to the mechanistic model.

  13. A rigorous mechanistic model for predicting gas hydrate formation kinetics: The case of CO2 recovery and sequestration

    International Nuclear Information System (INIS)

    ZareNezhad, Bahman; Mottahedin, Mona

    2012-01-01

    Highlights: ► A mechanistic model for predicting gas hydrate formation kinetics is presented. ► A secondary nucleation rate model is proposed for the first time. ► Crystal–crystal collisions and crystal–impeller collisions are distinguished. ► Simultaneous determination of nucleation and growth kinetics are established. ► Important for design of gas hydrate based energy storage and CO 2 recovery systems. - Abstract: A rigorous mechanistic model for predicting gas hydrate formation crystallization kinetics is presented and the special case of CO 2 gas hydrate formation regarding CO 2 recovery and sequestration processes has been investigated by using the proposed model. A physical model for prediction of secondary nucleation rate is proposed for the first time and the formation rates of secondary nuclei by crystal–crystal collisions and crystal–impeller collisions are formulated. The objective functions for simultaneous determination of nucleation and growth kinetics are presented and a theoretical framework for predicting the dynamic behavior of gas hydrate formation is presented. Predicted time variations of CO 2 content, total number and surface area of produced hydrate crystals are in good agreement with the available experimental data. The proposed approach can have considerable application for design of gas hydrate converters regarding energy storage and CO 2 recovery processes.

  14. A mechanistic modeling and data assimilation framework for Mojave Desert ecohydrology

    Science.gov (United States)

    Ng, Gene-Hua Crystal; Bedford, David R.; Miller, David M.

    2014-06-01

    This study demonstrates and addresses challenges in coupled ecohydrological modeling in deserts, which arise due to unique plant adaptations, marginal growing conditions, slow net primary production rates, and highly variable rainfall. We consider model uncertainty from both structural and parameter errors and present a mechanistic model for the shrub Larrea tridentata (creosote bush) under conditions found in the Mojave National Preserve in southeastern California (USA). Desert-specific plant and soil features are incorporated into the CLM-CN model by Oleson et al. (2010). We then develop a data assimilation framework using the ensemble Kalman filter (EnKF) to estimate model parameters based on soil moisture and leaf-area index observations. A new implementation procedure, the "multisite loop EnKF," tackles parameter estimation difficulties found to affect desert ecohydrological applications. Specifically, the procedure iterates through data from various observation sites to alleviate adverse filter impacts from non-Gaussianity in small desert vegetation state values. It also readjusts inconsistent parameters and states through a model spin-up step that accounts for longer dynamical time scales due to infrequent rainfall in deserts. Observation error variance inflation may also be needed to help prevent divergence of estimates from true values. Synthetic test results highlight the importance of adequate observations for reducing model uncertainty, which can be achieved through data quality or quantity.

  15. A mechanistic modeling and data assimilation framework for Mojave Desert ecohydrology

    Science.gov (United States)

    Ng, Gene-Hua Crystal.; Bedford, David; Miller, David

    2014-01-01

    This study demonstrates and addresses challenges in coupled ecohydrological modeling in deserts, which arise due to unique plant adaptations, marginal growing conditions, slow net primary production rates, and highly variable rainfall. We consider model uncertainty from both structural and parameter errors and present a mechanistic model for the shrub Larrea tridentata (creosote bush) under conditions found in the Mojave National Preserve in southeastern California (USA). Desert-specific plant and soil features are incorporated into the CLM-CN model by Oleson et al. (2010). We then develop a data assimilation framework using the ensemble Kalman filter (EnKF) to estimate model parameters based on soil moisture and leaf-area index observations. A new implementation procedure, the “multisite loop EnKF,” tackles parameter estimation difficulties found to affect desert ecohydrological applications. Specifically, the procedure iterates through data from various observation sites to alleviate adverse filter impacts from non-Gaussianity in small desert vegetation state values. It also readjusts inconsistent parameters and states through a model spin-up step that accounts for longer dynamical time scales due to infrequent rainfall in deserts. Observation error variance inflation may also be needed to help prevent divergence of estimates from true values. Synthetic test results highlight the importance of adequate observations for reducing model uncertainty, which can be achieved through data quality or quantity.

  16. A mechanistic model for the evolution of multicellularity

    Science.gov (United States)

    Amado, André; Batista, Carlos; Campos, Paulo R. A.

    2018-02-01

    Through a mechanistic approach we investigate the formation of aggregates of variable sizes, accounting mechanisms of aggregation, dissociation, death and reproduction. In our model, cells can produce two metabolites, but the simultaneous production of both metabolites is costly in terms of fitness. Thus, the formation of larger groups can favor the aggregates to evolve to a configuration where division of labor arises. It is assumed that the states of the cells in a group are those that maximize organismal fitness. In the model it is considered that the groups can grow linearly, forming a chain, or compactly keeping a roughly spherical shape. Starting from a population consisting of single-celled organisms, we observe the formation of groups with variable sizes and usually much larger than two-cell aggregates. Natural selection can favor the formation of large groups, which allows the system to achieve new and larger fitness maxima.

  17. A mechanistic model for electricity consumption on dairy farms: definition, validation, and demonstration.

    Science.gov (United States)

    Upton, J; Murphy, M; Shalloo, L; Groot Koerkamp, P W G; De Boer, I J M

    2014-01-01

    Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical data of 1yr on commercial spring calving, grass-based dairy farms with 45, 88, and 195 milking cows; and (3) demonstrated the functionality of the model by applying 2 electricity tariffs to the electricity consumption data and examining the effect on total dairy farm electricity costs. The MECD was developed using a mechanistic modeling approach and required the key inputs of milk production, cow number, and details relating to the milk-cooling system, milking machine system, water-heating system, lighting systems, water pump systems, and the winter housing facilities as well as details relating to the management of the farm (e.g., season of calving). Model validation showed an overall relative prediction error (RPE) of less than 10% for total electricity consumption. More than 87% of the mean square prediction error of total electricity consumption was accounted for by random variation. The RPE values of the milk-cooling systems, water-heating systems, and milking machine systems were less than 20%. The RPE values for automatic scraper systems, lighting systems, and water pump systems varied from 18 to 113%, indicating a poor prediction for these metrics. However, automatic scrapers, lighting, and water pumps made up only 14% of total electricity consumption across all farms, reducing the overall impact of these poor predictions. Demonstration of the model showed that total farm electricity costs increased by between 29 and 38% by moving from a day and night tariff to a flat

  18. New web-based applications for mechanistic case diagramming

    Directory of Open Access Journals (Sweden)

    Fred R. Dee

    2014-07-01

    Full Text Available The goal of mechanistic case diagraming (MCD is to provide students with more in-depth understanding of cause and effect relationships and basic mechanistic pathways in medicine. This will enable them to better explain how observed clinical findings develop from preceding pathogenic and pathophysiological events. The pedagogic function of MCD is in relating risk factors, disease entities and morphology, signs and symptoms, and test and procedure findings in a specific case scenario with etiologic pathogenic and pathophysiological sequences within a flow diagram. In this paper, we describe the addition of automation and predetermined lists to further develop the original concept of MCD as described by Engelberg in 1992 and Guerrero in 2001. We demonstrate that with these modifications, MCD is effective and efficient in small group case-based teaching for second-year medical students (ratings of ~3.4 on a 4.0 scale. There was also a significant correlation with other measures of competency, with a ‘true’ score correlation of 0.54. A traditional calculation of reliability showed promising results (α =0.47 within a low stakes, ungraded environment. Further, we have demonstrated MCD's potential for use in independent learning and TBL. Future studies are needed to evaluate MCD's potential for use in medium stakes assessment or self-paced independent learning and assessment. MCD may be especially relevant in returning students to the application of basic medical science mechanisms in the clinical years.

  19. Mechanistic modelling of the drying behaviour of single pharmaceutical granules

    DEFF Research Database (Denmark)

    Thérèse F.C. Mortier, Séverine; Beer, Thomas De; Gernaey, Krist

    2012-01-01

    The trend to move towards continuous production processes in pharmaceutical applications enhances the necessity to develop mechanistic models to understand and control these processes. This work focuses on the drying behaviour of a single wet granule before tabletting, using a six...... phase (submodel 2), the water inside the granule evaporates. The second submodel contains an empirical power coefficient, b. A sensitivity analysis was performed to study the influence of parameters on the moisture content of single pharmaceutical granules, which clearly points towards the importance...

  20. A Mechanistic Beta-Binomial Probability Model for mRNA Sequencing Data.

    Science.gov (United States)

    Smith, Gregory R; Birtwistle, Marc R

    2016-01-01

    A main application for mRNA sequencing (mRNAseq) is determining lists of differentially-expressed genes (DEGs) between two or more conditions. Several software packages exist to produce DEGs from mRNAseq data, but they typically yield different DEGs, sometimes markedly so. The underlying probability model used to describe mRNAseq data is central to deriving DEGs, and not surprisingly most softwares use different models and assumptions to analyze mRNAseq data. Here, we propose a mechanistic justification to model mRNAseq as a binomial process, with data from technical replicates given by a binomial distribution, and data from biological replicates well-described by a beta-binomial distribution. We demonstrate good agreement of this model with two large datasets. We show that an emergent feature of the beta-binomial distribution, given parameter regimes typical for mRNAseq experiments, is the well-known quadratic polynomial scaling of variance with the mean. The so-called dispersion parameter controls this scaling, and our analysis suggests that the dispersion parameter is a continually decreasing function of the mean, as opposed to current approaches that impose an asymptotic value to the dispersion parameter at moderate mean read counts. We show how this leads to current approaches overestimating variance for moderately to highly expressed genes, which inflates false negative rates. Describing mRNAseq data with a beta-binomial distribution thus may be preferred since its parameters are relatable to the mechanistic underpinnings of the technique and may improve the consistency of DEG analysis across softwares, particularly for moderately to highly expressed genes.

  1. Soil pH controls the environmental availability of phosphorus: Experimental and mechanistic modelling approaches

    International Nuclear Information System (INIS)

    Devau, Nicolas; Cadre, Edith Le; Hinsinger, Philippe; Jaillard, Benoit; Gerard, Frederic

    2009-01-01

    Inorganic P is the least mobile major nutrient in most soils and is frequently the prime limiting factor for plant growth in terrestrial ecosystems. In this study, the extraction of soil inorganic P with CaCl 2 (P-CaCl 2 ) and geochemical modelling were combined in order to unravel the processes controlling the environmentally available P (EAP) of a soil over a range of pH values (pH ∼ 4-10). Mechanistic descriptions of the adsorption of cations and anions by the soil constituents were used (1-pK Triple Plane, ion-exchange and NICA-Donnan models). These models are implemented into the geochemical code Visual MINTEQ. An additive approach was used for their application to the surface horizon of a Cambisol. The geochemical code accurately reproduced the concentration of extracted P at the different soil pH values (R 2 = 0.9, RMSE = 0.03 mg kg -1 ). Model parameters were either directly found in the literature or estimated by fitting published experimental results in single mineral systems. The strong agreement between measurements and modelling results demonstrated that adsorption processes exerted a major control on the EAP of the soil over a large range of pH values. An influence of the precipitation of P-containing mineral is discounted based on thermodynamic calculations. Modelling results indicated that the variations in P-CaCl 2 with soil pH were controlled by the deprotonation/protonation of the surface hydroxyl groups, the distribution of P surface complexes, and the adsorption of Ca and Cl from the electrolyte background. Iron-oxides and gibbsite were found to be the major P-adsorbing soil constituents at acidic and alkaline pHs, whereas P was mainly adsorbed by clay minerals at intermediate pH values. This study demonstrates the efficacy of geochemical modelling to understand soil processes, and the applicability of mechanistic adsorption models to a 'real' soil, with its mineralogical complexity and the additional contribution of soil organic matter.

  2. Soil pH controls the environmental availability of phosphorus: Experimental and mechanistic modelling approaches

    Energy Technology Data Exchange (ETDEWEB)

    Devau, Nicolas [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Cadre, Edith Le [Supagro, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Hinsinger, Philippe; Jaillard, Benoit [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Gerard, Frederic, E-mail: gerard@supagro.inra.fr [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France)

    2009-11-15

    Inorganic P is the least mobile major nutrient in most soils and is frequently the prime limiting factor for plant growth in terrestrial ecosystems. In this study, the extraction of soil inorganic P with CaCl{sub 2} (P-CaCl{sub 2}) and geochemical modelling were combined in order to unravel the processes controlling the environmentally available P (EAP) of a soil over a range of pH values (pH {approx} 4-10). Mechanistic descriptions of the adsorption of cations and anions by the soil constituents were used (1-pK Triple Plane, ion-exchange and NICA-Donnan models). These models are implemented into the geochemical code Visual MINTEQ. An additive approach was used for their application to the surface horizon of a Cambisol. The geochemical code accurately reproduced the concentration of extracted P at the different soil pH values (R{sup 2} = 0.9, RMSE = 0.03 mg kg{sup -1}). Model parameters were either directly found in the literature or estimated by fitting published experimental results in single mineral systems. The strong agreement between measurements and modelling results demonstrated that adsorption processes exerted a major control on the EAP of the soil over a large range of pH values. An influence of the precipitation of P-containing mineral is discounted based on thermodynamic calculations. Modelling results indicated that the variations in P-CaCl{sub 2} with soil pH were controlled by the deprotonation/protonation of the surface hydroxyl groups, the distribution of P surface complexes, and the adsorption of Ca and Cl from the electrolyte background. Iron-oxides and gibbsite were found to be the major P-adsorbing soil constituents at acidic and alkaline pHs, whereas P was mainly adsorbed by clay minerals at intermediate pH values. This study demonstrates the efficacy of geochemical modelling to understand soil processes, and the applicability of mechanistic adsorption models to a 'real' soil, with its mineralogical complexity and the additional

  3. Development and validation of deterioration models for concrete bridge decks - phase 2 : mechanics-based degradation models.

    Science.gov (United States)

    2013-06-01

    This report summarizes a research project aimed at developing degradation models for bridge decks in the state of Michigan based on durability mechanics. A probabilistic framework to implement local-level mechanistic-based models for predicting the c...

  4. Comparison of Two-Phase Pipe Flow in OpenFOAM with a Mechanistic Model

    International Nuclear Information System (INIS)

    Shuard, Adrian M; Mahmud, Hisham B; King, Andrew J

    2016-01-01

    Two-phase pipe flow is a common occurrence in many industrial applications such as power generation and oil and gas transportation. Accurate prediction of liquid holdup and pressure drop is of vast importance to ensure effective design and operation of fluid transport systems. In this paper, a Computational Fluid Dynamics (CFD) study of a two-phase flow of air and water is performed using OpenFOAM. The two-phase solver, interFoam is used to identify flow patterns and generate values of liquid holdup and pressure drop, which are compared to results obtained from a two-phase mechanistic model developed by Petalas and Aziz (2002). A total of 60 simulations have been performed at three separate pipe inclinations of 0°, +10° and -10° respectively. A three dimensional, 0.052m diameter pipe of 4m length is used with the Shear Stress Transport (SST) k - ω turbulence model to solve the turbulent mixtures of air and water. Results show that the flow pattern behaviour and numerical values of liquid holdup and pressure drop compare reasonably well to the mechanistic model. (paper)

  5. Toward a mechanistic modeling of nitrogen limitation for photosynthesis

    Science.gov (United States)

    Xu, C.; Fisher, R. A.; Travis, B. J.; Wilson, C. J.; McDowell, N. G.

    2011-12-01

    The nitrogen limitation is an important regulator for vegetation growth and global carbon cycle. Most current ecosystem process models simulate nitrogen effects on photosynthesis based on a prescribed relationship between leaf nitrogen and photosynthesis; however, there is a large amount of variability in this relationship with different light, temperature, nitrogen availability and CO2 conditions, which can affect the reliability of photosynthesis prediction under future climate conditions. To account for the variability in nitrogen-photosynthesis relationship under different environmental conditions, in this study, we developed a mechanistic model of nitrogen limitation for photosynthesis based on nitrogen trade-offs among light absorption, electron transport, carboxylization and carbon sink. Our model shows that strategies of nitrogen storage allocation as determined by tradeoff among growth and persistence is a key factor contributing to the variability in relationship between leaf nitrogen and photosynthesis. Nitrogen fertilization substantially increases the proportion of nitrogen in storage for coniferous trees but much less for deciduous trees, suggesting that coniferous trees allocate more nitrogen toward persistence compared to deciduous trees. The CO2 fertilization will cause lower nitrogen allocation for carboxylization but higher nitrogen allocation for storage, which leads to a weaker relationship between leaf nitrogen and maximum photosynthesis rate. Lower radiation will cause higher nitrogen allocation for light absorption and electron transport but less nitrogen allocation for carboxylyzation and storage, which also leads to weaker relationship between leaf nitrogen and maximum photosynthesis rate. At the same time, lower growing temperature will cause higher nitrogen allocation for carboxylyzation but lower allocation for light absorption, electron transport and storage, which leads to a stronger relationship between leaf nitrogen and maximum

  6. Wear-dependent specific coefficients in a mechanistic model for turning of nickel-based superalloy with ceramic tools

    Science.gov (United States)

    López de Lacalle, Luis Norberto; Urbicain Pelayo, Gorka; Fernández-Valdivielso, Asier; Alvarez, Alvaro; González, Haizea

    2017-09-01

    Difficult to cut materials such as nickel and titanium alloys are used in the aeronautical industry, the former alloys due to its heat-resistant behavior and the latter for the low weight - high strength ratio. Ceramic tools made out alumina with reinforce SiC whiskers are a choice in turning for roughing and semifinishing workpiece stages. Wear rate is high in the machining of these alloys, and consequently cutting forces tends to increase along one operation. This paper establishes the cutting force relation between work-piece and tool in the turning of such difficult-to-cut alloys by means of a mechanistic cutting force model that considers the tool wear effect. The cutting force model demonstrates the force sensitivity to the cutting engagement parameters (ap, f) when using ceramic inserts and wear is considered. Wear is introduced through a cutting time factor, being useful in real conditions taking into account that wear quickly appears in alloys machining. A good accuracy in the cutting force model coefficients is the key issue for an accurate prediction of turning forces, which could be used as criteria for tool replacement or as input for chatter or other models.

  7. Study of n-Butyl Acrylate Self-Initiation Reaction Experimentally and via Macroscopic Mechanistic Modeling

    Directory of Open Access Journals (Sweden)

    Ahmad Arabi Shamsabadi

    2016-04-01

    Full Text Available This paper presents an experimental study of the self-initiation reaction of n-butyl acrylate (n-BA in free-radical polymerization. For the first time, the frequency factor and activation energy of the monomer self-initiation reaction are estimated from measurements of n-BA conversion in free-radical homo-polymerization initiated only by the monomer. The estimation was carried out using a macroscopic mechanistic mathematical model of the reactor. In addition to already-known reactions that contribute to the polymerization, the model considers a n-BA self-initiation reaction mechanism that is based on our previous electronic-level first-principles theoretical study of the self-initiation reaction. Reaction rate equations are derived using the method of moments. The reaction-rate parameter estimates obtained from conversion measurements agree well with estimates obtained via our purely-theoretical quantum chemical calculations.

  8. Mechanistic movement models to understand epidemic spread.

    Science.gov (United States)

    Fofana, Abdou Moutalab; Hurford, Amy

    2017-05-05

    An overlooked aspect of disease ecology is considering how and why animals come into contact with one and other resulting in disease transmission. Mathematical models of disease spread frequently assume mass-action transmission, justified by stating that susceptible and infectious hosts mix readily, and foregoing any detailed description of host movement. Numerous recent studies have recorded, analysed and modelled animal movement. These movement models describe how animals move with respect to resources, conspecifics and previous movement directions and have been used to understand the conditions for the occurrence and the spread of infectious diseases when hosts perform a type of movement. Here, we summarize the effect of the different types of movement on the threshold conditions for disease spread. We identify gaps in the literature and suggest several promising directions for future research. The mechanistic inclusion of movement in epidemic models may be beneficial for the following two reasons. Firstly, the estimation of the transmission coefficient in an epidemic model is possible because animal movement data can be used to estimate the rate of contacts between conspecifics. Secondly, unsuccessful transmission events, where a susceptible host contacts an infectious host but does not become infected can be quantified. Following an outbreak, this enables disease ecologists to identify 'near misses' and to explore possible alternative epidemic outcomes given shifts in ecological or immunological parameters.This article is part of the themed issue 'Opening the black box: re-examining the ecology and evolution of parasite transmission'. © 2017 The Author(s).

  9. Mechanistic modelling of genetic and epigenetic events in radiation carcinogenesis

    International Nuclear Information System (INIS)

    Andreev, S. G.; Eidelman, Y. A.; Salnikov, I. V.; Khvostunov, I. K.

    2006-01-01

    Methodological problems arise on the way of radiation carcinogenesis modelling with the incorporation of radiobiological and cancer biology mechanistic data. The results of biophysical modelling of different endpoints [DNA DSB induction, repair, chromosome aberrations (CA) and cell proliferation] are presented and applied to the analysis of RBE-LET relationships for radiation-induced neoplastic transformation (RINT) of C3H/10T1/2 cells in culture. Predicted values for some endpoints correlate well with the data. It is concluded that slowly repaired DSB clusters, as well as some kind of CA, may be initiating events for RINT. As an alternative interpretation, it is possible that DNA damage can induce RINT indirectly via epigenetic process. A hypothetical epigenetic pathway for RINT is discussed. (authors)

  10. Prediction of net hepatic release of glucose using a “hybrid” mechanistic model in ruminants applied to positive energy balance

    OpenAIRE

    Bahloul, Lahlou; Ortigues, Isabelle; Vernet, Jean; Lapierre, Helène; Noziere, Pierre; Sauvant, Daniel

    2013-01-01

    Ruminants depend on hepatic gluconeogenesis to meet most of their metabolic demand for glucose which relies on availability of precursors from diet supply and animal requirements (Loncke et al., 2010). Several mechanistic models of the metabolic fate of nutrients across the liver exist that have been parameterized for dairy cows. They cannot be directly used to predict hepatic gluconeogenesis in all types of ruminants in different physiological status. A hybrid mechanistic model of nutrient f...

  11. A Mechanistically Informed User-Friendly Model to Predict Greenhouse Gas (GHG) Fluxes and Carbon Storage from Coastal Wetlands

    Science.gov (United States)

    Abdul-Aziz, O. I.; Ishtiaq, K. S.

    2015-12-01

    We present a user-friendly modeling tool on MS Excel to predict the greenhouse gas (GHG) fluxes and estimate potential carbon sequestration from the coastal wetlands. The dominant controls of wetland GHG fluxes and their relative mechanistic linkages with various hydro-climatic, sea level, biogeochemical and ecological drivers were first determined by employing a systematic data-analytics method, including Pearson correlation matrix, principal component and factor analyses, and exploratory partial least squares regressions. The mechanistic knowledge and understanding was then utilized to develop parsimonious non-linear (power-law) models to predict wetland carbon dioxide (CO2) and methane (CH4) fluxes based on a sub-set of climatic, hydrologic and environmental drivers such as the photosynthetically active radiation, soil temperature, water depth, and soil salinity. The models were tested with field data for multiple sites and seasons (2012-13) collected from the Waquoit Bay, MA. The model estimated the annual wetland carbon storage by up-scaling the instantaneous predicted fluxes to an extended growing season (e.g., May-October) and by accounting for the net annual lateral carbon fluxes between the wetlands and estuary. The Excel Spreadsheet model is a simple ecological engineering tool for coastal carbon management and their incorporation into a potential carbon market under a changing climate, sea level and environment. Specifically, the model can help to determine appropriate GHG offset protocols and monitoring plans for projects that focus on tidal wetland restoration and maintenance.

  12. A mechanistic model for electricity consumption on dairy farms: Definition, validation, and demonstration

    OpenAIRE

    Upton, J.R.; Murphy, M.; Shallo, L.; Groot Koerkamp, P.W.G.; Boer, de, I.J.M.

    2014-01-01

    Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical d...

  13. Mechanistic systems modeling to guide drug discovery and development.

    Science.gov (United States)

    Schmidt, Brian J; Papin, Jason A; Musante, Cynthia J

    2013-02-01

    A crucial question that must be addressed in the drug development process is whether the proposed therapeutic target will yield the desired effect in the clinical population. Pharmaceutical and biotechnology companies place a large investment on research and development, long before confirmatory data are available from human trials. Basic science has greatly expanded the computable knowledge of disease processes, both through the generation of large omics data sets and a compendium of studies assessing cellular and systemic responses to physiologic and pathophysiologic stimuli. Given inherent uncertainties in drug development, mechanistic systems models can better inform target selection and the decision process for advancing compounds through preclinical and clinical research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Mechanistic modeling analysis of micro-evolutive responses from a Caenorhabditis elegans population exposed to a radioactive metallic stress

    International Nuclear Information System (INIS)

    Goussen, Benoit

    2013-01-01

    The evolution of toxic effects at a relevant scale is an important challenge for the ecosystem protection. Indeed, pollutants may impact populations over long-term and represent a new evolutionary force which can be adding itself to the natural selection forces. Thereby, it is necessary to acquire knowledge on the phenotypics and genetics changes that may appear in populations submitted to stress over several generations. Usually statistical analyses are performed to analyse such multi-generational studies. The use of a mechanistic mathematical model may provide a way to fully understand the impact of pollutants on the populations' dynamics. Such kind of model allows the integration of biological and toxic processes into the analysis of eco-toxicological data and the assessment of interactions between these processes. The aim of this Ph.D. project was to assess the contributions of the mechanistic modelling to the analysis of evolutionary experiment assessing long-term exposure. To do so, a three step strategy has been developed. Foremost, a multi-generational study was performed to assess the evolution of two populations of the ubiquitous nematode Caenorhabditis elegans in control conditions or exposed to 1.1 mM of uranium. Several generations were selected to assess growth, reproduction, and dose-responses relationships, through exposure to a range of concentrations (from 0 to 1.2 mM U) with all endpoints measured daily. A first statistical analysis was then performed. In a second step, a bio-energetic model adapted to the assessment of eco-toxicological data (DEBtox) was developed on C. elegans. Its numerical behaviour was analysed. Finally, this model was applied to all the selected generations in order to infer parameters values for the two populations and to assess their evolutions. Results highlighted an impact of the uranium starting from 0.4 mM U on both C. elegans' growth and reproduction. Results from the mechanistic analysis indicate this effect is due

  15. Multiscale mechanistic modeling in pharmaceutical research and development.

    Science.gov (United States)

    Kuepfer, Lars; Lippert, Jörg; Eissing, Thomas

    2012-01-01

    Discontinuation of drug development projects due to lack of efficacy or adverse events is one of the main cost drivers in pharmaceutical research and development (R&D). Investments have to be written-off and contribute to the total costs of a successful drug candidate receiving marketing authorization and allowing return on invest. A vital risk for pharmaceutical innovator companies is late stage clinical failure since costs for individual clinical trials may exceed the one billion Euro threshold. To guide investment decisions and to safeguard maximum medical benefit and safety for patients recruited in clinical trials, it is therefore essential to understand the clinical consequences of all information and data generated. The complexity of the physiological and pathophysiological processes and the sheer amount of information available overcharge the mental capacity of any human being and prevent a prediction of the success in clinical development. A rigorous integration of knowledge, assumption, and experimental data into computational models promises a significant improvement of the rationalization of decision making in pharmaceutical industry. We here give an overview of the current status of modeling and simulation in pharmaceutical R&D and outline the perspectives of more recent developments in mechanistic modeling. Specific modeling approaches for different biological scales ranging from intracellular processes to whole organism physiology are introduced and an example for integrative multiscale modeling of therapeutic efficiency in clinical oncology trials is showcased.

  16. A probabilistic model-based soft sensor to monitor lactic acid bacteria fermentations

    DEFF Research Database (Denmark)

    Spann, Robert; Roca, Christophe; Kold, David

    2018-01-01

    A probabilistic soft sensor based on a mechanistic model was designed to monitor S. thermophilus fermentations, and validated with experimental lab-scale data. It considered uncertainties in the initial conditions, on-line measurements, and model parameters by performing Monte Carlo simulations...... the model parameters that were then used as input to the mechanistic model. The soft sensor predicted both the current state variables, as well as the future course of the fermentation, e.g. with a relative mean error of the biomass concentration of 8 %. This successful implementation of a process...... within the monitoring system. It predicted, therefore, the probability distributions of the unmeasured states, such as biomass, lactose, and lactic acid concentrations. To this end, a mechanistic model was developed first, and a statistical parameter estimation was performed in order to assess parameter...

  17. Pathophysiology of white-nose syndrome in bats: a mechanistic model linking wing damage to mortality.

    Science.gov (United States)

    Warnecke, Lisa; Turner, James M; Bollinger, Trent K; Misra, Vikram; Cryan, Paul M; Blehert, David S; Wibbelt, Gudrun; Willis, Craig K R

    2013-08-23

    White-nose syndrome is devastating North American bat populations but we lack basic information on disease mechanisms. Altered blood physiology owing to epidermal invasion by the fungal pathogen Geomyces destructans (Gd) has been hypothesized as a cause of disrupted torpor patterns of affected hibernating bats, leading to mortality. Here, we present data on blood electrolyte concentration, haematology and acid-base balance of hibernating little brown bats, Myotis lucifugus, following experimental inoculation with Gd. Compared with controls, infected bats showed electrolyte depletion (i.e. lower plasma sodium), changes in haematology (i.e. increased haematocrit and decreased glucose) and disrupted acid-base balance (i.e. lower CO2 partial pressure and bicarbonate). These findings indicate hypotonic dehydration, hypovolaemia and metabolic acidosis. We propose a mechanistic model linking tissue damage to altered homeostasis and morbidity/mortality.

  18. Bird Migration Under Climate Change - A Mechanistic Approach Using Remote Sensing

    Science.gov (United States)

    Smith, James A.; Blattner, Tim; Messmer, Peter

    2010-01-01

    The broad-scale reductions and shifts that may be expected under climate change in the availability and quality of stopover habitat for long-distance migrants is an area of increasing concern for conservation biologists. Researchers generally have taken two broad approaches to the modeling of migration behaviour to understand the impact of these changes on migratory bird populations. These include models based on causal processes and their response to environmental stimulation, "mechanistic models", or models that primarily are based on observed animal distribution patterns and the correlation of these patterns with environmental variables, i.e. "data driven" models. Investigators have applied the latter technique to forecast changes in migration patterns with changes in the environment, for example, as might be expected under climate change, by forecasting how the underlying environmental data layers upon which the relationships are built will change over time. The learned geostatstical correlations are then applied to the modified data layers.. However, this is problematic. Even if the projections of how the underlying data layers will change are correct, it is not evident that the statistical relationships will remain the same, i.e. that the animal organism may not adapt its' behaviour to the changing conditions. Mechanistic models that explicitly take into account the physical, biological, and behaviour responses of an organism as well as the underlying changes in the landscape offer an alternative to address these shortcomings. The availability of satellite remote sensing observations at multiple spatial and temporal scales, coupled with advances in climate modeling and information technologies enable the application of the mechanistic models to predict how continental bird migration patterns may change in response to environmental change. In earlier work, we simulated the impact of effects of wetland loss and inter-annual variability on the fitness of

  19. Toward a mechanistic modeling of nitrogen limitation on vegetation dynamics.

    Science.gov (United States)

    Xu, Chonggang; Fisher, Rosie; Wullschleger, Stan D; Wilson, Cathy J; Cai, Michael; McDowell, Nate G

    2012-01-01

    Nitrogen is a dominant regulator of vegetation dynamics, net primary production, and terrestrial carbon cycles; however, most ecosystem models use a rather simplistic relationship between leaf nitrogen content and photosynthetic capacity. Such an approach does not consider how patterns of nitrogen allocation may change with differences in light intensity, growing-season temperature and CO(2) concentration. To account for this known variability in nitrogen-photosynthesis relationships, we develop a mechanistic nitrogen allocation model based on a trade-off of nitrogen allocated between growth and storage, and an optimization of nitrogen allocated among light capture, electron transport, carboxylation, and respiration. The developed model is able to predict the acclimation of photosynthetic capacity to changes in CO(2) concentration, temperature, and radiation when evaluated against published data of V(c,max) (maximum carboxylation rate) and J(max) (maximum electron transport rate). A sensitivity analysis of the model for herbaceous plants, deciduous and evergreen trees implies that elevated CO(2) concentrations lead to lower allocation of nitrogen to carboxylation but higher allocation to storage. Higher growing-season temperatures cause lower allocation of nitrogen to carboxylation, due to higher nitrogen requirements for light capture pigments and for storage. Lower levels of radiation have a much stronger effect on allocation of nitrogen to carboxylation for herbaceous plants than for trees, resulting from higher nitrogen requirements for light capture for herbaceous plants. As far as we know, this is the first model of complete nitrogen allocation that simultaneously considers nitrogen allocation to light capture, electron transport, carboxylation, respiration and storage, and the responses of each to altered environmental conditions. We expect this model could potentially improve our confidence in simulations of carbon-nitrogen interactions and the vegetation

  20. Toward a Mechanistic Modeling of Nitrogen Limitation on Vegetation Dynamics

    Science.gov (United States)

    Xu, Chonggang; Fisher, Rosie; Wullschleger, Stan D.; Wilson, Cathy J.; Cai, Michael; McDowell, Nate G.

    2012-01-01

    Nitrogen is a dominant regulator of vegetation dynamics, net primary production, and terrestrial carbon cycles; however, most ecosystem models use a rather simplistic relationship between leaf nitrogen content and photosynthetic capacity. Such an approach does not consider how patterns of nitrogen allocation may change with differences in light intensity, growing-season temperature and CO2 concentration. To account for this known variability in nitrogen-photosynthesis relationships, we develop a mechanistic nitrogen allocation model based on a trade-off of nitrogen allocated between growth and storage, and an optimization of nitrogen allocated among light capture, electron transport, carboxylation, and respiration. The developed model is able to predict the acclimation of photosynthetic capacity to changes in CO2 concentration, temperature, and radiation when evaluated against published data of Vc,max (maximum carboxylation rate) and Jmax (maximum electron transport rate). A sensitivity analysis of the model for herbaceous plants, deciduous and evergreen trees implies that elevated CO2 concentrations lead to lower allocation of nitrogen to carboxylation but higher allocation to storage. Higher growing-season temperatures cause lower allocation of nitrogen to carboxylation, due to higher nitrogen requirements for light capture pigments and for storage. Lower levels of radiation have a much stronger effect on allocation of nitrogen to carboxylation for herbaceous plants than for trees, resulting from higher nitrogen requirements for light capture for herbaceous plants. As far as we know, this is the first model of complete nitrogen allocation that simultaneously considers nitrogen allocation to light capture, electron transport, carboxylation, respiration and storage, and the responses of each to altered environmental conditions. We expect this model could potentially improve our confidence in simulations of carbon-nitrogen interactions and the vegetation feedbacks

  1. Toward a mechanistic modeling of nitrogen limitation on vegetation dynamics.

    Directory of Open Access Journals (Sweden)

    Chonggang Xu

    Full Text Available Nitrogen is a dominant regulator of vegetation dynamics, net primary production, and terrestrial carbon cycles; however, most ecosystem models use a rather simplistic relationship between leaf nitrogen content and photosynthetic capacity. Such an approach does not consider how patterns of nitrogen allocation may change with differences in light intensity, growing-season temperature and CO(2 concentration. To account for this known variability in nitrogen-photosynthesis relationships, we develop a mechanistic nitrogen allocation model based on a trade-off of nitrogen allocated between growth and storage, and an optimization of nitrogen allocated among light capture, electron transport, carboxylation, and respiration. The developed model is able to predict the acclimation of photosynthetic capacity to changes in CO(2 concentration, temperature, and radiation when evaluated against published data of V(c,max (maximum carboxylation rate and J(max (maximum electron transport rate. A sensitivity analysis of the model for herbaceous plants, deciduous and evergreen trees implies that elevated CO(2 concentrations lead to lower allocation of nitrogen to carboxylation but higher allocation to storage. Higher growing-season temperatures cause lower allocation of nitrogen to carboxylation, due to higher nitrogen requirements for light capture pigments and for storage. Lower levels of radiation have a much stronger effect on allocation of nitrogen to carboxylation for herbaceous plants than for trees, resulting from higher nitrogen requirements for light capture for herbaceous plants. As far as we know, this is the first model of complete nitrogen allocation that simultaneously considers nitrogen allocation to light capture, electron transport, carboxylation, respiration and storage, and the responses of each to altered environmental conditions. We expect this model could potentially improve our confidence in simulations of carbon-nitrogen interactions and the

  2. Development of a mechanistic model for release of radionuclides from spent fuel in brines: Salt Repository Project

    International Nuclear Information System (INIS)

    Reimus, P.W.; Windisch, C.F.

    1988-03-01

    At present there are no comprehensive mechanistic models describing the release of radionuclides from spent fuel in brine environments. This report provides a comprehensive review of the various factors that can affect radionuclide release from spent fuel, suggests a modeling approach, and discusses proposed experiments for obtaining a better mechanistic understanding of the radionuclide release processes. Factors affecting radionuclide release include the amount, location, and disposition of radionuclides in the fuel and environmental factors such as redox potential, pH, the presence of complexing anions, temperature, and radiolysis. It is concluded that a model describing the release of radionuclides from spent fuel should contain separate terms for release from the gap, grain boundaries, and grains of the fuel. Possible functional forms for these terms are discussed in the report. Experiments for assessing their validity and obtaining key model parameters are proposed. 71 refs., 4 figs., 6 tabs

  3. Drug-disease modeling in the pharmaceutical industry - where mechanistic systems pharmacology and statistical pharmacometrics meet.

    Science.gov (United States)

    Helmlinger, Gabriel; Al-Huniti, Nidal; Aksenov, Sergey; Peskov, Kirill; Hallow, Karen M; Chu, Lulu; Boulton, David; Eriksson, Ulf; Hamrén, Bengt; Lambert, Craig; Masson, Eric; Tomkinson, Helen; Stanski, Donald

    2017-11-15

    Modeling & simulation (M&S) methodologies are established quantitative tools, which have proven to be useful in supporting the research, development (R&D), regulatory approval, and marketing of novel therapeutics. Applications of M&S help design efficient studies and interpret their results in context of all available data and knowledge to enable effective decision-making during the R&D process. In this mini-review, we focus on two sets of modeling approaches: population-based models, which are well-established within the pharmaceutical industry today, and fall under the discipline of clinical pharmacometrics (PMX); and systems dynamics models, which encompass a range of models of (patho-)physiology amenable to pharmacological intervention, of signaling pathways in biology, and of substance distribution in the body (today known as physiologically-based pharmacokinetic models) - which today may be collectively referred to as quantitative systems pharmacology models (QSP). We next describe the convergence - or rather selected integration - of PMX and QSP approaches into 'middle-out' drug-disease models, which retain selected mechanistic aspects, while remaining parsimonious, fit-for-purpose, and able to address variability and the testing of covariates. We further propose development opportunities for drug-disease systems models, to increase their utility and applicability throughout the preclinical and clinical spectrum of pharmaceutical R&D. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. MECHANISTIC KINETIC MODELS FOR STEAM REFORMING OF CONCENTRATED CRUDE ETHANOL ON NI/AL2O3 CATALYST

    Directory of Open Access Journals (Sweden)

    O. A. OLAFADEHAN

    2015-05-01

    Full Text Available Mechanistic kinetic models were postulated for the catalytic steam reforming of concentrated crude ethanol on a Ni-based commercial catalyst at atmosphere pressure in the temperature range of 673-863 K, and at different catalyst weight to the crude ethanol molar flow rate ratio (in the range 0.9645-9.6451 kg catalyst h/kg mole crude ethanol in a stainless steel packed bed tubular microreactor. The models were based on Langmuir-Hinshelwood-Hougen-Watson (LHHW and Eley-Rideal (ER mechanisms. The optimization routine of Nelder-Mead simplex algorithm was used to estimate the inherent kinetic parameters in the proposed models. The selection of the best kinetic model amongst the rival kinetic models was based on physicochemical, statistical and thermodynamic scrutinies. The rate determining step for the steam reforming of concentrated crude ethanol on Ni/Al2O3 catalyst was found to be surface reaction between chemisorbed CH3O and O when hydrogen and oxygen were adsorbed as monomolecular species on the catalyst surface. Excellent agreement was obtained between the experimental rate of reaction and conversion of crude ethanol, and the simulated results, with ADD% being ±0.46.

  5. Growth and lipid production of Umbelopsis isabellina on a solid substrate - Mechanistic modeling and validation

    NARCIS (Netherlands)

    Meeuwse, P.; Klok, A.J.; Haemers, S.; Tramper, J.; Rinzema, A.

    2012-01-01

    Microbial lipids are an interesting feedstock for biodiesel. Their production from agricultural waste streams by fungi cultivated in solid-state fermentation may be attractive, but the yield of this process is still quite low. In this article, a mechanistic model is presented that describes growth,

  6. Phenomenological and mechanistic modeling of melt-structure-water interactions in a light water reactor severe accident

    International Nuclear Information System (INIS)

    Bui, V.A.

    1998-01-01

    The objective of this work is to address the modeling of the thermal hydrodynamic phenomena and interactions occurring during the progression of reactor severe accidents. Integrated phenomenological models are developed to describe the accident scenarios, which consist of many processes, while mechanistic modeling, including direct numerical simulation, is carried out to describe separate effects and selected physical phenomena of particular importance

  7. In silico, experimental, mechanistic model for extended-release felodipine disposition exhibiting complex absorption and a highly variable food interaction.

    Directory of Open Access Journals (Sweden)

    Sean H J Kim

    Full Text Available The objective of this study was to develop and explore new, in silico experimental methods for deciphering complex, highly variable absorption and food interaction pharmacokinetics observed for a modified-release drug product. Toward that aim, we constructed an executable software analog of study participants to whom product was administered orally. The analog is an object- and agent-oriented, discrete event system, which consists of grid spaces and event mechanisms that map abstractly to different physiological features and processes. Analog mechanisms were made sufficiently complicated to achieve prespecified similarity criteria. An equation-based gastrointestinal transit model with nonlinear mixed effects analysis provided a standard for comparison. Subject-specific parameterizations enabled each executed analog's plasma profile to mimic features of the corresponding six individual pairs of subject plasma profiles. All achieved prespecified, quantitative similarity criteria, and outperformed the gastrointestinal transit model estimations. We observed important subject-specific interactions within the simulation and mechanistic differences between the two models. We hypothesize that mechanisms, events, and their causes occurring during simulations had counterparts within the food interaction study: they are working, evolvable, concrete theories of dynamic interactions occurring within individual subjects. The approach presented provides new, experimental strategies for unraveling the mechanistic basis of complex pharmacological interactions and observed variability.

  8. Global scale analysis and evaluation of an improved mechanistic representation of plant nitrogen and carbon dynamics in the Community Land Model (CLM)

    Science.gov (United States)

    Ghimire, B.; Riley, W. J.; Koven, C. D.; Randerson, J. T.; Mu, M.; Kattge, J.; Rogers, A.; Reich, P. B.

    2014-12-01

    In many ecosystems, nitrogen is the most limiting nutrient for plant growth and productivity. However mechanistic representation of nitrogen uptake linked to root traits, and functional nitrogen allocation among different leaf enzymes involved in respiration and photosynthesis is currently lacking in Earth System models. The linkage between nitrogen availability and plant productivity is simplistically represented by potential photosynthesis rates, and is subsequently downregulated depending on nitrogen supply and other nitrogen consumers in the model (e.g., nitrification). This type of potential photosynthesis rate calculation is problematic for several reasons. Firstly, plants do not photosynthesize at potential rates and then downregulate. Secondly, there is considerable subjectivity on the meaning of potential photosynthesis rates. Thirdly, there exists lack of understanding on modeling these potential photosynthesis rates in a changing climate. In addition to model structural issues in representing photosynthesis rates, the role of plant roots in nutrient acquisition have been largely ignored in Earth System models. For example, in CLM4.5, nitrogen uptake is linked to leaf level processes (e.g., primarily productivity) rather than root scale process involved in nitrogen uptake. We present a new plant model for CLM with an improved mechanistic presentation of plant nitrogen uptake based on root scale Michaelis Menten kinetics, and stronger linkages between leaf nitrogen and plant productivity by inferring relationships observed in global databases of plant traits (including the TRY database and several individual studies). We also incorporate improved representation of plant nitrogen leaf allocation, especially in tropical regions where significant over-prediction of plant growth and productivity in CLM4.5 simulations exist. We evaluate our improved global model simulations using the International Land Model Benchmarking (ILAMB) framework. We conclude that

  9. Agent-Based Modeling in Systems Pharmacology.

    Science.gov (United States)

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.

  10. Productivity of "collisions generate heat" for reconciling an energy model with mechanistic reasoning: A case study

    Science.gov (United States)

    Scherr, Rachel E.; Robertson, Amy D.

    2015-06-01

    We observe teachers in professional development courses about energy constructing mechanistic accounts of energy transformations. We analyze a case in which teachers investigating adiabatic compression develop a model of the transformation of kinetic energy to thermal energy. Among their ideas is the idea that thermal energy is generated as a byproduct of individual particle collisions, which is represented in science education research literature as an obstacle to learning. We demonstrate that in this instructional context, the idea that individual particle collisions generate thermal energy is not an obstacle to learning, but instead is productive: it initiates intellectual progress. Specifically, this idea initiates the reconciliation of the teachers' energy model with mechanistic reasoning about adiabatic compression, and leads to a canonically correct model of the transformation of kinetic energy into thermal energy. We claim that the idea's productivity is influenced by features of our particular instructional context, including the instructional goals of the course, the culture of collaborative sense making, and the use of certain representations of energy.

  11. Fast Biological Modeling for Voxel-based Heavy Ion Treatment Planning Using the Mechanistic Repair-Misrepair-Fixation Model and Nuclear Fragment Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Kamp, Florian [Department of Therapeutic Radiology, Yale University School of Medicine, New Haven, Connecticut (United States); Department of Radiation Oncology, Technische Universität München, Klinikum Rechts der Isar, München (Germany); Physik-Department, Technische Universität München, Garching (Germany); Cabal, Gonzalo [Experimental Physics–Medical Physics, Ludwig Maximilians University Munich, Garching (Germany); Mairani, Andrea [Medical Physics Unit, Centro Nazionale Adroterapia Oncologica (CNAO), Pavia (Italy); Heidelberg Ion-Beam Therapy Center, Heidelberg (Germany); Parodi, Katia [Experimental Physics–Medical Physics, Ludwig Maximilians University Munich, Garching (Germany); Wilkens, Jan J. [Department of Radiation Oncology, Technische Universität München, Klinikum Rechts der Isar, München (Germany); Physik-Department, Technische Universität München, Garching (Germany); Carlson, David J., E-mail: david.j.carlson@yale.edu [Department of Therapeutic Radiology, Yale University School of Medicine, New Haven, Connecticut (United States)

    2015-11-01

    Purpose: The physical and biological differences between heavy ions and photons have not been fully exploited and could improve treatment outcomes. In carbon ion therapy, treatment planning must account for physical properties, such as the absorbed dose and nuclear fragmentation, and for differences in the relative biological effectiveness (RBE) of ions compared with photons. We combined the mechanistic repair-misrepair-fixation (RMF) model with Monte Carlo-generated fragmentation spectra for biological optimization of carbon ion treatment plans. Methods and Materials: Relative changes in double-strand break yields and radiosensitivity parameters with particle type and energy were determined using the independently benchmarked Monte Carlo damage simulation and the RMF model to estimate the RBE values for primary carbon ions and secondary fragments. Depth-dependent energy spectra were generated with the Monte Carlo code FLUKA for clinically relevant initial carbon ion energies. The predicted trends in RBE were compared with the published experimental data. Biological optimization for carbon ions was implemented in a 3-dimensional research treatment planning tool. Results: We compared the RBE and RBE-weighted dose (RWD) distributions of different carbon ion treatment scenarios with and without nuclear fragments. The inclusion of fragments in the simulations led to smaller RBE predictions. A validation of RMF against measured cell survival data reported in published studies showed reasonable agreement. We calculated and optimized the RWD distributions on patient data and compared the RMF predictions with those from other biological models. The RBE values in an astrocytoma tumor ranged from 2.2 to 4.9 (mean 2.8) for a RWD of 3 Gy(RBE) assuming (α/β){sub X} = 2 Gy. Conclusions: These studies provide new information to quantify and assess uncertainties in the clinically relevant RBE values for carbon ion therapy based on biophysical mechanisms. We present results from

  12. Mechanistic modeling of insecticide risks to breeding birds in ...

    Science.gov (United States)

    Insecticide usage in the United States is ubiquitous in urban, suburban, and rural environments. In evaluating data for an insecticide registration application and for registration review, scientists at the United States Environmental Protection Agency (USEPA) assess the fate of the insecticide and the risk the insecticide poses to the environment and non-target wildlife. At the present time, current USEPA risk assessments do not include population-level endpoints. In this paper, we present a new mechanistic model, which allows risk assessors to estimate the effects of insecticide exposure on the survival and seasonal productivity of birds known to use agricultural fields during their breeding season. The new model was created from two existing USEPA avian risk assessment models, the Terrestrial Investigation Model (TIM v.3.0) and the Markov Chain Nest Productivity model (MCnest). The integrated TIM/MCnest model has been applied to assess the relative risk of 12 insecticides used to control corn pests on a suite of 31 avian species known to use cornfields in midwestern agroecosystems. The 12 insecticides that were assessed in this study are all used to treat major pests of corn (corn root worm borer, cutworm, and armyworm). After running the integrated TIM/MCnest model, we found extensive differences in risk to birds among insecticides, with chlorpyrifos and malathion (organophosphates) generally posing the greatest risk, and bifenthrin and ë-cyhalothrin (

  13. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term - Trial Calculation

    International Nuclear Information System (INIS)

    Grabaskas, David

    2016-01-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  14. Mechanistic modeling of biocorrosion caused by biofilms of sulfate reducing bacteria and acid producing bacteria.

    Science.gov (United States)

    Xu, Dake; Li, Yingchao; Gu, Tingyue

    2016-08-01

    Biocorrosion is also known as microbiologically influenced corrosion (MIC). Most anaerobic MIC cases can be classified into two major types. Type I MIC involves non-oxygen oxidants such as sulfate and nitrate that require biocatalysis for their reduction in the cytoplasm of microbes such as sulfate reducing bacteria (SRB) and nitrate reducing bacteria (NRB). This means that the extracellular electrons from the oxidation of metal such as iron must be transported across cell walls into the cytoplasm. Type II MIC involves oxidants such as protons that are secreted by microbes such as acid producing bacteria (APB). The biofilms in this case supply the locally high concentrations of oxidants that are corrosive without biocatalysis. This work describes a mechanistic model that is based on the biocatalytic cathodic sulfate reduction (BCSR) theory. The model utilizes charge transfer and mass transfer concepts to describe the SRB biocorrosion process. The model also includes a mechanism to describe APB attack based on the local acidic pH at a pit bottom. A pitting prediction software package has been created based on the mechanisms. It predicts long-term pitting rates and worst-case scenarios after calibration using SRB short-term pit depth data. Various parameters can be investigated through computer simulation. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Mutual Dependence Between Sedimentary Organic Carbon and Infaunal Macrobenthos Resolved by Mechanistic Modeling

    Science.gov (United States)

    Zhang, Wenyan; Wirtz, Kai

    2017-10-01

    The mutual dependence between sedimentary total organic carbon (TOC) and infaunal macrobenthos is here quantified by a mechanistic model. The model describes (i) the vertical distribution of infaunal macrobenthic biomass resulting from a trade-off between nutritional benefit (quantity and quality of TOC) and the costs of burial (respiration) and mortality, and (ii) the variable vertical distribution of TOC being in turn shaped by bioturbation of local macrobenthos. In contrast to conventional approaches, our model emphasizes variations of bioturbation both spatially and temporally depending on local food resources and macrobenthic biomass. Our implementation of the dynamic interaction between TOC and infaunal macrobenthos is able to capture a temporal benthic response to both depositional and erosional environments and provides improved estimates of the material exchange flux at the sediment-water interface. Applications to literature data for the North Sea demonstrate the robustness and accuracy of the model and its potential as an analysis tool for the status of TOC and macrobenthos in marine sediments. Results indicate that the vertical distribution of infaunal biomass is shaped by both the quantity and the quality of OC, while the community structure is determined only by the quality of OC. Bioturbation intensity may differ by 1 order of magnitude over different seasons owing to variations in the OC input, resulting in a significant modulation on the distribution of OC. Our relatively simple implementation may further improve models of early diagenesis and marine food web dynamics by mechanistically connecting the vertical distribution of both TOC and macrobenthic biomass.

  16. Proceedings of the international workshop on mechanistic understanding of radionuclide migration in compacted/intact systems

    International Nuclear Information System (INIS)

    Tachi, Yukio; Yui, Mikazu

    2010-03-01

    The international workshop on mechanistic understanding of radionuclide migration in compacted / intact systems was held at ENTRY, JAEA, Tokai on 21st - 23rd January, 2009. This workshop was hosted by Japan Atomic Energy Agency (JAEA) as part of the project on the mechanistic model/database development for radionuclide sorption and diffusion behavior in compacted / intact systems. The overall goal of the project is to develop the mechanistic model / database for a consistent understanding and prediction of migration parameters and its uncertainties for performance assessment of geological disposal of radioactive waste. The objective of the workshop is to integrate the state-of-the-art of mechanistic sorption and diffusion model in compacted / intact systems, especially in bentonite / clay systems, and discuss the JAEA's mechanistic approaches and future challenges, especially the following discussions points; 1) What's the status and difficulties for mechanistic model/database development? 2) What's the status and difficulties for applicability of mechanistic model to the compacted/intact system? 3) What's the status and difficulties for obtaining evidences for mechanistic model? 4) What's the status and difficulties for standardization of experimental methodology for batch sorption and diffusion? 5) What's the uncertainties of transport parameters in radionuclides migration analysis due to a lack of understanding/experimental methodologies, and how do we derive them? This report includes workshop program, overview and materials of each presentation, summary of discussions. (author)

  17. Melanie Klein's metapsychology: phenomenological and mechanistic perspective.

    Science.gov (United States)

    Mackay, N

    1981-01-01

    Freud's metapsychology is the subject of an important debate. This is over whether psychoanalysis is best construed as a science of the natural science type or as a special human science. The same debate applies to Melanie Klein's work. In Klein's metapsychology are two different and incompatible models of explanation. One is taken over from Freud's structural theory and appears to be similarly mechanistic. The other is clinically based and phenomenological. These two are discussed with special reference to the concepts of "phantasy" and "internal object".

  18. Phenomenological and mechanistic modeling of melt-structure-water interactions in a light water reactor severe accident

    Energy Technology Data Exchange (ETDEWEB)

    Bui, V.A

    1998-10-01

    The objective of this work is to address the modeling of the thermal hydrodynamic phenomena and interactions occurring during the progression of reactor severe accidents. Integrated phenomenological models are developed to describe the accident scenarios, which consist of many processes, while mechanistic modeling, including direct numerical simulation, is carried out to describe separate effects and selected physical phenomena of particular importance 88 refs, 54 figs, 7 tabs

  19. Linking spring phenology with mechanistic models of host movement to predict disease transmission risk

    Science.gov (United States)

    Merkle, Jerod A.; Cross, Paul C.; Scurlock, Brandon M.; Cole, Eric K.; Courtemanch, Alyson B.; Dewey, Sarah R.; Kauffman, Matthew J.

    2018-01-01

    Disease models typically focus on temporal dynamics of infection, while often neglecting environmental processes that determine host movement. In many systems, however, temporal disease dynamics may be slow compared to the scale at which environmental conditions alter host space-use and accelerate disease transmission.Using a mechanistic movement modelling approach, we made space-use predictions of a mobile host (elk [Cervus Canadensis] carrying the bacterial disease brucellosis) under environmental conditions that change daily and annually (e.g., plant phenology, snow depth), and we used these predictions to infer how spring phenology influences the risk of brucellosis transmission from elk (through aborted foetuses) to livestock in the Greater Yellowstone Ecosystem.Using data from 288 female elk monitored with GPS collars, we fit step selection functions (SSFs) during the spring abortion season and then implemented a master equation approach to translate SSFs into predictions of daily elk distribution for five plausible winter weather scenarios (from a heavy snow, to an extreme winter drought year). We predicted abortion events by combining elk distributions with empirical estimates of daily abortion rates, spatially varying elk seroprevelance and elk population counts.Our results reveal strong spatial variation in disease transmission risk at daily and annual scales that is strongly governed by variation in host movement in response to spring phenology. For example, in comparison with an average snow year, years with early snowmelt are predicted to have 64% of the abortions occurring on feedgrounds shift to occurring on mainly public lands, and to a lesser extent on private lands.Synthesis and applications. Linking mechanistic models of host movement with disease dynamics leads to a novel bridge between movement and disease ecology. Our analysis framework offers new avenues for predicting disease spread, while providing managers tools to proactively mitigate

  20. The mechanistic bases of the power-time relationship

    DEFF Research Database (Denmark)

    Vanhatalo, Anni; Black, Matthew I; DiMenna, Fred J

    2016-01-01

    .025) and inversely correlated with muscle type IIx fibre proportion (r = -0.76, P = 0.01). There was no relationship between W' (19.4 ± 6.3 kJ) and muscle fibre type. These data indicate a mechanistic link between the bioenergetic characteristics of different muscle fibre types and the power-duration relationship...

  1. A mechanistic model for long-term nuclear waste glass dissolution integrating chemical affinity and interfacial diffusion barrier

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Teqi [Northwest Institute of Nuclear Technology, No.28 Pingyu Road, Baqiao District, Xi' an,Shaanxi, 710024 (China); Mechanics and Physics of Solids Research Group, Modelling and Simulation Centre, The University of Manchester, Oxford Road, Manchester, M13 9PL (United Kingdom); Jivkov, Andrey P., E-mail: andrey.jivkov@manchester.ac.uk [Mechanics and Physics of Solids Research Group, Modelling and Simulation Centre, The University of Manchester, Oxford Road, Manchester, M13 9PL (United Kingdom); Li, Weiping; Liang, Wei; Wang, Yu; Xu, Hui [Northwest Institute of Nuclear Technology, No.28 Pingyu Road, Baqiao District, Xi' an,Shaanxi, 710024 (China); Han, Xiaoyuan, E-mail: xyhan_nint@sina.cn [Northwest Institute of Nuclear Technology, No.28 Pingyu Road, Baqiao District, Xi' an,Shaanxi, 710024 (China)

    2017-04-01

    Understanding the alteration of nuclear waste glass in geological repository conditions is critical element of the analysis of repository retention function. Experimental observations of glass alterations provide a general agreement on the following regimes: inter-diffusion, hydrolysis process, rate drop, residual rate and, under very particular conditions, resumption of alteration. Of these, the mechanisms controlling the rate drop and the residual rate remain a subject of dispute. This paper offers a critical review of the two most competitive models related to these regimes: affinity–limited dissolution and diffusion barrier. The limitations of these models are highlighted by comparison of their predictions with available experimental evidence. Based on the comprehensive discussion of the existing models, a new mechanistic model is proposed as a combination of the chemical affinity and diffusion barrier concepts. It is demonstrated how the model can explain experimental phenomena and data, for which the existing models are shown to be not fully adequate.

  2. Unification and mechanistic detail as drivers of model construction: models of networks in economics and sociology.

    Science.gov (United States)

    Kuorikoski, Jaakko; Marchionni, Caterina

    2014-12-01

    We examine the diversity of strategies of modelling networks in (micro) economics and (analytical) sociology. Field-specific conceptions of what explaining (with) networks amounts to or systematic preference for certain kinds of explanatory factors are not sufficient to account for differences in modelling methodologies. We argue that network models in both sociology and economics are abstract models of network mechanisms and that differences in their modelling strategies derive to a large extent from field-specific conceptions of the way in which a good model should be a general one. Whereas the economics models aim at unification, the sociological models aim at a set of mechanism schemas that are extrapolatable to the extent that the underlying psychological mechanisms are general. These conceptions of generality induce specific biases in mechanistic explanation and are related to different views of when knowledge from different fields should be seen as relevant.

  3. Higher plant modelling for life support applications: first results of a simple mechanistic model

    Science.gov (United States)

    Hezard, Pauline; Dussap, Claude-Gilles; Sasidharan L, Swathy

    2012-07-01

    In the case of closed ecological life support systems, the air and water regeneration and food production are performed using microorganisms and higher plants. Wheat, rice, soybean, lettuce, tomato or other types of eatable annual plants produce fresh food while recycling CO2 into breathable oxygen. Additionally, they evaporate a large quantity of water, which can be condensed and used as potable water. This shows that recycling functions of air revitalization and food production are completely linked. Consequently, the control of a growth chamber for higher plant production has to be performed with efficient mechanistic models, in order to ensure a realistic prediction of plant behaviour, water and gas recycling whatever the environmental conditions. Purely mechanistic models of plant production in controlled environments are not available yet. This is the reason why new models must be developed and validated. This work concerns the design and test of a simplified version of a mathematical model coupling plant architecture and mass balance purposes in order to compare its results with available data of lettuce grown in closed and controlled chambers. The carbon exchange rate, water absorption and evaporation rate, biomass fresh weight as well as leaf surface are modelled and compared with available data. The model consists of four modules. The first one evaluates plant architecture, like total leaf surface, leaf area index and stem length data. The second one calculates the rate of matter and energy exchange depending on architectural and environmental data: light absorption in the canopy, CO2 uptake or release, water uptake and evapotranspiration. The third module evaluates which of the previous rates is limiting overall biomass growth; and the last one calculates biomass growth rate depending on matter exchange rates, using a global stoichiometric equation. All these rates are a set of differential equations, which are integrated with time in order to provide

  4. A dynamic, mechanistic model of metabolism in adipose tissue of lactating dairy cattle.

    Science.gov (United States)

    McNamara, J P; Huber, K; Kenéz, A

    2016-07-01

    Research in dairy cattle biology has resulted in a large body of knowledge on nutrition and metabolism in support of milk production and efficiency. This quantitative knowledge has been compiled in several model systems to balance and evaluate rations and predict requirements. There are also systems models for metabolism and reproduction in the cow that can be used to support research programs. Adipose tissue plays a significant role in the success and efficiency of lactation, and recent research has resulted in several data sets on genomic differences and changes in gene transcription of adipose tissue in dairy cattle. To fully use this knowledge, we need to build and expand mechanistic, dynamic models that integrate control of metabolism and production. Therefore, we constructed a second-generation dynamic, mechanistic model of adipose tissue metabolism of dairy cattle. The model describes the biochemical interconversions of glucose, acetate, β-hydroxybutyrate (BHB), glycerol, C16 fatty acids, and triacylglycerols. Data gathered from our own research and published references were used to set equation forms and parameter values. Acetate, glucose, BHB, and fatty acids are taken up from blood. The fatty acids are activated to the acyl coenzyme A moieties. Enzymatically catalyzed reactions are explicitly described with parameters including maximal velocity and substrate sensitivity. The control of enzyme activity is partially carried out by insulin and norepinephrine, portraying control in the cow. Model behavior was adequate, with sensitive responses to changing substrates and hormones. Increased nutrient uptake and increased insulin stimulate triacylglycerol synthesis, whereas a reduction in nutrient availability or increase in norepinephrine increases triacylglycerol hydrolysis and free fatty acid release to blood. This model can form a basis for more sophisticated integration of existing knowledge and future studies on metabolic efficiency of dairy cattle

  5. Development of boiling transition analysis code TCAPE-INS/B based on mechanistic methods for BWR fuel bundles. Models and validations with boiling transition experimental data

    International Nuclear Information System (INIS)

    Ishida, Naoyuki; Utsuno, Hideaki; Kasahara, Fumio

    2003-01-01

    The Boiling Transition (BT) analysis code TCAPE-INS/B based on the mechanistic methods coupled with subchannel analysis has been developed for the evaluation of the integrity of Boiling Water Reactor (BWR) fuel rod bundles under abnormal operations. Objective of the development is the evaluation of the BT without using empirical BT and rewetting correlations needed for different bundle designs in the current analysis methods. TCAPE-INS/B consisted mainly of the drift-flux model, the film flow model, the cross-flow model, the thermal conductivity model and the heat transfer correlations. These models were validated systematically with the experimental data. The accuracy of the prediction for the steady-state Critical Heat Flux (CHF) and the transient temperature of the fuel rod surface after the occurrence of BT were evaluated on the validations. The calculations for the experiments with the single tube and bundles were carried out for the validations of the models incorporated in the code. The results showed that the steady-state CHF was predicted within about 6% average error. In the transient calculations, BT timing and temperature of the fuel rod surface gradient agreed well with experimental results, but rewetting was predicted lately. So, modeling of heat transfer phenomena during post-BT is under modification. (author)

  6. Toxic neuropathies: Mechanistic insights based on a chemical perspective.

    Science.gov (United States)

    LoPachin, Richard M; Gavin, Terrence

    2015-06-02

    2,5-Hexanedione (HD) and acrylamide (ACR) are considered to be prototypical among chemical toxicants that cause central-peripheral axonopathies characterized by distal axon swelling and degeneration. Because the demise of distal regions was assumed to be causally related to the onset of neurotoxicity, substantial effort was devoted to deciphering the respective mechanisms. Continued research, however, revealed that expression of the presumed hallmark morphological features was dependent upon the daily rate of toxicant exposure. Indeed, many studies reported that the corresponding axonopathic changes were late developing effects that occurred independent of behavioral and/or functional neurotoxicity. This suggested that the toxic axonopathy classification might be based on epiphenomena related to dose-rate. Therefore, the goal of this mini-review is to discuss how quantitative morphometric analyses and the establishment of dose-dependent relationships helped distinguish primary, mechanistically relevant toxicant effects from non-specific consequences. Perhaps more importantly, we will discuss how knowledge of neurotoxicant chemical nature can guide molecular-level research toward a better, more rational understanding of mechanism. Our discussion will focus on HD, the neurotoxic γ-diketone metabolite of the industrial solvents n-hexane and methyl-n-butyl ketone. Early investigations suggested that HD caused giant neurofilamentous axonal swellings and eventual degeneration in CNS and PNS. However, as our review will point out, this interpretation underwent several iterations as the understanding of γ-diketone chemistry improved and more quantitative experimental approaches were implemented. The chemical concepts and design strategies discussed in this mini-review are broadly applicable to the mechanistic studies of other chemicals (e.g., n-propyl bromine, methyl methacrylate) that cause toxic neuropathies. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Bucknor, Matthew [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Jerden, James [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Brunett, Acacia J. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Denman, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Clark, Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Denning, Richard S. [Consultant, Columbus, OH (United States)

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  8. Mechanistic kinetic models of enzymatic cellulose hydrolysis-A review.

    Science.gov (United States)

    Jeoh, Tina; Cardona, Maria J; Karuna, Nardrapee; Mudinoor, Akshata R; Nill, Jennifer

    2017-07-01

    Bioconversion of lignocellulose forms the basis for renewable, advanced biofuels, and bioproducts. Mechanisms of hydrolysis of cellulose by cellulases have been actively studied for nearly 70 years with significant gains in understanding of the cellulolytic enzymes. Yet, a full mechanistic understanding of the hydrolysis reaction has been elusive. We present a review to highlight new insights gained since the most recent comprehensive review of cellulose hydrolysis kinetic models by Bansal et al. (2009) Biotechnol Adv 27:833-848. Recent models have taken a two-pronged approach to tackle the challenge of modeling the complex heterogeneous reaction-an enzyme-centric modeling approach centered on the molecularity of the cellulase-cellulose interactions to examine rate limiting elementary steps and a substrate-centric modeling approach aimed at capturing the limiting property of the insoluble cellulose substrate. Collectively, modeling results suggest that at the molecular-scale, how rapidly cellulases can bind productively (complexation) and release from cellulose (decomplexation) is limiting, while the overall hydrolysis rate is largely insensitive to the catalytic rate constant. The surface area of the insoluble substrate and the degrees of polymerization of the cellulose molecules in the reaction both limit initial hydrolysis rates only. Neither enzyme-centric models nor substrate-centric models can consistently capture hydrolysis time course at extended reaction times. Thus, questions of the true reaction limiting factors at extended reaction times and the role of complexation and decomplexation in rate limitation remain unresolved. Biotechnol. Bioeng. 2017;114: 1369-1385. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  9. Validation of mechanistic models for gas precipitation in solids during postirradiation annealing experiments

    Science.gov (United States)

    Rest, J.

    1989-12-01

    A number of different phenomenological models for gas precipitation in solids during postirradiation annealing experiments have been proposed. Validation of such mechanistic models for gas release and swelling is complicated by the use of data containing large systematic errors, and phenomena characterized by synergistic effects as well as uncertainties in materials properties. Statistical regression analysis is recommended for the selection of a reasonably well characterized data base for gas release from irradiated fuel under transient heating conditions. It is demonstrated that an appropriate data selection method is required in order to realistically examine the impact of differing descriptions of the phenomena, and uncertainties in selected materials properties, on the validation results. The results of the analysis show that the kinetics of gas precipitation in solids depend on bubble overpressurization effects and need to be accounted for during the heatup phase of isothermal heating experiments. It is shown that if only the total gas release values (as opposed to time-dependent data) were available, differentiation between different gas precipitation models would be ambiguous. The observed sustained increase in the fractional release curve at relatively high temperatures after the total precipitation of intragranular gas in fission gas bubbles is ascribed to the effects of a grain-growth/grain-boundary sweeping mechanism.

  10. Validation of mechanistic models for gas precipitation in solids during postirradiation annealing experiments

    International Nuclear Information System (INIS)

    Rest, J.

    1989-01-01

    A number of different phenomenological models for gas precipitation in solids during postirradiation annealing experiments have been proposed. Validation of such mechanistic models for gas release and swelling is complicated by the use of data containing large systematic errors, and phenomena characterized by synergistic effects as well as uncertainties in materials properties. Statistical regression analysis is recommended for the selection of a reasonably well characterized data base for gas release from irradiated fuel under transient heating conditions. It is demonstrated that an appropriate data selection method is required in order to realistically examine the impact of differing descriptions of the phenomena, and uncertainties in selected materials properties, on the validation results. The results of the analysis show that the kinetics of gas precipitation in solid depend on bubble overpressurization effects and need to be accounted for during the heatup phase of isothermal heating experiments. It is shown that if only the total gas release values (as opposed to time-dependent data) were available, differentiation between different gas precipitation models would be ambiguous. The observed sustained increase in the fractional release curve at relatively high temperatures after the total precipitation of intragranular gas in fission gas bubbles is ascribed to the effects of a grain-growth/grain-boundary sweeping mechanism. (orig.)

  11. A Three-Stage Mechanistic Model for Solidification Cracking During Welding of Steel

    Science.gov (United States)

    Aucott, L.; Huang, D.; Dong, H. B.; Wen, S. W.; Marsden, J.; Rack, A.; Cocks, A. C. F.

    2018-03-01

    A three-stage mechanistic model for solidification cracking during TIG welding of steel is proposed from in situ synchrotron X-ray imaging of solidification cracking and subsequent analysis of fracture surfaces. Stage 1—Nucleation of inter-granular hot cracks: cracks nucleate inter-granularly in sub-surface where maximum volumetric strain is localized and volume fraction of liquid is less than 0.1; the crack nuclei occur at solute-enriched liquid pockets which remain trapped in increasingly impermeable semi-solid skeleton. Stage 2—Coalescence of cracks via inter-granular fracture: as the applied strain increases, cracks coalesce through inter-granular fracture; the coalescence path is preferential to the direction of the heat source and propagates through the grain boundaries to solidifying dendrites. Stage 3—Propagation through inter-dendritic hot tearing: inter-dendritic hot tearing occurs along the boundaries between solidifying columnar dendrites with higher liquid fraction. It is recommended that future solidification cracking criterion shall be based on the application of multiphase mechanics and fracture mechanics to the failure of semi-solid materials.

  12. Disruption of steroidogenesis: Cell models for mechanistic investigations and as screening tools.

    Science.gov (United States)

    Odermatt, Alex; Strajhar, Petra; Engeli, Roger T

    2016-04-01

    In the modern world, humans are exposed during their whole life to a large number of synthetic chemicals. Some of these chemicals have the potential to disrupt endocrine functions and contribute to the development and/or progression of major diseases. Every year approximately 1000 novel chemicals, used in industrial production, agriculture, consumer products or as pharmaceuticals, are reaching the market, often with limited safety assessment regarding potential endocrine activities. Steroids are essential endocrine hormones, and the importance of the steroidogenesis pathway as a target for endocrine disrupting chemicals (EDCs) has been recognized by leading scientists and authorities. Cell lines have a prominent role in the initial stages of toxicity assessment, i.e. for mechanistic investigations and for the medium to high throughput analysis of chemicals for potential steroidogenesis disrupting activities. Nevertheless, the users have to be aware of the limitations of the existing cell models in order to apply them properly, and there is a great demand for improved cell-based testing systems and protocols. This review intends to provide an overview of the available cell lines for studying effects of chemicals on gonadal and adrenal steroidogenesis, their use and limitations, as well as the need for future improvements of cell-based testing systems and protocols. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Productivity of "Collisions Generate Heat" for Reconciling an Energy Model with Mechanistic Reasoning: A Case Study

    Science.gov (United States)

    Scherr, Rachel E.; Robertson, Amy D.

    2015-01-01

    We observe teachers in professional development courses about energy constructing mechanistic accounts of energy transformations. We analyze a case in which teachers investigating adiabatic compression develop a model of the transformation of kinetic energy to thermal energy. Among their ideas is the idea that thermal energy is generated as a…

  14. Computational Modeling of Cobalt-based Water Oxidation: Current Status and Future Challenges

    Science.gov (United States)

    Schilling, Mauro; Luber, Sandra

    2018-04-01

    A lot of effort is nowadays put into the development of novel water oxidation catalysts. In this context mechanistic studies are crucial in order to elucidate the reaction mechanisms governing this complex process, new design paradigms and strategies how to improve the stability and efficiency of those catalysis. This review is focused on recent theoretical mechanistic studies in the field of homogeneous cobalt-based water oxidation catalysts. In the first part, computational methodologies and protocols are summarized and evaluated on the basis of their applicability towards real catalytic or smaller model systems, whereby special emphasis is laid on the choice of an appropriate model system. In the second part, an overview of mechanistic studies is presented, from which conceptual guidelines are drawn on how to approach novel studies of catalysts and how to further develop the field of computational modeling of water oxidation reactions.

  15. Computational Modeling of Cobalt-Based Water Oxidation: Current Status and Future Challenges

    Directory of Open Access Journals (Sweden)

    Mauro Schilling

    2018-04-01

    Full Text Available A lot of effort is nowadays put into the development of novel water oxidation catalysts. In this context, mechanistic studies are crucial in order to elucidate the reaction mechanisms governing this complex process, new design paradigms and strategies how to improve the stability and efficiency of those catalysts. This review is focused on recent theoretical mechanistic studies in the field of homogeneous cobalt-based water oxidation catalysts. In the first part, computational methodologies and protocols are summarized and evaluated on the basis of their applicability toward real catalytic or smaller model systems, whereby special emphasis is laid on the choice of an appropriate model system. In the second part, an overview of mechanistic studies is presented, from which conceptual guidelines are drawn on how to approach novel studies of catalysts and how to further develop the field of computational modeling of water oxidation reactions.

  16. Mechanistic facility safety and source term analysis

    International Nuclear Information System (INIS)

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  17. Mechanistic Drifting Forecast Model for A Small Semi-Submersible Drifter Under Tide-Wind-Wave Conditions

    Science.gov (United States)

    Zhang, Wei-Na; Huang, Hui-ming; Wang, Yi-gang; Chen, Da-ke; Zhang, lin

    2018-03-01

    Understanding the drifting motion of a small semi-submersible drifter is of vital importance regarding monitoring surface currents and the floating pollutants in coastal regions. This work addresses this issue by establishing a mechanistic drifting forecast model based on kinetic analysis. Taking tide-wind-wave into consideration, the forecast model is validated against in situ drifting experiment in the Radial Sand Ridges. Model results show good performance with respect to the measured drifting features, characterized by migrating back and forth twice a day with daily downwind displacements. Trajectory models are used to evaluate the influence of the individual hydrodynamic forcing. The tidal current is the fundamental dynamic condition in the Radial Sand Ridges and has the greatest impact on the drifting distance. However, it loses its leading position in the field of the daily displacement of the used drifter. The simulations reveal that different hydrodynamic forces dominate the daily displacement of the used drifter at different wind scales. The wave-induced mass transport has the greatest influence on the daily displacement at Beaufort wind scale 5-6; while wind drag contributes mostly at wind scale 2-4.

  18. Fetal programming of CVD and renal disease: animal models and mechanistic considerations.

    Science.gov (United States)

    Langley-Evans, Simon C

    2013-08-01

    The developmental origins of health and disease hypothesis postulates that exposure to a less than optimal maternal environment during fetal development programmes physiological function, and determines risk of disease in adult life. Much evidence of such programming comes from retrospective epidemiological cohorts, which demonstrate associations between birth anthropometry and non-communicable diseases of adulthood. The assertion that variation in maternal nutrition drives these associations is supported by studies using animal models, which demonstrate that maternal under- or over-nutrition during pregnancy can programme offspring development. Typically, the offspring of animals that are undernourished in pregnancy exhibit a relatively narrow range of physiological phenotypes that includes higher blood pressure, glucose intolerance, renal insufficiency and increased adiposity. The observation that common phenotypes arise from very diverse maternal nutritional insults has led to the proposal that programming is driven by a small number of mechanistic processes. The remodelling of tissues during development as a consequence of maternal nutritional status being signalled by endocrine imbalance or key nutrients limiting processes in the fetus may lead to organs having irreversibly altered structures that may limit their function with ageing. It has been proposed that the maternal diet may impact upon epigenetic marks that determine gene expression in fetal tissues, and this may be an important mechanism connecting maternal nutrient intakes to long-term programming of offspring phenotype. The objective for this review is to provide an overview of the mechanistic basis of fetal programming, demonstrating the critical role of animal models as tools for the investigation of programming phenomena.

  19. Mechanistic modelling of a cathode-supported tubular solid oxide fuel cell

    Science.gov (United States)

    Suwanwarangkul, R.; Croiset, E.; Pritzker, M. D.; Fowler, M. W.; Douglas, P. L.; Entchev, E.

    A two-dimensional mechanistic model of a tubular solid oxide fuel cell (SOFC) considering momentum, energy, mass and charge transport is developed. The model geometry of a single cell comprises an air-preheating tube, air channel, fuel channel, anode, cathode and electrolyte layers. The heat radiation between cell and air-preheating tube is also incorporated into the model. This allows the model to predict heat transfer between the cell and air-preheating tube accurately. The model is validated and shows good agreement with literature data. It is anticipated that this model can be used to help develop efficient fuel cell designs and set operating variables under practical conditions. The transport phenomena inside the cell, including gas flow behaviour, temperature, overpotential, current density and species concentration, are analysed and discussed in detail. Fuel and air velocities are found to vary along flow passages depending on the local temperature and species concentrations. This model demonstrates the importance of incorporating heat radiation into a tubular SOFC model. Furthermore, the model shows that the overall cell performance is limited by O 2 diffusion through the thick porous cathode and points to the development of new cathode materials and designs being important avenues to enhance cell performance.

  20. Mechanistic Mathematical Modeling Tests Hypotheses of the Neurovascular Coupling in fMRI.

    Directory of Open Access Journals (Sweden)

    Karin Lundengård

    2016-06-01

    Full Text Available Functional magnetic resonance imaging (fMRI measures brain activity by detecting the blood-oxygen-level dependent (BOLD response to neural activity. The BOLD response depends on the neurovascular coupling, which connects cerebral blood flow, cerebral blood volume, and deoxyhemoglobin level to neuronal activity. The exact mechanisms behind this neurovascular coupling are not yet fully investigated. There are at least three different ways in which these mechanisms are being discussed. Firstly, mathematical models involving the so-called Balloon model describes the relation between oxygen metabolism, cerebral blood volume, and cerebral blood flow. However, the Balloon model does not describe cellular and biochemical mechanisms. Secondly, the metabolic feedback hypothesis, which is based on experimental findings on metabolism associated with brain activation, and thirdly, the neurotransmitter feed-forward hypothesis which describes intracellular pathways leading to vasoactive substance release. Both the metabolic feedback and the neurotransmitter feed-forward hypotheses have been extensively studied, but only experimentally. These two hypotheses have never been implemented as mathematical models. Here we investigate these two hypotheses by mechanistic mathematical modeling using a systems biology approach; these methods have been used in biological research for many years but never been applied to the BOLD response in fMRI. In the current work, model structures describing the metabolic feedback and the neurotransmitter feed-forward hypotheses were applied to measured BOLD responses in the visual cortex of 12 healthy volunteers. Evaluating each hypothesis separately shows that neither hypothesis alone can describe the data in a biologically plausible way. However, by adding metabolism to the neurotransmitter feed-forward model structure, we obtained a new model structure which is able to fit the estimation data and successfully predict new

  1. Improving the prediction of methane production and representation of rumen fermentation for finishing beef cattle within a mechanistic model

    NARCIS (Netherlands)

    Ellis, J.L.; Dijkstra, J.; Bannink, A.; Kebreab, E.; Archibeque, S.; Benchaar, C.; Beauchemin, K.; Nkrumah, D.J.; France, J.

    2014-01-01

    The purpose of this study was to evaluate prediction of methane emissions from finishing beef cattle using an extant mechanistic model with pH-independent or pH-dependent VFA stoichiometries, a recent stoichiometry adjustment for the use of monensin, and adaptation of the underlying model structure,

  2. Development of a mechanistic model for prediction of CO2 capture from gas mixtures by amine solutions in porous membranes.

    Science.gov (United States)

    Ghadiri, Mehdi; Marjani, Azam; Shirazian, Saeed

    2017-06-01

    A mechanistic model was developed in order to predict capture and removal of CO 2 from air using membrane technology. The considered membrane was a hollow-fiber contactor module in which gas mixture containing CO 2 was assumed as feed while 2-amino-2-metyl-1-propanol (AMP) was used as an absorbent. The mechanistic model was developed according to transport phenomena taking into account mass transfer and chemical reaction between CO 2 and amine in the contactor module. The main aim of modeling was to track the composition and flux of CO 2 and AMP in the membrane module for process optimization. For modeling of the process, the governing equations were computed using finite element approach in which the whole model domain was discretized into small cells. To confirm the simulation findings, model outcomes were compared with experimental data and good consistency was revealed. The results showed that increasing temperature of AMP solution increases CO 2 removal in the hollow-fiber membrane contactor.

  3. Flow regimes and mechanistic modeling of critical heat flux under subcooled flow boiling conditions

    Science.gov (United States)

    Le Corre, Jean-Marie

    Thermal performance of heat flux controlled boiling heat exchangers are usually limited by the Critical Heat Flux (CHF) above which the heat transfer degrades quickly, possibly leading to heater overheating and destruction. In an effort to better understand the phenomena, a literature review of CHF experimental visualizations under subcooled flow boiling conditions was performed and systematically analyzed. Three major types of CHF flow regimes were identified (bubbly, vapor clot and slug flow regime) and a CHF flow regime map was developed, based on a dimensional analysis of the phenomena and available data. It was found that for similar geometric characteristics and pressure, a Weber number (We)/thermodynamic quality (x) map can be used to predict the CHF flow regime. Based on the experimental observations and the review of the available CHF mechanistic models under subcooled flow boiling conditions, hypothetical CHF mechanisms were selected for each CHF flow regime, all based on a concept of wall dry spot overheating, rewetting prevention and subsequent dry spot spreading. It is postulated that a high local wall superheat occurs locally in a dry area of the heated wall, due to a cyclical event inherent to the considered CHF two-phase flow regime, preventing rewetting (Leidenfrost effect). The selected modeling concept has the potential to span the CHF conditions from highly subcooled bubbly flow to early stage of annular flow. A numerical model using a two-dimensional transient thermal analysis of the heater undergoing nucleation was developed to mechanistically predict CHF in the case of a bubbly flow regime. In this type of CHF two-phase flow regime, the high local wall superheat occurs underneath a nucleating bubble at the time of bubble departure. The model simulates the spatial and temporal heater temperature variations during nucleation at the wall, accounting for the stochastic nature of the boiling phenomena. The model has also the potential to evaluate

  4. Modeling and validation of a mechanistic tool (MEFISTO) for the prediction of critical power in BWR fuel assemblies

    International Nuclear Information System (INIS)

    Adamsson, Carl; Le Corre, Jean-Marie

    2011-01-01

    Highlights: → The MEFISTO code efficiently and accurately predicts the dryout event in a BWR fuel bundle, using a mechanistic model. → A hybrid approach between a fast and robust sub-channel analysis and a three-field two-phase analysis is adopted. → MEFISTO modeling approach, calibration, CPU usage, sensitivity, trend analysis and performance evaluation are presented. → The calibration parameters and process were carefully selected to preserve the mechanistic nature of the code. → The code dryout prediction performance is near the level of fuel-specific empirical dryout correlations. - Abstract: Westinghouse is currently developing the MEFISTO code with the main goal to achieve fast, robust, practical and reliable prediction of steady-state dryout Critical Power in Boiling Water Reactor (BWR) fuel bundle based on a mechanistic approach. A computationally efficient simulation scheme was used to achieve this goal, where the code resolves all relevant field (drop, steam and multi-film) mass balance equations, within the annular flow region, at the sub-channel level while relying on a fast and robust two-phase (liquid/steam) sub-channel solution to provide the cross-flow information. The MEFISTO code can hence provide highly detailed solution of the multi-film flow in BWR fuel bundle while enhancing flexibility and reducing the computer time by an order of magnitude as compared to a standard three-field sub-channel analysis approach. Models for the numerical computation of the one-dimensional field flowrate distributions in an open channel (e.g. a sub-channel), including the numerical treatment of field cross-flows, part-length rods, spacers grids and post-dryout conditions are presented in this paper. The MEFISTO code is then applied to dryout prediction in BWR fuel bundle using VIPRE-W as a fast and robust two-phase sub-channel driver code. The dryout power is numerically predicted by iterating on the bundle power so that the minimum film flowrate in the

  5. Applicability of one-dimensional mechanistic post-dryout prediction model

    International Nuclear Information System (INIS)

    Jeong, Hae Yong; No Hee Cheon

    1996-01-01

    Through the analysis of many experimental post-dryout data, it is shown that the most probable flow regime near dryout or quench front is not annular flow but churn-turbulent flow when the mass flux is low. A correlation describing the initial droplet size just after the CHF position at low mass flux is low. A correlation describing the initial droplet size just after the CHF position at low mass flux is suggested through regression analysis. In the post-dryout region at low pressure and low flow, it is found that the suggested one-dimensional mechanistic model is not applicable when the vapor superficial velocity is very low, i. e., when the flow is bubbly or slug flow regime. This is explained by the change of main entrainment mechanism with the change of flow regime. Therefore, the suggested correlation is valid only in the churn-turbulent flow regime (j * g = 0.5 ∼ 4.5)

  6. A mechanistic model for spread of livestock-associated methicillin-resistant Staphylococcus aureus (LA-MRSA) within a pig herd

    DEFF Research Database (Denmark)

    Sørensen, Anna Irene Vedel; Toft, Nils; Boklund, Anette

    2017-01-01

    Before an efficient control strategy for livestock-associated methicillin resistant Staphylococcus aureus (LA-MRSA) in pigs can be decided upon, it is necessary to obtain a betterunderstanding of how LA-MRSA spreads and persists within a pig herd, once it is introduced.We here present a mechanistic...... stochastic discrete-event simulation model forspread of LA-MRSA within a farrow-to-finish sow herd to aid in this. The model was individual-based and included three different disease compartments: susceptible, intermittent or persistent shedder of MRSA. The model was used for studying transmission dynamics...... and within-farm prevalence after different introductions of LA-MRSA into a farm. The spread of LA-MRSA throughout the farm mainly followed the movement of pigs. After spread of LA-MRSA had reached equilibrium, the prevalence of LA-MRSA shedders was predicted to be highest in the farrowing unit, independent...

  7. Mechanistic modelling of gaseous fission product behaviour in UO2 fuel by Rtop code

    International Nuclear Information System (INIS)

    Kanukova, V.D.; Khoruzhii, O.V.; Kourtchatov, S.Y.; Likhanskii, V.V.; Matveew, L.V.

    2002-01-01

    The current status of a mechanistic modelling by the RTOP code of the fission product behaviour in polycrystalline UO 2 fuel is described. An outline of the code and implemented physical models is presented. The general approach to code validation is discussed. It is exemplified by the results of validation of the models of fuel oxidation and grain growth. The different models of intragranular and intergranular gas bubble behaviour have been tested and the sensitivity of the code in the framework of these models has been analysed. An analysis of available models of the resolution of grain face bubbles is also presented. The possibilities of the RTOP code are presented through the example of modelling behaviour of WWER fuel over the course of a comparative WWER-PWR experiment performed at Halden and by comparison with Yanagisawa experiments. (author)

  8. Chemical kinetic mechanistic models to investigate cancer biology and impact cancer medicine

    International Nuclear Information System (INIS)

    Stites, Edward C

    2013-01-01

    Traditional experimental biology has provided a mechanistic understanding of cancer in which the malignancy develops through the acquisition of mutations that disrupt cellular processes. Several drugs developed to target such mutations have now demonstrated clinical value. These advances are unequivocal testaments to the value of traditional cellular and molecular biology. However, several features of cancer may limit the pace of progress that can be made with established experimental approaches alone. The mutated genes (and resultant mutant proteins) function within large biochemical networks. Biochemical networks typically have a large number of component molecules and are characterized by a large number of quantitative properties. Responses to a stimulus or perturbation are typically nonlinear and can display qualitative changes that depend upon the specific values of variable system properties. Features such as these can complicate the interpretation of experimental data and the formulation of logical hypotheses that drive further research. Mathematical models based upon the molecular reactions that define these networks combined with computational studies have the potential to deal with these obstacles and to enable currently available information to be more completely utilized. Many of the pressing problems in cancer biology and cancer medicine may benefit from a mathematical treatment. As work in this area advances, one can envision a future where such models may meaningfully contribute to the clinical management of cancer patients. (paper)

  9. The physicochemical process of bacterial attachment to abiotic surfaces: Challenges for mechanistic studies, predictability and the development of control strategies.

    Science.gov (United States)

    Wang, Yi; Lee, Sui Mae; Dykes, Gary

    2015-01-01

    Bacterial attachment to abiotic surfaces can be explained as a physicochemical process. Mechanisms of the process have been widely studied but are not yet well understood due to their complexity. Physicochemical processes can be influenced by various interactions and factors in attachment systems, including, but not limited to, hydrophobic interactions, electrostatic interactions and substratum surface roughness. Mechanistic models and control strategies for bacterial attachment to abiotic surfaces have been established based on the current understanding of the attachment process and the interactions involved. Due to a lack of process control and standardization in the methodologies used to study the mechanisms of bacterial attachment, however, various challenges are apparent in the development of models and control strategies. In this review, the physicochemical mechanisms, interactions and factors affecting the process of bacterial attachment to abiotic surfaces are described. Mechanistic models established based on these parameters are discussed in terms of their limitations. Currently employed methods to study these parameters and bacterial attachment are critically compared. The roles of these parameters in the development of control strategies for bacterial attachment are reviewed, and the challenges that arise in developing mechanistic models and control strategies are assessed.

  10. Mechanistic study of aerosol dry deposition on vegetated canopies

    International Nuclear Information System (INIS)

    Petroff, A.

    2005-04-01

    The dry deposition of aerosols onto vegetated canopies is modelled through a mechanistic approach. The interaction between aerosols and vegetation is first formulated by using a set of parameters, which are defined at the local scale of one surface. The overall deposition is then deduced at the canopy scale through an up-scaling procedure based on the statistic distribution parameters. This model takes into account the canopy structural and morphological properties, and the main characteristics of the turbulent flow. Deposition mechanisms considered are Brownian diffusion, interception, initial and turbulent impaction, initially with coniferous branches and then with entire canopies of different roughness, such as grass, crop field and forest. (author)

  11. Testing the molecular clock using mechanistic models of fossil preservation and molecular evolution.

    Science.gov (United States)

    Warnock, Rachel C M; Yang, Ziheng; Donoghue, Philip C J

    2017-06-28

    Molecular sequence data provide information about relative times only, and fossil-based age constraints are the ultimate source of information about absolute times in molecular clock dating analyses. Thus, fossil calibrations are critical to molecular clock dating, but competing methods are difficult to evaluate empirically because the true evolutionary time scale is never known. Here, we combine mechanistic models of fossil preservation and sequence evolution in simulations to evaluate different approaches to constructing fossil calibrations and their impact on Bayesian molecular clock dating, and the relative impact of fossil versus molecular sampling. We show that divergence time estimation is impacted by the model of fossil preservation, sampling intensity and tree shape. The addition of sequence data may improve molecular clock estimates, but accuracy and precision is dominated by the quality of the fossil calibrations. Posterior means and medians are poor representatives of true divergence times; posterior intervals provide a much more accurate estimate of divergence times, though they may be wide and often do not have high coverage probability. Our results highlight the importance of increased fossil sampling and improved statistical approaches to generating calibrations, which should incorporate the non-uniform nature of ecological and temporal fossil species distributions. © 2017 The Authors.

  12. In Vitro–In Vivo Correlation for Gliclazide Immediate-Release Tablets Based on Mechanistic Absorption Simulation

    OpenAIRE

    Grbic, Sandra; Parojcic, Jelena; Ibric, Svetlana; Djuric, Zorica

    2010-01-01

    The aim of this study was to develop a drug-specific absorption model for gliclazide (GLK) using mechanistic gastrointestinal simulation technology (GIST) implemented in GastroPlusTM software package. A range of experimentally determined, in silico predicted or literature data were used as input parameters. Experimentally determined pH-solubility profile was used for all simulations. The human jejunum effective permeability (Peff) value was estimated on the basis of in vitro measured Caco-2 p...

  13. Mechanistic Links Between PARP, NAD, and Brain Inflammation After TBI

    Science.gov (United States)

    2015-10-01

    1 AWARD NUMBER: W81XWH-13-2-0091 TITLE: Mechanistic Links Between PARP, NAD , and Brain Inflammation After TBI PRINCIPAL INVESTIGATOR...COVERED 25 Sep 2014 - 24 Sep 2015 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Mechanistic Links Between PARP, NAD , and Brain Inflammation After TBI 5b. GRANT...efficacy of veliparib and NAD as agents for suppressing inflammation and improving outcomes after traumatic brain injury. The animal models include

  14. Numerical simulation in steam injection process by a mechanistic approach

    Energy Technology Data Exchange (ETDEWEB)

    De Souza, J.C.Jr.; Campos, W.; Lopes, D.; Moura, L.S.S. [Petrobras, Rio de Janeiro (Brazil)

    2008-10-15

    Steam injection is a common thermal recovery method used in very viscous oil reservoirs. The method involves the injection of heat to reduce viscosity and mobilize oil. A steam generation and injection system consists primarily of a steam source, distribution lines, injection wells and a discarding tank. In order to optimize injection and improve the oil recovery factor, one must determine the parameters of steam flow such as pressure, temperature and steam quality. This study focused on developing a unified mathematical model by means of a mechanistic approach for two-phase steam flow in pipelines and wells. The hydrodynamic and heat transfer mechanistic model was implemented in a computer simulator to model the parameters of steam injection while trying to avoid the use of empirical correlations. A marching algorithm was used to determine the distribution of pressure and temperature along the pipelines and wellbores. The mathematical model for steam flow in injection systems, developed by a mechanistic approach (VapMec) performed well when the simulated values of pressures and temperatures were compared with the values measured during field tests. The newly developed VapMec model was incorporated in the LinVap-3 simulator that constitutes an engineering supporting tool for steam injection wells operated by Petrobras. 23 refs., 7 tabs., 6 figs.

  15. DNB Mechanistic model assessment based on experimental data in narrow rectangular channel

    International Nuclear Information System (INIS)

    Zhou Lei; Yan Xiao; Huang Yanping; Xiao Zejun; Huang Shanfang

    2011-01-01

    The departure from nuclear boiling (DNB) is important concerning about the safety of a PWR. Lacking assessment by experimental data points, it's doubtful whether the existing models can be used in narrow rectangular channels or not. Based on experimental data points in narrow rectangular channels, two kinds of classical DNB models, which include liquid sublayer dryout model (LSDM) and bubble crowding model (BCM), were assessed. The results show that the BCM has much wider application range than the LSDM. Several thermal parameters show systematical influences on the calculated results by the models. The performances of all the models deteriorate as the void fraction increases. The reason may be attributed to the geometrical differences between a circular tube and narrow rectangular channel. (authors)

  16. Development and Implementation of Mechanistic Terry Turbine Models in RELAP-7 to Simulate RCIC Normal Operation Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Haihua [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zhang, Hongbin [Idaho National Lab. (INL), Idaho Falls, ID (United States); O' Brien, James Edward [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    As part of the efforts to understand the unexpected “self-regulating” mode of the RCIC (Reactor Core Isolation Cooling) systems in Fukushima accidents and extend BWR RCIC and PWR AFW (Auxiliary Feed Water) operational range and flexibility, mechanistic models for the Terry turbine, based on Sandia’s original work [1], have been developed and implemented in the RELAP-7 code to simulate the RCIC system. In 2016, our effort has been focused on normal working conditions of the RCIC system. More complex off-design conditions will be pursued in later years when more data are available. In the Sandia model, the turbine stator inlet velocity is provided according to a reduced-order model which was obtained from a large number of CFD (computational fluid dynamics) simulations. In this work, we propose an alternative method, using an under-expanded jet model to obtain the velocity and thermodynamic conditions for the turbine stator inlet. The models include both an adiabatic expansion process inside the nozzle and a free expansion process outside of the nozzle to ambient pressure. The combined models are able to predict the steam mass flow rate and supersonic velocity to the Terry turbine bucket entrance, which are the necessary input information for the Terry turbine rotor model. The analytical models for the nozzle were validated with experimental data and benchmarked with CFD simulations. The analytical models generally agree well with the experimental data and CFD simulations. The analytical models are suitable for implementation into a reactor system analysis code or severe accident code as part of mechanistic and dynamical models to understand the RCIC behaviors. The newly developed nozzle models and modified turbine rotor model according to the Sandia’s original work have been implemented into RELAP-7, along with the original Sandia Terry turbine model. A new pump model has also been developed and implemented to couple with the Terry turbine model. An input

  17. Experimental investigation and mechanistic modelling of dilute bubbly bulk boiling

    International Nuclear Information System (INIS)

    Kutnjak, Josip

    2013-01-01

    During evaporation the geometric shape of the vapour is not described using thermodynamics. In bubbly flows the bubble shape is considered spheric with small diameters and changing into various shapes upon growth. The heat and mass transfer happens at the interfacial area. The forces acting on the bubbles depend on the bubble diameter and shape. In this work the prediction of the bubble diameter and/or bubble number density in bulk boiling was considered outside the vicinity of the heat input area. Thus the boiling effects that happened inside the nearly saturated bulk were under investigation. This situation is relevant for nuclear safety analysis concerning a stagnant coolant in the spent fuel pool. In this research project a new experimental set-up to investigate was built. The experimental set-up consists of an instrumented, partly transparent, high and slender boiling container for visual observation. The direct visual observation of the boiling phenomena is necessary for the identification of basic mechanisms, which should be incorporated in the simulation model. The boiling process has been recorded by means of video images and subsequently was evaluated by digital image processing methods, and by that data concerning the characteristics of the boiling process were generated for the model development and validation. Mechanistic modelling is based on the derivation of relevant mechanisms concluded from observation, which is in line with physical knowledge. In this context two mechanisms were identified; the growth/-shrink mechanism (GSM) of the vapour bubbles and sudden increases of the bubble number density. The GSM was implemented into the CFD-Code ANSYS-CFX using the CFX Expression Language (CEL) by calculation of the internal bubble pressure using the Young-Laplace-Equation. This way a hysteresis is realised as smaller bubbles have an increased internal pressure. The sudden increases of the bubble number density are explainable by liquid super

  18. Experimental investigation and mechanistic modelling of dilute bubbly bulk boiling

    Energy Technology Data Exchange (ETDEWEB)

    Kutnjak, Josip

    2013-06-27

    During evaporation the geometric shape of the vapour is not described using thermodynamics. In bubbly flows the bubble shape is considered spheric with small diameters and changing into various shapes upon growth. The heat and mass transfer happens at the interfacial area. The forces acting on the bubbles depend on the bubble diameter and shape. In this work the prediction of the bubble diameter and/or bubble number density in bulk boiling was considered outside the vicinity of the heat input area. Thus the boiling effects that happened inside the nearly saturated bulk were under investigation. This situation is relevant for nuclear safety analysis concerning a stagnant coolant in the spent fuel pool. In this research project a new experimental set-up to investigate was built. The experimental set-up consists of an instrumented, partly transparent, high and slender boiling container for visual observation. The direct visual observation of the boiling phenomena is necessary for the identification of basic mechanisms, which should be incorporated in the simulation model. The boiling process has been recorded by means of video images and subsequently was evaluated by digital image processing methods, and by that data concerning the characteristics of the boiling process were generated for the model development and validation. Mechanistic modelling is based on the derivation of relevant mechanisms concluded from observation, which is in line with physical knowledge. In this context two mechanisms were identified; the growth/-shrink mechanism (GSM) of the vapour bubbles and sudden increases of the bubble number density. The GSM was implemented into the CFD-Code ANSYS-CFX using the CFX Expression Language (CEL) by calculation of the internal bubble pressure using the Young-Laplace-Equation. This way a hysteresis is realised as smaller bubbles have an increased internal pressure. The sudden increases of the bubble number density are explainable by liquid super

  19. A mechanistic compartmental model for total antibody uptake in tumors.

    Science.gov (United States)

    Thurber, Greg M; Dane Wittrup, K

    2012-12-07

    Antibodies are under development to treat a variety of cancers, such as lymphomas, colon, and breast cancer. A major limitation to greater efficacy for this class of drugs is poor distribution in vivo. Localization of antibodies occurs slowly, often in insufficient therapeutic amounts, and distributes heterogeneously throughout the tumor. While the microdistribution around individual vessels is important for many therapies, the total amount of antibody localized in the tumor is paramount for many applications such as imaging, determining the therapeutic index with antibody drug conjugates, and dosing in radioimmunotherapy. With imaging and pretargeted therapeutic strategies, the time course of uptake is critical in determining when to take an image or deliver a secondary reagent. We present here a simple mechanistic model of antibody uptake and retention that captures the major rates that determine the time course of antibody concentration within a tumor including dose, affinity, plasma clearance, target expression, internalization, permeability, and vascularization. Since many of the parameters are known or can be estimated in vitro, this model can approximate the time course of antibody concentration in tumors to aid in experimental design, data interpretation, and strategies to improve localization. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Mechanistic Indicators of Childhood Asthma (MICA) Study

    Science.gov (United States)

    The Mechanistic Indicators of Childhood Asthma (MICA) Study has been designed to incorporate state-of-the-art technologies to examine the physiological and environmental factors that interact to increase the risk of asthmatic responses. MICA is primarily a clinically-bases obser...

  1. A mechanistic model of heat transfer for gas-liquid flow in vertical wellbore annuli.

    Science.gov (United States)

    Yin, Bang-Tang; Li, Xiang-Fang; Liu, Gang

    2018-01-01

    The most prominent aspect of multiphase flow is the variation in the physical distribution of the phases in the flow conduit known as the flow pattern. Several different flow patterns can exist under different flow conditions which have significant effects on liquid holdup, pressure gradient and heat transfer. Gas-liquid two-phase flow in an annulus can be found in a variety of practical situations. In high rate oil and gas production, it may be beneficial to flow fluids vertically through the annulus configuration between well tubing and casing. The flow patterns in annuli are different from pipe flow. There are both casing and tubing liquid films in slug flow and annular flow in the annulus. Multiphase heat transfer depends on the hydrodynamic behavior of the flow. There are very limited research results that can be found in the open literature for multiphase heat transfer in wellbore annuli. A mechanistic model of multiphase heat transfer is developed for different flow patterns of upward gas-liquid flow in vertical annuli. The required local flow parameters are predicted by use of the hydraulic model of steady-state multiphase flow in wellbore annuli recently developed by Yin et al. The modified heat-transfer model for single gas or liquid flow is verified by comparison with Manabe's experimental results. For different flow patterns, it is compared with modified unified Zhang et al. model based on representative diameters.

  2. Mechanistic applicability domain classification of a local lymph node assay dataset for skin sensitization.

    Science.gov (United States)

    Roberts, David W; Patlewicz, Grace; Kern, Petra S; Gerberick, Frank; Kimber, Ian; Dearman, Rebecca J; Ryan, Cindy A; Basketter, David A; Aptula, Aynur O

    2007-07-01

    The goal of eliminating animal testing in the predictive identification of chemicals with the intrinsic ability to cause skin sensitization is an important target, the attainment of which has recently been brought into even sharper relief by the EU Cosmetics Directive and the requirements of the REACH legislation. Development of alternative methods requires that the chemicals used to evaluate and validate novel approaches comprise not only confirmed skin sensitizers and non-sensitizers but also substances that span the full chemical mechanistic spectrum associated with skin sensitization. To this end, a recently published database of more than 200 chemicals tested in the mouse local lymph node assay (LLNA) has been examined in relation to various chemical reaction mechanistic domains known to be associated with sensitization. It is demonstrated here that the dataset does cover the main reaction mechanistic domains. In addition, it is shown that assignment to a reaction mechanistic domain is a critical first step in a strategic approach to understanding, ultimately on a quantitative basis, how chemical properties influence the potency of skin sensitizing chemicals. This understanding is necessary if reliable non-animal approaches, including (quantitative) structure-activity relationships (Q)SARs, read-across, and experimental chemistry based models, are to be developed.

  3. Metabolic Model-Based Integration of Microbiome Taxonomic and Metabolomic Profiles Elucidates Mechanistic Links between Ecological and Metabolic Variation

    Energy Technology Data Exchange (ETDEWEB)

    Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M.; Young, Vincent B.; Jansson, Janet K.; Fredricks, David N.; Borenstein, Elhanan; Sanchez, Laura M.

    2015-12-22

    ABSTRACT

    Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in

  4. Mechanistic site-based emulation of a global ocean biogeochemical model (MEDUSA 1.0 for parametric analysis and calibration: an application of the Marine Model Optimization Testbed (MarMOT 1.1

    Directory of Open Access Journals (Sweden)

    J. C. P. Hemmings

    2015-03-01

    Full Text Available Biogeochemical ocean circulation models used to investigate the role of plankton ecosystems in global change rely on adjustable parameters to capture the dominant biogeochemical dynamics of a complex biological system. In principle, optimal parameter values can be estimated by fitting models to observational data, including satellite ocean colour products such as chlorophyll that achieve good spatial and temporal coverage of the surface ocean. However, comprehensive parametric analyses require large ensemble experiments that are computationally infeasible with global 3-D simulations. Site-based simulations provide an efficient alternative but can only be used to make reliable inferences about global model performance if robust quantitative descriptions of their relationships with the corresponding 3-D simulations can be established. The feasibility of establishing such a relationship is investigated for an intermediate complexity biogeochemistry model (MEDUSA coupled with a widely used global ocean model (NEMO. A site-based mechanistic emulator is constructed for surface chlorophyll output from this target model as a function of model parameters. The emulator comprises an array of 1-D simulators and a statistical quantification of the uncertainty in their predictions. The unknown parameter-dependent biogeochemical environment, in terms of initial tracer concentrations and lateral flux information required by the simulators, is a significant source of uncertainty. It is approximated by a mean environment derived from a small ensemble of 3-D simulations representing variability of the target model behaviour over the parameter space of interest. The performance of two alternative uncertainty quantification schemes is examined: a direct method based on comparisons between simulator output and a sample of known target model "truths" and an indirect method that is only partially reliant on knowledge of the target model output. In general, chlorophyll

  5. Towards a CFD-based mechanistic deposit formation model for straw-fired boilers

    DEFF Research Database (Denmark)

    Kær, Søren Knudsen; Rosendahl, Lasse Aistrup; Baxter, L.L.

    2006-01-01

    is configured entirely through a graphical user interface integrated in the standard FLUENTe interface. The model considers fine and coarse mode ash deposition and sticking mechanisms for the complete deposit growth, as well as an influence on the local boundary conditions for heat transfer due to thermal...... in the reminder of the paper. The growth of deposits on furnace walls and super heater tubes is treated including the impact on heat transfer rates determined by the CFD code. Based on the commercial CFD code FLUENTe, the overall model is fully implemented through the User Defined Functions. The model...

  6. Four Mechanistic Models of Peer Influence on Adolescent Cannabis Use.

    Science.gov (United States)

    Caouette, Justin D; Feldstein Ewing, Sarah W

    2017-06-01

    Most adolescents begin exploring cannabis in peer contexts, but the neural mechanisms that underlie peer influence on adolescent cannabis use are still unknown. This theoretical overview elucidates the intersecting roles of neural function and peer factors in cannabis use in adolescents. Novel paradigms using functional magnetic resonance imaging (fMRI) in adolescents have identified distinct neural mechanisms of risk decision-making and incentive processing in peer contexts, centered on reward-motivation and affect regulatory neural networks; these findings inform a theoretical model of peer-driven cannabis use decisions in adolescents. We propose four "mechanistic profiles" of social facilitation of cannabis use in adolescents: (1) peer influence as the primary driver of use; (2) cannabis exploration as the primary driver, which may be enhanced in peer contexts; (3) social anxiety; and (4) negative peer experiences. Identification of "neural targets" involved in motivating cannabis use may inform clinicians about which treatment strategies work best in adolescents with cannabis use problems, and via which social and neurocognitive processes.

  7. Thermodynamics-based models of transcriptional regulation with gene sequence.

    Science.gov (United States)

    Wang, Shuqiang; Shen, Yanyan; Hu, Jinxing

    2015-12-01

    Quantitative models of gene regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled or heuristic approximations of the underlying regulatory mechanisms. In this work, we have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence. The proposed model relies on a continuous time, differential equation description of transcriptional dynamics. The sequence features of the promoter are exploited to derive the binding affinity which is derived based on statistical molecular thermodynamics. Experimental results show that the proposed model can effectively identify the activity levels of transcription factors and the regulatory parameters. Comparing with the previous models, the proposed model can reveal more biological sense.

  8. Supporting Mechanistic Reasoning in Domain-Specific Contexts

    Science.gov (United States)

    Weinberg, Paul J.

    2017-01-01

    Mechanistic reasoning is an epistemic practice central within science, technology, engineering, and mathematics disciplines. Although there has been some work on mechanistic reasoning in the research literature and standards documents, much of this work targets domain-general characterizations of mechanistic reasoning; this study provides…

  9. Precision and accuracy of mechanistic-empirical pavement design

    CSIR Research Space (South Africa)

    Theyse, HL

    2006-09-01

    Full Text Available are discussed in general. The effects of variability and error on the design accuracy and design risk are lastly illustrated at the hand of a simple mechanistic-empirical design problem, showing that the engineering models alone determine the accuracy...

  10. Application of a mechanistic model as a tool for on-line monitoring of pilot scale filamentous fungal fermentation processes-The importance of evaporation effects.

    Science.gov (United States)

    Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V

    2017-03-01

    A mechanistic model-based soft sensor is developed and validated for 550L filamentous fungus fermentations operated at Novozymes A/S. The soft sensor is comprised of a parameter estimation block based on a stoichiometric balance, coupled to a dynamic process model. The on-line parameter estimation block models the changing rates of formation of product, biomass, and water, and the rate of consumption of feed using standard, available on-line measurements. This parameter estimation block, is coupled to a mechanistic process model, which solves the current states of biomass, product, substrate, dissolved oxygen and mass, as well as other process parameters including k L a, viscosity and partial pressure of CO 2 . State estimation at this scale requires a robust mass model including evaporation, which is a factor not often considered at smaller scales of operation. The model is developed using a historical data set of 11 batches from the fermentation pilot plant (550L) at Novozymes A/S. The model is then implemented on-line in 550L fermentation processes operated at Novozymes A/S in order to validate the state estimator model on 14 new batches utilizing a new strain. The product concentration in the validation batches was predicted with an average root mean sum of squared error (RMSSE) of 16.6%. In addition, calculation of the Janus coefficient for the validation batches shows a suitably calibrated model. The robustness of the model prediction is assessed with respect to the accuracy of the input data. Parameter estimation uncertainty is also carried out. The application of this on-line state estimator allows for on-line monitoring of pilot scale batches, including real-time estimates of multiple parameters which are not able to be monitored on-line. With successful application of a soft sensor at this scale, this allows for improved process monitoring, as well as opening up further possibilities for on-line control algorithms, utilizing these on-line model outputs

  11. Semi-Mechanistic Population Pharmacokinetic Modeling of L-Histidine Disposition and Brain Uptake in Wildtype and Pht1 Null Mice.

    Science.gov (United States)

    Wang, Xiao-Xing; Li, Yang-Bing; Feng, Meihua R; Smith, David E

    2018-01-05

    To develop a semi-mechanistic population pharmacokinetic (PK) model to quantitate the disposition kinetics of L-histidine, a peptide-histidine transporter 1 (PHT1) substrate, in the plasma, cerebrospinal fluid and brain parenchyma of wildtype (WT) and Pht1 knockout (KO) mice. L-[ 14 C]Hisidine (L-His) was administrated to WT and KO mice via tail vein injection, after which plasma, cerebrospinal fluid (CSF) and brain parenchyma samples were collected. A PK model was developed using non-linear mixed effects modeling (NONMEM). The disposition of L-His between the plasma, brain, and CSF was described by a combination of PHT1-mediated uptake, CSF bulk flow and first-order micro-rate constants. The PK profile of L-His was best described by a four-compartment model. A more rapid uptake of L-His in brain parenchyma was observed in WT mice due to PHT1-mediated uptake, a process characterized by a Michaelis-Menten component (V max  = 0.051 nmoL/min and K m  = 34.94 μM). A semi-mechanistic population PK model was successfully developed, for the first time, to quantitatively characterize the disposition kinetics of L-His in brain under in vivo conditions. This model may prove a useful tool in predicting the uptake of L-His, and possibly other PHT1 peptide/mimetic substrates, for drug delivery to the brain.

  12. Integrity: A semi-mechanistic model for stress corrosion cracking of fuel

    Energy Technology Data Exchange (ETDEWEB)

    Tayal, M; Hallgrimson, K; Macquarrie, J; Alavi, P [Atomic Energy of Canada Ltd., Mississauga, ON (Canada); Sato, S; Kinoshita, Y; Nishimura, T [Electric Power Development Co. Ltd., Tokyo (Japan)

    1997-08-01

    In this paper we describe the features, validation, and illustrative applications of a semi-mechanistic model, INTEGRITY, which calculates the probability of fuel defects due to stress corrosion cracking. The model expresses the defect probability in terms of fundamental parameters such as local stresses, local strains, and fission product concentration. The assessments of defect probability continue to reflect the influence of conventional parameters like ramped power, power-ramp, burnup and Canlub coating. In addition, the INTEGRITY model provides a mechanism to account for the impacts of additional factors involving detailed fuel design and reactor operation. Some examples of the latter include pellet density, pellet shape and size, sheath diameter and thickness, pellet/sheath clearance, coolant temperature and pressure, etc. The model has been fitted to a database of 554 power-ramp irradiations of CANDU fuel with and without Canlub. For this database the INTEGRITY model calculates 75 defects vs 75 actual defects. Similarly good agreements were noted in the different sub-groups of the data involving non-Canlub, thin-Canlub, and thick-Canlub fuel. Moreover, the shapes and the locations of the defect thresholds were consistent with all the above defects as well as with additional 14 ripple defects that were not in the above database. Two illustrative examples demonstrate how the defect thresholds are influenced by changes in the internal design of the fuel element and by extended burnup. (author). 19 refs, 7 figs.

  13. Integrity: A semi-mechanistic model for stress corrosion cracking of fuel

    International Nuclear Information System (INIS)

    Tayal, M.; Hallgrimson, K.; Macquarrie, J.; Alavi, P.; Sato, S.; Kinoshita, Y.; Nishimura, T.

    1997-01-01

    In this paper we describe the features, validation, and illustrative applications of a semi-mechanistic model, INTEGRITY, which calculates the probability of fuel defects due to stress corrosion cracking. The model expresses the defect probability in terms of fundamental parameters such as local stresses, local strains, and fission product concentration. The assessments of defect probability continue to reflect the influence of conventional parameters like ramped power, power-ramp, burnup and Canlub coating. In addition, the INTEGRITY model provides a mechanism to account for the impacts of additional factors involving detailed fuel design and reactor operation. Some examples of the latter include pellet density, pellet shape and size, sheath diameter and thickness, pellet/sheath clearance, coolant temperature and pressure, etc. The model has been fitted to a database of 554 power-ramp irradiations of CANDU fuel with and without Canlub. For this database the INTEGRITY model calculates 75 defects vs 75 actual defects. Similarly good agreements were noted in the different sub-groups of the data involving non-Canlub, thin-Canlub, and thick-Canlub fuel. Moreover, the shapes and the locations of the defect thresholds were consistent with all the above defects as well as with additional 14 ripple defects that were not in the above database. Two illustrative examples demonstrate how the defect thresholds are influenced by changes in the internal design of the fuel element and by extended burnup. (author). 19 refs, 7 figs

  14. Data-based mechanistic modeling of dissolved organic carbon load through storms using continuous 15-minute resolution observations within UK upland watersheds

    Science.gov (United States)

    Jones, T.; Chappell, N. A.

    2013-12-01

    . Furthermore this allows a data-based mechanistic (DBM) modelling philosophy to be followed where no assumptions about processes are defined a priori (given that dominant processes are often not known before analysis) & where the information contained in the time-series is used to identify multiple structures of models that are statistically robust. Within the final stage of DBM, biogeochemical & hydrological processes are interpreted from those models that are observable from the available stream time-series. We show that this approach can simulate the key features of DOC dynamics within & between storms & that some of the resultant response characteristics change with varying DOC processes in different seasons. Through the use of MISO (multiple-input single-output) models we demonstrate the relative importance of different variables (e.g., rainfall, temperature) in controlling DOC responses. The contrasting behaviour of the six experimental catchments is also reflected in differing response characteristics. These characteristics are shown to contribute to understanding of basin-integrated DOC export processes & to the ecosystem service impacts of DOC & color on commercial water treatment within the surrounding water supply basins.

  15. Comparison of volumetric modulated arc therapy and intensity modulated radiation therapy for whole brain hippocampal sparing treatment plans based on radiobiological modeling

    Directory of Open Access Journals (Sweden)

    Ethan Kendall

    2018-01-01

    Full Text Available Introduction: In this article, we report the results of our investigation on comparison of radiobiological aspects of treatment plans with linear accelerator-based intensity-modulated radiation therapy and volumetric-modulated arc therapy for patients having hippocampal avoidance whole-brain radiation therapy. Materials and Methods: In this retrospective study using the dose-volume histogram, we calculated and compared biophysical indices of equivalent uniform dose, tumor control probability, and normal tissue complication probability (NTCP for 15 whole-brain radiotherapy patients. Results and Discussions: Dose-response models for tumors and critical structures were separated into two groups: mechanistic and empirical. Mechanistic models formulate mathematically with describable relationships while empirical models fit data through empirical observations to appropriately determine parameters giving results agreeable to those given by mechanistic models. Conclusions: Techniques applied in this manuscript could be applied to any other organs or types of cancer to evaluate treatment plans based on radiobiological modeling.

  16. Semi-mechanistic partial buffer approach to modeling pH, the buffer properties, and the distribution of ionic species in complex solutions.

    Science.gov (United States)

    Dougherty, Daniel P; Da Conceicao Neta, Edith Ramos; McFeeters, Roger F; Lubkin, Sharon R; Breidt, Frederick

    2006-08-09

    In many biological science and food processing applications, it is very important to control or modify pH. However, the complex, unknown composition of biological media and foods often limits the utility of purely theoretical approaches to modeling pH and calculating the distributions of ionizable species. This paper provides general formulas and efficient algorithms for predicting the pH, titration, ionic species concentrations, buffer capacity, and ionic strength of buffer solutions containing both defined and undefined components. A flexible, semi-mechanistic, partial buffering (SMPB) approach is presented that uses local polynomial regression to model the buffering influence of complex or undefined components in a solution, while identified components of known concentration are modeled using expressions based on extensions of the standard acid-base theory. The SMPB method is implemented in a freeware package, (pH)Tools, for use with Matlab. We validated the predictive accuracy of these methods by using strong acid titrations of cucumber slurries to predict the amount of a weak acid required to adjust pH to selected target values.

  17. Predicting soil-to-plant transfer of radionuclides with a mechanistic model (BioRUR)

    Energy Technology Data Exchange (ETDEWEB)

    Casadesus, J. [Servei de Camps Experimentals, Universitat de Barcelona, Avda Diagonal 645, 08028 Barcelona (Spain); Sauras-Yera, T. [Departament de Biologia Vegetal, Facultat de Biologia, Universitat de Barcelona, Avda Diagonal 645, 08028 Barcelona (Spain)], E-mail: msauras@ub.edu; Vallejo, V.R. [Departament de Biologia Vegetal, Facultat de Biologia, Universitat de Barcelona, Avda Diagonal 645, 08028 Barcelona (Spain); Centro de Estudios Ambientales del Mediterraneo, Charles Darwin 14, Parc Tecnologic, 46980 Paterna, Valencia (Spain)

    2008-05-15

    BioRUR model has been developed for the simulation of radionuclide (RN) transfer through physical and biological compartments, based on the available information on the transfer of their nutrient analogues. The model assumes that radionuclides are transferred from soil to plant through the same pathways as their nutrient analogues, where K and Ca are the analogues of Cs and Sr, respectively. Basically, the transfer of radionuclide between two compartments is calculated as the transfer of nutrient multiplied by the ratio of concentrations of RN to nutrient, corrected by a selectivity coefficient. Hydroponic experiments showed the validity of this assumption for root uptake of Cs and Sr and reported a selectivity coefficient around 1.0 for both. However, the application of this approach to soil-to-plant transfer raises some questions on which are the effective concentrations of RN and nutrient detected by the plant uptake mechanism. This paper describes the evaluation of two configurations of BioRUR, one which simplifies the soil as an homogeneous pool, and the other which considers that some concentration gradients develop around roots and therefore ion concentrations at the root surface are different from those of the bulk soil. The results show a good fit between the observed Sr transfer and the mechanistic simulations, even when a homogeneous soil is considered. On the other hand, Cs transfer is overestimated by two orders of magnitude if the development of a decreasing K profile around roots is not taken into account.

  18. Predicting soil-to-plant transfer of radionuclides with a mechanistic model (BioRUR)

    International Nuclear Information System (INIS)

    Casadesus, J.; Sauras-Yera, T.; Vallejo, V.R.

    2008-01-01

    BioRUR model has been developed for the simulation of radionuclide (RN) transfer through physical and biological compartments, based on the available information on the transfer of their nutrient analogues. The model assumes that radionuclides are transferred from soil to plant through the same pathways as their nutrient analogues, where K and Ca are the analogues of Cs and Sr, respectively. Basically, the transfer of radionuclide between two compartments is calculated as the transfer of nutrient multiplied by the ratio of concentrations of RN to nutrient, corrected by a selectivity coefficient. Hydroponic experiments showed the validity of this assumption for root uptake of Cs and Sr and reported a selectivity coefficient around 1.0 for both. However, the application of this approach to soil-to-plant transfer raises some questions on which are the effective concentrations of RN and nutrient detected by the plant uptake mechanism. This paper describes the evaluation of two configurations of BioRUR, one which simplifies the soil as an homogeneous pool, and the other which considers that some concentration gradients develop around roots and therefore ion concentrations at the root surface are different from those of the bulk soil. The results show a good fit between the observed Sr transfer and the mechanistic simulations, even when a homogeneous soil is considered. On the other hand, Cs transfer is overestimated by two orders of magnitude if the development of a decreasing K profile around roots is not taken into account

  19. Mechanistic study of manganese-substituted glycerol dehydrogenase using a kinetic and thermodynamic analysis.

    Science.gov (United States)

    Fang, Baishan; Niu, Jin; Ren, Hong; Guo, Yingxia; Wang, Shizhen

    2014-01-01

    Mechanistic insights regarding the activity enhancement of dehydrogenase by metal ion substitution were investigated by a simple method using a kinetic and thermodynamic analysis. By profiling the binding energy of both the substrate and product, the metal ion's role in catalysis enhancement was revealed. Glycerol dehydrogenase (GDH) from Klebsiella pneumoniae sp., which demonstrated an improvement in activity by the substitution of a zinc ion with a manganese ion, was used as a model for the mechanistic study of metal ion substitution. A kinetic model based on an ordered Bi-Bi mechanism was proposed considering the noncompetitive product inhibition of dihydroxyacetone (DHA) and the competitive product inhibition of NADH. By obtaining preliminary kinetic parameters of substrate and product inhibition, the number of estimated parameters was reduced from 10 to 4 for a nonlinear regression-based kinetic parameter estimation. The simulated values of time-concentration curves fit the experimental values well, with an average relative error of 11.5% and 12.7% for Mn-GDH and GDH, respectively. A comparison of the binding energy of enzyme ternary complex for Mn-GDH and GDH derived from kinetic parameters indicated that metal ion substitution accelerated the release of dioxyacetone. The metal ion's role in catalysis enhancement was explicated.

  20. Mechanistic study of manganese-substituted glycerol dehydrogenase using a kinetic and thermodynamic analysis.

    Directory of Open Access Journals (Sweden)

    Baishan Fang

    Full Text Available Mechanistic insights regarding the activity enhancement of dehydrogenase by metal ion substitution were investigated by a simple method using a kinetic and thermodynamic analysis. By profiling the binding energy of both the substrate and product, the metal ion's role in catalysis enhancement was revealed. Glycerol dehydrogenase (GDH from Klebsiella pneumoniae sp., which demonstrated an improvement in activity by the substitution of a zinc ion with a manganese ion, was used as a model for the mechanistic study of metal ion substitution. A kinetic model based on an ordered Bi-Bi mechanism was proposed considering the noncompetitive product inhibition of dihydroxyacetone (DHA and the competitive product inhibition of NADH. By obtaining preliminary kinetic parameters of substrate and product inhibition, the number of estimated parameters was reduced from 10 to 4 for a nonlinear regression-based kinetic parameter estimation. The simulated values of time-concentration curves fit the experimental values well, with an average relative error of 11.5% and 12.7% for Mn-GDH and GDH, respectively. A comparison of the binding energy of enzyme ternary complex for Mn-GDH and GDH derived from kinetic parameters indicated that metal ion substitution accelerated the release of dioxyacetone. The metal ion's role in catalysis enhancement was explicated.

  1. Simulating soil C stability with mechanistic systems models: a multisite comparison of measured fractions and modelled pools

    Science.gov (United States)

    Robertson, Andy; Schipanski, Meagan; Sherrod, Lucretia; Ma, Liwang; Ahuja, Lajpat; McNamara, Niall; Smith, Pete; Davies, Christian

    2016-04-01

    Agriculture, covering more than 30% of global land area, has an exciting opportunity to help combat climate change by effectively managing its soil to promote increased C sequestration. Further, newly sequestered soil carbon (C) through agriculture needs to be stored in more stable forms in order to have a lasting impact on reducing atmospheric CO2 concentrations. While land uses in different climates and soils require different management strategies, the fundamental mechanisms that regulate C sequestration and stabilisation remain the same. These mechanisms are used by a number of different systems models to simulate C dynamics, and thus assess the impacts of change in management or climate. To evaluate the accuracy of these model simulations, our research uses a multidirectional approach to compare C stocks of physicochemical soil fractions collected at two long-term agricultural sites. Carbon stocks for a number of soil fractions were measured at two sites (Lincoln, UK; Colorado, USA) over 8 and 12 years, respectively. Both sites represent managed agricultural land but have notably different climates and levels of disturbance. The measured soil fractions act as proxies for varying degrees of stability, with C contained within these fractions relatable to the C simulated within the soil pools of mechanistic systems models1. Using stable isotope techniques at the UK site, specific turnover times of C within the different fractions were determined and compared with those simulated in the pools of 3 different models of varying complexity (RothC, DayCent and RZWQM2). Further, C dynamics and N-mineralisation rates of the measured fractions at the US site were assessed and compared to results of the same three models. The UK site saw a significant increase in C stocks within the most stable fractions, with topsoil (0-30cm) sequestration rates of just over 0.3 tC ha-1 yr-1 after only 8 years. Further, the sum of all fractions reported C sequestration rates of nearly 1

  2. In Silico Oncology: Quantification of the In Vivo Antitumor Efficacy of Cisplatin-Based Doublet Therapy in Non-Small Cell Lung Cancer (NSCLC) through a Multiscale Mechanistic Model

    Science.gov (United States)

    Kolokotroni, Eleni; Dionysiou, Dimitra; Veith, Christian; Kim, Yoo-Jin; Franz, Astrid; Grgic, Aleksandar; Bohle, Rainer M.; Stamatakos, Georgios

    2016-01-01

    The 5-year survival of non-small cell lung cancer patients can be as low as 1% in advanced stages. For patients with resectable disease, the successful choice of preoperative chemotherapy is critical to eliminate micrometastasis and improve operability. In silico experimentations can suggest the optimal treatment protocol for each patient based on their own multiscale data. A determinant for reliable predictions is the a priori estimation of the drugs’ cytotoxic efficacy on cancer cells for a given treatment. In the present work a mechanistic model of cancer response to treatment is applied for the estimation of a plausible value range of the cell killing efficacy of various cisplatin-based doublet regimens. Among others, the model incorporates the cancer related mechanism of uncontrolled proliferation, population heterogeneity, hypoxia and treatment resistance. The methodology is based on the provision of tumor volumetric data at two time points, before and after or during treatment. It takes into account the effect of tumor microenvironment and cell repopulation on treatment outcome. A thorough sensitivity analysis based on one-factor-at-a-time and latin hypercube sampling/partial rank correlation coefficient approaches has established the volume growth rate and the growth fraction at diagnosis as key features for more accurate estimates. The methodology is applied on the retrospective data of thirteen patients with non-small cell lung cancer who received cisplatin in combination with gemcitabine, vinorelbine or docetaxel in the neoadjuvant context. The selection of model input values has been guided by a comprehensive literature survey on cancer-specific proliferation kinetics. The latin hypercube sampling has been recruited to compensate for patient-specific uncertainties. Concluding, the present work provides a quantitative framework for the estimation of the in-vivo cell-killing ability of various chemotherapies. Correlation studies of such estimates with

  3. Mechanistic electronic model to simulate and predict the effect of heat stress on the functional genomics of HO-1 system: Vasodilation.

    Science.gov (United States)

    Aggarwal, Yogender; Karan, Bhuwan Mohan; Das, Barda Nand; Sinha, Rakesh Kumar

    2010-05-01

    The present work is concerned to model the molecular signalling pathway for vasodilation and to predict the resting young human forearm blood flow under heat stress. The mechanistic electronic modelling technique has been designed and implemented using MULTISIM 8.0 and an assumption of 1V/ degrees C for prediction of forearm blood flow and the digital logic has been used to design the molecular signalling pathway for vasodilation. The minimum forearm blood flow has been observed at 35 degrees C (0 ml 100 ml(-1)min(-1)) and the maximum at 42 degrees C (18.7 ml 100 ml(-1)min(-1)) environmental temperature with respect to the base value of 2 ml 100 ml(-1)min(-1). This model may also enable to identify many therapeutic targets that can be used in the treatment of inflammations and disorders due to heat-related illnesses. 2010 Elsevier Ltd. All rights reserved.

  4. Inferring the Impact of Regulatory Mechanisms that Underpin CD8+ T Cell Control of B16 Tumor Growth In vivo Using Mechanistic Models and Simulation.

    Science.gov (United States)

    Klinke, David J; Wang, Qing

    2016-01-01

    A major barrier for broadening the efficacy of immunotherapies for cancer is identifying key mechanisms that limit the efficacy of tumor infiltrating lymphocytes. Yet, identifying these mechanisms using human samples and mouse models for cancer remains a challenge. While interactions between cancer and the immune system are dynamic and non-linear, identifying the relative roles that biological components play in regulating anti-tumor immunity commonly relies on human intuition alone, which can be limited by cognitive biases. To assist natural intuition, modeling and simulation play an emerging role in identifying therapeutic mechanisms. To illustrate the approach, we developed a multi-scale mechanistic model to describe the control of tumor growth by a primary response of CD8+ T cells against defined tumor antigens using the B16 C57Bl/6 mouse model for malignant melanoma. The mechanistic model was calibrated to data obtained following adenovirus-based immunization and validated to data obtained following adoptive transfer of transgenic CD8+ T cells. More importantly, we use simulation to test whether the postulated network topology, that is the modeled biological components and their associated interactions, is sufficient to capture the observed anti-tumor immune response. Given the available data, the simulation results also provided a statistical basis for quantifying the relative importance of different mechanisms that underpin CD8+ T cell control of B16F10 growth. By identifying conditions where the postulated network topology is incomplete, we illustrate how this approach can be used as part of an iterative design-build-test cycle to expand the predictive power of the model.

  5. BIOMAP A Daily Time Step, Mechanistic Model for the Study of Ecosystem Dynamics

    Science.gov (United States)

    Wells, J. R.; Neilson, R. P.; Drapek, R. J.; Pitts, B. S.

    2010-12-01

    BIOMAP simulates competition between two Plant Functional Types (PFT) at any given point in the conterminous U.S. using a time series of daily temperature (mean, minimum, maximum), precipitation, humidity, light and nutrients, with PFT-specific rooting within a multi-layer soil. The model employs a 2-layer canopy biophysics, Farquhar photosynthesis, the Beer-Lambert Law for light attenuation and a mechanistic soil hydrology. In essence, BIOMAP is a re-built version of the biogeochemistry model, BIOME-BGC, into the form of the MAPSS biogeography model. Specific enhancements are: 1) the 2-layer canopy biophysics of Dolman (1993); 2) the unique MAPSS-based hydrology, which incorporates canopy evaporation, snow dynamics, infiltration and saturated and unsaturated percolation with ‘fast’ flow and base flow and a ‘tunable aquifer’ capacity, a metaphor of D’Arcy’s Law; and, 3) a unique MAPSS-based stomatal conductance algorithm, which simultaneously incorporates vapor pressure and soil water potential constraints, based on physiological information and many other improvements. Over small domains the PFTs can be parameterized as individual species to investigate fundamental vs. potential niche theory; while, at more coarse scales the PFTs can be rendered as more general functional groups. Since all of the model processes are intrinsically leaf to plot scale (physiology to PFT competition), it essentially has no ‘intrinsic’ scale and can be implemented on a grid of any size, taking on the characteristics defined by the homogeneous climate of each grid cell. Currently, the model is implemented on the VEMAP 1/2 degree, daily grid over the conterminous U.S. Although both the thermal and water-limited ecotones are dynamic, following climate variability, the PFT distributions remain fixed. Thus, the model is currently being fitted with a ‘reproduction niche’ to allow full dynamic operation as a Dynamic General Vegetation Model (DGVM). While global simulations

  6. Mechanistic model for Sr and Ba release from severely damaged fuel

    International Nuclear Information System (INIS)

    Rest, J.; Cronenberg, A.W.

    1985-11-01

    Among radionuclides associated with fission product release during severe accidents, the primary ones with health consequences are the volatile species of I, Te, and Cs, and the next most important are Sr, Ba, and Ru. Considerable progress has been made in the mechanistic understanding of I, Cs, Te, and noble gas release; however, no capability presently exists for estimating the release of Sr, Ba, and Ru. This paper presents a description of the primary physical/chemical models recently incorporated into the FASTGRASS-VFP (volatile fission product) code for the estimation of Sr and Ba release. FASTGRASS-VFP release predictions are compared with two data sets: (1) data from out-of-reactor induction-heating experiments on declad low-burnup (1000 and 4000 MWd/t) pellets, and (2) data from the more recent in-reactor PBF Severe Fuel Damage Tests, in which one-meter-long, trace-irradiated (89 MWd/t) and normally irradiated (approx.35,000 MWd/t) fuel rods were tested under accident conditions. 10 refs

  7. RNA-Seq-based toxicogenomic assessment of fresh frozen and formalin-fixed tissues yields similar mechanistic insights.

    Science.gov (United States)

    Auerbach, Scott S; Phadke, Dhiral P; Mav, Deepak; Holmgren, Stephanie; Gao, Yuan; Xie, Bin; Shin, Joo Heon; Shah, Ruchir R; Merrick, B Alex; Tice, Raymond R

    2015-07-01

    Formalin-fixed, paraffin-embedded (FFPE) pathology specimens represent a potentially vast resource for transcriptomic-based biomarker discovery. We present here a comparison of results from a whole transcriptome RNA-Seq analysis of RNA extracted from fresh frozen and FFPE livers. The samples were derived from rats exposed to aflatoxin B1 (AFB1 ) and a corresponding set of control animals. Principal components analysis indicated that samples were separated in the two groups representing presence or absence of chemical exposure, both in fresh frozen and FFPE sample types. Sixty-five percent of the differentially expressed transcripts (AFB1 vs. controls) in fresh frozen samples were also differentially expressed in FFPE samples (overlap significance: P < 0.0001). Genomic signature and gene set analysis of AFB1 differentially expressed transcript lists indicated highly similar results between fresh frozen and FFPE at the level of chemogenomic signatures (i.e., single chemical/dose/duration elicited transcriptomic signatures), mechanistic and pathology signatures, biological processes, canonical pathways and transcription factor networks. Overall, our results suggest that similar hypotheses about the biological mechanism of toxicity would be formulated from fresh frozen and FFPE samples. These results indicate that phenotypically anchored archival specimens represent a potentially informative resource for signature-based biomarker discovery and mechanistic characterization of toxicity. Copyright © 2014 John Wiley & Sons, Ltd.

  8. Development of a soft-sensor based on multi-wavelength fluorescence spectroscopy and a dynamic metabolic model for monitoring mammalian cell cultures.

    Science.gov (United States)

    Ohadi, Kaveh; Legge, Raymond L; Budman, Hector M

    2015-01-01

    A soft-sensor based on an Extended Kalman Filter (EKF) that combines data obtained using a fluorescence-based soft-sensor with a dynamic mechanistic model, was investigated as a tool for continuous monitoring of a Chinese hamster ovary (CHO) cell cultivation process. A standalone fluorescence based soft-sensor, which uses a combination of an empirical multivariate statistical model and measured spectra, was designed for predicting key culture variables including viable and dead cells, recombinant protein, glucose, and ammonia concentrations. The standalone fluorescence sensor was then combined with a dynamic mechanistic model within an EKF framework, for improving the prediction accuracy and generating predictions in-between sampling instances. The dynamic model used for the EKF framework was based on a structured metabolic flux analysis and mass balances. In order to calibrate the fluorescence-based empirical model and the dynamic mechanistic model, cells were grown in batch mode with different initial glucose and glutamine concentrations. To mitigate the uncertainty associated with the model structure and parameters, non-stationary disturbances were accounted for in the EKF by parameter-adaptation. It was demonstrated that the implementation of the EKF along with the dynamic model could improve the accuracy of the fluorescence-based predictions at the sampling instances. Additionally, it was shown that the major advantage of the EKF-based soft-sensor, compared to the standalone fluorescence-based counterpart, was its capability to track the temporal evolution of key process variables between measurement instances obtained by the fluorescence-based soft-sensor. This is crucial for designing control strategies of CHO cell cultures with the aim of guaranteeing product quality. © 2014 Wiley Periodicals, Inc.

  9. Exploring BSEP Inhibition-Mediated Toxicity with a Mechanistic Model of Drug-Induced Liver Injury

    Directory of Open Access Journals (Sweden)

    Jeffrey L Woodhead

    2014-11-01

    Full Text Available Inhibition of the bile salt export pump (BSEP has been linked to incidence of drug-induced liver injury (DILI, presumably by the accumulation of toxic bile acids in the liver. We have previously constructed and validated a model of bile acid disposition within DILIsym®, a mechanistic model of DILI. In this paper, we use DILIsym® to simulate the DILI response of the hepatotoxic BSEP inhibitors bosentan and CP-724,714 and the non-hepatotoxic BSEP inhibitor telmisartan in humans in order to explore whether we can predict that hepatotoxic BSEP inhibitors can cause bile acid accumulation to reach toxic levels. We also simulate bosentan in rats in order to illuminate potential reasons behind the lack of toxicity in rats compared to the toxicity observed in humans. DILIsym® predicts that bosentan, but not telmisartan, will cause mild hepatocellular ATP decline and serum ALT elevation in a simulated population of humans. The difference in hepatotoxic potential between bosentan and telmisartan is consistent with clinical observations. However, DILIsym® underpredicts the incidence of bosentan toxicity. DILIsym® also predicts that bosentan will not cause toxicity in a simulated population of rats, and that the difference between the response to bosentan in rats and in humans is primarily due to the less toxic bile acid pool in rats. Our simulations also suggest a potential synergistic role for bile acid accumulation and mitochondrial electron transport chain inhibition in producing the observed toxicity in CP-724,714, and suggest that CP-724,714 metabolites may also play a role in the observed toxicity. Our work also compares the impact of competitive and noncompetitive BSEP inhibition for CP-724,714 and demonstrates that noncompetitive inhibition leads to much greater bile acid accumulation and potential toxicity. Our research demonstrates the potential for mechanistic modeling to contribute to the understanding of how bile acid transport inhibitors

  10. Improving the International Agency for Research on Cancer's consideration of mechanistic evidence

    International Nuclear Information System (INIS)

    Goodman, Julie; Lynch, Heather

    2017-01-01

    Background: The International Agency for Research on Cancer (IARC) recently developed a framework for evaluating mechanistic evidence that includes a list of 10 key characteristics of carcinogens. This framework is useful for identifying and organizing large bodies of literature on carcinogenic mechanisms, but it lacks sufficient guidance for conducting evaluations that fully integrate mechanistic evidence into hazard assessments. Objectives: We summarize the framework, and suggest approaches to strengthen the evaluation of mechanistic evidence using this framework. Discussion: While the framework is useful for organizing mechanistic evidence, its lack of guidance for implementation limits its utility for understanding human carcinogenic potential. Specifically, it does not include explicit guidance for evaluating the biological significance of mechanistic endpoints, inter- and intra-individual variability, or study quality and relevance. It also does not explicitly address how mechanistic evidence should be integrated with other realms of evidence. Because mechanistic evidence is critical to understanding human cancer hazards, we recommend that IARC develop transparent and systematic guidelines for the use of this framework so that mechanistic evidence will be evaluated and integrated in a robust manner, and concurrently with other realms of evidence, to reach a final human cancer hazard conclusion. Conclusions: IARC does not currently provide a standardized approach to evaluating mechanistic evidence. Incorporating the recommendations discussed here will make IARC analyses of mechanistic evidence more transparent, and lead to assessments of cancer hazards that reflect the weight of the scientific evidence and allow for scientifically defensible decision-making. - Highlights: • IARC has a revised framework for evaluating literature on carcinogenic mechanisms. • The framework is based on 10 key characteristics of carcinogens. • IARC should develop transparent

  11. Improving the International Agency for Research on Cancer's consideration of mechanistic evidence

    Energy Technology Data Exchange (ETDEWEB)

    Goodman, Julie, E-mail: jgoodman@gradientcorp.com; Lynch, Heather

    2017-03-15

    Background: The International Agency for Research on Cancer (IARC) recently developed a framework for evaluating mechanistic evidence that includes a list of 10 key characteristics of carcinogens. This framework is useful for identifying and organizing large bodies of literature on carcinogenic mechanisms, but it lacks sufficient guidance for conducting evaluations that fully integrate mechanistic evidence into hazard assessments. Objectives: We summarize the framework, and suggest approaches to strengthen the evaluation of mechanistic evidence using this framework. Discussion: While the framework is useful for organizing mechanistic evidence, its lack of guidance for implementation limits its utility for understanding human carcinogenic potential. Specifically, it does not include explicit guidance for evaluating the biological significance of mechanistic endpoints, inter- and intra-individual variability, or study quality and relevance. It also does not explicitly address how mechanistic evidence should be integrated with other realms of evidence. Because mechanistic evidence is critical to understanding human cancer hazards, we recommend that IARC develop transparent and systematic guidelines for the use of this framework so that mechanistic evidence will be evaluated and integrated in a robust manner, and concurrently with other realms of evidence, to reach a final human cancer hazard conclusion. Conclusions: IARC does not currently provide a standardized approach to evaluating mechanistic evidence. Incorporating the recommendations discussed here will make IARC analyses of mechanistic evidence more transparent, and lead to assessments of cancer hazards that reflect the weight of the scientific evidence and allow for scientifically defensible decision-making. - Highlights: • IARC has a revised framework for evaluating literature on carcinogenic mechanisms. • The framework is based on 10 key characteristics of carcinogens. • IARC should develop transparent

  12. A mechanistic ecohydrological model to investigate complex interactions in cold and warm water-controlled environments. 2. Spatiotemporal analyses

    Directory of Open Access Journals (Sweden)

    Simone Fatichi

    2012-05-01

    Full Text Available An ecohydrological model Tethys-Chloris (T&C described in the companion paper is applied to two semiarid systems characterized by different climate and vegetation cover conditions. The Lucky Hills watershed in Arizona represents a typical small, ``unit-source'' catchment of a desert shrub system of the U.S. southwest. Two nested basins of the Reynolds Creek Experimental watershed (Idaho, U.S.A., the Reynolds Creek Mountain East and Tollgate catchments, are representative of a semiarid cold climate with seasonal snow cover. Both exhibit a highly non-uniform vegetation cover. A range of ecohydrological metrics of the long-term model performance is presented to highlight the model capabilities in reproducing hydrological and vegetation dynamics both at the plot and the watershed scales. A diverse set of observations is used to confirm the simulated dynamics. Highly satisfactory results are obtained without significant (or any calibration efforts despite the large phase-space dimensionality of the model, the uncertainty of imposed boundary conditions, and limited data availability. It is argued that a significant investment into the model design based on the description of physical, biophysical, and ecological processes leads to such a consistent simulation skill. The simulated patterns mimic the outcome of hydrological and vegetation dynamics with high realism, as confirmed from spatially distributed remote sensing data. Further community efforts are warranted to address the issue of thorough quantitative assessment. The current lack of appropriate data hampers the development and testing of process-based ecohydrological models. It is further argued that the mechanistic nature of the T&C model can be valuable for designing virtual experiments and developing questions of scientific inquiry at a range of spatiotemporal scales.

  13. Radiation-induced carcinogenesis: mechanistically based differences between gamma-rays and neutrons, and interactions with DMBA.

    Directory of Open Access Journals (Sweden)

    Igor Shuryak

    Full Text Available Different types of ionizing radiation produce different dependences of cancer risk on radiation dose/dose rate. Sparsely ionizing radiation (e.g. γ-rays generally produces linear or upwardly curving dose responses at low doses, and the risk decreases when the dose rate is reduced (direct dose rate effect. Densely ionizing radiation (e.g. neutrons often produces downwardly curving dose responses, where the risk initially grows with dose, but eventually stabilizes or decreases. When the dose rate is reduced, the risk increases (inverse dose rate effect. These qualitative differences suggest qualitative differences in carcinogenesis mechanisms. We hypothesize that the dominant mechanism for induction of many solid cancers by sparsely ionizing radiation is initiation of stem cells to a pre-malignant state, but for densely ionizing radiation the dominant mechanism is radiation-bystander-effect mediated promotion of already pre-malignant cell clone growth. Here we present a mathematical model based on these assumptions and test it using data on the incidence of dysplastic growths and tumors in the mammary glands of mice exposed to high or low dose rates of γ-rays and neutrons, either with or without pre-treatment with the chemical carcinogen 7,12-dimethylbenz-alpha-anthracene (DMBA. The model provides a mechanistic and quantitative explanation which is consistent with the data and may provide useful insight into human carcinogenesis.

  14. Exploring the pros and cons of mechanistic case diagrams for problem-based learning

    Directory of Open Access Journals (Sweden)

    Minjeong Kim

    2017-09-01

    Full Text Available Purpose Mechanistic case diagram (MCD was recommended for increasing the depth of understanding of disease, but with few articles on its specific methods. We address the experience of making MCD in the fullest depth to identify the pros and cons of using MCDs in such ways. Methods During problem-based learning, we gave guidelines of MCD for its mechanistic exploration from subcellular processes to clinical features, being laid out in as much detail as possible. To understand the students’ attitudes and depth of study using MCDs, we analyzed the results of a questionnaire in an open format about experiencing MCDs and examined the resulting products. Results Through the responses to questionnaire, we found several favorable outcomes, major of which was deeper insight and comprehensive understanding of disease facilitated by the process of making well-organized diagram. The main disadvantages of these guidelines were the feeling of too much workload and difficulty of finding mechanisms. Students gave suggestions to overcome these problems: cautious reading of comprehensive texts, additional guidance from staff about depth and focus of mechanisms, and cooperative group work. From the analysis of maps, we recognized there should be allowance of diversities in the appearance of maps and many hypothetical connections, which could be related to an insufficient understanding of mechanisms in nature. Conclusion The more detailed an MCD task is, the better students can become acquainted with deep knowledges. However, this advantage should be balanced by the results that there are many ensuing difficulties for the work and deliberate help plans should be prepared.

  15. [Mechanistic modelling allows to assess pathways of DNA lesion interactions underlying chromosome aberration formation].

    Science.gov (United States)

    Eĭdel'man, Iu A; Slanina, S V; Sal'nikov, I V; Andreev, S G

    2012-12-01

    The knowledge of radiation-induced chromosomal aberration (CA) mechanisms is required in many fields of radiation genetics, radiation biology, biodosimetry, etc. However, these mechanisms are yet to be quantitatively characterised. One of the reasons is that the relationships between primary lesions of DNA/chromatin/chromosomes and dose-response curves for CA are unknown because the pathways of lesion interactions in an interphase nucleus are currently inaccessible for direct experimental observation. This article aims for the comparative analysis of two principally different scenarios of formation of simple and complex interchromosomal exchange aberrations: by lesion interactions at chromosome territories' surface vs. in the whole space of the nucleus. The analysis was based on quantitative mechanistic modelling of different levels of structures and processes involved in CA formation: chromosome structure in an interphase nucleus, induction, repair and interactions of DNA lesions. It was shown that the restricted diffusion of chromosomal loci, predicted by computational modelling of chromosome organization, results in lesion interactions in the whole space of the nucleus being impossible. At the same time, predicted features of subchromosomal dynamics agrees well with in vivo observations and does not contradict the mechanism of CA formation at the surface of chromosome territories. On the other hand, the "surface mechanism" of CA formation, despite having certain qualities, proved to be insufficient to explain high frequency of complex exchange aberrations observed by mFISH technique. The alternative mechanism, CA formation on nuclear centres is expected to be sufficient to explain frequent complex exchanges.

  16. Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model

    Science.gov (United States)

    Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward

    2018-04-01

    A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.

  17. Generative embedding for model-based classification of fMRI data.

    Directory of Open Access Journals (Sweden)

    Kay H Brodersen

    2011-06-01

    Full Text Available Decoding models, such as those underlying multivariate classification algorithms, have been increasingly used to infer cognitive or clinical brain states from measures of brain activity obtained by functional magnetic resonance imaging (fMRI. The practicality of current classifiers, however, is restricted by two major challenges. First, due to the high data dimensionality and low sample size, algorithms struggle to separate informative from uninformative features, resulting in poor generalization performance. Second, popular discriminative methods such as support vector machines (SVMs rarely afford mechanistic interpretability. In this paper, we address these issues by proposing a novel generative-embedding approach that incorporates neurobiologically interpretable generative models into discriminative classifiers. Our approach extends previous work on trial-by-trial classification for electrophysiological recordings to subject-by-subject classification for fMRI and offers two key advantages over conventional methods: it may provide more accurate predictions by exploiting discriminative information encoded in 'hidden' physiological quantities such as synaptic connection strengths; and it affords mechanistic interpretability of clinical classifications. Here, we introduce generative embedding for fMRI using a combination of dynamic causal models (DCMs and SVMs. We propose a general procedure of DCM-based generative embedding for subject-wise classification, provide a concrete implementation, and suggest good-practice guidelines for unbiased application of generative embedding in the context of fMRI. We illustrate the utility of our approach by a clinical example in which we classify moderately aphasic patients and healthy controls using a DCM of thalamo-temporal regions during speech processing. Generative embedding achieves a near-perfect balanced classification accuracy of 98% and significantly outperforms conventional activation-based and

  18. Simulating polar bear energetics during a seasonal fast using a mechanistic model.

    Directory of Open Access Journals (Sweden)

    Paul D Mathewson

    Full Text Available In this study we tested the ability of a mechanistic model (Niche Mapper™ to accurately model adult, non-denning polar bear (Ursus maritimus energetics while fasting during the ice-free season in the western Hudson Bay. The model uses a steady state heat balance approach, which calculates the metabolic rate that will allow an animal to maintain its core temperature in its particular microclimate conditions. Predicted weight loss for a 120 day fast typical of the 1990s was comparable to empirical studies of the population, and the model was able to reach a heat balance at the target metabolic rate for the entire fast, supporting use of the model to explore the impacts of climate change on polar bears. Niche Mapper predicted that all but the poorest condition bears would survive a 120 day fast under current climate conditions. When the fast extended to 180 days, Niche Mapper predicted mortality of up to 18% for males. Our results illustrate how environmental conditions, variation in animal properties, and thermoregulation processes may impact survival during extended fasts because polar bears were predicted to require additional energetic expenditure for thermoregulation during a 180 day fast. A uniform 3°C temperature increase reduced male mortality during a 180 day fast from 18% to 15%. Niche Mapper explicitly links an animal's energetics to environmental conditions and thus can be a valuable tool to help inform predictions of climate-related population changes. Since Niche Mapper is a generic model, it can make energetic predictions for other species threatened by climate change.

  19. Simulating polar bear energetics during a seasonal fast using a mechanistic model.

    Science.gov (United States)

    Mathewson, Paul D; Porter, Warren P

    2013-01-01

    In this study we tested the ability of a mechanistic model (Niche Mapper™) to accurately model adult, non-denning polar bear (Ursus maritimus) energetics while fasting during the ice-free season in the western Hudson Bay. The model uses a steady state heat balance approach, which calculates the metabolic rate that will allow an animal to maintain its core temperature in its particular microclimate conditions. Predicted weight loss for a 120 day fast typical of the 1990s was comparable to empirical studies of the population, and the model was able to reach a heat balance at the target metabolic rate for the entire fast, supporting use of the model to explore the impacts of climate change on polar bears. Niche Mapper predicted that all but the poorest condition bears would survive a 120 day fast under current climate conditions. When the fast extended to 180 days, Niche Mapper predicted mortality of up to 18% for males. Our results illustrate how environmental conditions, variation in animal properties, and thermoregulation processes may impact survival during extended fasts because polar bears were predicted to require additional energetic expenditure for thermoregulation during a 180 day fast. A uniform 3°C temperature increase reduced male mortality during a 180 day fast from 18% to 15%. Niche Mapper explicitly links an animal's energetics to environmental conditions and thus can be a valuable tool to help inform predictions of climate-related population changes. Since Niche Mapper is a generic model, it can make energetic predictions for other species threatened by climate change.

  20. A metabonomic approach for mechanistic exploration of pre-clinical toxicology.

    Science.gov (United States)

    Coen, Muireann

    2010-12-30

    Metabonomics involves the application of advanced analytical tools to profile the diverse metabolic complement of a given biofluid or tissue. Subsequent statistical modelling of the complex multivariate spectral profiles enables discrimination between phenotypes of interest and identifies panels of discriminatory metabolites that represent candidate biomarkers. This review article presents an overview of recent developments in the field of metabonomics with a focus on application to pre-clinical toxicology studies. Recent research investigations carried out as part of the international COMET 2 consortium project on the hepatotoxic action of the aminosugar, galactosamine (galN) are presented. The application of advanced, high-field NMR spectroscopy is demonstrated, together with complementary application of a targeted mass spectrometry platform coupled with ultra-performance liquid chromatography. Much novel mechanistic information has been gleaned on both the mechanism of galN hepatotoxicity in multiple biofluids and tissues, and on the protection afforded by co-administration of glycine and uridine. The simultaneous identification of both the metabolic fate of galN and its associated endogenous consequences in spectral profiles is demonstrated. Furthermore, metabonomic assessment of inter-animal variability in response to galN presents enhanced mechanistic insight on variable response phentoypes and is relevant to understanding wider aspects of individual variability in drug response. This exemplar highlights the analytical and statistical tools commonly applied in metabonomic studies and notably, the approach is applicable to the study of any toxin/drug or intervention of interest. The metabonomic approach holds considerable promise and potential to significantly advance our understanding of the mechanistic bases for adverse drug reactions. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  1. Biomeasures and mechanistic modeling highlight PK/PD risks for a monoclonal antibody targeting Fn14 in kidney disease.

    Science.gov (United States)

    Chen, Xiaoying; Farrokhi, Vahid; Singh, Pratap; Ocana, Mireia Fernandez; Patel, Jenil; Lin, Lih-Ling; Neubert, Hendrik; Brodfuehrer, Joanne

    2018-01-01

    Discovery of the upregulation of fibroblast growth factor-inducible-14 (Fn14) receptor following tissue injury has prompted investigation into biotherapeutic targeting of the Fn14 receptor for the treatment of conditions such as chronic kidney diseases. In the development of monoclonal antibody (mAb) therapeutics, there is an increasing trend to use biomeasures combined with mechanistic pharmacokinetic/pharmacodynamic (PK/PD) modeling to enable decision making in early discovery. With the aim of guiding preclinical efforts on designing an antibody with optimized properties, we developed a mechanistic site-of-action (SoA) PK/PD model for human application. This model incorporates experimental biomeasures, including concentration of soluble Fn14 (sFn14) in human plasma and membrane Fn14 (mFn14) in human kidney tissue, and turnover rate of human sFn14. Pulse-chase studies using stable isotope-labeled amino acids and mass spectrometry indicated the sFn14 half-life to be approximately 5 hours in healthy volunteers. The biomeasures (concentration, turnover) of sFn14 in plasma reveals a significant hurdle in designing an antibody against Fn14 with desired characteristics. The projected dose (>1 mg/kg/wk for 90% target coverage) derived from the human PK/PD model revealed potential high and frequent dosing requirements under certain conditions. The PK/PD model suggested a unique bell-shaped relationship between target coverage and antibody affinity for anti-Fn14 mAb, which could be applied to direct the antibody engineering towards an optimized affinity. This investigation highlighted potential applications, including assessment of PK/PD risks during early target validation, human dose prediction and drug candidate optimization.

  2. DOUBLE-SHELL TANK (DST) HYDROXIDE DEPLETION MODEL FOR CARBON DIOXIDE ABSORPTION

    International Nuclear Information System (INIS)

    OGDEN DM; KIRCH NW

    2007-01-01

    This document generates a supernatant hydroxide ion depletion model based on mechanistic principles. The carbon dioxide absorption mechanistic model is developed in this report. The report also benchmarks the model against historical tank supernatant hydroxide data and vapor space carbon dioxide data. A comparison of the newly generated mechanistic model with previously applied empirical hydroxide depletion equations is also performed

  3. Assessing the ability of mechanistic volatilization models to simulate soil surface conditions: a study with the Volt'Air model.

    Science.gov (United States)

    Garcia, L; Bedos, C; Génermont, S; Braud, I; Cellier, P

    2011-09-01

    Ammonia and pesticide volatilization in the field is a surface phenomenon involving physical and chemical processes that depend on the soil surface temperature and water content. The water transfer, heat transfer and energy budget sub models of volatilization models are adapted from the most commonly accepted formalisms and parameterizations. They are less detailed than the dedicated models describing water and heat transfers and surface status. The aim of this work was to assess the ability of one of the available mechanistic volatilization models, Volt'Air, to accurately describe the pedo-climatic conditions of a soil surface at the required time and space resolution. The assessment involves: (i) a sensitivity analysis, (ii) an evaluation of Volt'Air outputs in the light of outputs from a reference Soil-Vegetation-Atmosphere Transfer model (SiSPAT) and three experimental datasets, and (iii) the study of three tests based on modifications of SiSPAT to establish the potential impact of the simplifying assumptions used in Volt'Air. The analysis confirmed that a 5 mm surface layer was well suited, and that Volt'Air surface temperature correlated well with the experimental measurements as well as with SiSPAT outputs. In terms of liquid water transfers, Volt'Air was overall consistent with SiSPAT, with discrepancies only during major rainfall events and dry weather conditions. The tests enabled us to identify the main source of the discrepancies between Volt'Air and SiSPAT: the lack of gaseous water transfer description in Volt'Air. They also helped to explain why neither Volt'Air nor SiSPAT was able to represent lower values of surface water content: current classical water retention and hydraulic conductivity models are not yet adapted to cases of very dry conditions. Given the outcomes of this study, we discuss to what extent the volatilization models can be improved and the questions they pose for current research in water transfer modeling and parameterization

  4. In Silico Oncology: Quantification of the In Vivo Antitumor Efficacy of Cisplatin-Based Doublet Therapy in Non-Small Cell Lung Cancer (NSCLC through a Multiscale Mechanistic Model.

    Directory of Open Access Journals (Sweden)

    Eleni Kolokotroni

    2016-09-01

    Full Text Available The 5-year survival of non-small cell lung cancer patients can be as low as 1% in advanced stages. For patients with resectable disease, the successful choice of preoperative chemotherapy is critical to eliminate micrometastasis and improve operability. In silico experimentations can suggest the optimal treatment protocol for each patient based on their own multiscale data. A determinant for reliable predictions is the a priori estimation of the drugs' cytotoxic efficacy on cancer cells for a given treatment. In the present work a mechanistic model of cancer response to treatment is applied for the estimation of a plausible value range of the cell killing efficacy of various cisplatin-based doublet regimens. Among others, the model incorporates the cancer related mechanism of uncontrolled proliferation, population heterogeneity, hypoxia and treatment resistance. The methodology is based on the provision of tumor volumetric data at two time points, before and after or during treatment. It takes into account the effect of tumor microenvironment and cell repopulation on treatment outcome. A thorough sensitivity analysis based on one-factor-at-a-time and latin hypercube sampling/partial rank correlation coefficient approaches has established the volume growth rate and the growth fraction at diagnosis as key features for more accurate estimates. The methodology is applied on the retrospective data of thirteen patients with non-small cell lung cancer who received cisplatin in combination with gemcitabine, vinorelbine or docetaxel in the neoadjuvant context. The selection of model input values has been guided by a comprehensive literature survey on cancer-specific proliferation kinetics. The latin hypercube sampling has been recruited to compensate for patient-specific uncertainties. Concluding, the present work provides a quantitative framework for the estimation of the in-vivo cell-killing ability of various chemotherapies. Correlation studies of

  5. Optimal chemotherapy for leukemia: a model-based strategy for individualized treatment.

    Directory of Open Access Journals (Sweden)

    Devaraj Jayachandran

    Full Text Available Acute Lymphoblastic Leukemia, commonly known as ALL, is a predominant form of cancer during childhood. With the advent of modern healthcare support, the 5-year survival rate has been impressive in the recent past. However, long-term ALL survivors embattle several treatment-related medical and socio-economic complications due to excessive and inordinate chemotherapy doses received during treatment. In this work, we present a model-based approach to personalize 6-Mercaptopurine (6-MP treatment for childhood ALL with a provision for incorporating the pharmacogenomic variations among patients. Semi-mechanistic mathematical models were developed and validated for i 6-MP metabolism, ii red blood cell mean corpuscular volume (MCV dynamics, a surrogate marker for treatment efficacy, and iii leukopenia, a major side-effect. With the constraint of getting limited data from clinics, a global sensitivity analysis based model reduction technique was employed to reduce the parameter space arising from semi-mechanistic models. The reduced, sensitive parameters were used to individualize the average patient model to a specific patient so as to minimize the model uncertainty. Models fit the data well and mimic diverse behavior observed among patients with minimum parameters. The model was validated with real patient data obtained from literature and Riley Hospital for Children in Indianapolis. Patient models were used to optimize the dose for an individual patient through nonlinear model predictive control. The implementation of our approach in clinical practice is realizable with routinely measured complete blood counts (CBC and a few additional metabolite measurements. The proposed approach promises to achieve model-based individualized treatment to a specific patient, as opposed to a standard-dose-for-all, and to prescribe an optimal dose for a desired outcome with minimum side-effects.

  6. Respiratory cancer risks associated with low-level nickel exposure: an integrated assessment based on animal, epidemiological, and mechanistic data.

    Science.gov (United States)

    Seilkop, Steven K; Oller, Adriana R

    2003-04-01

    Increased lung and nasal cancer risks have been reported in several cohorts of nickel refinery workers, but in more than 90% of the nickel-exposed workers that have been studied there is little, if any evidence of excess risk. This investigation utilizes human exposure measurements, animal data from cancer bioassays of three nickel compounds, and a mechanistic theory of nickel carcinogenesis to reconcile the disparities in lung cancer risk among nickel-exposed workers. Animal data and mechanistic theory suggest that the apparent absence of risk in workers with low nickel exposures is due to threshold-like responses in lung tumor incidence (oxidic nickel), tumor promotion (soluble nickel), and genetic damage (sulfidic nickel). When animal-based lung cancer dose-response functions for these compounds are extrapolated to humans, taking into account interspecies differences in deposition and clearance, differences in particle size distributions, and human work activity patterns, the predicted risks at occupational exposures are remarkably similar to those observed in nickel-exposed workers. This provides support for using the animal-based dose-response functions to estimate occupational exposure limits, which are found to be comparable to those in current use.

  7. Application of response surface methodology and semi-mechanistic model to optimize fluoride removal using crushed concrete in a fixed-bed column.

    Science.gov (United States)

    Gu, Bon-Wun; Lee, Chang-Gu; Park, Seong-Jik

    2018-03-01

    The aim of this study was to investigate the removal of fluoride from aqueous solutions by using crushed concrete fines as a filter medium under varying conditions of pH 3-7, flow rate of 0.3-0.7 mL/min, and filter depth of 10-20 cm. The performance of fixed-bed columns was evaluated on the basis of the removal ratio (Re), uptake capacity (qe), degree of sorbent used (DoSU), and sorbent usage rate (SUR) obtained from breakthrough curves (BTCs). Three widely used semi-mechanistic models, that is, Bohart-Adams, Thomas, and Yoon-Nelson models, were applied to simulate the BTCs and to derive the design parameters. The Box-Behnken design of response surface methodology (RSM) was used to elucidate the individual and interactive effects of the three operational parameters on the column performance and to optimize these parameters. The results demonstrated that pH is the most important factor in the performance of fluoride removal by a fixed-bed column. The flow rate had a significant negative influence on Re and DoSU, and the effect of filter depth was observed only in the regression model for DoSU. Statistical analysis indicated that the model attained from the RSM study is suitable for describing the semi-mechanistic model parameters.

  8. New methods For Modeling Transport Of Water And Solutes In Soils

    DEFF Research Database (Denmark)

    Møldrup, Per

    Recent models for water and solute transport in unsaturated soils have been mechanistically based but numerically very involved. This dissertation concerns the development of mechanistically-based but numerically simple models for calculating and analyzing transport of water and solutes in soil...

  9. A subchannel based annular flow dryout model

    International Nuclear Information System (INIS)

    Hammouda, Najmeddine; Cheng, Zhong; Rao, Yanfei F.

    2016-01-01

    Highlights: • A modified annular flow dryout model for subchannel thermalhydraulic analysis. • Implementation of the model in Canadian subchannel code ASSERT-PV. • Assessment of the model against tube CHF experiments. • Assessment of the model against CANDU-bundle CHF experiments. - Abstract: This paper assesses a popular tube-based mechanistic critical heat flux model (Hewitt and Govan’s annular flow model (based on the model of Whalley et al.), and modifies and implements the model for bundle geometries. It describes the results of the ASSERT subchannel code predictions using the modified model, as applied to a single tube and the 28-element, 37-element and 43-element (CANFLEX) CANDU bundles. A quantitative comparison between the model predictions and experimental data indicates good agreement for a wide range of flow conditions. The comparison has resulted in an overall average error of −0.15% and an overall root-mean-square error of 5.46% with tube data representing annular film dryout type critical heat flux, and in an overall average error of −0.9% and an overall RMS error of 9.9% with Stern Laboratories’ CANDU-bundle data.

  10. Comparative approaches from empirical to mechanistic simulation modelling in Land Evaluation studies

    Science.gov (United States)

    Manna, P.; Basile, A.; Bonfante, A.; Terribile, F.

    2009-04-01

    The Land Evaluation (LE) comprise the evaluation procedures to asses the attitudes of the land to a generic or specific use (e.g. biomass production). From local to regional and national scale the approach to the land use planning should requires a deep knowledge of the processes that drive the functioning of the soil-plant-atmosphere system. According to the classical approaches the assessment of attitudes is the result of a qualitative comparison between the land/soil physical properties and the land use requirements. These approaches have a quick and inexpensive applicability; however, they are based on empirical and qualitative models with a basic knowledge structure specifically built for a specific landscape and for the specific object of the evaluation (e.g. crop). The outcome from this situation is the huge difficulties in the spatial extrapolation of the LE results and the rigidity of the system. Modern techniques instead, rely on the application of mechanistic and quantitative simulation modelling that allow a dynamic characterisation of the interrelated physical and chemical processes taking place in the soil landscape. Moreover, the insertion of physical based rules in the LE procedure may make it less difficult in terms of both extending spatially the results and changing the object (e.g. crop species, nitrate dynamics, etc.) of the evaluation. On the other side these modern approaches require high quality and quantity of input data that cause a significant increase in costs. In this scenario nowadays the LE expert is asked to choose the best LE methodology considering costs, complexity of the procedure and benefits in handling a specific land evaluation. In this work we performed a forage maize land suitability study by comparing 9 different methods having increasing complexity and costs. The study area, of about 2000 ha, is located in North Italy in the Lodi plain (Po valley). The range of the 9 employed methods ranged from standard LE approaches to

  11. ReactionPredictor: prediction of complex chemical reactions at the mechanistic level using machine learning.

    Science.gov (United States)

    Kayala, Matthew A; Baldi, Pierre

    2012-10-22

    Proposing reasonable mechanisms and predicting the course of chemical reactions is important to the practice of organic chemistry. Approaches to reaction prediction have historically used obfuscating representations and manually encoded patterns or rules. Here we present ReactionPredictor, a machine learning approach to reaction prediction that models elementary, mechanistic reactions as interactions between approximate molecular orbitals (MOs). A training data set of productive reactions known to occur at reasonable rates and yields and verified by inclusion in the literature or textbooks is derived from an existing rule-based system and expanded upon with manual curation from graduate level textbooks. Using this training data set of complex polar, hypervalent, radical, and pericyclic reactions, a two-stage machine learning prediction framework is trained and validated. In the first stage, filtering models trained at the level of individual MOs are used to reduce the space of possible reactions to consider. In the second stage, ranking models over the filtered space of possible reactions are used to order the reactions such that the productive reactions are the top ranked. The resulting model, ReactionPredictor, perfectly ranks polar reactions 78.1% of the time and recovers all productive reactions 95.7% of the time when allowing for small numbers of errors. Pericyclic and radical reactions are perfectly ranked 85.8% and 77.0% of the time, respectively, rising to >93% recovery for both reaction types with a small number of allowed errors. Decisions about which of the polar, pericyclic, or radical reaction type ranking models to use can be made with >99% accuracy. Finally, for multistep reaction pathways, we implement the first mechanistic pathway predictor using constrained tree-search to discover a set of reasonable mechanistic steps from given reactants to given products. Webserver implementations of both the single step and pathway versions of Reaction

  12. Assessing the Role of Climate Variability on Liver Fluke Risk in the UK Through Mechanistic Hydro-Epidemiological Modelling

    Science.gov (United States)

    Beltrame, L.; Dunne, T.; Rose, H.; Walker, J.; Morgan, E.; Vickerman, P.; Wagener, T.

    2016-12-01

    Liver fluke is a flatworm parasite infecting grazing animals worldwide. In the UK, it causes considerable production losses to cattle and sheep industries and costs farmers millions of pounds each year due to reduced growth rates and lower milk yields. Large part of the parasite life-cycle takes place outside of the host, with its survival and development strongly controlled by climatic and hydrologic conditions. Evidence of climate-driven changes in the distribution and seasonality of fluke disease already exists, as the infection is increasingly expanding to new areas and becoming a year-round problem. Therefore, it is crucial to assess current and potential future impacts of climate variability on the disease to guide interventions at the farm scale and mitigate risk. Climate-based fluke risk models have been available since the 1950s, however, they are based on empirical relationships derived between historical climate and incidence data, and thus are unlikely to be robust for simulating risk under changing conditions. Moreover, they are not dynamic, but estimate risk over large regions in the UK based on monthly average climate conditions, so they do not allow investigating the effects of climate variability for supporting farmers' decisions. In this study, we introduce a mechanistic model for fluke, which represents habitat suitability for disease development at 25m resolution with a daily time step, explicitly linking the parasite life-cycle to key hydro-climate conditions. The model is used on a case study in the UK and sensitivity analysis is performed to better understand the role of climate variability on the space-time dynamics of the disease, while explicitly accounting for uncertainties. Comparisons are presented with experts' knowledge and a widely used empirical model.

  13. Polymerization kinetics of wheat gluten upon thermosetting. A mechanistic model.

    Science.gov (United States)

    Domenek, Sandra; Morel, Marie-Hélène; Bonicel, Joëlle; Guilbert, Stéphane

    2002-10-09

    Size exclusion high-performance liquid chromatography analysis was carried out on wheat gluten-glycerol blends subjected to different heat treatments. The elution profiles were analyzed in order to follow the solubility loss of protein fractions with specific molecular size. Owing to the known biochemical changes involved during the heat denaturation of gluten, a mechanistic mathematical model was developed, which divided the protein denaturation into two distinct reaction steps: (i) reversible change in protein conformation and (ii) protein precipitation through disulfide bonding between initially SDS-soluble and SDS-insoluble reaction partners. Activation energies of gluten unfolding, refolding, and precipitation were calculated with the Arrhenius law to 53.9 kJ x mol(-1), 29.5 kJ x mol(-1), and 172 kJ x mol(-1), respectively. The rate of protein solubility loss decreased as the cross-linking reaction proceeded, which may be attributed to the formation of a three-dimensional network progressively hindering the reaction. The enhanced susceptibility to aggregation of large molecules was assigned to a risen reaction probability due to their higher number of cysteine residues and to the increased percentage of unfolded and thereby activated proteins as complete protein refolding seemed to be an anticooperative process.

  14. Controls on the spatial variability of key soil properties: comparing field data with a mechanistic soilscape evolution model

    Science.gov (United States)

    Vanwalleghem, T.; Román, A.; Giraldez, J. V.

    2016-12-01

    There is a need for better understanding the processes influencing soil formation and the resulting distribution of soil properties. Soil properties can exhibit strong spatial variation, even at the small catchment scale. Especially soil carbon pools in semi-arid, mountainous areas are highly uncertain because bulk density and stoniness are very heterogeneous and rarely measured explicitly. In this study, we explore the spatial variability in key soil properties (soil carbon stocks, stoniness, bulk density and soil depth) as a function of processes shaping the critical zone (weathering, erosion, soil water fluxes and vegetation patterns). We also compare the potential of a geostatistical versus a mechanistic soil formation model (MILESD) for predicting these key soil properties. Soil core samples were collected from 67 locations at 6 depths. Total soil organic carbon stocks were 4.38 kg m-2. Solar radiation proved to be the key variable controlling soil carbon distribution. Stone content was mostly controlled by slope, indicating the importance of erosion. Spatial distribution of bulk density was found to be highly random. Finally, total carbon stocks were predicted using a random forest model whose main covariates were solar radiation and NDVI. The model predicts carbon stocks that are double as high on north versus south-facing slopes. However, validation showed that these covariates only explained 25% of the variation in the dataset. Apparently, present-day landscape and vegetation properties are not sufficient to fully explain variability in the soil carbon stocks in this complex terrain under natural vegetation. This is attributed to a high spatial variability in bulk density and stoniness, key variables controlling carbon stocks. Similar results were obtained with the mechanistic soil formation model MILESD, suggesting that more complex models might be needed to further explore this high spatial variability.

  15. Thermal tides and studies to tune the mechanistic tidal model using UARS observations

    Directory of Open Access Journals (Sweden)

    V. A. Yudin

    1997-09-01

    Full Text Available Monthly simulations of the thermal diurnal and semidiurnal tides are compared to High-Resolution Doppler Imager (HRDI and Wind Imaging Interferometer (WINDII wind and temperature measurements on the Upper-Atmosphere Research Satellite (UARS. There is encouraging agreement between the observations and the linear global mechanistic tidal model results both for the diurnal and semidiurnal components in the equatorial and mid-latitude regions. This gives us the confidence to outline the first steps of an assimilative analysis/interpretation for tides, dissipation, and mean flow using a combination of model results and the global measurements from HRDI and WINDII. The sensitivity of the proposed technique to the initial guess employed to obtain a best fit to the data by tuning model parameters is discussed for the January and March 1993 cases, when the WINDII day and night measurements of the meridional winds between 90 and 110 km are used along with the daytime HRDI measurements. Several examples for the derivation of the tidal variables and decomposition of the measured winds into tidal and mean flow components using this approach are compared with previous tidal estimates and modeling results for the migrating tides. The seasonal cycle of the derived diurnal tidal amplitudes are discussed and compared with radar observation between 80 and 100 km and 40°S and 40°N.

  16. Thermal tides and studies to tune the mechanistic tidal model using UARS observations

    Directory of Open Access Journals (Sweden)

    V. A. Yudin

    Full Text Available Monthly simulations of the thermal diurnal and semidiurnal tides are compared to High-Resolution Doppler Imager (HRDI and Wind Imaging Interferometer (WINDII wind and temperature measurements on the Upper-Atmosphere Research Satellite (UARS. There is encouraging agreement between the observations and the linear global mechanistic tidal model results both for the diurnal and semidiurnal components in the equatorial and mid-latitude regions. This gives us the confidence to outline the first steps of an assimilative analysis/interpretation for tides, dissipation, and mean flow using a combination of model results and the global measurements from HRDI and WINDII. The sensitivity of the proposed technique to the initial guess employed to obtain a best fit to the data by tuning model parameters is discussed for the January and March 1993 cases, when the WINDII day and night measurements of the meridional winds between 90 and 110 km are used along with the daytime HRDI measurements. Several examples for the derivation of the tidal variables and decomposition of the measured winds into tidal and mean flow components using this approach are compared with previous tidal estimates and modeling results for the migrating tides. The seasonal cycle of the derived diurnal tidal amplitudes are discussed and compared with radar observation between 80 and 100 km and 40°S and 40°N.

  17. Opening the black box—Development, testing and documentation of a mechanistically rich agent-based model

    DEFF Research Database (Denmark)

    Topping, Chris J.; Høye, Toke; Olesen, Carsten Riis

    2010-01-01

    Although increasingly widely used in biology, complex adaptive simulation models such as agent-based models have been criticised for being difficult to communicate and test. This study demonstrates the application of pattern-oriented model testing, and a novel documentation procedure to present...... accessible description of the processes included in the model. Application of the model to a comprehensive historical data set supported the hypothesis that interference competition is the primary population regulating factor in the absence of mammal predators in the brown hare, and that the effect works...

  18. Hierarchical modeling of activation mechanisms in the ABL and EGFR kinase domains: thermodynamic and mechanistic catalysts of kinase activation by cancer mutations.

    Directory of Open Access Journals (Sweden)

    Anshuman Dixit

    2009-08-01

    Full Text Available Structural and functional studies of the ABL and EGFR kinase domains have recently suggested a common mechanism of activation by cancer-causing mutations. However, dynamics and mechanistic aspects of kinase activation by cancer mutations that stimulate conformational transitions and thermodynamic stabilization of the constitutively active kinase form remain elusive. We present a large-scale computational investigation of activation mechanisms in the ABL and EGFR kinase domains by a panel of clinically important cancer mutants ABL-T315I, ABL-L387M, EGFR-T790M, and EGFR-L858R. We have also simulated the activating effect of the gatekeeper mutation on conformational dynamics and allosteric interactions in functional states of the ABL-SH2-SH3 regulatory complexes. A comprehensive analysis was conducted using a hierarchy of computational approaches that included homology modeling, molecular dynamics simulations, protein stability analysis, targeted molecular dynamics, and molecular docking. Collectively, the results of this study have revealed thermodynamic and mechanistic catalysts of kinase activation by major cancer-causing mutations in the ABL and EGFR kinase domains. By using multiple crystallographic states of ABL and EGFR, computer simulations have allowed one to map dynamics of conformational fluctuations and transitions in the normal (wild-type and oncogenic kinase forms. A proposed multi-stage mechanistic model of activation involves a series of cooperative transitions between different conformational states, including assembly of the hydrophobic spine, the formation of the Src-like intermediate structure, and a cooperative breakage and formation of characteristic salt bridges, which signify transition to the active kinase form. We suggest that molecular mechanisms of activation by cancer mutations could mimic the activation process of the normal kinase, yet exploiting conserved structural catalysts to accelerate a conformational transition

  19. A Mechanistic Model of Onset of Flow Instability Due to Mergence of Bubble Layers in a Vertical Narrow Rectangular Channel

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Juh Yung; Chang, Soon Heung; Jeong, Yong [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    The onset of flow instability (OFI) is the one of important boiling phenomena since it may induce the premature critical heat flux (CHF) at the lowest heat flux level due to sudden flow excursion in a single channel of multichannel configuration. Especially prediction of OFI for narrow rectangular channel is very crucial in relevant to thermal-hydraulic design and safety analysis of open pool-type research reactors (RRs) using plate-type fuels. Based on high speed video (HSV) technique, the authors observed and determined that OFI and the minimum premature CHF in a narrow rectangular channel are induced by abrupt pressure drop fluctuation due to the mergence of facing bubble boundary layers (BLs) on opposite boiling surfaces. In this study, new mechanistic OFI model for narrow rectangular channel heated on both sides has been derived, which satisfies with the real triggering phenomena. Force balance approach was used for modeling of the maximum BLT since the quantity is comparable to the bubble departure diameter. From the validation with OFI database, it was shown that the new model fairly well predicts OFI heat flux for wide range of conditions.

  20. Malaria's Missing Number: Calculating the Human Component of R0 by a Within-Host Mechanistic Model of Plasmodium falciparum Infection and Transmission

    OpenAIRE

    Johnston, Geoffrey L.; Smith, David L.; Fidock, David A.

    2013-01-01

    Human infection by malarial parasites of the genus Plasmodium begins with the bite of an infected Anopheles mosquito. Current estimates place malaria mortality at over 650,000 individuals each year, mostly in African children. Efforts to reduce disease burden can benefit from the development of mathematical models of disease transmission. To date, however, comprehensive modeling of the parameters defining human infectivity to mosquitoes has remained elusive. Here, we describe a mechanistic wi...

  1. Fractal growth of tumors and other cellular populations: Linking the mechanistic to the phenomenological modeling and vice versa

    International Nuclear Information System (INIS)

    D'Onofrio, Alberto

    2009-01-01

    In this paper we study and extend the mechanistic mean field theory of growth of cellular populations proposed by Mombach et al. [Mombach JCM, Lemke N, Bodmann BEJ, Idiart MAP. A mean-field theory of cellular growth. Europhys Lett 2002;59:923-928] (MLBI model), and we demonstrate that the original model and our generalizations lead to inferences of biological interest. In the first part of this paper, we show that the model in study is widely general since it admits, as particular cases, the main phenomenological models of cellular growth. In the second part of this work, we generalize the MLBI model to a wider family of models by allowing the cells to have a generic unspecified biologically plausible interaction. Then, we derive a relationship between this generic microscopic interaction function and the growth rate of the corresponding macroscopic model. Finally, we propose to use this relationship in order to help the investigation of the biological plausibility of phenomenological models of cancer growth.

  2. Semi-mechanistic Model Applied to the Search for Economically Optimal Conditions and Blending of Gasoline Feedstock for Steam-cracking Process

    Directory of Open Access Journals (Sweden)

    Karaba Adam

    2016-01-01

    Full Text Available Steam-cracking is energetically intensive large-scaled process which transforms a wide range of hydrocarbons feedstock to petrochemical products. The dependence of products yields on feedstock composition and reaction conditions has been successfully described by mathematical models which are very useful tools for the optimization of cracker operation. Remaining problem is to formulate objective function for such an optimization. Quantitative criterion based on the process economy is proposed in this paper. Previously developed and verified industrial steam-cracking semi-mechanistic model is utilized as supporting tool for economic evaluation of selected gasoline feedstock. Economic criterion is established as the difference between value of products obtained by cracking of studied feedstock under given conditions and the value of products obtained by cracking of reference feedstock under reference conditions. As an example of method utilization, optimal reaction conditions were searched for each of selected feedstock. Potential benefit of individual cracking and cracking of grouped feedstocks in the contrast to cracking under the middle of optimums is evaluated and also compared to cracking under usual conditions.

  3. A cell-based model system links chromothripsis with hyperploidy

    DEFF Research Database (Denmark)

    Mardin, Balca R; Drainas, Alexandros P; Waszak, Sebastian M

    2015-01-01

    A remarkable observation emerging from recent cancer genome analyses is the identification of chromothripsis as a one-off genomic catastrophe, resulting in massive somatic DNA structural rearrangements (SRs). Largely due to lack of suitable model systems, the mechanistic basis of chromothripsis h...... in hyperploid cells. Analysis of primary medulloblastoma cancer genomes verified the link between hyperploidy and chromothripsis in vivo. CAST provides the foundation for mechanistic dissection of complex DNA rearrangement processes....

  4. New Mechanistic Models of Long Term Evolution of Microstructure and Mechanical Properties of Nickel Based Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Kruzic, Jamie J. [Oregon State Univ., Corvallis, OR (United States); Evans, T. Matthew [Oregon State Univ., Corvallis, OR (United States); Greaney, P. Alex [Univ. of California, Riverside, CA (United States)

    2018-05-15

    The report describes the development of a discrete element method (DEM) based modeling approach to quantitatively predict deformation and failure of typical nickel based superalloys. A series of experimental data, including microstructure and mechanical property characterization at 600°C, was collected for a relatively simple, model solid solution Ni-20Cr alloy (Nimonic 75) to determine inputs for the model and provide data for model validation. Nimonic 75 was considered ideal for this study because it is a certified tensile and creep reference material. A series of new DEM modeling approaches were developed to capture the complexity of metal deformation, including cubic elastic anisotropy and plastic deformation both with and without strain hardening. Our model approaches were implemented into a commercially available DEM code, PFC3D, that is commonly used by engineers. It is envisioned that once further developed, this new DEM modeling approach can be adapted to a wide range of engineering applications.

  5. Verification of a mechanistic model for the strain rate of zircaloy-4 fuel sheaths during transient heating

    International Nuclear Information System (INIS)

    Hunt, C.E.L.

    1980-10-01

    A mechanistic strain rate model for Zircaloy-4, named NIRVANA, was tested against experiments where pressurized fuel sheaths were strained during complex temperature-stress-time histories. The same histories were then examined to determine the spread in calculated strain which may be expected because of variations in dimensions, chemical content and mechanical properties which are allowed in the fuel sheath specifications. It was found that the variations allowed by the specifications could result in a probable spread in the predicted strain of plus or minus a factor of two from the mean value. The experimental results were well within this range. (auth)

  6. Mechanistic insights into nanotoxicity determined by synchrotron radiation-based Fourier-transform infrared imaging and multivariate analysis.

    Science.gov (United States)

    Riding, Matthew J; Trevisan, Júlio; Hirschmugl, Carol J; Jones, Kevin C; Semple, Kirk T; Martin, Francis L

    2012-12-01

    Our ability to identify the mechanisms by which carbon-based nanomaterials (CBNs) exert toxicity in cells is constrained by the lack of standardized methodologies to assay endpoint effects. Herein we describe a method of mechanistically identifying the effects of various CBN types in both prokaryotic and eukaryotic cells using multi-beam synchrotron radiation-based Fourier-transform infrared imaging (SR-FTIRI) at diffraction-limited resolution. This technique overcomes many of the inherent difficulties of assaying nanotoxicity and demonstrates exceptional sensitivity in identifying the effects of CBNs in cells at environmentally-relevant concentrations. We identify key mechanisms of nanotoxicity as the alteration of Amide and lipid biomolecules, but propose more specific bioactivity of CBNs occurs as a result of specific interactions between CBN structural conformation and cellular characteristics. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Understanding the influence of biofilm accumulation on the hydraulic properties of soils: a mechanistic approach based on experimental data

    Science.gov (United States)

    Carles Brangarí, Albert; Sanchez-Vila, Xavier; Freixa, Anna; Romaní, Anna M.; Fernàndez-Garcia, Daniel

    2017-04-01

    The distribution, amount, and characteristics of biofilms and its components govern the capacity of soils to let water through, to transport solutes, and the reactions occurring. Therefore, unraveling the relationship between microbial dynamics and the hydraulic properties of soils is of concern for the management of natural systems and many technological applications. However, the increased complexity of both the microbial communities and the geochemical processes entailed by them causes that the phenomenon of bioclogging remains poorly understood. This highlights the need for a better understanding of the microbial components such as live and dead bacteria and extracellular polymeric substances (EPS), as well as of their spatial distribution. This work tries to shed some light on these issues, providing experimental data and a new mechanistic model that predicts the variably saturated hydraulic properties of bio-amended soils based on these data. We first present a long-term laboratory infiltration experiment that aims at studying the temporal variation of selected biogeochemical parameters along the infiltration path. The setup consists of a 120-cm-high soil tank instrumented with an array of sensors plus soil and liquid samplers. Sensors measured a wide range of parameters in continuous, such as volumetric water content, electrical conductivity, temperature, water pressure, soil suction, dissolved oxygen, and pH. Samples were kept for chemical and biological analyses. Results indicate that: i) biofilm is present at all depths, denoting the potential for deep bioclogging, ii) the redox conditions profile shows different stages, indicating that the community was adapted to changing redox conditions, iii) bacterial activity, richness and diversity also exhibit zonation with depth, and iv) the hydraulic properties of the soil experienced significant changes as biofilm proliferated. Based on experimental evidences, we propose a tool to predict changes in the

  8. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission-With an application to the 2014-2015 West Africa Ebola outbreak.

    Directory of Open Access Journals (Sweden)

    Max S Y Lau

    2017-10-01

    Full Text Available In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015. Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.

  9. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission—With an application to the 2014-2015 West Africa Ebola outbreak

    Science.gov (United States)

    McClelland, Amanda; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D.; Grenfell, Bryan T.

    2017-01-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. PMID:29084216

  10. Comparison of Two Mechanistic Microbial Growth Models to Estimate Shelf Life of Perishable Food Package under Dynamic Temperature Conditions

    Directory of Open Access Journals (Sweden)

    Dong Sun Lee

    2014-01-01

    Full Text Available Two mechanistic microbial growth models (Huang’s model and model of Baranyi and Roberts given in differential and integrated equation forms were compared in predicting the microbial growth and shelf life under dynamic temperature storage and distribution conditions. Literatures consistently reporting the microbial growth data under constant and changing temperature conditions were selected to obtain the primary model parameters, set up the secondary models, and apply them to predict the microbial growth and shelf life under fluctuating temperatures. When evaluated by general estimation behavior, bias factor, accuracy factor, and root-mean-square error, Huang’s model was comparable to Baranyi and Roberts’ model in the capability to estimate microbial growth under dynamic temperature conditions. Its simple form of single differential equation incorporating directly the growth rate and lag time may work as an advantage to be used in online shelf life estimation by using the electronic device.

  11. Numerical simulation in steam injection wellbores by mechanistic approach; Simulacao numerica do escoamento de vapor em pocos por uma abordagem mecanicista

    Energy Technology Data Exchange (ETDEWEB)

    Souza Junior, J.C. de; Campos, W.; Lopes, D.; Moura, L.S.S. [PETROBRAS, Rio de Janeiro, RJ (Brazil); Thomas, A. Clecio F. [Universidade Estadual do Ceara (UECE), CE (Brazil)

    2008-07-01

    This work addresses to the development of a hydrodynamic and heat transfer mechanistic model for steam flow in injection wellbores. The problem of two-phase steam flow in wellbores has been solved recently by using available empirical correlations from petroleum industry (Lopes, 1986) and nuclear industry (Moura, 1991).The good performance achieved by mechanistic models developed by Ansari (1994), Hasan (1995), Gomez (2000) and Kaya (2001) supports the importance of the mechanistic approach for the steam flow problem in injection wellbores. In this study, the methodology to solve the problem consists in the application of a numerical method to the governing equations of steam flow and a marching algorithm to determine the distribution of the pressure and temperature along the wellbore. So, a computer code has been formulated to get numerical results, which provides a comparative study to the main models found in the literature. Finally, when compared to available field data, the mechanistic model for downward vertical steam flow in wellbores gave better results than the empirical correlations. (author)

  12. Mechanistic phenotypes: an aggregative phenotyping strategy to identify disease mechanisms using GWAS data.

    Directory of Open Access Journals (Sweden)

    Jonathan D Mosley

    Full Text Available A single mutation can alter cellular and global homeostatic mechanisms and give rise to multiple clinical diseases. We hypothesized that these disease mechanisms could be identified using low minor allele frequency (MAF<0.1 non-synonymous SNPs (nsSNPs associated with "mechanistic phenotypes", comprised of collections of related diagnoses. We studied two mechanistic phenotypes: (1 thrombosis, evaluated in a population of 1,655 African Americans; and (2 four groupings of cancer diagnoses, evaluated in 3,009 white European Americans. We tested associations between nsSNPs represented on GWAS platforms and mechanistic phenotypes ascertained from electronic medical records (EMRs, and sought enrichment in functional ontologies across the top-ranked associations. We used a two-step analytic approach whereby nsSNPs were first sorted by the strength of their association with a phenotype. We tested associations using two reverse genetic models and standard additive and recessive models. In the second step, we employed a hypothesis-free ontological enrichment analysis using the sorted nsSNPs to identify functional mechanisms underlying the diagnoses comprising the mechanistic phenotypes. The thrombosis phenotype was solely associated with ontologies related to blood coagulation (Fisher's p = 0.0001, FDR p = 0.03, driven by the F5, P2RY12 and F2RL2 genes. For the cancer phenotypes, the reverse genetics models were enriched in DNA repair functions (p = 2×10-5, FDR p = 0.03 (POLG/FANCI, SLX4/FANCP, XRCC1, BRCA1, FANCA, CHD1L while the additive model showed enrichment related to chromatid segregation (p = 4×10-6, FDR p = 0.005 (KIF25, PINX1. We were able to replicate nsSNP associations for POLG/FANCI, BRCA1, FANCA and CHD1L in independent data sets. Mechanism-oriented phenotyping using collections of EMR-derived diagnoses can elucidate fundamental disease mechanisms.

  13. A Mechanistic Reliability Assessment of RVACS and Metal Fuel Inherent Reactivity Feedbacks

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Brunett, Acacia J.; Passerini, Stefano; Grelle, Austin

    2017-09-24

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory (Argonne) participated in a two year collaboration to modernize and update the probabilistic risk assessment (PRA) for the PRISM sodium fast reactor. At a high level, the primary outcome of the project was the development of a next-generation PRA that is intended to enable risk-informed prioritization of safety- and reliability-focused research and development. A central Argonne task during this project was a reliability assessment of passive safety systems, which included the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedbacks of the metal fuel core. Both systems were examined utilizing a methodology derived from the Reliability Method for Passive Safety Functions (RMPS), with an emphasis on developing success criteria based on mechanistic system modeling while also maintaining consistency with the Fuel Damage Categories (FDCs) of the mechanistic source term assessment. This paper provides an overview of the reliability analyses of both systems, including highlights of the FMEAs, the construction of best-estimate models, uncertain parameter screening and propagation, and the quantification of system failure probability. In particular, special focus is given to the methodologies to perform the analysis of uncertainty propagation and the determination of the likelihood of violating FDC limits. Additionally, important lessons learned are also reviewed, such as optimal sampling methodologies for the discovery of low likelihood failure events and strategies for the combined treatment of aleatory and epistemic uncertainties.

  14. Development and evaluation of a dimensionless mechanistic pan coating model for the prediction of coated tablet appearance.

    Science.gov (United States)

    Niblett, Daniel; Porter, Stuart; Reynolds, Gavin; Morgan, Tomos; Greenamoyer, Jennifer; Hach, Ronald; Sido, Stephanie; Karan, Kapish; Gabbott, Ian

    2017-08-07

    A mathematical, mechanistic tablet film-coating model has been developed for pharmaceutical pan coating systems based on the mechanisms of atomisation, tablet bed movement and droplet drying with the main purpose of predicting tablet appearance quality. Two dimensionless quantities were used to characterise the product properties and operating parameters: the dimensionless Spray Flux (relating to area coverage of the spray droplets) and the Niblett Number (relating to the time available for drying of coating droplets). The Niblett Number is the ratio between the time a droplet needs to dry under given thermodynamic conditions and the time available for the droplet while on the surface of the tablet bed. The time available for drying on the tablet bed surface is critical for appearance quality. These two dimensionless quantities were used to select process parameters for a set of 22 coating experiments, performed over a wide range of multivariate process parameters. The dimensionless Regime Map created can be used to visualise the effect of interacting process parameters on overall tablet appearance quality and defects such as picking and logo bridging. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. A mechanistic model of an upper bound on oceanic carbon export as a function of mixed layer depth and temperature

    Directory of Open Access Journals (Sweden)

    Z. Li

    2017-11-01

    Full Text Available Export production reflects the amount of organic matter transferred from the ocean surface to depth through biological processes. This export is in large part controlled by nutrient and light availability, which are conditioned by mixed layer depth (MLD. In this study, building on Sverdrup's critical depth hypothesis, we derive a mechanistic model of an upper bound on carbon export based on the metabolic balance between photosynthesis and respiration as a function of MLD and temperature. We find that the upper bound is a positively skewed bell-shaped function of MLD. Specifically, the upper bound increases with deepening mixed layers down to a critical depth, beyond which a long tail of decreasing carbon export is associated with increasing heterotrophic activity and decreasing light availability. We also show that in cold regions the upper bound on carbon export decreases with increasing temperature when mixed layers are deep, but increases with temperature when mixed layers are shallow. A meta-analysis shows that our model envelopes field estimates of carbon export from the mixed layer. When compared to satellite export production estimates, our model indicates that export production in some regions of the Southern Ocean, particularly the subantarctic zone, is likely limited by light for a significant portion of the growing season.

  16. INDIVIDUAL BASED MODELLING APPROACH TO THERMAL ...

    Science.gov (United States)

    Diadromous fish populations in the Pacific Northwest face challenges along their migratory routes from declining habitat quality, harvest, and barriers to longitudinal connectivity. Changes in river temperature regimes are producing an additional challenge for upstream migrating adult salmon and steelhead, species that are sensitive to absolute and cumulative thermal exposure. Adult salmon populations have been shown to utilize cold water patches along migration routes when mainstem river temperatures exceed thermal optimums. We are employing an individual based model (IBM) to explore the costs and benefits of spatially-distributed cold water refugia for adult migrating salmon. Our model, developed in the HexSim platform, is built around a mechanistic behavioral decision tree that drives individual interactions with their spatially explicit simulated environment. Population-scale responses to dynamic thermal regimes, coupled with other stressors such as disease and harvest, become emergent properties of the spatial IBM. Other model outputs include arrival times, species-specific survival rates, body energetic content, and reproductive fitness levels. Here, we discuss the challenges associated with parameterizing an individual based model of salmon and steelhead in a section of the Columbia River. Many rivers and streams in the Pacific Northwest are currently listed as impaired under the Clean Water Act as a result of high summer water temperatures. Adverse effec

  17. Dose selection based on physiologically based pharmacokinetic (PBPK) approaches.

    Science.gov (United States)

    Jones, Hannah M; Mayawala, Kapil; Poulin, Patrick

    2013-04-01

    Physiologically based pharmacokinetic (PBPK) models are built using differential equations to describe the physiology/anatomy of different biological systems. Readily available in vitro and in vivo preclinical data can be incorporated into these models to not only estimate pharmacokinetic (PK) parameters and plasma concentration-time profiles, but also to gain mechanistic insight into compound properties. They provide a mechanistic framework to understand and extrapolate PK and dose across in vitro and in vivo systems and across different species, populations and disease states. Using small molecule and large molecule examples from the literature and our own company, we have shown how PBPK techniques can be utilised for human PK and dose prediction. Such approaches have the potential to increase efficiency, reduce the need for animal studies, replace clinical trials and increase PK understanding. Given the mechanistic nature of these models, the future use of PBPK modelling in drug discovery and development is promising, however some limitations need to be addressed to realise its application and utility more broadly.

  18. State-of-the-art report on the theoretical modeling of interfacial area concentration

    International Nuclear Information System (INIS)

    Lee, Won Jae; Euh, Dong Jin

    1998-03-01

    Classical approaches based on experimental correlations and the mechanistic approaches based on the interfacial area concentration were reviewed. The study focuses on the state-of-the-art researches based on the mechanistic modeling of the interfacial area concentration. The investigation is performed by classifying the mechanistic modeling approaches into those using the number density transport equations supported with a simple algebraic relation for obtaining interfacial area concentration and those using the direct interfacial area transport equations. The modeling approaches are subdivided into one group and multi-group models. The state-of-the-art source terms of transport equations are also investigated for their applicability and limitations. (author). 62 refs., 6 tabs., 49 figs

  19. Mechanistic origin of dragon-kings in a population of competing agents

    Science.gov (United States)

    Johnson, N.; Tivnan, B.

    2012-05-01

    We analyze the mechanistic origins of the extreme behaviors that arise in an idealized model of a population of competing agents, such as traders in a market. These extreme behaviors exhibit the defining characteristics of `dragon-kings'. Our model comprises heterogeneous agents who repeatedly compete for some limited resource, making binary choices based on the strategies that they have in their possession. It generalizes the well-known Minority Game by allowing agents whose strategies have not made accurate recent predictions, to step out of the competition until their strategies improve. This generates a complex dynamical interplay between the number V of active agents (mimicking market volume) and the imbalance D between the decisions made (mimicking excess demand). The wide spectrum of extreme behaviors which emerge, helps to explain why no unique relationship has been identified between the price and volume during real market crashes and rallies.

  20. Modelling the active site of NiFe hydrogenases: new catalysts for the electro-production of H2 and mechanistic studies

    International Nuclear Information System (INIS)

    Canaguier, S.

    2009-01-01

    NiFe hydrogenases are unique metalloenzymes that catalyze H + /H 2 interconversion with remarkable efficiency close to the thermodynamic potential. Their active site consists of a hetero-bimetallic complex containing a nickel ion in a sulphur-rich environment connected by two thiolate bridges to an organometallic cyano-carbonyl iron moiety. In order to improve the understanding of the enzymatic mechanism and to obtain new base-metal electrocatalysts for H 2 production, we synthesized a series of bio-inspired low molecular weight model complexes with the butterfly structure Ni(μ-S 2 )M (M= Ru, Mn and Fe). All these compounds displayed a catalytic activity of hydrogen production. Modulating the electronic and steric properties of the ruthenium center allowed optimizing the catalytic performances of these compounds in terms of stability, catalytic rate and overpotential. Mechanistic studies of the catalytic cycle of the Ni-Ru complexes have also been carried out. They allowed us to suggest a bio-relevant bridging hydride as the catalytic intermediate. Finally, we synthesized one of the first Ni-Fe complexes that is both a structural and a functional model of NiFe hydrogenase. (author) [fr

  1. Development and Analysis of Patient-Based Complete Conducting Airways Models.

    Directory of Open Access Journals (Sweden)

    Rafel Bordas

    Full Text Available The analysis of high-resolution computed tomography (CT images of the lung is dependent on inter-subject differences in airway geometry. The application of computational models in understanding the significance of these differences has previously been shown to be a useful tool in biomedical research. Studies using image-based geometries alone are limited to the analysis of the central airways, down to generation 6-10, as other airways are not visible on high-resolution CT. However, airways distal to this, often termed the small airways, are known to play a crucial role in common airway diseases such as asthma and chronic obstructive pulmonary disease (COPD. Other studies have incorporated an algorithmic approach to extrapolate CT segmented airways in order to obtain a complete conducting airway tree down to the level of the acinus. These models have typically been used for mechanistic studies, but also have the potential to be used in a patient-specific setting. In the current study, an image analysis and modelling pipeline was developed and applied to a number of healthy (n = 11 and asthmatic (n = 24 CT patient scans to produce complete patient-based airway models to the acinar level (mean terminal generation 15.8 ± 0.47. The resulting models are analysed in terms of morphometric properties and seen to be consistent with previous work. A number of global clinical lung function measures are compared to resistance predictions in the models to assess their suitability for use in a patient-specific setting. We show a significant difference (p < 0.01 in airways resistance at all tested flow rates in complete airway trees built using CT data from severe asthmatics (GINA 3-5 versus healthy subjects. Further, model predictions of airways resistance at all flow rates are shown to correlate with patient forced expiratory volume in one second (FEV1 (Spearman ρ = -0.65, p < 0.001 and, at low flow rates (0.00017 L/s, FEV1 over forced vital capacity (FEV1

  2. Mechanistic model coupling gas exchange dynamics and Listeria monocytogenes growth in modified atmosphere packaging of non respiring food.

    Science.gov (United States)

    Chaix, E; Broyart, B; Couvert, O; Guillaume, C; Gontard, N; Guillard, V

    2015-10-01

    A mechanistic model coupling O2 and CO2 mass transfer (namely diffusion and solubilisation in the food itself and permeation through the packaging material) to microbial growth models was developed aiming at predicting the shelf life of modified atmosphere packaging (MAP) systems. It was experimentally validated on a non-respiring food by investigating concomitantly the O2/CO2 partial pressure in packaging headspace and the growth of Listeria monocytogenes (average microbial count) within the food sample. A sensitivity analysis has revealed that the reliability of the prediction by this "super-parametrized" model (no less than 47 parameters were required for running one simulation) was strongly dependent on the accuracy of the microbial input parameters. Once validated, this model was used to decipher the role of O2/CO2 mass transfer on microbial growth and as a MAP design tool: an example of MAP dimensioning was provided in this paper as a proof of concept. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Reducing Uncertainty in the Daycent Model of Heterotrophic Respiration with a More Mechanistic Representation of Microbial Processes.

    Science.gov (United States)

    Berardi, D.; Gomez-Casanovas, N.; Hudiburg, T. W.

    2017-12-01

    Improving the certainty of ecosystem models is essential to ensuring their legitimacy, value, and ability to inform management and policy decisions. With more than a century of research exploring the variables controlling soil respiration, a high level of uncertainty remains in the ability of ecosystem models to accurately estimate respiration with changing climatic conditions. Refining model estimates of soil carbon fluxes is a high priority for climate change scientists to determine whether soils will be carbon sources or sinks in the future. We found that DayCent underestimates heterotrophic respiration by several magnitudes for our temperate mixed conifer forest site. While traditional ecosystem models simulate decomposition through first order kinetics, recent research has found that including microbial mechanisms explains 20 percent more spatial heterogeneity. We manipulated the DayCent heterotrophic respiration model to include a more mechanistic representation of microbial dynamic and compared the new model with continuous and survey observations from our experimental forest site in the Northern Rockies ecoregion. We also calibrated the model's sensitivity to soil moisture and temperature to our experimental data. We expect to improve the accuracy of the model by 20-30 percent. By using a more representative and calibrated model of soil carbon dynamics, we can better predict feedbacks between climate and soil carbon pools.

  4. A Mechanistic Model of Human Recall of Social Network Structure and Relationship Affect.

    Science.gov (United States)

    Omodei, Elisa; Brashears, Matthew E; Arenas, Alex

    2017-12-07

    The social brain hypothesis argues that the need to deal with social challenges was key to our evolution of high intelligence. Research with non-human primates as well as experimental and fMRI studies in humans produce results consistent with this claim, leading to an estimate that human primary groups should consist of roughly 150 individuals. Gaps between this prediction and empirical observations can be partially accounted for using "compression heuristics", or schemata that simplify the encoding and recall of social information. However, little is known about the specific algorithmic processes used by humans to store and recall social information. We describe a mechanistic model of human network recall and demonstrate its sufficiency for capturing human recall behavior observed in experimental contexts. We find that human recall is predicated on accurate recall of a small number of high degree network nodes and the application of heuristics for both structural and affective information. This provides new insight into human memory, social network evolution, and demonstrates a novel approach to uncovering human cognitive operations.

  5. A 3-D CFD approach to the mechanistic prediction of forced convective critical heat flux at low quality

    International Nuclear Information System (INIS)

    Jean-Marie Le Corre; Cristina H Amon; Shi-Chune Yao

    2005-01-01

    Full text of publication follows: The prediction of the Critical Heat Flux (CHF) in a heat flux controlled boiling heat exchanger is important to assess the maximal thermal capability of the system. In the case of a nuclear reactor, CHF margin gain (using improved mixing vane grid design, for instance) can allow power up-rate and enhanced operating flexibility. In general, current nuclear core design procedures use quasi-1D approach to model the coolant thermal-hydraulic conditions within the fuel bundles coupled with fully empirical CHF prediction methods. In addition, several CHF mechanistic models have been developed in the past and coupled with 1D and quasi-1D thermal-hydraulic codes. These mechanistic models have demonstrated reasonable CHF prediction characteristics and, more remarkably, correct parametric trends over wide range of fluid conditions. However, since the phenomena leading to CHF are localized near the heater, models are needed to relate local quantities of interest to area-averaged quantities. As a consequence, large CHF prediction uncertainties may be introduced and 3D fluid characteristics (such as swirling flow) cannot be accounted properly. Therefore, a fully mechanistic approach to CHF prediction is, in general, not possible using the current approach. The development of CHF-enhanced fuel assembly designs requires the use of more advanced 3D coolant properties computations coupled with a CHF mechanistic modeling. In the present work, the commercial CFD code CFX-5 is used to compute 3D coolant conditions in a vertical heated tube with upward flow. Several CHF mechanistic models at low quality available in the literature are coupled with the CFD code by developing adequate models between local coolant properties and local parameters of interest to predict CHF. The prediction performances of these models are assessed using CHF databases available in the open literature and the 1995 CHF look-up table. Since CFD can reasonably capture 3D fluid

  6. Model-based experimental design for assessing effects of mixtures of chemicals

    Energy Technology Data Exchange (ETDEWEB)

    Baas, Jan, E-mail: jan.baas@falw.vu.n [Vrije Universiteit of Amsterdam, Dept of Theoretical Biology, De Boelelaan 1085, 1081 HV Amsterdam (Netherlands); Stefanowicz, Anna M., E-mail: anna.stefanowicz@uj.edu.p [Institute of Environmental Sciences, Jagiellonian University, Gronostajowa 7, 30-387 Krakow (Poland); Klimek, Beata, E-mail: beata.klimek@uj.edu.p [Institute of Environmental Sciences, Jagiellonian University, Gronostajowa 7, 30-387 Krakow (Poland); Laskowski, Ryszard, E-mail: ryszard.laskowski@uj.edu.p [Institute of Environmental Sciences, Jagiellonian University, Gronostajowa 7, 30-387 Krakow (Poland); Kooijman, Sebastiaan A.L.M., E-mail: bas@bio.vu.n [Vrije Universiteit of Amsterdam, Dept of Theoretical Biology, De Boelelaan 1085, 1081 HV Amsterdam (Netherlands)

    2010-01-15

    We exposed flour beetles (Tribolium castaneum) to a mixture of four poly aromatic hydrocarbons (PAHs). The experimental setup was chosen such that the emphasis was on assessing partial effects. We interpreted the effects of the mixture by a process-based model, with a threshold concentration for effects on survival. The behavior of the threshold concentration was one of the key features of this research. We showed that the threshold concentration is shared by toxicants with the same mode of action, which gives a mechanistic explanation for the observation that toxic effects in mixtures may occur in concentration ranges where the individual components do not show effects. Our approach gives reliable predictions of partial effects on survival and allows for a reduction of experimental effort in assessing effects of mixtures, extrapolations to other mixtures, other points in time, or in a wider perspective to other organisms. - We show a mechanistic approach to assess effects of mixtures in low concentrations.

  7. Model-based experimental design for assessing effects of mixtures of chemicals

    International Nuclear Information System (INIS)

    Baas, Jan; Stefanowicz, Anna M.; Klimek, Beata; Laskowski, Ryszard; Kooijman, Sebastiaan A.L.M.

    2010-01-01

    We exposed flour beetles (Tribolium castaneum) to a mixture of four poly aromatic hydrocarbons (PAHs). The experimental setup was chosen such that the emphasis was on assessing partial effects. We interpreted the effects of the mixture by a process-based model, with a threshold concentration for effects on survival. The behavior of the threshold concentration was one of the key features of this research. We showed that the threshold concentration is shared by toxicants with the same mode of action, which gives a mechanistic explanation for the observation that toxic effects in mixtures may occur in concentration ranges where the individual components do not show effects. Our approach gives reliable predictions of partial effects on survival and allows for a reduction of experimental effort in assessing effects of mixtures, extrapolations to other mixtures, other points in time, or in a wider perspective to other organisms. - We show a mechanistic approach to assess effects of mixtures in low concentrations.

  8. A model for life predictions of nickel-base superalloys in high-temperature low cycle fatigue

    Science.gov (United States)

    Romanoski, Glenn R.; Pelloux, Regis M.; Antolovich, Stephen D.

    1988-01-01

    Extensive characterization of low-cycle fatigue damage mechanisms was performed on polycrystalline Rene 80 and IN100 tested in the temperature range from 871 to 1000 C. Low-cycle fatigue life was found to be dominated by propagation of microcracks to a critical size governed by the maximum tensile stress. A model was developed which incorporates a threshold stress for crack extension, a stress-based crack growth expression, and a failure criterion. The mathematical equivalence between this mechanistically based model and the strain-life low-cycle fatigue law was demonstrated using cyclic stress-strain relationships. The model was shown to correlate the high-temperature low-cycle fatigue data of the different nickel-base superalloys considered in this study.

  9. Causation at Different Levels: Tracking the Commitments of Mechanistic Explanations

    DEFF Research Database (Denmark)

    Fazekas, Peter; Kertész, Gergely

    2011-01-01

    connections transparent. These general commitments get confronted with two claims made by certain proponents of the mechanistic approach: William Bechtel often argues that within the mechanistic framework it is possible to balance between reducing higher levels and maintaining their autonomy at the same time...... their autonomy at the same time than standard reductive accounts are, and that what mechanistic explanations are able to do at best is showing that downward causation does not exist....

  10. On the closed form mechanistic modeling of milling: Specific cutting energy, torque, and power

    Science.gov (United States)

    Bayoumi, A. E.; Yücesan, G.; Hutton, D. V.

    1994-02-01

    Specific energy in metal cutting, defined as the energy expended in removing a unit volume of workpiece material, is formulated and determined using a previously developed closed form mechanistic force model for milling operations. Cutting power is computed from the cutting torque, cutting force, kinematics of the cutter, and the volumetric material removal rate. Closed form expressions for specific cutting energy were formulated and found to be functions of the process parameters: pressure and friction for both rake and flank surfaces and chip flow angle at the rake face of the tool. Friction is found to play a very important role in cutting torque and power. Experiments were carried out to determine the effects of feedrate, cutting speed, workpiece material, and flank wear land width on specific cutting energy. It was found that the specific cutting energy increases with a decrease in the chip thickness and with an increase in flank wear land.

  11. Population PK modelling and simulation based on fluoxetine and norfluoxetine concentrations in milk: a milk concentration-based prediction model.

    Science.gov (United States)

    Tanoshima, Reo; Bournissen, Facundo Garcia; Tanigawara, Yusuke; Kristensen, Judith H; Taddio, Anna; Ilett, Kenneth F; Begg, Evan J; Wallach, Izhar; Ito, Shinya

    2014-10-01

    Population pharmacokinetic (pop PK) modelling can be used for PK assessment of drugs in breast milk. However, complex mechanistic modelling of a parent and an active metabolite using both blood and milk samples is challenging. We aimed to develop a simple predictive pop PK model for milk concentration-time profiles of a parent and a metabolite, using data on fluoxetine (FX) and its active metabolite, norfluoxetine (NFX), in milk. Using a previously published data set of drug concentrations in milk from 25 women treated with FX, a pop PK model predictive of milk concentration-time profiles of FX and NFX was developed. Simulation was performed with the model to generate FX and NFX concentration-time profiles in milk of 1000 mothers. This milk concentration-based pop PK model was compared with the previously validated plasma/milk concentration-based pop PK model of FX. Milk FX and NFX concentration-time profiles were described reasonably well by a one compartment model with a FX-to-NFX conversion coefficient. Median values of the simulated relative infant dose on a weight basis (sRID: weight-adjusted daily doses of FX and NFX through breastmilk to the infant, expressed as a fraction of therapeutic FX daily dose per body weight) were 0.028 for FX and 0.029 for NFX. The FX sRID estimates were consistent with those of the plasma/milk-based pop PK model. A predictive pop PK model based on only milk concentrations can be developed for simultaneous estimation of milk concentration-time profiles of a parent (FX) and an active metabolite (NFX). © 2014 The British Pharmacological Society.

  12. Network-based discovery through mechanistic systems biology. Implications for applications--SMEs and drug discovery: where the action is.

    Science.gov (United States)

    Benson, Neil

    2015-08-01

    Phase II attrition remains the most important challenge for drug discovery. Tackling the problem requires improved understanding of the complexity of disease biology. Systems biology approaches to this problem can, in principle, deliver this. This article reviews the reports of the application of mechanistic systems models to drug discovery questions and discusses the added value. Although we are on the journey to the virtual human, the length, path and rate of learning from this remain an open question. Success will be dependent on the will to invest and make the most of the insight generated along the way. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Developing a framework to model the primary drying step of a continuous freeze-drying process based on infrared radiation

    DEFF Research Database (Denmark)

    Van Bockstal, Pieter-Jan; Corver, Jos; Mortier, Séverine Thérèse F.C.

    2018-01-01

    . These results assist in the selection of proper materials which could serve as IR window in the continuous freeze-drying prototype. The modelling framework presented in this paper fits the model-based design approach used for the development of this prototype and shows the potential benefits of this design...... requires the fundamental mechanistic modelling of each individual process step. Therefore, a framework is presented for the modelling and control of the continuous primary drying step based on non-contact IR radiation. The IR radiation emitted by the radiator filaments passes through various materials...

  14. Mechanistic and "natural" body metaphors and their effects on attitudes to hormonal contraception.

    Science.gov (United States)

    Walker, Susan

    2012-01-01

    A small, self-selected convenience sample of male and female contraceptive users in the United Kingdom (n = 34) were interviewed between 2006 and 2008 concerning their feelings about the body and their contraceptive attitudes and experiences. The interviewees were a sub-sample of respondents (n = 188) who completed a paper-based questionnaire on similar topics, who were recruited through a poster placed in a family planning clinic, web-based advertisements on workplace and university websites, and through direct approaches to social groups. The bodily metaphors used when discussing contraception were analyzed using an interpretative phenomenological analytical approach facilitated by Atlas.ti software. The dominant bodily metaphor was mechanistic (i.e.,"body as machine"). A subordinate but influential bodily metaphor was the "natural" body, which had connotations of connection to nature and a quasi-sacred bodily order. Interviewees drew upon this "natural" metaphorical image in the context of discussing their anxieties about hormonal contraception. Drawing upon a "natural," non-mechanistic body image in the context of contraceptive decision-making contributed to reluctance to use a hormonal form of contraception. This research suggests that clinicians could improve communication and advice about contraception by recognizing that some users may draw upon non-mechanistic body imagery.

  15. FOAM3D: A numerical simulator for mechanistic prediciton of foam displacement in multidimensions

    Energy Technology Data Exchange (ETDEWEB)

    Kovscek, A.R.; Patzek, T.W. [Lawrence Berkeley Laboratory, Berkeley, CA (United States); Radke, C.J. [Univ. of California, Berkeley, CA (United States)

    1995-03-01

    Field application of foam is a technically viable enhanced oil recovery process (EOR) as demonstrated by recent steam-foam field studies. Traditional gas-displacement processes, such as steam drive, are improved substantially by controlling gas mobility and thereby improving volumetric displacement efficiency. For instance, Patzek and Koinis showed major oil-recovery response after about two years of foam injection in two different pilot studies at the Kern River field. They report increased production of 5.5 to 14% of the original oil in place over a five year period. Because reservoir-scale simulation is a vital component of the engineering and economic evaluation of any EOR project, efficient application of foam as a displacement fluid requires a predictive numerical model of foam displacement. A mechanistic model would also expedite scale-up of the process from the laboratory to the field scale. No general, mechanistic, field-scale model for foam displacement is currently in use.

  16. Mechanistic considerations used in the development of the probability of failure in transient increases in power (PROFIT) pellet-zircaloy cladding (thermo-mechanical-chemical) interactions (pci) fuel failure model

    International Nuclear Information System (INIS)

    Pankaskie, P.J.

    1980-05-01

    A fuel Pellet-Zircaloy Cladding (thermo-mechanical-chemical) interactions (PCI) failure model for estimating the Probability of Failure in Transient Increases in Power (PROFIT) was developed. PROFIT is based on (1) standard statistical methods applied to available PCI fuel failure data and (2) a mechanistic analysis of the environmental and strain-rate-dependent stress versus strain characteristics of Zircaloy cladding. The statistical analysis of fuel failures attributable to PCI suggested that parameters in addition to power, transient increase in power, and burnup are needed to define PCI fuel failures in terms of probability estimates with known confidence limits. The PROFIT model, therefore, introduces an environmental and strain-rate dependent Strain Energy Absorption to Failure (SEAF) concept to account for the stress versus strain anomalies attributable to interstitial-dislocation interaction effects in the Zircaloy cladding

  17. Can ligand addition to soil enhance Cd phytoextraction? A mechanistic model study.

    Science.gov (United States)

    Lin, Zhongbing; Schneider, André; Nguyen, Christophe; Sterckeman, Thibault

    2014-11-01

    Phytoextraction is a potential method for cleaning Cd-polluted soils. Ligand addition to soil is expected to enhance Cd phytoextraction. However, experimental results show that this addition has contradictory effects on plant Cd uptake. A mechanistic model simulating the reaction kinetics (adsorption on solid phase, complexation in solution), transport (convection, diffusion) and root absorption (symplastic, apoplastic) of Cd and its complexes in soil was developed. This was used to calculate plant Cd uptake with and without ligand addition in a great number of combinations of soil, ligand and plant characteristics, varying the parameters within defined domains. Ligand addition generally strongly reduced hydrated Cd (Cd(2+)) concentration in soil solution through Cd complexation. Dissociation of Cd complex ([Formula: see text]) could not compensate for this reduction, which greatly lowered Cd(2+) symplastic uptake by roots. The apoplastic uptake of [Formula: see text] was not sufficient to compensate for the decrease in symplastic uptake. This explained why in the majority of the cases, ligand addition resulted in the reduction of the simulated Cd phytoextraction. A few results showed an enhanced phytoextraction in very particular conditions (strong plant transpiration with high apoplastic Cd uptake capacity), but this enhancement was very limited, making chelant-enhanced phytoextraction poorly efficient for Cd.

  18. Mechanistic-Empirical (M-E) Design Implementation & Monitoring for Flexible Pavements : 2018 PROJECT SUMMARY

    Science.gov (United States)

    2018-06-01

    This document is a summary of the tasks performed for Project ICT-R27-149-1. Mechanistic-empirical (M-E)based flexible pavement design concepts and procedures were previously developed in Illinois Cooperative Highway Research Program projects IHR-...

  19. Mechanistic Prediction of the Effect of Microstructural Coarsening on Creep Response of SnAgCu Solder Joints

    Science.gov (United States)

    Mukherjee, S.; Chauhan, P.; Osterman, M.; Dasgupta, A.; Pecht, M.

    2016-07-01

    Mechanistic microstructural models have been developed to capture the effect of isothermal aging on time dependent viscoplastic response of Sn3.0Ag0.5Cu (SAC305) solders. SnAgCu (SAC) solders undergo continuous microstructural coarsening during both storage and service because of their high homologous temperature. The microstructures of these low melting point alloys continuously evolve during service. This results in evolution of creep properties of the joint over time, thereby influencing the long term reliability of microelectronic packages. It is well documented that isothermal aging degrades the creep resistance of SAC solder. SAC305 alloy is aged for (24-1000) h at (25-100)°C (~0.6-0.8 × T melt). Cross-sectioning and image processing techniques were used to periodically quantify the effect of isothermal aging on phase coarsening and evolution. The parameters monitored during isothermal aging include size, area fraction, and inter-particle spacing of nanoscale Ag3Sn intermetallic compounds (IMCs) and the volume fraction of micronscale Cu6Sn5 IMCs, as well as the area fraction of pure tin dendrites. Effects of microstructural evolution on secondary creep constitutive response of SAC305 solder joints were then modeled using a mechanistic multiscale creep model. The mechanistic phenomena modeled include: (1) dispersion strengthening by coarsened nanoscale Ag3Sn IMCs in the eutectic phase; and (2) load sharing between pro-eutectic Sn dendrites and the surrounding coarsened eutectic Sn-Ag phase and microscale Cu6Sn5 IMCs. The coarse-grained polycrystalline Sn microstructure in SAC305 solder was not captured in the above model because isothermal aging does not cause any significant change in the initial grain size and orientation of SAC305 solder joints. The above mechanistic model can successfully capture the drop in creep resistance due to the influence of isothermal aging on SAC305 single crystals. Contribution of grain boundary sliding to the creep strain of

  20. Malaria's missing number: calculating the human component of R0 by a within-host mechanistic model of Plasmodium falciparum infection and transmission.

    Directory of Open Access Journals (Sweden)

    Geoffrey L Johnston

    2013-04-01

    Full Text Available Human infection by malarial parasites of the genus Plasmodium begins with the bite of an infected Anopheles mosquito. Current estimates place malaria mortality at over 650,000 individuals each year, mostly in African children. Efforts to reduce disease burden can benefit from the development of mathematical models of disease transmission. To date, however, comprehensive modeling of the parameters defining human infectivity to mosquitoes has remained elusive. Here, we describe a mechanistic within-host model of Plasmodium falciparum infection in humans and pathogen transmission to the mosquito vector. Our model incorporates the entire parasite lifecycle, including the intra-erythrocytic asexual forms responsible for disease, the onset of symptoms, the development and maturation of intra-erythrocytic gametocytes that are transmissible to Anopheles mosquitoes, and human-to-mosquito infectivity. These model components were parameterized from malaria therapy data and other studies to simulate individual infections, and the ensemble of outputs was found to reproduce the full range of patient responses to infection. Using this model, we assessed human infectivity over the course of untreated infections and examined the effects in relation to transmission intensity, expressed by the basic reproduction number R0 (defined as the number of secondary cases produced by a single typical infection in a completely susceptible population. Our studies predict that net human-to-mosquito infectivity from a single non-immune individual is on average equal to 32 fully infectious days. This estimate of mean infectivity is equivalent to calculating the human component of malarial R0 . We also predict that mean daily infectivity exceeds five percent for approximately 138 days. The mechanistic framework described herein, made available as stand-alone software, will enable investigators to conduct detailed studies into theories of malaria control, including the effects of

  1. Mechanistic and Economical Characteristics of Asphalt Rubber Mixtures

    Directory of Open Access Journals (Sweden)

    Mena I. Souliman

    2016-01-01

    Full Text Available Load associated fatigue cracking is one of the major distress types occurring in flexible pavement systems. Flexural bending beam fatigue laboratory test has been used for several decades and is considered to be an integral part of the new superpave advanced characterization procedure. One of the most significant solutions to prolong the fatigue life for an asphaltic mixture is to utilize flexible materials as rubber. A laboratory testing program was performed on a conventional and Asphalt Rubber- (AR- gap-graded mixtures to investigate the impact of added rubber on the mechanical, mechanistic, and economical attributes of asphaltic mixtures. Strain controlled fatigue tests were conducted according to American Association of State Highway and Transportation Officials (AASHTO procedures. The results from the beam fatigue tests indicated that the AR-gap-graded mixtures would have much longer fatigue life compared with the reference (conventional mixtures. In addition, a mechanistic analysis using 3D-Move software coupled with a cost analysis study based on the fatigue performance on the two mixtures was performed. Overall, analysis showed that AR modified asphalt mixtures exhibited significantly lower cost of pavement per 1000 cycles of fatigue life per mile compared to conventional HMA mixture.

  2. Modeling of iodine radiation chemistry in the presence of organic compounds

    International Nuclear Information System (INIS)

    Taghipour, Fariborz; Evans, Greg J.

    2002-01-01

    A kinetic-based model was developed that simulates the radiation chemistry of iodine in the presence of organic compounds. The model's mechanistic description of iodine chemistry and generic semi-mechanistic reactions for various classes of organics, provided a reasonable representation of experimental results. The majority of the model and experimental results of iodine volatilization rates were in agreement within an order of magnitude

  3. Comparing and Contrasting Traditional Membrane Bioreactor Models with Novel Ones Based on Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Parneet Paul

    2013-02-01

    Full Text Available The computer modelling and simulation of wastewater treatment plant and their specific technologies, such as membrane bioreactors (MBRs, are becoming increasingly useful to consultant engineers when designing, upgrading, retrofitting, operating and controlling these plant. This research uses traditional phenomenological mechanistic models based on MBR filtration and biochemical processes to measure the effectiveness of alternative and novel time series models based upon input–output system identification methods. Both model types are calibrated and validated using similar plant layouts and data sets derived for this purpose. Results prove that although both approaches have their advantages, they also have specific disadvantages as well. In conclusion, the MBR plant designer and/or operator who wishes to use good quality, calibrated models to gain a better understanding of their process, should carefully consider which model type is selected based upon on what their initial modelling objectives are. Each situation usually proves unique.

  4. Mechanistic link between uptake of sulfonamides and bacteriostatic effect: model development and application to experimental data from two soil microorganisms.

    Science.gov (United States)

    Focks, Andreas; Klasmeier, Jörg; Matthies, Michael

    2010-07-01

    Sulfonamides (SA) are antibiotic compounds that are widely used as human and veterinary pharmaceuticals. They are not rapidly biodegradable and have been detected in various environmental compartments. Effects of sulfonamides on microbial endpoints in soil have been reported from laboratory incubation studies. Sulfonamides inhibit the growth of sensitive microorganisms by competitive binding to the dihydropteroate-synthase (DHPS) enzyme of folic acid production. A mathematical model was developed that relates the extracellular SA concentration to the inhibition of the relative bacterial growth rate. Two factors--the anionic accumulation factor (AAF) and the cellular affinity factor (CAF)--determine the effective concentration of an SA. The AAF describes the SA uptake into bacterial cells and varies with both the extra- and intracellular pH values and with the acidic pKa value of an SA. The CAF subsumes relevant cellular and enzyme properties, and is directly proportional to the DHPS affinity constant for an SA. Based on the model, a mechanistic dose-response relationship is developed and evaluated against previously published data, where differences in the responses of Pseudomonas aeruginosa and Panthoea agglomerans toward changing medium pH values were found, most likely as a result of their diverse pH regulation. The derived dose-response relationship explains the pH and pKa dependency of mean effective concentration values (EC50) of eight SA and two soil bacteria based on AAF and CAF values. The mathematical model can be used to extrapolate sulfonamide effects to other pH values and to calculate the CAF as a pH-independent measure for the SA effects on microbial growth. Copyright (c) 2010 SETAC.

  5. Towards the development of mechanistically based design rules for corrosion fatigue in ductile steels

    International Nuclear Information System (INIS)

    Johnson, R.; McMinn, A.; Tomkins, B.

    1980-08-01

    Design curves for nuclear pressure vessels and off-shore structures are based on air endurance curves that have had a safety factor applied to account for effects such as corrosive environments, frequency and mean stress. These are supported by a limited number of endurance tests on actual pressure vessels, and on welded joints under service conditions. These data-based rules are limited in their ability to cope with environmental effects and as the time dependencies of fatigue and corrosion processes are so different, no sound basis exists for the extrapolation of data to long component lifetimes. The crack-growth behaviour of materials used in nuclear pressure vessels and off-shore structures is examined with a view to determining how it may be used to re-assess the design curves. Even simple integration of crack-growth laws can be seen to be within reasonable agreement with present design curves; with improved methods of stress analysis, etc. this approach could potentially improve these curves. Mechanistic studies are also seen to offer a means of examining and assessing time-dependent process interactions and so, potentially, to form the basis of new guidelines. Finally the areas where further work would be needed to substantiate any changes in design curves are indicated. (author)

  6. Generative mechanistic explanation building in undergraduate molecular and cellular biology

    Science.gov (United States)

    Southard, Katelyn M.; Espindola, Melissa R.; Zaepfel, Samantha D.; Bolger, Molly S.

    2017-09-01

    When conducting scientific research, experts in molecular and cellular biology (MCB) use specific reasoning strategies to construct mechanistic explanations for the underlying causal features of molecular phenomena. We explored how undergraduate students applied this scientific practice in MCB. Drawing from studies of explanation building among scientists, we created and applied a theoretical framework to explore the strategies students use to construct explanations for 'novel' biological phenomena. Specifically, we explored how students navigated the multi-level nature of complex biological systems using generative mechanistic reasoning. Interviews were conducted with introductory and upper-division biology students at a large public university in the United States. Results of qualitative coding revealed key features of students' explanation building. Students used modular thinking to consider the functional subdivisions of the system, which they 'filled in' to varying degrees with mechanistic elements. They also hypothesised the involvement of mechanistic entities and instantiated abstract schema to adapt their explanations to unfamiliar biological contexts. Finally, we explored the flexible thinking that students used to hypothesise the impact of mutations on multi-leveled biological systems. Results revealed a number of ways that students drew mechanistic connections between molecules, functional modules (sets of molecules with an emergent function), cells, tissues, organisms and populations.

  7. "Ratio via Machina": Three Standards of Mechanistic Explanation in Sociology

    Science.gov (United States)

    Aviles, Natalie B.; Reed, Isaac Ariail

    2017-01-01

    Recently, sociologists have expended much effort in attempts to define social mechanisms. We intervene in these debates by proposing that sociologists in fact have a choice to make between three standards of what constitutes a good mechanistic explanation: substantial, formal, and metaphorical mechanistic explanation. All three standards are…

  8. Mechanistic evidence for a ring-opening pathway in the Pd-catalyzed direct arylation of benzoxazoles

    DEFF Research Database (Denmark)

    Sanchez, R.S.; Zhuravlev, Fedor

    2007-01-01

    The direct Pd-catalyzed arylation of 5-substituted benzoxazoles, used as a mechanistic model for 1,3-azoles, was investigated experimentally and computationally. The results of the primary deuterium kinetic isotope effect, Hammett studies, and H/D exchange were shown to be inconsistent with the r......The direct Pd-catalyzed arylation of 5-substituted benzoxazoles, used as a mechanistic model for 1,3-azoles, was investigated experimentally and computationally. The results of the primary deuterium kinetic isotope effect, Hammett studies, and H/D exchange were shown to be inconsistent...... with the rate-limiting electrophilic or concerted palladation. A mechanism, proposed on the basis of kinetic and computational studies, includes generation of isocyanophenolate as the key step. The DFT calculations suggest that the overall catalytic cycle is facile and is largely controlled by the C-H acidity...

  9. Toward mechanistic classification of enzyme functions.

    Science.gov (United States)

    Almonacid, Daniel E; Babbitt, Patricia C

    2011-06-01

    Classification of enzyme function should be quantitative, computationally accessible, and informed by sequences and structures to enable use of genomic information for functional inference and other applications. Large-scale studies have established that divergently evolved enzymes share conserved elements of structure and common mechanistic steps and that convergently evolved enzymes often converge to similar mechanisms too, suggesting that reaction mechanisms could be used to develop finer-grained functional descriptions than provided by the Enzyme Commission (EC) system currently in use. Here we describe how evolution informs these structure-function mappings and review the databases that store mechanisms of enzyme reactions along with recent developments to measure ligand and mechanistic similarities. Together, these provide a foundation for new classifications of enzyme function. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Calibration and analysis of genome-based models for microbial ecology.

    Science.gov (United States)

    Louca, Stilianos; Doebeli, Michael

    2015-10-16

    Microbial ecosystem modeling is complicated by the large number of unknown parameters and the lack of appropriate calibration tools. Here we present a novel computational framework for modeling microbial ecosystems, which combines genome-based model construction with statistical analysis and calibration to experimental data. Using this framework, we examined the dynamics of a community of Escherichia coli strains that emerged in laboratory evolution experiments, during which an ancestral strain diversified into two coexisting ecotypes. We constructed a microbial community model comprising the ancestral and the evolved strains, which we calibrated using separate monoculture experiments. Simulations reproduced the successional dynamics in the evolution experiments, and pathway activation patterns observed in microarray transcript profiles. Our approach yielded detailed insights into the metabolic processes that drove bacterial diversification, involving acetate cross-feeding and competition for organic carbon and oxygen. Our framework provides a missing link towards a data-driven mechanistic microbial ecology.

  11. Predicting interactions from mechanistic information: Can omic data validate theories?

    International Nuclear Information System (INIS)

    Borgert, Christopher J.

    2007-01-01

    To address the most pressing and relevant issues for improving mixture risk assessment, researchers must first recognize that risk assessment is driven by both regulatory requirements and scientific research, and that regulatory concerns may expand beyond the purely scientific interests of researchers. Concepts of 'mode of action' and 'mechanism of action' are used in particular ways within the regulatory arena, depending on the specific assessment goals. The data requirements for delineating a mode of action and predicting interactive toxicity in mixtures are not well defined from a scientific standpoint due largely to inherent difficulties in testing certain underlying assumptions. Understanding the regulatory perspective on mechanistic concepts will be important for designing experiments that can be interpreted clearly and applied in risk assessments without undue reliance on extrapolation and assumption. In like fashion, regulators and risk assessors can be better equipped to apply mechanistic data if the concepts underlying mechanistic research and the limitations that must be placed on interpretation of mechanistic data are understood. This will be critically important for applying new technologies to risk assessment, such as functional genomics, proteomics, and metabolomics. It will be essential not only for risk assessors to become conversant with the language and concepts of mechanistic research, including new omic technologies, but also, for researchers to become more intimately familiar with the challenges and needs of risk assessment

  12. Explanation and inference: Mechanistic and functional explanations guide property generalization

    Directory of Open Access Journals (Sweden)

    Tania eLombrozo

    2014-09-01

    Full Text Available The ability to generalize from the known to the unknown is central to learning and inference. Two experiments explore the relationship between how a property is explained and how that property is generalized to novel species and artifacts. The experiments contrast the consequences of explaining a property mechanistically, by appeal to parts and processes, with the consequences of explaining the property functionally, by appeal to functions and goals. The findings suggest that properties that are explained functionally are more likely to be generalized on the basis of shared functions, with a weaker relationship between mechanistic explanations and generalization on the basis of shared parts and processes. The influence of explanation type on generalization holds even though all participants are provided with the same mechanistic and functional information, and whether an explanation type is freely generated (Experiment 1, experimentally provided (Experiment 2, or experimentally induced (Experiment 2. The experiments also demonstrate that explanations and generalizations of a particular type (mechanistic or functional can be experimentally induced by providing sample explanations of that type, with a comparable effect when the sample explanations come from the same domain or from a different domains. These results suggest that explanations serve as a guide to generalization, and contribute to a growing body of work supporting the value of distinguishing mechanistic and functional explanations.

  13. Explanation and inference: mechanistic and functional explanations guide property generalization.

    Science.gov (United States)

    Lombrozo, Tania; Gwynne, Nicholas Z

    2014-01-01

    The ability to generalize from the known to the unknown is central to learning and inference. Two experiments explore the relationship between how a property is explained and how that property is generalized to novel species and artifacts. The experiments contrast the consequences of explaining a property mechanistically, by appeal to parts and processes, with the consequences of explaining the property functionally, by appeal to functions and goals. The findings suggest that properties that are explained functionally are more likely to be generalized on the basis of shared functions, with a weaker relationship between mechanistic explanations and generalization on the basis of shared parts and processes. The influence of explanation type on generalization holds even though all participants are provided with the same mechanistic and functional information, and whether an explanation type is freely generated (Experiment 1), experimentally provided (Experiment 2), or experimentally induced (Experiment 2). The experiments also demonstrate that explanations and generalizations of a particular type (mechanistic or functional) can be experimentally induced by providing sample explanations of that type, with a comparable effect when the sample explanations come from the same domain or from a different domains. These results suggest that explanations serve as a guide to generalization, and contribute to a growing body of work supporting the value of distinguishing mechanistic and functional explanations.

  14. Mechanistic model of the inverted annular film boiling

    International Nuclear Information System (INIS)

    Seok, Ho; Chang, Soon Heung

    1989-01-01

    An analytical model is developed to predict the heat transfer coefficient and the friction factor in the inverted annular film boiling. The developed model is based on two-fluid mass, momentum and energy balance equations and a theoretical velocity profile. The predictions of the proposed model are compared with the experimental data and the well-established correlations. For the heat transfer coefficient, they agree with the experimental data and are more promising than those of Bromely and Berenson correlations. The present model also accounts the effects of the mass flux and subcooling on the heat transfer. The friction factor predictions agree qualitatively with the experimental measurements, while some cases show a similar behavior with those of the post-CHF dispersed flow obtained from Beattie's correlation

  15. Mechanistic Basis of Cocrystal Dissolution Advantage.

    Science.gov (United States)

    Cao, Fengjuan; Amidon, Gordon L; Rodríguez-Hornedo, Naír; Amidon, Gregory E

    2018-01-01

    Current interest in cocrystal development resides in the advantages that the cocrystal may have in solubility and dissolution compared with the parent drug. This work provides a mechanistic analysis and comparison of the dissolution behavior of carbamazepine (CBZ) and its 2 cocrystals, carbamazepine-saccharin (CBZ-SAC) and carbamazepine-salicylic acid (CBZ-SLC) under the influence of pH and micellar solubilization. A simple mathematical equation is derived based on the mass transport analyses to describe the dissolution advantage of cocrystals. The dissolution advantage is the ratio of the cocrystal flux to drug flux and is defined as the solubility advantage (cocrystal to drug solubility ratio) times the diffusivity advantage (cocrystal to drug diffusivity ratio). In this work, the effective diffusivity of CBZ in the presence of surfactant was determined to be different and less than those of the cocrystals. The higher effective diffusivity of drug from the dissolved cocrystals, the diffusivity advantage, can impart a dissolution advantage to cocrystals with lower solubility than the parent drug while still maintaining thermodynamic stability. Dissolution conditions where cocrystals can display both thermodynamic stability and a dissolution advantage can be obtained from the mass transport models, and this information is useful for both cocrystal selection and formulation development. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  16. Fast charging technique for high power LiFePO4 batteries: A mechanistic analysis of aging

    Science.gov (United States)

    Anseán, D.; Dubarry, M.; Devie, A.; Liaw, B. Y.; García, V. M.; Viera, J. C.; González, M.

    2016-07-01

    One of the major issues hampering the acceptance of electric vehicles (EVs) is the anxiety associated with long charging time. Hence, the ability to fast charging lithium-ion battery (LIB) systems is gaining notable interest. However, fast charging is not tolerated by all LIB chemistries because it affects battery functionality and accelerates its aging processes. Here, we investigate the long-term effects of multistage fast charging on a commercial high power LiFePO4-based cell and compare it to another cell tested under standard charging. Coupling incremental capacity (IC) and IC peak area analysis together with mechanistic model simulations ('Alawa' toolbox with harvested half-cell data), we quantify the degradation modes that cause aging of the tested cells. The results show that the proposed fast charging technique caused similar aging effects as standard charging. The degradation is caused by a linear loss of lithium inventory, coupled with a less degree of linear loss of active material on the negative electrode. This study validates fast charging as a feasible mean of operation for this particular LIB chemistry and cell architecture. It also illustrates the benefits of a mechanistic approach to understand cell degradation on commercial cells.

  17. Physiologically-based PK/PD modelling of therapeutic macromolecules.

    Science.gov (United States)

    Thygesen, Peter; Macheras, Panos; Van Peer, Achiel

    2009-12-01

    Therapeutic proteins are a diverse class of drugs consisting of naturally occurring or modified proteins, and due to their size and physico-chemical properties, they can pose challenges for the pharmacokinetic and pharmacodynamic studies. Physiologically-based pharmacokinetics (PBPK) modelling has been effective for early in silico prediction of pharmacokinetic properties of new drugs. The aim of the present workshop was to discuss the feasibility of PBPK modelling of macromolecules. The classical PBPK approach was discussed with a presentation of the successful example of PBPK modelling of cyclosporine A. PBPK model was performed with transport of the cyclosporine across cell membranes, affinity to plasma proteins and active membrane transporters included to describe drug transport between physiological compartments. For macromolecules, complex PBPK modelling or permeability-limited and/or target-mediated distribution was discussed. It was generally agreed that PBPK modelling was feasible and desirable. The role of the lymphatic system should be considered when absorption after extravascular administration is modelled. Target-mediated drug disposition was regarded as an important feature for generation of PK models. Complex PK-models may not be necessary when a limited number of organs are affected. More mechanistic PK/PD models will be relevant when adverse events/toxicity are included in the PK/PD modelling.

  18. Mechanistic modelling of Middle Eocene atmospheric carbon dioxide using fossil plant material

    Science.gov (United States)

    Grein, Michaela; Roth-Nebelsick, Anita; Wilde, Volker; Konrad, Wilfried; Utescher, Torsten

    2010-05-01

    Various proxies (such as pedogenic carbonates, boron isotopes or phytoplankton) and geochemical models were applied in order to reconstruct palaeoatmospheric carbon dioxide, partially providing conflicting results. Another promising proxy is the frequency of stomata (pores on the leaf surface used for gaseous exchange). In this project, fossil plant material from the Messel Pit (Hesse, Germany) is used to reconstruct atmospheric carbon dioxide concentration in the Middle Eocene by analyzing stomatal density. We applied the novel mechanistic-theoretical approach of Konrad et al. (2008) which provides a quantitative derivation of the stomatal density response (number of stomata per leaf area) to varying atmospheric carbon dioxide concentration. The model couples 1) C3-photosynthesis, 2) the process of diffusion and 3) an optimisation principle providing maximum photosynthesis (via carbon dioxide uptake) and minimum water loss (via stomatal transpiration). These three sub-models also include data of the palaeoenvironment (temperature, water availability, wind velocity, atmospheric humidity, precipitation) and anatomy of leaf and stoma (depth, length and width of stomatal porus, thickness of assimilation tissue, leaf length). In order to calculate curves of stomatal density as a function of atmospheric carbon dioxide concentration, various biochemical parameters have to be borrowed from extant representatives. The necessary palaeoclimate data are reconstructed from the whole Messel flora using Leaf Margin Analysis (LMA) and the Coexistence Approach (CA). In order to obtain a significant result, we selected three species from which a large number of well-preserved leaves is available (at least 20 leaves per species). Palaeoclimate calculations for the Middle Eocene Messel Pit indicate a warm and humid climate with mean annual temperature of approximately 22°C, up to 2540 mm mean annual precipitation and the absence of extended periods of drought. Mean relative air

  19. Mechanistic formulation of a lineal-quadratic-linear (LQL) model: Split-dose experiments and exponentially decaying sources

    International Nuclear Information System (INIS)

    Guerrero, Mariana; Carlone, Marco

    2010-01-01

    Purpose: In recent years, several models were proposed that modify the standard linear-quadratic (LQ) model to make the predicted survival curve linear at high doses. Most of these models are purely phenomenological and can only be applied in the particular case of acute doses per fraction. The authors consider a mechanistic formulation of a linear-quadratic-linear (LQL) model in the case of split-dose experiments and exponentially decaying sources. This model provides a comprehensive description of radiation response for arbitrary dose rate and fractionation with only one additional parameter. Methods: The authors use a compartmental formulation of the LQL model from the literature. They analytically solve the model's differential equations for the case of a split-dose experiment and for an exponentially decaying source. They compare the solutions of the survival fraction with the standard LQ equations and with the lethal-potentially lethal (LPL) model. Results: In the case of the split-dose experiment, the LQL model predicts a recovery ratio as a function of dose per fraction that deviates from the square law of the standard LQ. The survival fraction as a function of time between fractions follows a similar exponential law as the LQ but adds a multiplicative factor to the LQ parameter β. The LQL solution for the split-dose experiment is very close to the LPL prediction. For the decaying source, the differences between the LQL and the LQ solutions are negligible when the half-life of the source is much larger than the characteristic repair time, which is the clinically relevant case. Conclusions: The compartmental formulation of the LQL model can be used for arbitrary dose rates and provides a comprehensive description of dose response. When the survival fraction for acute doses is linear for high dose, a deviation of the square law formula of the recovery ratio for split doses is also predicted.

  20. WE-H-BRA-07: Mechanistic Modelling of the Relative Biological Effectiveness of Heavy Charged Particles

    Energy Technology Data Exchange (ETDEWEB)

    McMahon, S [Massachusetts General Hospital, Boston, MA (United States); Queen’s University, Belfast, Belfast (United Kingdom); McNamara, A; Schuemann, J; Paganetti, H [Massachusetts General Hospital, Boston, MA (United States); Prise, K [Queen’s University, Belfast, Belfast (United Kingdom)

    2016-06-15

    Purpose Uncertainty in the Relative Biological Effectiveness (RBE) of heavy charged particles compared to photons remains one of the major uncertainties in particle therapy. As RBEs depend strongly on clinical variables such as tissue type, dose, and radiation quality, more accurate individualised models are needed to fully optimise treatments. MethodsWe have developed a model of DNA damage and repair following X-ray irradiation in a number of settings, incorporating mechanistic descriptions of DNA repair pathways, geometric effects on DNA repair, cell cycle effects and cell death. Our model has previously been shown to accurately predict a range of biological endpoints including chromosome aberrations, mutations, and cell death. This model was combined with nanodosimetric models of individual ion tracks to calculate the additional probability of lethal damage forming within a single track. These lethal damage probabilities can be used to predict survival and RBE for cells irradiated with ions of different Linear Energy Transfer (LET). ResultsBy combining the X-ray response model with nanodosimetry information, predictions of RBE can be made without cell-line specific fitting. The model’s RBE predictions were found to agree well with empirical proton RBE models (Mean absolute difference between models of 1.9% and 1.8% for cells with α/β ratios of 9 and 1.4, respectively, for LETs between 0 and 15 keV/µm). The model also accurately recovers the impact of high-LET carbon ion exposures, showing both the reduced efficacy of ions at extremely high LET, as well as the impact of defects in non-homologous end joining on RBE values in Chinese Hamster Ovary cells.ConclusionOur model is predicts RBE without the inclusion of empirical LET fitting parameters for a range of experimental conditions. This approach has the potential to deliver improved personalisation of particle therapy, with future developments allowing for the calculation of individualised RBEs. SJM is

  1. Mechanistic modeling of sulfur-deprived photosynthesis and hydrogen production in suspensions of Chlamydomonas reinhardtii.

    Science.gov (United States)

    Williams, C R; Bees, M A

    2014-02-01

    The ability of unicellular green algal species such as Chlamydomonas reinhardtii to produce hydrogen gas via iron-hydrogenase is well known. However, the oxygen-sensitive hydrogenase is closely linked to the photosynthetic chain in such a way that hydrogen and oxygen production need to be separated temporally for sustained photo-production. Under illumination, sulfur-deprivation has been shown to accommodate the production of hydrogen gas by partially-deactivating O2 evolution activity, leading to anaerobiosis in a sealed culture. As these facets are coupled, and the system complex, mathematical approaches potentially are of significant value since they may reveal improved or even optimal schemes for maximizing hydrogen production. Here, a mechanistic model of the system is constructed from consideration of the essential pathways and processes. The role of sulfur in photosynthesis (via PSII) and the storage and catabolism of endogenous substrate, and thus growth and decay of culture density, are explicitly modeled in order to describe and explore the complex interactions that lead to H2 production during sulfur-deprivation. As far as possible, functional forms and parameter values are determined or estimated from experimental data. The model is compared with published experimental studies and, encouragingly, qualitative agreement for trends in hydrogen yield and initiation time are found. It is then employed to probe optimal external sulfur and illumination conditions for hydrogen production, which are found to differ depending on whether a maximum yield of gas or initial production rate is required. The model constitutes a powerful theoretical tool for investigating novel sulfur cycling regimes that may ultimately be used to improve the commercial viability of hydrogen gas production from microorganisms. © 2013 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc.

  2. Mechanistically-Based Field-Scale Models of Uranium Biogeochemistry from Upscaling Pore-Scale Experiments and Models

    International Nuclear Information System (INIS)

    Tim Scheibe; Alexandre Tartakovsky; Brian Wood; Joe Seymour

    2007-01-01

    Effective environmental management of DOE sites requires reliable prediction of reactive transport phenomena. A central issue in prediction of subsurface reactive transport is the impact of multiscale physical, chemical, and biological heterogeneity. Heterogeneity manifests itself through incomplete mixing of reactants at scales below those at which concentrations are explicitly defined (i.e., the numerical grid scale). This results in a mismatch between simulated reaction processes (formulated in terms of average concentrations) and actual processes (controlled by local concentrations). At the field scale, this results in apparent scale-dependence of model parameters and inability to utilize laboratory parameters in field models. Accordingly, most field modeling efforts are restricted to empirical estimation of model parameters by fitting to field observations, which renders extrapolation of model predictions beyond fitted conditions unreliable. The objective of this project is to develop a theoretical and computational framework for (1) connecting models of coupled reactive transport from pore-scale processes to field-scale bioremediation through a hierarchy of models that maintain crucial information from the smaller scales at the larger scales; and (2) quantifying the uncertainty that is introduced by both the upscaling process and uncertainty in physical parameters. One of the challenges of addressing scale-dependent effects of coupled processes in heterogeneous porous media is the problem-specificity of solutions. Much effort has been aimed at developing generalized scaling laws or theories, but these require restrictive assumptions that render them ineffective in many real problems. We propose instead an approach that applies physical and numerical experiments at small scales (specifically the pore scale) to a selected model system in order to identify the scaling approach appropriate to that type of problem. Although the results of such studies will

  3. Mechanistically-Based Field-Scale Models of Uranium Biogeochemistry from Upscaling Pore-Scale Experiments and Models

    Energy Technology Data Exchange (ETDEWEB)

    Tim Scheibe; Alexandre Tartakovsky; Brian Wood; Joe Seymour

    2007-04-19

    Effective environmental management of DOE sites requires reliable prediction of reactive transport phenomena. A central issue in prediction of subsurface reactive transport is the impact of multiscale physical, chemical, and biological heterogeneity. Heterogeneity manifests itself through incomplete mixing of reactants at scales below those at which concentrations are explicitly defined (i.e., the numerical grid scale). This results in a mismatch between simulated reaction processes (formulated in terms of average concentrations) and actual processes (controlled by local concentrations). At the field scale, this results in apparent scale-dependence of model parameters and inability to utilize laboratory parameters in field models. Accordingly, most field modeling efforts are restricted to empirical estimation of model parameters by fitting to field observations, which renders extrapolation of model predictions beyond fitted conditions unreliable. The objective of this project is to develop a theoretical and computational framework for (1) connecting models of coupled reactive transport from pore-scale processes to field-scale bioremediation through a hierarchy of models that maintain crucial information from the smaller scales at the larger scales; and (2) quantifying the uncertainty that is introduced by both the upscaling process and uncertainty in physical parameters. One of the challenges of addressing scale-dependent effects of coupled processes in heterogeneous porous media is the problem-specificity of solutions. Much effort has been aimed at developing generalized scaling laws or theories, but these require restrictive assumptions that render them ineffective in many real problems. We propose instead an approach that applies physical and numerical experiments at small scales (specifically the pore scale) to a selected model system in order to identify the scaling approach appropriate to that type of problem. Although the results of such studies will

  4. Mechanistic model for dispersion coefficients in bubble column

    CSIR Research Space (South Africa)

    Skosana, PJ

    2015-05-01

    Full Text Available predicts axial and radial dispersion coefficients that are of the same order of magnitude as the reported data. Whereas the model is based on a description of the underlying physical phenomena, its validity and extrapolation is expected to be more reliable...

  5. Managing mechanistic and organic structure in health care organizations.

    Science.gov (United States)

    Olden, Peter C

    2012-01-01

    Managers at all levels in a health care organization must organize work to achieve the organization's mission and goals. This requires managers to decide the organization structure, which involves dividing the work among jobs and departments and then coordinating them all toward the common purpose. Organization structure, which is reflected in an organization chart, may range on a continuum from very mechanistic to very organic. Managers must decide how mechanistic versus how organic to make the entire organization and each of its departments. To do this, managers should carefully consider 5 factors for the organization and for each individual department: external environment, goals, work production, size, and culture. Some factors may push toward more mechanistic structure, whereas others may push in the opposite direction toward more organic structure. Practical advice can help managers at all levels design appropriate structure for their departments and organization.

  6. Cognitive science as an interface between rational and mechanistic explanation.

    Science.gov (United States)

    Chater, Nick

    2014-04-01

    Cognitive science views thought as computation; and computation, by its very nature, can be understood in both rational and mechanistic terms. In rational terms, a computation solves some information processing problem (e.g., mapping sensory information into a description of the external world; parsing a sentence; selecting among a set of possible actions). In mechanistic terms, a computation corresponds to causal chain of events in a physical device (in engineering context, a silicon chip; in biological context, the nervous system). The discipline is thus at the interface between two very different styles of explanation--as the papers in the current special issue well illustrate, it explores the interplay of rational and mechanistic forces. Copyright © 2014 Cognitive Science Society, Inc.

  7. Mechanistic Understanding of Microbial Plugging for Improved Sweep Efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Steven Bryant; Larry Britton

    2008-09-30

    Microbial plugging has been proposed as an effective low cost method of permeability reduction. Yet there is a dearth of information on the fundamental processes of microbial growth in porous media, and there are no suitable data to model the process of microbial plugging as it relates to sweep efficiency. To optimize the field implementation, better mechanistic and volumetric understanding of biofilm growth within a porous medium is needed. In particular, the engineering design hinges upon a quantitative relationship between amount of nutrient consumption, amount of growth, and degree of permeability reduction. In this project experiments were conducted to obtain new data to elucidate this relationship. Experiments in heterogeneous (layered) beadpacks showed that microbes could grow preferentially in the high permeability layer. Ultimately this caused flow to be equally divided between high and low permeability layers, precisely the behavior needed for MEOR. Remarkably, classical models of microbial nutrient uptake in batch experiments do not explain the nutrient consumption by the same microbes in flow experiments. We propose a simple extension of classical kinetics to account for the self-limiting consumption of nutrient observed in our experiments, and we outline a modeling approach based on architecture and behavior of biofilms. Such a model would account for the changing trend of nutrient consumption by bacteria with the increasing biomass and the onset of biofilm formation. However no existing model can explain the microbial preference for growth in high permeability regions, nor is there any obvious extension of the model for this observation. An attractive conjecture is that quorum sensing is involved in the heterogeneous bead packs.

  8. Mechanistic kinetic modeling generates system-independent P-glycoprotein mediated transport elementary rate constants for inhibition and, in combination with 3D SIM microscopy, elucidates the importance of microvilli morphology on P-glycoprotein mediated efflux activity.

    Science.gov (United States)

    Ellens, Harma; Meng, Zhou; Le Marchand, Sylvain J; Bentz, Joe

    2018-06-01

    In vitro transporter kinetics are typically analyzed by steady-state Michaelis-Menten approximations. However, no clear evidence exists that these approximations, applied to multiple transporters in biological membranes, yield system-independent mechanistic parameters needed for reliable in vivo hypothesis generation and testing. Areas covered: The classical mass action model has been developed for P-glycoprotein (P-gp) mediated transport across confluent polarized cell monolayers. Numerical integration of the mass action equations for transport using a stable global optimization program yields fitted elementary rate constants that are system-independent. The efflux active P-gp was defined by the rate at which P-gp delivers drugs to the apical chamber, since as much as 90% of drugs effluxed by P-gp partition back into nearby microvilli prior to reaching the apical chamber. The efflux active P-gp concentration was 10-fold smaller than the total expressed P-gp for Caco-2 cells, due to their microvilli membrane morphology. The mechanistic insights from this analysis are readily extrapolated to P-gp mediated transport in vivo. Expert opinion: In vitro system-independent elementary rate constants for transporters are essential for the generation and validation of robust mechanistic PBPK models. Our modeling approach and programs have broad application potential. They can be used for any drug transporter with minor adaptations.

  9. A Mechanistic Model of Intermittent Gastric Emptying and Glucose-Insulin Dynamics following a Meal Containing Milk Components.

    Directory of Open Access Journals (Sweden)

    Priska Stahel

    Full Text Available To support decision-making around diet selection choices to manage glycemia following a meal, a novel mechanistic model of intermittent gastric emptying and plasma glucose-insulin dynamics was developed. Model development was guided by postprandial timecourses of plasma glucose, insulin and the gastric emptying marker acetaminophen in infant calves fed meals of 2 or 4 L milk replacer. Assigning a fast, slow or zero first-order gastric emptying rate to each interval between plasma samples fit acetaminophen curves with prediction errors equal to 9% of the mean observed acetaminophen concentration. Those gastric emptying parameters were applied to glucose appearance in conjunction with minimal models of glucose disposal and insulin dynamics to describe postprandial glycemia and insulinemia. The final model contains 20 parameters, 8 of which can be obtained by direct measurement and 12 by fitting to observations. The minimal model of intestinal glucose delivery contains 2 gastric emptying parameters and a third parameter describing the time lag between emptying and appearance of glucose in plasma. Sensitivity analysis of the aggregate model revealed that gastric emptying rate influences area under the plasma insulin curve but has little effect on area under the plasma glucose curve. This result indicates that pancreatic responsiveness is influenced by gastric emptying rate as a consequence of the quasi-exponential relationship between plasma glucose concentration and pancreatic insulin release. The fitted aggregate model was able to reproduce the multiple postprandial rises and falls in plasma glucose concentration observed in calves consuming a normal-sized meal containing milk components.

  10. A mechanistic diagnosis of the simulation of soil CO2 efflux of the ACME Land Model

    Science.gov (United States)

    Liang, J.; Ricciuto, D. M.; Wang, G.; Gu, L.; Hanson, P. J.; Mayes, M. A.

    2017-12-01

    Accurate simulation of the CO2 efflux from soils (i.e., soil respiration) to the atmosphere is critical to project global biogeochemical cycles and the magnitude of climate change in Earth system models (ESMs). Currently, the simulated soil respiration by ESMs still have a large uncertainty. In this study, a mechanistic diagnosis of soil respiration in the Accelerated Climate Model for Energy (ACME) Land Model (ALM) was conducted using long-term observations at the Missouri Ozark AmeriFlux (MOFLUX) forest site in the central U.S. The results showed that the ALM default run significantly underestimated annual soil respiration and gross primary production (GPP), while incorrectly estimating soil water potential. Improved simulations of soil water potential with site-specific data significantly improved the modeled annual soil respiration, primarily because annual GPP was simultaneously improved. Therefore, accurate simulations of soil water potential must be carefully calibrated in ESMs. Despite improved annual soil respiration, the ALM continued to underestimate soil respiration during peak growing seasons, and to overestimate soil respiration during non-peak growing seasons. Simulations involving increased GPP during peak growing seasons increased soil respiration, while neither improved plant phenology nor increased temperature sensitivity affected the simulation of soil respiration during non-peak growing seasons. One potential reason for the overestimation of the soil respiration during non-peak growing seasons may be that the current model structure is substrate-limited, while microbial dormancy under stress may cause the system to become decomposer-limited. Further studies with more microbial data are required to provide adequate representation of soil respiration and to understand the underlying reasons for inaccurate model simulations.

  11. Prediction of the Pharmacokinetics, Pharmacodynamics, and Efficacy of a Monoclonal Antibody, Using a Physiologically Based Pharmacokinetic FcRn Model

    Science.gov (United States)

    Chetty, Manoranjenni; Li, Linzhong; Rose, Rachel; Machavaram, Krishna; Jamei, Masoud; Rostami-Hodjegan, Amin; Gardner, Iain

    2015-01-01

    Although advantages of physiologically based pharmacokinetic models (PBPK) are now well established, PBPK models that are linked to pharmacodynamic (PD) models to predict pharmacokinetics (PK), PD, and efficacy of monoclonal antibodies (mAbs) in humans are uncommon. The aim of this study was to develop a PD model that could be linked to a physiologically based mechanistic FcRn model to predict PK, PD, and efficacy of efalizumab. The mechanistic FcRn model for mAbs with target-mediated drug disposition within the Simcyp population-based simulator was used to simulate the pharmacokinetic profiles for three different single doses and two multiple doses of efalizumab administered to virtual Caucasian healthy volunteers. The elimination of efalizumab was modeled with both a target-mediated component (specific) and catabolism in the endosome (non-specific). This model accounted for the binding between neonatal Fc receptor (FcRn) and efalizumab (protective against elimination) and for changes in CD11a target concentration. An integrated response model was then developed to predict the changes in mean Psoriasis Area and Severity Index (PASI) scores that were measured in a clinical study as an efficacy marker for efalizumab treatment. PASI scores were approximated as continuous and following a first-order asymptotic progression model. The reported steady state asymptote (Y ss) and baseline score [Y (0)] was applied and parameter estimation was used to determine the half-life of progression (Tp) of psoriasis. Results suggested that simulations using this model were able to recover the changes in PASI scores (indicating efficacy) observed during clinical studies. Simulations of both single dose and multiple doses of efalizumab concentration-time profiles as well as suppression of CD11a concentrations recovered clinical data reasonably well. It can be concluded that the developed PBPK FcRn model linked to a PD model adequately predicted PK, PD, and efficacy of efalizumab. PMID

  12. Insight into the hydraulics and resilience of Ponderosa pine seedlings using a mechanistic ecohydrologic model

    Science.gov (United States)

    Maneta, M. P.; Simeone, C.; Dobrowski, S.; Holden, Z.; Sapes, G.; Sala, A.; Begueria, S.

    2017-12-01

    In semiarid regions, drought-induced seedling mortality is considered to be caused by failure in the tree hydraulic column. Understanding the mechanisms that cause hydraulic failure and death in seedlings is important, among other things, to diagnose where some tree species may fail to regenerate, triggering demographic imbalances in the forest that could result in climate-driven shifts of tree species. Ponderosa pine is a common lower tree line species in the western US. Seedlings of ponderosa pine are often subject to low soil water potentials, which require lower water potentials in the xylem and leaves to maintain the negative pressure gradient that drives water upward. The resilience of the hydraulic column to hydraulic tension is species dependent, but from greenhouse experiments, we have identified general tension thresholds beyond which loss of xylem conductivity becomes critical, and mortality in Ponderosa pine seedlings start to occur. We describe this hydraulic behavior of plants using a mechanistic soil-vegetation-atmosphere transfer model. Before we use this models to understand water-stress induced seedling mortality at the landscape scale, we perform a modeling analysis of the dynamics of soil moisture, transpiration, leaf water potential and loss of plant water conductivity using detailed data from our green house experiments. The analysis is done using a spatially distributed model that simulates water fluxes, energy exchanges and water potentials in the soil-vegetation-atmosphere continuum. Plant hydraulic and physiological parameters of this model were calibrated using Monte Carlo methods against information on soil moisture, soil hydraulic potential, transpiration, leaf water potential and percent loss of conductivity in the xylem. This analysis permits us to construct a full portrait of the parameter space for Ponderosa pine seedling and generate posterior predictive distributions of tree response to understand the sensitivity of transpiration

  13. Cell physiology based pharmacodynamic modeling of antimicrobial drug combinations

    OpenAIRE

    Hethey, Christoph Philipp

    2017-01-01

    Mathematical models of bacterial growth have been successfully applied to study the relationship between antibiotic drug exposure and the antibacterial effect. Since these models typically lack a representation of cellular processes and cell physiology, the mechanistic integration of drug action is not possible on the cellular level. The cellular mechanisms of drug action, however, are particularly relevant for the prediction, analysis and understanding of interactions between antibiotics. In...

  14. In vitro solubility, dissolution and permeability studies combined with semi-mechanistic modeling to investigate the intestinal absorption of desvenlafaxine from an immediate- and extended release formulation.

    Science.gov (United States)

    Franek, F; Jarlfors, A; Larsen, F; Holm, P; Steffansen, B

    2015-09-18

    Desvenlafaxine is a biopharmaceutics classification system (BCS) class 1 (high solubility, high permeability) and biopharmaceutical drug disposition classification system (BDDCS) class 3, (high solubility, poor metabolism; implying low permeability) compound. Thus the rate-limiting step for desvenlafaxine absorption (i.e. intestinal dissolution or permeation) is not fully clarified. The aim of this study was to investigate whether dissolution and/or intestinal permeability rate-limit desvenlafaxine absorption from an immediate-release formulation (IRF) and Pristiq(®), an extended release formulation (ERF). Semi-mechanistic models of desvenlafaxine were built (using SimCyp(®)) by combining in vitro data on dissolution and permeation (mechanistic part of model) with clinical data (obtained from literature) on distribution and clearance (non-mechanistic part of model). The model predictions of desvenlafaxine pharmacokinetics after IRF and ERF administration were compared with published clinical data from 14 trials. Desvenlafaxine in vivo dissolution from the IRF and ERF was predicted from in vitro solubility studies and biorelevant dissolution studies (using the USP3 dissolution apparatus), respectively. Desvenlafaxine apparent permeability (Papp) at varying apical pH was investigated using the Caco-2 cell line and extrapolated to effective intestinal permeability (Peff) in human duodenum, jejunum, ileum and colon. Desvenlafaxine pKa-values and octanol-water partition coefficients (Do:w) were determined experimentally. Due to predicted rapid dissolution after IRF administration, desvenlafaxine was predicted to be available for permeation in the duodenum. Desvenlafaxine Do:w and Papp increased approximately 13-fold when increasing apical pH from 5.5 to 7.4. Desvenlafaxine Peff thus increased with pH down the small intestine. Consequently, desvenlafaxine absorption from an IRF appears rate-limited by low Peff in the upper small intestine, which "delays" the predicted

  15. A dynamic data based model describing nephropathia epidemica in Belgium

    NARCIS (Netherlands)

    Amirpour Haredasht, S.; Barrios, J.M.; Maes, P.; Verstraeten, W.W.; Clement, J.; Ducoffre, G.; Lagrou, K.; Van Ranst, M.; Coppin, P.; Berckmans, D.; Aerts, J.M.F.G.

    2011-01-01

    ropathia epidemica (NE) is a human infection caused by Puumala virus (PUUV), which is naturally carried and shed by bank voles (Myodes glareolus). Population dynamics and infectious diseases in general, such as NE, have often been modelled with mechanistic SIR (Susceptible, Infective and Remove with

  16. ['Anatomia actuosa et apta'. The mechanist 'proto'-physiology of B.S. Albinus].

    Science.gov (United States)

    van der Korst, J K

    1993-01-01

    Already during his tenure as professor of anatomy and surgery (1721-1746) and before he became a professor of physiology and medicine at the University of Leiden, Bernard Siegfried Albinus held private lecture courses on physiology. In these lectures he pleaded for a separation of physiology from theoretical medicine, which was still its customary place in the medical curriculum of the first half of the eighteenth century. According to Albinus, physiology was a science in its own right and should be solely based on the careful observation of forms and structures of the human body. From the 'fabrica', the function ('aptitudo') could be derived by careful reasoning. As shown by a set of lecture notes, which recently came to light, Albinus adhered, initially, to a strictly mechanistic explanatory model, which was almost completely based on the physiological concepts of Herman Boerhaave. However, in contrast to the latter, he even rejected the involvement of chemical processes in digestion. Although his lectures were highly acclaimed as demonstrations of minute anatomy, Albinus met with little or no direct response in regard to his concept of physiology.

  17. The challenge of making ozone risk assessment for forest trees more mechanistic

    International Nuclear Information System (INIS)

    Matyssek, R.; Sandermann, H.; Wieser, G.; Booker, F.; Cieslik, S.; Musselman, R.; Ernst, D.

    2008-01-01

    Upcoming decades will experience increasing atmospheric CO 2 and likely enhanced O 3 exposure which represents a risk for the carbon sink strength of forests, so that the need for cause-effect related O 3 risk assessment increases. Although assessment will gain in reliability on an O 3 uptake basis, risk is co-determined by the effective dose, i.e. the plant's sensitivity per O 3 uptake. Recent progress in research on the molecular and metabolic control of the effective O 3 dose is reported along with advances in empirically assessing O 3 uptake at the whole-tree and stand level. Knowledge on both O 3 uptake and effective dose (measures of stress avoidance and tolerance, respectively) needs to be understood mechanistically and linked as a pre-requisite before practical use of process-based O 3 risk assessment can be implemented. To this end, perspectives are derived for validating and promoting new O 3 flux-based modelling tools. - Clarifying and linking mechanisms of O 3 uptake and effective dose are research challenges highlighted in view of recent progress and perspectives towards cause-effect based risk assessment

  18. Model-based analysis of a twin-screw wet granulation system for continuous solid dosage manufacturing

    DEFF Research Database (Denmark)

    Kumar, Ashish; Vercruysse, Jurgen; Mortier, Severine T. F. C.

    2016-01-01

    Implementation of twin-screw granulation in a continuous from-powder-to-tablet manufacturing line requires process knowledge development. This is often pursued by application of mechanistic models incorporating the underlying mechanisms. In this study, granulation mechanisms considered to be domi......Implementation of twin-screw granulation in a continuous from-powder-to-tablet manufacturing line requires process knowledge development. This is often pursued by application of mechanistic models incorporating the underlying mechanisms. In this study, granulation mechanisms considered...... to be dominant in the kneading element regions of the granulator i.e., aggregation and breakage, were included in a one-dimensional population balance model. The model was calibrated using the experimentally determined inflow granule size distribution, and the mean residence time was used as additional input...

  19. Functionally relevant climate variables for arid lands: Aclimatic water deficit approach for modelling desert shrub distributions

    Science.gov (United States)

    Thomas E. Dilts; Peter J. Weisberg; Camie M. Dencker; Jeanne C. Chambers

    2015-01-01

    We have three goals. (1) To develop a suite of functionally relevant climate variables for modelling vegetation distribution on arid and semi-arid landscapes of the Great Basin, USA. (2) To compare the predictive power of vegetation distribution models based on mechanistically proximate factors (water deficit variables) and factors that are more mechanistically removed...

  20. Modeling phytoplankton community in reservoirs. A comparison between taxonomic and functional groups-based models.

    Science.gov (United States)

    Di Maggio, Jimena; Fernández, Carolina; Parodi, Elisa R; Diaz, M Soledad; Estrada, Vanina

    2016-01-01

    In this paper we address the formulation of two mechanistic water quality models that differ in the way the phytoplankton community is described. We carry out parameter estimation subject to differential-algebraic constraints and validation for each model and comparison between models performance. The first approach aggregates phytoplankton species based on their phylogenetic characteristics (Taxonomic group model) and the second one, on their morpho-functional properties following Reynolds' classification (Functional group model). The latter approach takes into account tolerance and sensitivity to environmental conditions. The constrained parameter estimation problems are formulated within an equation oriented framework, with a maximum likelihood objective function. The study site is Paso de las Piedras Reservoir (Argentina), which supplies water for consumption for 450,000 population. Numerical results show that phytoplankton morpho-functional groups more closely represent each species growth requirements within the group. Each model performance is quantitatively assessed by three diagnostic measures. Parameter estimation results for seasonal dynamics of the phytoplankton community and main biogeochemical variables for a one-year time horizon are presented and compared for both models, showing the functional group model enhanced performance. Finally, we explore increasing nutrient loading scenarios and predict their effect on phytoplankton dynamics throughout a one-year time horizon. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Mechanistic models for cancer development after short time radiation exposure

    International Nuclear Information System (INIS)

    Kottbauer, M. M.

    1997-12-01

    In this work two biological based models were developed. First the single-hit model for solid tumors (SHM-S) and second the single-hit model for leukemia (SHM-L). These models are a further development of the Armitage-Doll model for the special case of a short time radiation exposure. The basis of the models is the multistage process of carcinogeneses. The single-hit models provide simultaneously the age-dependent cancer-rate of spontaneous and radiation induced tumors as well as the dose-effect relationships at any age after exposure. The SHM-S leads to a biological based dose-effect relationship, which is similar to the relative risk model suggested by the ICRP 60. The SHM-S describes the increased mortality rate of the bomb survivors more accurate than the relative risk model. The SHM-L results in an additive dose-effect relationship. It is shown that only small differences in the derivation of the two models lead to the two dose-effect relationships. Beside the radiation exposure the new models consider the decrease of the cancer mortality rate at higher ages (age>75) which can be traced back mainly to three causes: competitive causes of death, reduction of cell proliferation and reduction of risk groups. The single-hit models also consider children cancer, the different rates of incidence and mortality, influence of the immune system and the cell-killing effect. (author)

  2. Mechanistic and kinetic aspects of pentose dehydration towards furfural in aqueous media employing homogeneous catalysis

    NARCIS (Netherlands)

    Danon, B.; Marcotullio, G.; De Jong, W.

    2013-01-01

    In this paper both the mechanistic and kinetic aspects of furfural formation from pentoses in aqueous acidic media have been reviewed. Based on the reviewed literature, a comprehensive reaction mechanism has been proposed consisting of more than one route, all starting from acyclic xylose, and

  3. Mechanistic study of aerosol dry deposition on vegetated canopies; Etude mecaniste du depot sec d'aerosols sur les couverts vegetaux

    Energy Technology Data Exchange (ETDEWEB)

    Petroff, A

    2005-04-15

    The dry deposition of aerosols onto vegetated canopies is modelled through a mechanistic approach. The interaction between aerosols and vegetation is first formulated by using a set of parameters, which are defined at the local scale of one surface. The overall deposition is then deduced at the canopy scale through an up-scaling procedure based on the statistic distribution parameters. This model takes into account the canopy structural and morphological properties, and the main characteristics of the turbulent flow. Deposition mechanisms considered are Brownian diffusion, interception, initial and turbulent impaction, initially with coniferous branches and then with entire canopies of different roughness, such as grass, crop field and forest. (author)

  4. Mechanistic characterization and molecular modeling of hepatitis B virus polymerase resistance to entecavir.

    Science.gov (United States)

    Walsh, Ann W; Langley, David R; Colonno, Richard J; Tenney, Daniel J

    2010-02-12

    Entecavir (ETV) is a deoxyguanosine analog competitive inhibitor of hepatitis B virus (HBV) polymerase that exhibits delayed chain termination of HBV DNA. A high barrier to entecavir-resistance (ETVr) is observed clinically, likely due to its potency and a requirement for multiple resistance changes to overcome suppression. Changes in the HBV polymerase reverse-transcriptase (RT) domain involve lamivudine-resistance (LVDr) substitutions in the conserved YMDD motif (M204V/I +/- L180M), plus an additional ETV-specific change at residues T184, S202 or M250. These substitutions surround the putative dNTP binding site or primer grip regions of the HBV RT. To determine the mechanistic basis for ETVr, wildtype, lamivudine-resistant (M204V, L180M) and ETVr HBVs were studied using in vitro RT enzyme and cell culture assays, as well as molecular modeling. Resistance substitutions significantly reduced ETV incorporation and chain termination in HBV DNA and increased the ETV-TP inhibition constant (K(i)) for HBV RT. Resistant HBVs exhibited impaired replication in culture and reduced enzyme activity (k(cat)) in vitro. Molecular modeling of the HBV RT suggested that ETVr residue T184 was adjacent to and stabilized S202 within the LVDr YMDD loop. ETVr arose through steric changes at T184 or S202 or by disruption of hydrogen-bonding between the two, both of which repositioned the loop and reduced the ETV-triphosphate (ETV-TP) binding pocket. In contrast to T184 and S202 changes, ETVr at primer grip residue M250 was observed during RNA-directed DNA synthesis only. Experimentally, M250 changes also impacted the dNTP-binding site. Modeling suggested a novel mechanism for M250 resistance, whereby repositioning of the primer-template component of the dNTP-binding site shifted the ETV-TP binding pocket. No structural data are available to confirm the HBV RT modeling, however, results were consistent with phenotypic analysis of comprehensive substitutions of each ETVr position

  5. Mechanistic characterization and molecular modeling of hepatitis B virus polymerase resistance to entecavir.

    Directory of Open Access Journals (Sweden)

    Ann W Walsh

    Full Text Available BACKGROUND: Entecavir (ETV is a deoxyguanosine analog competitive inhibitor of hepatitis B virus (HBV polymerase that exhibits delayed chain termination of HBV DNA. A high barrier to entecavir-resistance (ETVr is observed clinically, likely due to its potency and a requirement for multiple resistance changes to overcome suppression. Changes in the HBV polymerase reverse-transcriptase (RT domain involve lamivudine-resistance (LVDr substitutions in the conserved YMDD motif (M204V/I +/- L180M, plus an additional ETV-specific change at residues T184, S202 or M250. These substitutions surround the putative dNTP binding site or primer grip regions of the HBV RT. METHODS/PRINCIPAL FINDINGS: To determine the mechanistic basis for ETVr, wildtype, lamivudine-resistant (M204V, L180M and ETVr HBVs were studied using in vitro RT enzyme and cell culture assays, as well as molecular modeling. Resistance substitutions significantly reduced ETV incorporation and chain termination in HBV DNA and increased the ETV-TP inhibition constant (K(i for HBV RT. Resistant HBVs exhibited impaired replication in culture and reduced enzyme activity (k(cat in vitro. Molecular modeling of the HBV RT suggested that ETVr residue T184 was adjacent to and stabilized S202 within the LVDr YMDD loop. ETVr arose through steric changes at T184 or S202 or by disruption of hydrogen-bonding between the two, both of which repositioned the loop and reduced the ETV-triphosphate (ETV-TP binding pocket. In contrast to T184 and S202 changes, ETVr at primer grip residue M250 was observed during RNA-directed DNA synthesis only. Experimentally, M250 changes also impacted the dNTP-binding site. Modeling suggested a novel mechanism for M250 resistance, whereby repositioning of the primer-template component of the dNTP-binding site shifted the ETV-TP binding pocket. No structural data are available to confirm the HBV RT modeling, however, results were consistent with phenotypic analysis of

  6. Implementation of a phenomenological DNB prediction model based on macroscale boiling flow processes in PWR fuel bundles

    International Nuclear Information System (INIS)

    Mohitpour, Maryam; Jahanfarnia, Gholamreza; Shams, Mehrzad

    2014-01-01

    Highlights: • A numerical framework was developed to mechanistically predict DNB in PWR bundles. • The DNB evaluation module was incorporated into the two-phase flow solver module. • Three-dimensional two-fluid model was the basis of two-phase flow solver module. • Liquid sublayer dryout model was adapted as CHF-triggering mechanism in DNB module. • Ability of DNB modeling approach was studied based on PSBT DNB tests in rod bundle. - Abstract: In this study, a numerical framework, comprising of a two-phase flow subchannel solver module and a Departure from Nucleate Boiling (DNB) evaluation module, was developed to mechanistically predict DNB in rod bundles of Pressurized Water Reactor (PWR). In this regard, the liquid sublayer dryout model was adapted as the Critical Heat Flux (CHF) triggering mechanism to reduce the dependency of the model on empirical correlations in the DNB evaluation module. To predict local flow boiling processes, a three-dimensional two-fluid formalism coupled with heat conduction was selected as the basic tool for the development of the two-phase flow subchannel analysis solver. Evaluation of the DNB modeling approach was performed against OECD/NRC NUPEC PWR Bundle tests (PSBT Benchmark) which supplied an extensive database for the development of truly mechanistic and consistent models for boiling transition and CHF. The results of the analyses demonstrated the need for additional assessment of the subcooled boiling model and the bulk condensation model implemented in the two-phase flow solver module. The proposed model slightly under-predicts the DNB power in comparison with the ones obtained from steady-state benchmark measurements. However, this prediction is acceptable compared with other codes. Another point about the DNB prediction model is that it has a conservative behavior. Examination of the axial and radial position of the first detected DNB using code-to-code comparisons on the basis of PSBT data indicated that the our

  7. Why did Jacques Monod make the choice of mechanistic determinism?

    Science.gov (United States)

    Loison, Laurent

    2015-06-01

    The development of molecular biology placed in the foreground a mechanistic and deterministic conception of the functioning of macromolecules. In this article, I show that this conception was neither obvious, nor necessary. Taking Jacques Monod as a case study, I detail the way he gradually came loose from a statistical understanding of determinism to finally support a mechanistic understanding. The reasons of the choice made by Monod at the beginning of the 1950s can be understood only in the light of the general theoretical schema supported by the concept of mechanistic determinism. This schema articulates three fundamental notions for Monod, namely that of the rigidity of the sequence of the genetic program, that of the intrinsic stability of macromolecules (DNA and proteins), and that of the specificity of molecular interactions. Copyright © 2015 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  8. Modeling size effects on fatigue life of a zirconium-based bulk metallic glass under bending

    International Nuclear Information System (INIS)

    Yuan Tao; Wang Gongyao; Feng Qingming; Liaw, Peter K.; Yokoyama, Yoshihiko; Inoue, Akihisa

    2013-01-01

    A size effect on the fatigue-life cycles of a Zr 50 Cu 30 Al 10 Ni 10 (at.%) bulk metallic glass has been observed in the four-point-bending fatigue experiment. Under the same bending-stress condition, large-sized samples tend to exhibit longer fatigue lives than small-sized samples. This size effect on the fatigue life cannot be satisfactorily explained by the flaw-based Weibull theories. Based on the experimental results, this study explores possible approaches to modeling the size effects on the bending-fatigue life of bulk metallic glasses, and proposes two fatigue-life models based on the Weibull distribution. The first model assumes, empirically, log-linear effects of the sample thickness on the Weibull parameters. The second model incorporates the mechanistic knowledge of the fatigue behavior of metallic glasses, and assumes that the shear-band density, instead of the flaw density, has significant influence on the bending fatigue-life cycles. Promising predictive results provide evidence of the potential validity of the models and their assumptions.

  9. Systems, methods and computer-readable media for modeling cell performance fade of rechargeable electrochemical devices

    Science.gov (United States)

    Gering, Kevin L

    2013-08-27

    A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware periodically samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics of the electrochemical cell. The computing system also develops a mechanistic level model of the electrochemical cell to determine performance fade characteristics of the electrochemical cell and analyzing the mechanistic level model to estimate performance fade characteristics over aging of a similar electrochemical cell. The mechanistic level model uses first constant-current pulses applied to the electrochemical cell at a first aging period and at three or more current values bracketing a first exchange current density. The mechanistic level model also is based on second constant-current pulses applied to the electrochemical cell at a second aging period and at three or more current values bracketing the second exchange current density.

  10. Arsenic Exposure and Type 2 Diabetes: MicroRNAs as Mechanistic Links?

    OpenAIRE

    Beck, Rowan; Styblo, Miroslav; Sethupathy, Praveen

    2017-01-01

    Purpose of Review The goal of this review is to delineate the following: (1) the primary means of inorganic arsenic (iAs) exposure for human populations, (2) the adverse public health outcomes associated with chronic iAs exposure, (3) the pathophysiological connection between arsenic and type 2 diabetes (T2D), and (4) the incipient evidence for microRNAs as candidate mechanistic links between iAs exposure and T2D. Recent Findings Exposure to iAs in animal models has been associated with the d...

  11. Aminoglycoside Concentrations Required for Synergy with Carbapenems against Pseudomonas aeruginosa Determined via Mechanistic Studies and Modeling.

    Science.gov (United States)

    Yadav, Rajbharan; Bulitta, Jürgen B; Schneider, Elena K; Shin, Beom Soo; Velkov, Tony; Nation, Roger L; Landersdorfer, Cornelia B

    2017-12-01

    This study aimed to systematically identify the aminoglycoside concentrations required for synergy with a carbapenem and characterize the permeabilizing effect of aminoglycosides on the outer membrane of Pseudomonas aeruginosa Monotherapies and combinations of four aminoglycosides and three carbapenems were studied for activity against P. aeruginosa strain AH298-GFP in 48-h static-concentration time-kill studies (SCTK) (inoculum: 10 7.6 CFU/ml). The outer membrane-permeabilizing effect of tobramycin alone and in combination with imipenem was characterized via electron microscopy, confocal imaging, and the nitrocefin assay. A mechanism-based model (MBM) was developed to simultaneously describe the time course of bacterial killing and prevention of regrowth by imipenem combined with each of the four aminoglycosides. Notably, 0.25 mg/liter of tobramycin, which was inactive in monotherapy, achieved synergy (i.e., ≥2-log 10 more killing than the most active monotherapy at 24 h) combined with imipenem. Electron micrographs, confocal image analyses, and the nitrocefin uptake data showed distinct outer membrane damage by tobramycin, which was more extensive for the combination with imipenem. The MBM indicated that aminoglycosides enhanced the imipenem target site concentration up to 4.27-fold. Tobramycin was the most potent aminoglycoside to permeabilize the outer membrane; tobramycin (0.216 mg/liter), gentamicin (0.739 mg/liter), amikacin (1.70 mg/liter), or streptomycin (5.19 mg/liter) was required for half-maximal permeabilization. In summary, our SCTK, mechanistic studies and MBM indicated that tobramycin was highly synergistic and displayed the maximum outer membrane disruption potential among the tested aminoglycosides. These findings support the optimization of highly promising antibiotic combination dosage regimens for critically ill patients. Copyright © 2017 American Society for Microbiology.

  12. Physiologically induced color-pattern changes in butterfly wings: mechanistic and evolutionary implications.

    Science.gov (United States)

    Otaki, Joji M

    2008-07-01

    A mechanistic understanding of the butterfly wing color-pattern determination can be facilitated by experimental pattern changes. Here I review physiologically induced color-pattern changes in nymphalid butterflies and their mechanistic and evolutionary implications. A type of color-pattern change can be elicited by elemental changes in size and position throughout the wing, as suggested by the nymphalid groundplan. These changes of pattern elements are bi-directional and bi-sided dislocation toward or away from eyespot foci and in both proximal and distal sides of the foci. The peripheral elements are dislocated even in the eyespot-less compartments. Anterior spots are more severely modified, suggesting the existence of an anterior-posterior gradient. In one species, eyespots are transformed into white spots with remnant-like orange scales, and such patterns emerge even at the eyespot-less "imaginary" foci. A series of these color-pattern modifications probably reveal "snap-shots" of a dynamic morphogenic signal due to heterochronic uncoupling between the signaling and reception steps. The conventional gradient model can be revised to account for these observed color-pattern changes.

  13. Critical review of glass performance modeling

    International Nuclear Information System (INIS)

    Bourcier, W.L.

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process

  14. Critical review of glass performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Bourcier, W.L. [Lawrence Livermore National Lab., CA (United States)

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process.

  15. Rule-based model of vein graft remodeling.

    Directory of Open Access Journals (Sweden)

    Minki Hwang

    Full Text Available When vein segments are implanted into the arterial system for use in arterial bypass grafting, adaptation to the higher pressure and flow of the arterial system is accomplished thorough wall thickening and expansion. These early remodeling events have been found to be closely coupled to the local hemodynamic forces, such as shear stress and wall tension, and are believed to be the foundation for later vein graft failure. To further our mechanistic understanding of the cellular and extracellular interactions that lead to global changes in tissue architecture, a rule-based modeling method is developed through the application of basic rules of behaviors for these molecular and cellular activities. In the current method, smooth muscle cell (SMC, extracellular matrix (ECM, and monocytes are selected as the three components that occupy the elements of a grid system that comprise the developing vein graft intima. The probabilities of the cellular behaviors are developed based on data extracted from in vivo experiments. At each time step, the various probabilities are computed and applied to the SMC and ECM elements to determine their next physical state and behavior. One- and two-dimensional models are developed to test and validate the computational approach. The importance of monocyte infiltration, and the associated effect in augmenting extracellular matrix deposition, was evaluated and found to be an important component in model development. Final model validation is performed using an independent set of experiments, where model predictions of intimal growth are evaluated against experimental data obtained from the complex geometry and shear stress patterns offered by a mid-graft focal stenosis, where simulation results show good agreements with the experimental data.

  16. Mechanistic failure mode investigation and resolution of parvovirus retentive filters.

    Science.gov (United States)

    LaCasse, Daniel; Lute, Scott; Fiadeiro, Marcus; Basha, Jonida; Stork, Matthew; Brorson, Kurt; Godavarti, Ranga; Gallo, Chris

    2016-07-08

    Virus retentive filters are a key product safety measure for biopharmaceuticals. A simplistic perception is that they function solely based on a size-based particle removal mechanism of mechanical sieving and retention of particles based on their hydrodynamic size. Recent observations have revealed a more nuanced picture, indicating that changes in viral particle retention can result from process pressure and/or flow interruptions. In this study, a mechanistic investigation was performed to help identify a potential mechanism leading to the reported reduced particle retention in small virus filters. Permeate flow rate or permeate driving force were varied and analyzed for their impact on particle retention in three commercially available small virus retentive filters. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:959-970, 2016. © 2016 American Institute of Chemical Engineers.

  17. Surface complexation modelling applied to the sorption of nickel on silica

    International Nuclear Information System (INIS)

    Olin, M.

    1995-10-01

    The modelling based on a mechanistic approach, of a sorption experiment is presented in the report. The system chosen for experiments (nickel + silica) is modelled by using literature values for some parameters, the remainder being fitted by existing experimental results. All calculations are performed by HYDRAQL, a model planned especially for surface complexation modelling. Allmost all the calculations are made by using the Triple-Layer Model (TLM) approach, which appeared to be sufficiently flexible for the silica system. The report includes a short description of mechanistic sorption models, input data, experimental results and modelling results (mostly graphical presentations). (13 refs., 40 figs., 4 tabs.)

  18. Towards a Mechanistic Understanding of Anaerobic Nitrate Dependent Iron Oxidation: Balancing Electron Uptake and Detoxification

    Directory of Open Access Journals (Sweden)

    Hans Karl Carlson

    2012-02-01

    Full Text Available The anaerobic oxidation of Fe(II by subsurface microorganisms is an important part of biogeochemical cycling in the environment, but the biochemical mechanisms used to couple iron oxidation to nitrate respiration are not well understood. Based on our own work and the evidence available in the literature, we propose a mechanistic model for anaerobic nitrate dependent iron oxidation. We suggest that anaerobic iron oxidizing microorganisms likely exist along a continuum including: 1 bacteria that inadvertently oxidize Fe(II by abiotic or biotic reactions with enzymes or chemical intermediates in their metabolic pathways (e.g. denitrification and suffer from toxicity or energetic penalty, 2 Fe(II tolerant bacteria that gain little or no growth benefit from iron oxidation but can manage the toxic reactions, and 3 bacteria that efficiently accept electrons from Fe(II to gain a growth advantage while preventing or mitigating the toxic reactions. Predictions of the proposed model are highlighted and experimental approaches are discussed.

  19. Application of mechanistic empirical approach to predict rutting of superpave mixtures in Iraq

    Directory of Open Access Journals (Sweden)

    Qasim Zaynab

    2018-01-01

    Full Text Available In Iraq rutting is considered as a real distress in flexible pavements as a result of high summer temperature, and increased axle loads. This distress majorly affects asphalt pavement performance, lessens the pavement useful service life and makes serious hazards for highway users. Performance of HMA mixtures against rutting using Mechanistic- Empirical approach is predicted by considering Wheel-Tracking test and employing the Superpave mix design requirements. Roller Wheel Compactor has been locally manufactured to prepare slab specimens. In view of study laboratory outcomes that are judged to be simulative of field loading conditions, models are developed for predicting permanent strain of compacted samples of local asphalt concrete mixtures after considering the stress level, properties of local material and environmental impacts variables. All in all, laboratory results were produced utilizing statistical analysis with the aid of SPSS software. Permanent strain models for asphalt concrete mixtures were developed as a function of: number of passes, temperature, asphalt content, viscosity, air voids and additive content. Mechanistic Empirical design approach through the MnPAVE software was applied to characterize rutting in HMA and to predict allowable number of loading repetitions of mixtures as a function of expected traffic loads, material properties, and environmental temperature.

  20. A preliminary study of mechanistic approach in pavement design to accommodate climate change effects

    Science.gov (United States)

    Harnaeni, S. R.; Pramesti, F. P.; Budiarto, A.; Setyawan, A.

    2018-03-01

    Road damage is caused by some factors, including climate changes, overload, and inappropriate procedure for material and development process. Meanwhile, climate change is a phenomenon which cannot be avoided. The effects observed include air temperature rise, sea level rise, rainfall changes, and the intensity of extreme weather phenomena. Previous studies had shown the impacts of climate changes on road damage. Therefore, several measures to anticipate the damage should be considered during the planning and construction in order to reduce the cost of road maintenance. There are three approaches generally applied in the design of flexible pavement thickness, namely mechanistic approach, mechanistic-empirical (ME) approach and empirical approach. The advantages of applying mechanistic approach or mechanistic-empirical (ME) approaches are its efficiency and reliability in the design of flexible pavement thickness as well as its capacity to accommodate climate changes in compared to empirical approach. However, generally, the design of flexible pavement thickness in Indonesia still applies empirical approach. This preliminary study aimed to emphasize the importance of the shifting towards a mechanistic approach in the design of flexible pavement thickness.

  1. Dynamic and accurate assessment of acetaminophen-induced hepatotoxicity by integrated photoacoustic imaging and mechanistic biomarkers in vivo.

    Science.gov (United States)

    Brillant, Nathalie; Elmasry, Mohamed; Burton, Neal C; Rodriguez, Josep Monne; Sharkey, Jack W; Fenwick, Stephen; Poptani, Harish; Kitteringham, Neil R; Goldring, Christopher E; Kipar, Anja; Park, B Kevin; Antoine, Daniel J

    2017-10-01

    The prediction and understanding of acetaminophen (APAP)-induced liver injury (APAP-ILI) and the response to therapeutic interventions is complex. This is due in part to sensitivity and specificity limitations of currently used assessment techniques. Here we sought to determine the utility of integrating translational non-invasive photoacoustic imaging of liver function with mechanistic circulating biomarkers of hepatotoxicity with histological assessment to facilitate the more accurate and precise characterization of APAP-ILI and the efficacy of therapeutic intervention. Perturbation of liver function and cellular viability was assessed in C57BL/6J male mice by Indocyanine green (ICG) clearance (Multispectral Optoacoustic Tomography (MSOT)) and by measurement of mechanistic (miR-122, HMGB1) and established (ALT, bilirubin) circulating biomarkers in response to the acetaminophen and its treatment with acetylcysteine (NAC) in vivo. We utilised a 60% partial hepatectomy model as a situation of defined hepatic functional mass loss to compared acetaminophen-induced changes to. Integration of these mechanistic markers correlated with histological features of APAP hepatotoxicity in a time-dependent manner. They accurately reflected the onset and recovery from hepatotoxicity compared to traditional biomarkers and also reported the efficacy of NAC with high sensitivity. ICG clearance kinetics correlated with histological scores for acute liver damage for APAP (i.e. 3h timepoint; r=0.90, P<0.0001) and elevations in both of the mechanistic biomarkers, miR-122 (e.g. 6h timepoint; r=0.70, P=0.005) and HMGB1 (e.g. 6h timepoint; r=0.56, P=0.04). For the first time we report the utility of this non-invasive longitudinal imaging approach to provide direct visualisation of the liver function coupled with mechanistic biomarkers, in the same animal, allowing the investigation of the toxicological and pharmacological aspects of APAP-ILI and hepatic regeneration. Copyright © 2017

  2. Experimental and mathematical modeling methods for the investigation of toxicological interactions

    International Nuclear Information System (INIS)

    El-Masri, Hisham A.

    2007-01-01

    While procedures have been developed and used for many years to assess risk and determine acceptable exposure levels to individual chemicals, most cases of environmental contamination can result in concurrent or sequential exposure to more than one chemical. Toxicological predictions of such combinations must be based on an understanding of the mechanisms of action and interaction of the components of the mixtures. Statistical and experimental methods test the existence of toxicological interactions in a mixture. However, these methods are limited to experimental data ranges for which they are derived, in addition to limitations caused by response differences from experimental animals to humans. Empirical methods such as isobolograms, median-effect principle and response surface methodology (RSM) are based on statistical experimental design and regression of data. For that reason, the predicted response surfaces can be used for extrapolation across dose regions where interaction mechanisms are not anticipated to change. In general, using these methods for predictions can be problematic without including biologically based mechanistic descriptions that can account for dose and species differences. Mechanistically based models, such as physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) models, include explicit descriptions of interaction mechanisms which are related to target tissues levels. These models include dose-dependent mechanistic hypotheses of toxicological interactions which can be tested by model-directed experimental design and used to identify dose regions where interactions are not significant

  3. Inverse Analysis of Pavement Structural Properties Based on Dynamic Finite Element Modeling and Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Xiaochao Tang

    2013-03-01

    Full Text Available With the movement towards the implementation of mechanistic-empirical pavement design guide (MEPDG, an accurate determination of pavement layer moduli is vital for predicting pavement critical mechanistic responses. A backcalculation procedure is commonly used to estimate the pavement layer moduli based on the non-destructive falling weight deflectometer (FWD tests. Backcalculation of flexible pavement layer properties is an inverse problem with known input and output signals based upon which unknown parameters of the pavement system are evaluated. In this study, an inverse analysis procedure that combines the finite element analysis and a population-based optimization technique, Genetic Algorithm (GA has been developed to determine the pavement layer structural properties. A lightweight deflectometer (LWD was used to infer the moduli of instrumented three-layer scaled flexible pavement models. While the common practice in backcalculating pavement layer properties still assumes a static FWD load and uses only peak values of the load and deflections, dynamic analysis was conducted to simulate the impulse LWD load. The recorded time histories of the LWD load were used as the known inputs into the pavement system while the measured time-histories of surface central deflections and subgrade deflections measured with a linear variable differential transformers (LVDT were considered as the outputs. As a result, consistent pavement layer moduli can be obtained through this inverse analysis procedure.

  4. Mechanistic aspects of ionic reactions in flames

    DEFF Research Database (Denmark)

    Egsgaard, H.; Carlsen, L.

    1993-01-01

    Some fundamentals of the ion chemistry of flames are summarized. Mechanistic aspects of ionic reactions in flames have been studied using a VG PlasmaQuad, the ICP-system being substituted by a simple quartz burner. Simple hydrocarbon flames as well as sulfur-containing flames have been investigated...

  5. The E. coli pET expression system revisited-mechanistic correlation between glucose and lactose uptake.

    Science.gov (United States)

    Wurm, David Johannes; Veiter, Lukas; Ulonska, Sophia; Eggenreich, Britta; Herwig, Christoph; Spadiut, Oliver

    2016-10-01

    Therapeutic monoclonal antibodies are mainly produced in mammalian cells to date. However, unglycosylated antibody fragments can also be produced in the bacterium Escherichia coli which brings several advantages, like growth on cheap media and high productivity. One of the most popular E. coli strains for recombinant protein production is E. coli BL21(DE3) which is usually used in combination with the pET expression system. However, it is well known that induction by isopropyl β-D-1-thiogalactopyranoside (IPTG) stresses the cells and can lead to the formation of insoluble inclusion bodies. In this study, we revisited the pET expression system for the production of a novel antibody single-chain variable fragment (scFv) with the goal of maximizing the amount of soluble product. Thus, we (1) investigated whether lactose favors the recombinant production of soluble scFv compared to IPTG, (2) investigated whether the formation of soluble product can be influenced by the specific glucose uptake rate (q s,glu) during lactose induction, and (3) determined the mechanistic correlation between the specific lactose uptake rate (q s,lac) and q s,glu. We found that lactose induction gave a much greater amount of soluble scFv compared to IPTG, even when the growth rate was increased. Furthermore, we showed that the production of soluble protein could be tuned by varying q s,glu during lactose induction. Finally, we established a simple model describing the mechanistic correlation between q s,lac and q s,glu allowing tailored feeding and prevention of sugar accumulation. We believe that this mechanistic model might serve as platform knowledge for E. coli.

  6. Statistical methods for mechanistic model validation: Salt Repository Project

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  7. Using a Mechanistic Reactive Transport Model to Represent Soil Organic Matter Dynamics and Climate Sensitivity

    Science.gov (United States)

    Guerry, N.; Riley, W. J.; Maggi, F.; Torn, M. S.; Kleber, M.

    2011-12-01

    The nature of long term Soil Organic Matter (SOM) dynamics is uncertain and the mechanisms involved are crudely represented in site, regional, and global models. Recent work challenging the paradigm that SOM is stabilized because of its sequential transformations to more intrinsically recalcitrant compounds motivated us to develop a mechanistic modeling framework that can be used to test hypotheses of SOM dynamics. We developed our C cycling model in TOUGHREACT, an established 3-dimensional reactive transport solver that accounts for multiple phases (aqueous, gaseous, sorbed), multiple species, advection and diffusion, and multiple microbial populations. Energy and mass exchange through the soil boundaries are accounted for via ground heat flux, rainfall, C sources (e.g., exudation, woody, leaf, root litter) and C losses (e.g., CO2 emissions and DOC deep percolation). SOM is categorized according to the various types of compounds commonly found in the above mentioned C sources and microbial byproducts, including poly- and monosaccharides, lignin, amino compounds, organic acids, nucleic acids, lipids, and phenols. Each of these compounds is accounted for by one or more representative species in the model. A reaction network was developed to describe the microbially-mediated processes and chemical interactions of these species, including depolymerization, microbial assimilation, respiration and deposition of byproducts, and incorporation of dead biomass into SOM stocks. Enzymatic reactions are characterized by Michaelis-Menten kinetics, with maximum reaction rates determined by the species' O/C ratio. Microbial activity is further regulated by soil moisture content, O2 availability, pH, and temperature. For the initial set of simulations, literature values were used to constrain microbial Monod parameters, Michaelis-Menten parameters, sorption parameters, physical protection, partitioning of microbial byproducts, and partitioning of litter inputs, although there is

  8. Simulating the effects of climate change on the distribution of an invasive plant, using a high resolution, local scale, mechanistic approach: challenges and insights.

    Science.gov (United States)

    Fennell, Mark; Murphy, James E; Gallagher, Tommy; Osborne, Bruce

    2013-04-01

    The growing economic and ecological damage associated with biological invasions, which will likely be exacerbated by climate change, necessitates improved projections of invasive spread. Generally, potential changes in species distribution are investigated using climate envelope models; however, the reliability of such models has been questioned and they are not suitable for use at local scales. At this scale, mechanistic models are more appropriate. This paper discusses some key requirements for mechanistic models and utilises a newly developed model (PSS[gt]) that incorporates the influence of habitat type and related features (e.g., roads and rivers), as well as demographic processes and propagule dispersal dynamics, to model climate induced changes in the distribution of an invasive plant (Gunnera tinctoria) at a local scale. A new methodology is introduced, dynamic baseline benchmarking, which distinguishes climate-induced alterations in species distributions from other potential drivers of change. Using this approach, it was concluded that climate change, based on IPCC and C4i projections, has the potential to increase the spread-rate and intensity of G. tinctoria invasions. Increases in the number of individuals were primarily due to intensification of invasion in areas already invaded or in areas projected to be invaded in the dynamic baseline scenario. Temperature had the largest influence on changes in plant distributions. Water availability also had a large influence and introduced the most uncertainty in the projections. Additionally, due to the difficulties of parameterising models such as this, the process has been streamlined by utilising methods for estimating unknown variables and selecting only essential parameters. © 2012 Blackwell Publishing Ltd.

  9. Integrated Experimental and Model-based Analysis Reveals the Spatial Aspects of EGFR Activation Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Shankaran, Harish; Zhang, Yi; Chrisler, William B.; Ewald, Jonathan A.; Wiley, H. S.; Resat, Haluk

    2012-10-02

    The epidermal growth factor receptor (EGFR) belongs to the ErbB family of receptor tyrosine kinases, and controls a diverse set of cellular responses relevant to development and tumorigenesis. ErbB activation is a complex process involving receptor-ligand binding, receptor dimerization, phosphorylation, and trafficking (internalization, recycling and degradation), which together dictate the spatio-temporal distribution of active receptors within the cell. The ability to predict this distribution, and elucidation of the factors regulating it, would help to establish a mechanistic link between ErbB expression levels and the cellular response. Towards this end, we constructed mathematical models for deconvolving the contributions of receptor dimerization and phosphorylation to EGFR activation, and to examine the dependence of these processes on sub-cellular location. We collected experimental datasets for EGFR activation dynamics in human mammary epithelial cells, with the specific goal of model parameterization, and used the data to estimate parameters for several alternate models. Model-based analysis indicated that: 1) signal termination via receptor dephosphorylation in late endosomes, prior to degradation, is an important component of the response, 2) less than 40% of the receptors in the cell are phosphorylated at any given time, even at saturating ligand doses, and 3) receptor dephosphorylation rates at the cell surface and early endosomes are comparable. We validated the last finding by measuring EGFR dephosphorylation rates at various times following ligand addition both in whole cells, and in endosomes using ELISAs and fluorescent imaging. Overall, our results provide important information on how EGFR phosphorylation levels are regulated within cells. Further, the mathematical model described here can be extended to determine receptor dimer abundances in cells co-expressing various levels of ErbB receptors. This study demonstrates that an iterative cycle of

  10. Mechanistic rationalization of unusual sigmoidal kinetic profiles in the Machetti-De Sarlo cycloaddition reaction.

    Science.gov (United States)

    Mower, Matthew P; Blackmond, Donna G

    2015-02-18

    Unusual sigmoidal kinetic profiles in the Machetti-De Sarlo base-catalyzed 1,3-dipolar cycloaddition of acrylamide to N-methylnitroacetamide are rationalized by detailed in situ kinetic analysis. A dual role is uncovered in which a substrate acts as a precursor to catalyze its own reaction. Such kinetic studies provide a general protocol for distinguishing among different mechanistic origins of induction periods in complex organic reactions.

  11. Elements in nucleotide sensing and hydrolysis of the AAA+ disaggregation machine ClpB: a structure-based mechanistic dissection of a molecular motor

    Energy Technology Data Exchange (ETDEWEB)

    Zeymer, Cathleen, E-mail: cathleen.zeymer@mpimf-heidelberg.mpg.de; Barends, Thomas R. M.; Werbeck, Nicolas D.; Schlichting, Ilme; Reinstein, Jochen, E-mail: cathleen.zeymer@mpimf-heidelberg.mpg.de [Max Planck Institute for Medical Research, Jahnstrasse 29, 69120 Heidelberg (Germany)

    2014-02-01

    High-resolution crystal structures together with mutational analysis and transient kinetics experiments were utilized to understand nucleotide sensing and the regulation of the ATPase cycle in an AAA+ molecular motor. ATPases of the AAA+ superfamily are large oligomeric molecular machines that remodel their substrates by converting the energy from ATP hydrolysis into mechanical force. This study focuses on the molecular chaperone ClpB, the bacterial homologue of Hsp104, which reactivates aggregated proteins under cellular stress conditions. Based on high-resolution crystal structures in different nucleotide states, mutational analysis and nucleotide-binding kinetics experiments, the ATPase cycle of the C-terminal nucleotide-binding domain (NBD2), one of the motor subunits of this AAA+ disaggregation machine, is dissected mechanistically. The results provide insights into nucleotide sensing, explaining how the conserved sensor 2 motif contributes to the discrimination between ADP and ATP binding. Furthermore, the role of a conserved active-site arginine (Arg621), which controls binding of the essential Mg{sup 2+} ion, is described. Finally, a hypothesis is presented as to how the ATPase activity is regulated by a conformational switch that involves the essential Walker A lysine. In the proposed model, an unusual side-chain conformation of this highly conserved residue stabilizes a catalytically inactive state, thereby avoiding unnecessary ATP hydrolysis.

  12. Elements in nucleotide sensing and hydrolysis of the AAA+ disaggregation machine ClpB: a structure-based mechanistic dissection of a molecular motor

    International Nuclear Information System (INIS)

    Zeymer, Cathleen; Barends, Thomas R. M.; Werbeck, Nicolas D.; Schlichting, Ilme; Reinstein, Jochen

    2014-01-01

    High-resolution crystal structures together with mutational analysis and transient kinetics experiments were utilized to understand nucleotide sensing and the regulation of the ATPase cycle in an AAA+ molecular motor. ATPases of the AAA+ superfamily are large oligomeric molecular machines that remodel their substrates by converting the energy from ATP hydrolysis into mechanical force. This study focuses on the molecular chaperone ClpB, the bacterial homologue of Hsp104, which reactivates aggregated proteins under cellular stress conditions. Based on high-resolution crystal structures in different nucleotide states, mutational analysis and nucleotide-binding kinetics experiments, the ATPase cycle of the C-terminal nucleotide-binding domain (NBD2), one of the motor subunits of this AAA+ disaggregation machine, is dissected mechanistically. The results provide insights into nucleotide sensing, explaining how the conserved sensor 2 motif contributes to the discrimination between ADP and ATP binding. Furthermore, the role of a conserved active-site arginine (Arg621), which controls binding of the essential Mg 2+ ion, is described. Finally, a hypothesis is presented as to how the ATPase activity is regulated by a conformational switch that involves the essential Walker A lysine. In the proposed model, an unusual side-chain conformation of this highly conserved residue stabilizes a catalytically inactive state, thereby avoiding unnecessary ATP hydrolysis

  13. Evaluation of long-term creep-fatigue life of stainless steel weldment based on a microstructure degradation model

    International Nuclear Information System (INIS)

    Asayama, Tai; Hasebe, Shinichi

    1997-01-01

    This paper describes a newly developed analytical method of evaluation of creep-fatigue strength of stainless weld metals. Based on the observation that creep-fatigue crack initiates adjacent to the interface of sigma-phase/delta-ferrite and matrix, a mechanistic model which allows the evaluation of micro stress/strain concentration adjacent to the interface was developed. Fatigue and creep damage were evaluated using the model which describes the microstructure after exposed to high temperatures for a long time. Thus it was made possible to predict analytically the long-term creep-fatigue life of stainless steel metals whose microstructure is degraded as a result of high temperature service. (author)

  14. Constraint-based modeling in microbial food biotechnology

    Science.gov (United States)

    Rau, Martin H.

    2018-01-01

    Genome-scale metabolic network reconstruction offers a means to leverage the value of the exponentially growing genomics data and integrate it with other biological knowledge in a structured format. Constraint-based modeling (CBM) enables both the qualitative and quantitative analyses of the reconstructed networks. The rapid advancements in these areas can benefit both the industrial production of microbial food cultures and their application in food processing. CBM provides several avenues for improving our mechanistic understanding of physiology and genotype–phenotype relationships. This is essential for the rational improvement of industrial strains, which can further be facilitated through various model-guided strain design approaches. CBM of microbial communities offers a valuable tool for the rational design of defined food cultures, where it can catalyze hypothesis generation and provide unintuitive rationales for the development of enhanced community phenotypes and, consequently, novel or improved food products. In the industrial-scale production of microorganisms for food cultures, CBM may enable a knowledge-driven bioprocess optimization by rationally identifying strategies for growth and stability improvement. Through these applications, we believe that CBM can become a powerful tool for guiding the areas of strain development, culture development and process optimization in the production of food cultures. Nevertheless, in order to make the correct choice of the modeling framework for a particular application and to interpret model predictions in a biologically meaningful manner, one should be aware of the current limitations of CBM. PMID:29588387

  15. Development and application of a mechanistic model to estimate emission of nitrous oxide from UK agriculture

    International Nuclear Information System (INIS)

    Brown, L.; Jarvis, S.C.; Syed, B.; Goulding, K.W.T.; Li, C.

    2002-01-01

    A mechanistic model of N 2 O emission from agricultural soil (DeNitrification-DeComposition - DNDC) was modified for application to the UK, and was used as the basis of an inventory of N 2 O emission from UK agriculture in 1990. UK-specific input data were added to DNDC's database and the ability to simulate daily C and N inputs from grazing animals and applied animal waste was added to the model. The UK version of the model, UK-DNDC, simulated emissions from 18 different crop types on the 3 areally dominant soils in each county. Validation of the model at the field scale showed that predictions matched observations well. Emission factors for the inventory were calculated from estimates of N 2 O emission from UK-DNDC, in order to maintain direct comparability with the IPCC approach. These, along with activity data, were included in a transparent spreadsheet format. Using UK-DNDC, the estimate of N 2 O-N emission from UK current agricultural practice in 1990 was 50.9Gg. This total comprised 31.7Gg from the soil sector, 5.9Gg from animals and 13.2Gg from the indirect sector. The range of this estimate (using the range of soil organic C for each soil used) was 30.5-62.5Gg N. Estimates of emissions in each sector were compared to those calculated using the IPCC default methodology. Emissions from the soil and indirect sectors were smaller with the UK-DNDC approach than with the IPCC methodology, while emissions from the animal sector were larger. The model runs suggested a relatively large emission from agricultural land that was not attributable to current agricultural practices (33.8Gg in total, 27.4Gg from the soil sector). This 'background' component is partly the result of historical agricultural land use. It is not normally included in inventories of emission, but would increase the total emission of N 2 O-N from agricultural land in 1990 to 78.3Gg. (Author)

  16. Development and application of a mechanistic model to estimate emission of nitrous oxide from UK agriculture

    Energy Technology Data Exchange (ETDEWEB)

    Brown, L.; Jarvis, S.C. [Institute of Grassland and Environmental Research, Okehampton (United Kingdom); Syed, B. [Cranfield Univ., Silsoe (United Kingdom). Soil Survey and Land Research Centre; Sneath, R.W.; Phillips, V.R. [Silsoe Research Inst. (United Kingdom); Goulding, K.W.T. [Institute of Arable Crops Research, Rothamsted (United Kingdom); Li, C. [University of New Hampshire (United States). Inst. for the Study of Earth, Oceans and Space

    2002-07-01

    A mechanistic model of N{sub 2}O emission from agricultural soil (DeNitrification-DeComposition - DNDC) was modified for application to the UK, and was used as the basis of an inventory of N{sub 2}O emission from UK agriculture in 1990. UK-specific input data were added to DNDC's database and the ability to simulate daily C and N inputs from grazing animals and applied animal waste was added to the model. The UK version of the model, UK-DNDC, simulated emissions from 18 different crop types on the 3 areally dominant soils in each county. Validation of the model at the field scale showed that predictions matched observations well. Emission factors for the inventory were calculated from estimates of N{sub 2}O emission from UK-DNDC, in order to maintain direct comparability with the IPCC approach. These, along with activity data, were included in a transparent spreadsheet format. Using UK-DNDC, the estimate of N{sub 2}O-N emission from UK current agricultural practice in 1990 was 50.9Gg. This total comprised 31.7Gg from the soil sector, 5.9Gg from animals and 13.2Gg from the indirect sector. The range of this estimate (using the range of soil organic C for each soil used) was 30.5-62.5Gg N. Estimates of emissions in each sector were compared to those calculated using the IPCC default methodology. Emissions from the soil and indirect sectors were smaller with the UK-DNDC approach than with the IPCC methodology, while emissions from the animal sector were larger. The model runs suggested a relatively large emission from agricultural land that was not attributable to current agricultural practices (33.8Gg in total, 27.4Gg from the soil sector). This 'background' component is partly the result of historical agricultural land use. It is not normally included in inventories of emission, but would increase the total emission of N{sub 2}O-N from agricultural land in 1990 to 78.3Gg. (Author)

  17. Development and application of a mechanistic model to estimate emission of nitrous oxide from UK agriculture

    Science.gov (United States)

    Brown, L.; Syed, B.; Jarvis, S. C.; Sneath, R. W.; Phillips, V. R.; Goulding, K. W. T.; Li, C.

    A mechanistic model of N 2O emission from agricultural soil (DeNitrification-DeComposition—DNDC) was modified for application to the UK, and was used as the basis of an inventory of N 2O emission from UK agriculture in 1990. UK-specific input data were added to DNDC's database and the ability to simulate daily C and N inputs from grazing animals and applied animal waste was added to the model. The UK version of the model, UK-DNDC, simulated emissions from 18 different crop types on the 3 areally dominant soils in each county. Validation of the model at the field scale showed that predictions matched observations well. Emission factors for the inventory were calculated from estimates of N 2O emission from UK-DNDC, in order to maintain direct comparability with the IPCC approach. These, along with activity data, were included in a transparent spreadsheet format. Using UK-DNDC, the estimate of N 2O-N emission from UK current agricultural practice in 1990 was 50.9 Gg. This total comprised 31.7 Gg from the soil sector, 5.9 Gg from animals and 13.2 Gg from the indirect sector. The range of this estimate (using the range of soil organic C for each soil used) was 30.5-62.5 Gg N. Estimates of emissions in each sector were compared to those calculated using the IPCC default methodology. Emissions from the soil and indirect sectors were smaller with the UK-DNDC approach than with the IPCC methodology, while emissions from the animal sector were larger. The model runs suggested a relatively large emission from agricultural land that was not attributable to current agricultural practices (33.8 Gg in total, 27.4 Gg from the soil sector). This 'background' component is partly the result of historical agricultural land use. It is not normally included in inventories of emission, but would increase the total emission of N 2O-N from agricultural land in 1990 to 78.3 Gg.

  18. Models of iodine behavior in reactor containments

    Energy Technology Data Exchange (ETDEWEB)

    Weber, C.F.; Beahm, E.C.; Kress, T.S.

    1992-10-01

    Models are developed for many phenomena of interest concerning iodine behavior in reactor containments during severe accidents. Processes include speciation in both gas and liquid phases, reactions with surfaces, airborne aerosols, and other materials, and gas-liquid interface behavior. Although some models are largely empirical formulations, every effort has been made to construct mechanistic and rigorous descriptions of relevant chemical processes. All are based on actual experimental data generated at the Oak Ridge National Laboratory (ORNL) or elsewhere, and, hence, considerable data evaluation and parameter estimation are contained in this study. No application or encoding is attempted, but each model is stated in terms of rate processes, with the intention of allowing mechanistic simulation. Taken together, this collection of models represents a best estimate iodine behavior and transport in reactor accidents.

  19. Models of iodine behavior in reactor containments

    International Nuclear Information System (INIS)

    Weber, C.F.; Beahm, E.C.; Kress, T.S.

    1992-10-01

    Models are developed for many phenomena of interest concerning iodine behavior in reactor containments during severe accidents. Processes include speciation in both gas and liquid phases, reactions with surfaces, airborne aerosols, and other materials, and gas-liquid interface behavior. Although some models are largely empirical formulations, every effort has been made to construct mechanistic and rigorous descriptions of relevant chemical processes. All are based on actual experimental data generated at the Oak Ridge National Laboratory (ORNL) or elsewhere, and, hence, considerable data evaluation and parameter estimation are contained in this study. No application or encoding is attempted, but each model is stated in terms of rate processes, with the intention of allowing mechanistic simulation. Taken together, this collection of models represents a best estimate iodine behavior and transport in reactor accidents

  20. Mechanistic explanation of time-dependent cross-phenomenon based on quorum sensing: A case study of the mixture of sulfonamide and quorum sensing inhibitor to bioluminescence of Aliivibrio fischeri.

    Science.gov (United States)

    Sun, Haoyu; Pan, Yongzheng; Gu, Yue; Lin, Zhifen

    2018-07-15

    Cross-phenomenon in which the concentration-response curve (CRC) for a mixture crosses the CRC for the reference model has been identified in many studies, expressed as a heterogeneous pattern of joint toxic action. However, a mechanistic explanation of the cross-phenomenon has thus far been extremely insufficient. In this study, a time-dependent cross-phenomenon was observed, in which the cross-concentration range between the CRC for the mixture of sulfamethoxypyridazine (SMP) and (Z-)-4-Bromo-5-(bromomethylene)-2(5H)-furanone (C30) to the bioluminescence of Aliivibrio fischeri (A. fischeri) and the CRC for independent action model with 95% confidence bands varied from low-concentration to higher-concentration regions in a timely manner expressed the joint toxic action of the mixture changing with an increase of both concentration and time. Through investigating the time-dependent hormetic effects of SMP and C30 (by measuring the expression of protein mRNA, simulating the bioluminescent reaction and analyzing the toxic action), the underlying mechanism was as follows: SMP and C30 acted on the quorum sensing (QS) system of A. fischeri, which induced low-concentration stimulatory effects and high-concentration inhibitory effects; in the low-concentration region, the stimulatory effects of SMP and C30 made the mixture produce a synergistic stimulation on the bioluminescence; thus, the joint toxic action exhibited antagonism. In the high-concentration region, the inhibitory effects of SMP and C30 in the mixture caused a double block in the loop circuit of the QS system; thus, the joint toxic action exhibited synergism. With the increase of time, these stimulatory and inhibitory effects of SMP and C30 were changed by the variation of the QS system at different growth phases, resulting in the time-dependent cross-phenomenon. This study proposes an induced mechanism for time-dependent cross-phenomenon based on QS, which may provide new insight into the mechanistic

  1. When mechanism matters: Bayesian forecasting using models of ecological diffusion

    Science.gov (United States)

    Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.

    2017-01-01

    Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.

  2. This Mechanistic Step Is ''Productive'': Organic Chemistry Students' Backward-Oriented Reasoning

    Science.gov (United States)

    Caspari, I.; Weinrich, M. L.; Sevian, H.; Graulich, N.

    2018-01-01

    If an organic chemistry student explains that she represents a mechanistic step because ''it's a productive part of the mechanism,'' what meaning could the professor teaching the class attribute to this statement, what is actually communicated, and what does it mean for the student? The professor might think that the explanation is based on…

  3. A Mechanistic Model of Waterfall Plunge Pool Erosion into Bedrock

    Science.gov (United States)

    Scheingross, Joel S.; Lamb, Michael P.

    2017-11-01

    Landscapes often respond to changes in climate and tectonics through the formation and upstream propagation of knickzones composed of waterfalls. Little work has been done on the mechanics of waterfall erosion, and instead most landscape-scale models neglect waterfalls or use rules for river erosion, such as stream power, that may not be applicable to waterfalls. Here we develop a physically based model to predict waterfall plunge pool erosion into rock by abrasion from particle impacts and test the model against flume experiments. Both the model and experiments show that evolving plunge pools have initially high vertical erosion rates due to energetic particle impacts, and erosion slows and eventually ceases as pools deepen and deposition protects the pool floor from further erosion. Lateral erosion can continue after deposition on the pool floor, but it occurs at slow rates that become negligible as pools widen. Our work points to the importance of vertical drilling of successive plunge pools to drive upstream knickzone propagation in homogenous rock, rather than the classic mechanism of headwall undercutting. For a series of vertically drilling waterfalls, we find that upstream knickzone propagation is faster under higher combined water and sediment fluxes and for knickzones composed of many waterfalls that are closely spaced. Our model differs significantly from stream-power-based erosion rules in that steeper knickzones can retreat faster or more slowly depending on the number and spacing of waterfalls within a knickzone, which has implications for interpreting climatic and tectonic history through analysis of river longitudinal profiles.

  4. Estimation of energy saving thanks to a reduced-model-based approach: Example of bread baking by jet impingement

    International Nuclear Information System (INIS)

    Alamir, M.; Witrant, E.; Della Valle, G.; Rouaud, O.; Josset, Ch.; Boillereaux, L.

    2013-01-01

    In this paper, a reduced order mechanistic model is proposed for the evolution of temperature and humidity during French bread baking. The model parameters are identified using experimental data. The resulting model is then used to estimate the potential energy saving that can be obtained using jet impingement technology when used to increase the heat transfer efficiency. Results show up to 16% potential energy saving under certain assumptions. - Highlights: ► We developed a mechanistic model of heat and mass transfer in bread including different and multiple energy sources. ► An optimal control system permits to track references trajectories with a minimization of energy consuming. ► The methodology is evaluated with jet impingement technique. ► Results show a significant energy saving of about 17% of energy with reasonable actuator variations

  5. Mechanistic understanding of the link between Sodium Starch Glycolate properties and the performance of tablets made by wet granulation.

    Science.gov (United States)

    Wren, S A C; Alhusban, F; Barry, A R; Hughes, L P

    2017-08-30

    The impact of varying Sodium Starch Glycolate (SSG) grade and wet granulation intensity on the mechanism of disintegration and dissolution of mannitol-based Immediate Release (IR) placebo tablets was investigated. MRI and 1 H NMR provided mechanistic insight, and revealed a four-fold range in both tablet disintegration and dissolution rates. MRI was used to quantify the rates of change in tablet volumes and the data fitted to a hydration/erosion model. Reduced levels of cross-linking change SSG from a swelling to a gelling matrix. The tablet hydration and dissolution rates are related to the viscosity at the tablet-solution interface, with high viscosities limiting mass transport. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Projected Climate Impacts to South African Maize and Wheat Production in 2055: A Comparison of Empirical and Mechanistic Modeling Approaches

    Science.gov (United States)

    Estes, Lyndon D.; Beukes, Hein; Bradley, Bethany A.; Debats, Stephanie R.; Oppenheimer, Michael; Ruane, Alex C.; Schulze, Roland; Tadross, Mark

    2013-01-01

    Crop model-specific biases are a key uncertainty affecting our understanding of climate change impacts to agriculture. There is increasing research focus on intermodel variation, but comparisons between mechanistic (MMs) and empirical models (EMs) are rare despite both being used widely in this field. We combined MMs and EMs to project future (2055) changes in the potential distribution (suitability) and productivity of maize and spring wheat in South Africa under 18 downscaled climate scenarios (9 models run under 2 emissions scenarios). EMs projected larger yield losses or smaller gains than MMs. The EMs' median-projected maize and wheat yield changes were 3.6% and 6.2%, respectively, compared to 6.5% and 15.2% for the MM. The EM projected a 10% reduction in the potential maize growing area, where the MM projected a 9% gain. Both models showed increases in the potential spring wheat production region (EM = 48%, MM = 20%), but these results were more equivocal because both models (particularly the EM) substantially overestimated the extent of current suitability. The substantial water-use efficiency gains simulated by the MMs under elevated CO2 accounted for much of the EMMM difference, but EMs may have more accurately represented crop temperature sensitivities. Our results align with earlier studies showing that EMs may show larger climate change losses than MMs. Crop forecasting efforts should expand to include EMMM comparisons to provide a fuller picture of crop-climate response uncertainties.

  7. Mechanistic understanding of monosaccharide-air flow battery electrochemistry

    Science.gov (United States)

    Scott, Daniel M.; Tsang, Tsz Ho; Chetty, Leticia; Aloi, Sekotilani; Liaw, Bor Yann

    Recently, an inexpensive monosaccharide-air flow battery configuration has been demonstrated to utilize a strong base and a mediator redox dye to harness electrical power from the partial oxidation of glucose. Here the mechanistic understanding of glucose oxidation in this unique glucose-air power source is further explored by acid-base titration experiments, 13C NMR, and comparison of results from chemically different redox mediators (indigo carmine vs. methyl viologen) and sugars (fructose vs. glucose) via studies using electrochemical techniques. Titration results indicate that gluconic acid is the main product of the cell reaction, as supported by evidence in the 13C NMR spectra. Using indigo carmine as the mediator dye and fructose as the energy source, an abiotic cell configuration generates a power density of 1.66 mW cm -2, which is greater than that produced from glucose under similar conditions (ca. 1.28 mW cm -2). A faster transition from fructose into the ene-diol intermediate than from glucose likely contributed to this difference in power density.

  8. INTEGRATION OF NPP SEMI MECHANISTIC - MODELLING, REMOTE SENSING AND CIS IN ESTIMATING CO 2 ABSORPTION OF FOREST VEGETATION IN LORE LINDU NATIONAL PARK

    Directory of Open Access Journals (Sweden)

    GODE GRAVENHORsr

    2006-01-01

    Full Text Available Net Primary Production, NPP, is one of the most important variables characterizing the performance of an ecosystem. It is the difference between the total carbon uptake from the air through photosynthesis and the carbon loss due to respiration by living plants. However, field measurements of NPP are time-consuming and expensive. Current techniques are therefore not useful for obtaining NPP estimates over large areas. By combining the remote sensing and GIS technology and modelling, we can estimate NPP of a large ecosystem with a little ease. This paper discusses the use of a process based physiological sunshade canopy models in estimating NPP of Lore Lindu National Park (LLNP. The discussion includes on how to parameterize the models and how to scale up from leaf to the canopy. The version documented in this manuscript is called NetPro Model, which is a potential NPP model where water effect is not included yet. The model integrates CIS and the use of Remote Sensing, and written in Visual Basic 6.0 programming language and Map Objects 2.1. NetPro has the capability of estimating NPP of Cs vegetation under present environmental condition and under future scenarios (increasing [CO2], increasing temperature and increasing or decreasing leaf nitrogen level. Based on site-measured parameterisation of VaM* (Photosynthetic capacity, /Jj (Respiration and leaf nitrogen ONi, the model was run under increasing CO2 level and temperature and varied leaf nitrogen. The output of the semi-mechanistic modelling is radiation use efficiency (?. Analysis of remote sensing data give Normalized Difference Vegetation Index (NDVI and related Leaf Area Index (LAI and traction of absorbed Photosynthetically Active Radiation (/M > AK. Climate data are obtained from 12 meteorological stations around die parks, which includes global radiations, minimum and maximum temperature. CO2 absorbed by vegetation (Gross Primary Production, GPP is then calculated using the above

  9. How to use mechanistic effect models in environmental risk assessment of pesticides: Case studies and recommendations from the SETAC workshop MODELINK.

    Science.gov (United States)

    Hommen, Udo; Forbes, Valery; Grimm, Volker; Preuss, Thomas G; Thorbek, Pernille; Ducrot, Virginie

    2016-01-01

    Mechanistic effect models (MEMs) are useful tools for ecological risk assessment of chemicals to complement experimentation. However, currently no recommendations exist for how to use them in risk assessments. Therefore, the Society of Environmental Toxicology and Chemistry (SETAC) MODELINK workshop aimed at providing guidance for when and how to apply MEMs in regulatory risk assessments. The workshop focused on risk assessment of plant protection products under Regulation (EC) No 1107/2009 using MEMs at the organism and population levels. Realistic applications of MEMs were demonstrated in 6 case studies covering assessments for plants, invertebrates, and vertebrates in aquatic and terrestrial habitats. From the case studies and their evaluation, 12 recommendations on the future use of MEMs were formulated, addressing the issues of how to translate specific protection goals into workable questions, how to select species and scenarios to be modeled, and where and how to fit MEMs into current and future risk assessment schemes. The most important recommendations are that protection goals should be made more quantitative; the species to be modeled must be vulnerable not only regarding toxic effects but also regarding their life history and dispersal traits; the models should be as realistic as possible for a specific risk assessment question, and the level of conservatism required for a specific risk assessment should be reached by designing appropriately conservative environmental and exposure scenarios; scenarios should include different regions of the European Union (EU) and different crops; in the long run, generic MEMs covering relevant species based on representative scenarios should be developed, which will require EU-level joint initiatives of all stakeholders involved. The main conclusion from the MODELINK workshop is that the considerable effort required for making MEMs an integral part of environmental risk assessment of pesticides is worthwhile, because

  10. Predictive Modeling of Fast-Curing Thermosets in Nozzle-Based Extrusion

    Science.gov (United States)

    Xie, Jingjin; Randolph, Robert; Simmons, Gary; Hull, Patrick V.; Mazzeo, Aaron D.

    2017-01-01

    This work presents an approach to modeling the dynamic spreading and curing behavior of thermosets in nozzle-based extrusions. Thermosets cover a wide range of materials, some of which permit low-temperature processing with subsequent high-temperature and high-strength working properties. Extruding thermosets may overcome the limited working temperatures and strengths of conventional thermoplastic materials used in additive manufacturing. This project aims to produce technology for the fabrication of thermoset-based structures leveraging advances made in nozzle-based extrusion, such as fused deposition modeling (FDM), material jetting, and direct writing. Understanding the synergistic interactions between spreading and fast curing of extruded thermosetting materials will provide essential insights for applications that require accurate dimensional controls, such as additive manufacturing [1], [2] and centrifugal coating/forming [3]. Two types of thermally curing thermosets -- one being a soft silicone (Ecoflex 0050) and the other being a toughened epoxy (G/Flex) -- served as the test materials in this work to obtain models for cure kinetics and viscosity. The developed models align with extensive measurements made with differential scanning calorimetry (DSC) and rheology. DSC monitors the change in the heat of reaction, which reflects the rate and degree of cure at different crosslinking stages. Rheology measures the change in complex viscosity, shear moduli, yield stress, and other properties dictated by chemical composition. By combining DSC and rheological measurements, it is possible to establish a set of models profiling the cure kinetics and chemorheology without prior knowledge of chemical composition, which is usually necessary for sophisticated mechanistic modeling. In this work, we conducted both isothermal and dynamic measurements with both DSC and rheology. With the developed models, numerical simulations yielded predictions of diameter and height of

  11. A mechanistic, globally-applicable model of plant nitrogen uptake, retranslocation and fixation

    Science.gov (United States)

    Fisher, J. B.; Tan, S.; Malhi, Y.; Fisher, R. A.; Sitch, S.; Huntingford, C.

    2008-12-01

    Nitrogen is one of the nutrients that can most limit plant growth, and nitrogen availability may be a controlling factor on biosphere responses to climate change. We developed a plant nitrogen assimilation model based on a) advective transport through the transpiration stream, b) retranslocation whereby carbon is expended to resorb nitrogen from leaves, c) active uptake whereby carbon is expended to acquire soil nitrogen, and d) biological nitrogen fixation whereby carbon is expended for symbiotic nitrogen fixers. The model relies on 9 inputs: 1) net primary productivity (NPP), 2) plant C:N ratio, 3) available soil nitrogen, 4) root biomass, 5) transpiration rate, 6) saturated soil depth,7) leaf nitrogen before senescence, 8) soil temperature, and 9) ability to fix nitrogen. A carbon cost of retranslocation is estimated based on leaf nitrogen and compared to an active uptake carbon cost based on root biomass and available soil nitrogen; for nitrogen fixers both costs are compared to a carbon cost of fixation dependent on soil temperature. The NPP is then allocated to optimize growth while maintaining the C:N ratio. The model outputs are total plant nitrogen uptake, remaining NPP available for growth, carbon respired to the soil and updated available soil nitrogen content. We test and validate the model (called FUN: Fixation and Uptake of Nitrogen) against data from the UK, Germany and Peru, and run the model under simplified scenarios of primary succession and climate change. FUN is suitable for incorporation into a land surface scheme of a General Circulation Model and will be coupled with a soil model and dynamic global vegetation model as part of a land surface model (JULES).

  12. Trichloroethylene: Mechanistic, epidemiologic and other supporting evidence of carcinogenic hazard.

    Science.gov (United States)

    Rusyn, Ivan; Chiu, Weihsueh A; Lash, Lawrence H; Kromhout, Hans; Hansen, Johnni; Guyton, Kathryn Z

    2014-01-01

    The chlorinated solvent trichloroethylene (TCE) is a ubiquitous environmental pollutant. The carcinogenic hazard of TCE was the subject of a 2012 evaluation by a Working Group of the International Agency for Research on Cancer (IARC). Information on exposures, relevant data from epidemiologic studies, bioassays in experimental animals, and toxicity and mechanism of action studies was used to conclude that TCE is carcinogenic to humans (Group 1). This article summarizes the key evidence forming the scientific bases for the IARC classification. Exposure to TCE from environmental sources (including hazardous waste sites and contaminated water) is common throughout the world. While workplace use of TCE has been declining, occupational exposures remain of concern, especially in developing countries. The strongest human evidence is from studies of occupational TCE exposure and kidney cancer. Positive, although less consistent, associations were reported for liver cancer and non-Hodgkin lymphoma. TCE is carcinogenic at multiple sites in multiple species and strains of experimental animals. The mechanistic evidence includes extensive data on the toxicokinetics and genotoxicity of TCE and its metabolites. Together, available evidence provided a cohesive database supporting the human cancer hazard of TCE, particularly in the kidney. For other target sites of carcinogenicity, mechanistic and other data were found to be more limited. Important sources of susceptibility to TCE toxicity and carcinogenicity were also reviewed by the Working Group. In all, consideration of the multiple evidence streams presented herein informed the IARC conclusions regarding the carcinogenicity of TCE. © 2013.

  13. Fluid mechanics in dentinal microtubules provides mechanistic insights into the difference between hot and cold dental pain.

    Science.gov (United States)

    Lin, Min; Luo, Zheng Yuan; Bai, Bo Feng; Xu, Feng; Lu, Tian Jian

    2011-03-23

    Dental thermal pain is a significant health problem in daily life and dentistry. There is a long-standing question regarding the phenomenon that cold stimulation evokes sharper and more shooting pain sensations than hot stimulation. This phenomenon, however, outlives the well-known hydrodynamic theory used to explain dental thermal pain mechanism. Here, we present a mathematical model based on the hypothesis that hot or cold stimulation-induced different directions of dentinal fluid flow and the corresponding odontoblast movements in dentinal microtubules contribute to different dental pain responses. We coupled a computational fluid dynamics model, describing the fluid mechanics in dentinal microtubules, with a modified Hodgkin-Huxley model, describing the discharge behavior of intradental neuron. The simulated results agreed well with existing experimental measurements. We thence demonstrated theoretically that intradental mechano-sensitive nociceptors are not "equally sensitive" to inward (into the pulp) and outward (away from the pulp) fluid flows, providing mechanistic insights into the difference between hot and cold dental pain. The model developed here could enable better diagnosis in endodontics which requires an understanding of pulpal histology, neurology and physiology, as well as their dynamic response to the thermal stimulation used in dental practices.

  14. Mechanistic approach for the kinetics of the decomposition of nitrous oxide over calcined hydrotalcites

    Energy Technology Data Exchange (ETDEWEB)

    Dandl, H.; Emig, G. [Lehrstuhl fuer Technische Chemie I, Erlangen (Germany)

    1998-03-27

    A highly active catalyst for the decomposition of N{sub 2}O was prepared by the thermal treatment of CoLaAl-hydrotalcite. For this catalyst the reaction rate was determined at various partial pressures of N{sub 2}O, O{sub 2} and H{sub 2}O in a temperature range from 573K to 823K. The kinetic simulation resulted in a mechanistic model. The energies of activation and rate coefficients are estimated for the main steps of the reaction

  15. A mechanistic approach to postirradiation spoilage kinetics of fish

    International Nuclear Information System (INIS)

    Tukenmez, I.

    2004-01-01

    Full text: In order to simulate postirradiation spoilage of fish, the mechanistic aspects of the growth of surviving microorganisms during chill storage and their product formation in irradiated fish were analyzed. Anchovy (Engraulis encrasicholus) samples those unirradiated and irradiated at 1, 2 and 3 kGy doses of gamma radiation were stored at +2 o C for 21 days. Total bacterial counts (TBC) and trimethylamine (TMA) analysis of the samples were done periodically during storage. Depending on the proposed spoilage mechanism, kinetic model equations were derived. By using experimental data of TBC and TMA in the developed model, the postirradiation spoilage parameters including growth rate constant, inital and maximum attainable TBC, lag time and TMA yield were evaluated and microbial spoilage of fish was simulated for postirradiation storage. Shelf life of irradiated fish was estimated depending on the spoilage kinetics. Dose effects on the kinetic parameters were analyzed. It is suggested that the kinetic evaluation method developed in this study may be used for quality assessment, shelf life determination and dose optimization for radiation preservation of fish

  16. Computer models versus reality: how well do in silico models currently predict the sensitization potential of a substance.

    Science.gov (United States)

    Teubner, Wera; Mehling, Anette; Schuster, Paul Xaver; Guth, Katharina; Worth, Andrew; Burton, Julien; van Ravenzwaay, Bennard; Landsiedel, Robert

    2013-12-01

    National legislations for the assessment of the skin sensitization potential of chemicals are increasingly based on the globally harmonized system (GHS). In this study, experimental data on 55 non-sensitizing and 45 sensitizing chemicals were evaluated according to GHS criteria and used to test the performance of computer (in silico) models for the prediction of skin sensitization. Statistic models (Vega, Case Ultra, TOPKAT), mechanistic models (Toxtree, OECD (Q)SAR toolbox, DEREK) or a hybrid model (TIMES-SS) were evaluated. Between three and nine of the substances evaluated were found in the individual training sets of various models. Mechanism based models performed better than statistical models and gave better predictivities depending on the stringency of the domain definition. Best performance was achieved by TIMES-SS, with a perfect prediction, whereby only 16% of the substances were within its reliability domain. Some models offer modules for potency; however predictions did not correlate well with the GHS sensitization subcategory derived from the experimental data. In conclusion, although mechanistic models can be used to a certain degree under well-defined conditions, at the present, the in silico models are not sufficiently accurate for broad application to predict skin sensitization potentials. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Mechanistic Modelling of Biodiesel Production using a Liquid Lipase Formulation

    DEFF Research Database (Denmark)

    Price, Jason Anthony; Hofmann, Björn; Silva, Vanessa T. L.

    2014-01-01

    , with respect to the industrial production of biodiesel. The developed kinetic model, coupled with a mass balance of the system, was fitted to and validated on experimental results for the fed-batch transesterification of rapeseed oil. The confidence intervals of the parameter estimates, along...... that constrains the amount of methanol in the reactor was computed and the predictions experimentally validated. Monte-Carlo simulations were then used to characterize the effect of the parameter uncertainty on the model outputs, giving a biodiesel yield, based on the mass of oil, of 90.8 ± 0.55 mass %. © 2014...

  18. Characterization of cementitiously stabilized subgrades for mechanistic-empirical pavement design

    Science.gov (United States)

    Solanki, Pranshoo

    ettringite, responsible for sulfate-induced heaving, is also discussed. For Level 2 design of pavements, a total of four stress-based statistical models and two feed-forward-type artificial neural network (ANN) models, are evaluated for predicting resilient modulus of 28-day cured stabilized specimens. Specifically, one semi-log stress-based, three log-log stress-based, one Multi-Layer Perceptrons Network (MLPN), and one Radial Basis Function Network (RBFN) are developed. Overall, semi-log stress-based and MLPN neural network are found to show best acceptable performance for the present evaluation and validation datasets. Further, correlations are presented for stress-based models to correlate Mr with compacted specimen characteristics and soil/additive properties. Additionally, the effect of type of additive on indirect tensile and fatigue characteristics of selected stabilized P- and V-soil is evaluated. This study is based on the fact that stabilized layer is subjected to tensile stresses under wheel loading. Thus, the resilient modulus in tension (M rt), fatigue life and strength in tension (sigmat) or flexure (represented by modulus of rupture, MOR) becomes another important design parameter within the mechanistic framework. Cylindrical specimens are prepared, cured for 28 days and subjected to different stress sequences in indirect tension to study the Mrt. On the other hand, stabilized beam specimens are compacted using a Linear Kneading Compactor and subjected to repeated cycles of reloading-unloading after 28 days of curing in a four-point beam fatigue apparatus for evaluating fatigue life and flexural stiffness. It is found that all three additives improved the Mrt, sigmat and MOR values; however, degree of improvement varied with the type of additive and soil. This study encompasses the differences in the design of semi-rigid pavements developed using AASHTO 1993 and AASHTO 2002 MEPDG methodologies. Further, the design curves for fatigue performance prediction of

  19. Heterogeneity of pulmonary perfusion as a mechanistic image-based phenotype in emphysema susceptible smokers.

    Science.gov (United States)

    Alford, Sara K; van Beek, Edwin J R; McLennan, Geoffrey; Hoffman, Eric A

    2010-04-20

    Recent evidence suggests that endothelial dysfunction and pathology of pulmonary vascular responses may serve as a precursor to smoking-associated emphysema. Although it is known that emphysematous destruction leads to vasculature changes, less is known about early regional vascular dysfunction which may contribute to and precede emphysematous changes. We sought to test the hypothesis, via multidetector row CT (MDCT) perfusion imaging, that smokers showing early signs of emphysema susceptibility have a greater heterogeneity in regional perfusion parameters than emphysema-free smokers and persons who had never smoked (NS). Assuming that all smokers have a consistent inflammatory response, increased perfusion heterogeneity in emphysema-susceptible smokers would be consistent with the notion that these subjects may have the inability to block hypoxic vasoconstriction in patchy, small regions of inflammation. Dynamic ECG-gated MDCT perfusion scans with a central bolus injection of contrast were acquired in 17 NS, 12 smokers with normal CT imaging studies (SNI), and 12 smokers with subtle CT findings of centrilobular emphysema (SCE). All subjects had normal spirometry. Quantitative image analysis determined regional perfusion parameters, pulmonary blood flow (PBF), and mean transit time (MTT). Mean and coefficient of variation were calculated, and statistical differences were assessed with one-way ANOVA. MDCT-based MTT and PBF measurements demonstrate globally increased heterogeneity in SCE subjects compared with NS and SNI subjects but demonstrate similarity between NS and SNI subjects. These findings demonstrate a functional lung-imaging measure that provides a more mechanistically oriented phenotype that differentiates smokers with and without evidence of emphysema susceptibility.

  20. Mechanistic studies of carbon monoxide reduction

    Energy Technology Data Exchange (ETDEWEB)

    Geoffroy, G.L.

    1990-06-12

    The progress made during the current grant period (1 January 1988--1 April 1990) in three different areas of research is summarized. The research areas are: (1) oxidatively-induced double carbonylation reactions to form {alpha}-ketoacyl complexes and studies of the reactivity of the resulting compounds, (2) mechanistic studies of the carbonylation of nitroaromatics to form isocyanates, carbamates, and ureas, and (3) studies of the formation and reactivity of unusual metallacycles and alkylidene ligands supported on binuclear iron carbonyl fragments. 18 refs., 5 figs., 1 tab.

  1. Fast integration-based prediction bands for ordinary differential equation models.

    Science.gov (United States)

    Hass, Helge; Kreutz, Clemens; Timmer, Jens; Kaschek, Daniel

    2016-04-15

    To gain a deeper understanding of biological processes and their relevance in disease, mathematical models are built upon experimental data. Uncertainty in the data leads to uncertainties of the model's parameters and in turn to uncertainties of predictions. Mechanistic dynamic models of biochemical networks are frequently based on nonlinear differential equation systems and feature a large number of parameters, sparse observations of the model components and lack of information in the available data. Due to the curse of dimensionality, classical and sampling approaches propagating parameter uncertainties to predictions are hardly feasible and insufficient. However, for experimental design and to discriminate between competing models, prediction and confidence bands are essential. To circumvent the hurdles of the former methods, an approach to calculate a profile likelihood on arbitrary observations for a specific time point has been introduced, which provides accurate confidence and prediction intervals for nonlinear models and is computationally feasible for high-dimensional models. In this article, reliable and smooth point-wise prediction and confidence bands to assess the model's uncertainty on the whole time-course are achieved via explicit integration with elaborate correction mechanisms. The corresponding system of ordinary differential equations is derived and tested on three established models for cellular signalling. An efficiency analysis is performed to illustrate the computational benefit compared with repeated profile likelihood calculations at multiple time points. The integration framework and the examples used in this article are provided with the software package Data2Dynamics, which is based on MATLAB and freely available at http://www.data2dynamics.org helge.hass@fdm.uni-freiburg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e

  2. The loss of ecosystem services due to land degradation. Integration of mechanistic and probabilistic models in an Ethiopian case study

    Science.gov (United States)

    Cerretelli, Stefania; Poggio, Laura; Gimona, Alessandro; Peressotti, Alessandro; Black, Helaina

    2017-04-01

    Land and soil degradation are widespread especially in dry and developing countries such as Ethiopia. Land degradation leads to ecosystems services (ESS) degradation, because it causes the depletion and loss of several soil functions. Ethiopia's farmland faces intense degradation due to deforestation, agricultural land expansion, land overexploitation and overgrazing. In this study we modelled the impact of physical factors on ESS degradation, in particular soil erodibility, carbon storage and nutrient retention, in the Ethiopian Great Rift Valley, northwestern of Hawassa. We used models of the Sediment retention/loss, the Nutrient Retention/loss (from the software suite InVEST) and Carbon Storage. To run the models we coupled soil local data (such as soil organic carbon, soil texture) with remote sensing data as input in the parametrization phase, e.g. to derive a land use map, to calculate the aboveground and belowground carbon, the evapotraspiration coefficient and the capacity of vegetation to retain nutrient. We then used spatialised Bayesian Belief Networks (sBBNs) predicting ecosystem services degradation on the basis of the results of the three mechanistic models. The results show i) the importance of mapping of ESS degradation taking into consideration the spatial heterogeneity and the cross-correlations between impacts ii) the fundamental role of remote sensing data in monitoring and modelling in remote, data-poor areas and iii) the important role of spatial BBNs in providing spatially explicit measures of risk and uncertainty. This approach could help decision makers to identify priority areas for intervention in order to reduce land and ecosystem services degradation.

  3. Proposed mechanistic description of dose-dependent BDE-47 urinary elimination in mice using a physiologically based pharmacokinetic model

    Energy Technology Data Exchange (ETDEWEB)

    Emond, Claude, E-mail: claude.emond@umontreal.ca [BioSimulation Consulting Inc., Newark, DE (United States); Departments of Environmental and Occupational Health, Medicine Faculty, University of Montreal, Montreal, Quebec (Canada); Sanders, J. Michael, E-mail: sander10@mail.nih.gov [National Cancer Institute, Research Triangle Park, NC (United States); Wikoff, Daniele, E-mail: dwikoff@toxstrategies.com [ToxStrategies, Austin, TX (United States); Birnbaum, Linda S., E-mail: birnbaumls@niehs.nih.gov [National Cancer Institute, Research Triangle Park, NC (United States)

    2013-12-01

    Polybrominated diphenyl ethers (PBDEs) have been used in a wide variety of consumer applications as additive flame retardants. In North America, scientists have noted continuing increases in the levels of PBDE congeners measured in human serum. Some recent studies have found that PBDEs are associated with adverse health effects in humans, in experimental animals, and wildlife. This laboratory previously demonstrated that urinary elimination of 2,2′,4,4′-tetrabromodiphenyl ether (BDE-47) is saturable at high doses in mice; however, this dose-dependent urinary elimination has not been observed in adult rats or immature mice. Thus, the primary objective of this study was to examine the mechanism of urinary elimination of BDE-47 in adult mice using a physiologically based pharmacokinetic (PBPK) model. To support this objective, additional laboratory data were collected to evaluate the predictions of the PBPK model using novel information from adult multi-drug resistance 1a/b knockout mice. Using the PBPK model, the roles of mouse major urinary protein (a blood protein carrier) and P-glycoprotein (an apical membrane transporter in proximal tubule cells in the kidneys, brain, intestines, and liver) were investigated in BDE-47 elimination. The resulting model and new data supported the major role of m-MUP in excretion of BDE-47 in the urine of adult mice, and a lesser role of P-gp as a transporter of BDE-47 in mice. This work expands the knowledge of BDE-47 kinetics between species and provides information for determining the relevancy of these data for human risk assessment purposes. - Highlights: • We report the first study on PBPK model on flame retardant in mice for BDE-47. • We examine mechanism of urinary elimination of BDE-47 in mice using a PBPK model. • We investigated roles of m-MUP and P-gp as transporters in urinary elimination.

  4. Simulation of Drought-induced Tree Mortality Using a New Individual and Hydraulic Trait-based Model (S-TEDy)

    Science.gov (United States)

    Sinha, T.; Gangodagamage, C.; Ale, S.; Frazier, A. G.; Giambelluca, T. W.; Kumagai, T.; Nakai, T.; Sato, H.

    2017-12-01

    Drought-related tree mortality at a regional scale causes drastic shifts in carbon and water cycling in Southeast Asian tropical rainforests, where severe droughts are projected to occur more frequently, especially under El Niño conditions. To provide a useful tool for projecting the tropical rainforest dynamics under climate change conditions, we developed the Spatially Explicit Individual-Based (SEIB) Dynamic Global Vegetation Model (DGVM) applicable to simulating mechanistic tree mortality induced by the climatic impacts via individual-tree-scale ecophysiology such as hydraulic failure and carbon starvation. In this study, we present the new model, SEIB-originated Terrestrial Ecosystem Dynamics (S-TEDy) model, and the computation results were compared with observations collected at a field site in a Bornean tropical rainforest. Furthermore, after validating the model's performance, numerical experiments addressing a future of the tropical rainforest were conducted using some global climate model (GCM) simulation outputs.

  5. Ionizing radiation induced cataracts: Recent biological and mechanistic developments and perspectives for future research.

    Science.gov (United States)

    Ainsbury, Elizabeth A; Barnard, Stephen; Bright, Scott; Dalke, Claudia; Jarrin, Miguel; Kunze, Sarah; Tanner, Rick; Dynlacht, Joseph R; Quinlan, Roy A; Graw, Jochen; Kadhim, Munira; Hamada, Nobuyuki

    The lens of the eye has long been considered as a radiosensitive tissue, but recent research has suggested that the radiosensitivity is even greater than previously thought. The 2012 recommendation of the International Commission on Radiological Protection (ICRP) to substantially reduce the annual occupational equivalent dose limit for the ocular lens has now been adopted in the European Union and is under consideration around the rest of the world. However, ICRP clearly states that the recommendations are chiefly based on epidemiological evidence because there are a very small number of studies that provide explicit biological, mechanistic evidence at doses <2Gy. This paper aims to present a review of recently published information on the biological and mechanistic aspects of cataracts induced by exposure to ionizing radiation (IR). The data were compiled by assessing the pertinent literature in several distinct areas which contribute to the understanding of IR induced cataracts, information regarding lens biology and general processes of cataractogenesis. Results from cellular and tissue level studies and animal models, and relevant human studies, were examined. The main focus was the biological effects of low linear energy transfer IR, but dosimetry issues and a number of other confounding factors were also considered. The results of this review clearly highlight a number of gaps in current knowledge. Overall, while there have been a number of recent advances in understanding, it remains unknown exactly how IR exposure contributes to opacification. A fuller understanding of how exposure to relatively low doses of IR promotes induction and/or progression of IR-induced cataracts will have important implications for prevention and treatment of this disease, as well as for the field of radiation protection. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  6. CO2 Mass transfer model for carbonic anhydrase-enhanced aqueous MDEA solutions

    DEFF Research Database (Denmark)

    Gladis, Arne Berthold; Deslauriers, Maria Gundersen; Neerup, Randi

    2018-01-01

    In this study a CO2 mass transfer model was developed for carbonic anhydrase-enhanced MDEA solutions based on a mechanistic kinetic enzyme model. Four different enzyme models were compared in their ability to predict the liquid side mass transfer coefficient at temperatures in the range of 298...

  7. Modeling the effects of binary mixtures on survival in time.

    NARCIS (Netherlands)

    Baas, J.; van Houte, B.P.P.; van Gestel, C.A.M.; Kooijman, S.A.L.M.

    2007-01-01

    In general, effects of mixtures are difficult to describe, and most of the models in use are descriptive in nature and lack a strong mechanistic basis. The aim of this experiment was to develop a process-based model for the interpretation of mixture toxicity measurements, with effects of binary

  8. Mechanistic Approach to Understanding the Toxicity of the Azole Fungicide Triadimefon to a Nontarget Aquatic Insect and Implications for Exposure Assessment

    Science.gov (United States)

    We utilized mechanistic and stereoselective based in vitro metabolism assays and sublethal exposures of triadimefon to gain insight into the extent of carbonyl reduction and the toxic mode of action of triadimefon with black fly (Diptera: Simuliidae) larvae.

  9. Life at the Common Denominator: Mechanistic and Quantitative Biology for the Earth and Space Sciences

    Science.gov (United States)

    Hoehler, Tori M.

    2010-01-01

    The remarkable challenges and possibilities of the coming few decades will compel the biogeochemical and astrobiological sciences to characterize the interactions between biology and its environment in a fundamental, mechanistic, and quantitative fashion. The clear need for integrative and scalable biology-environment models is exemplified in the Earth sciences by the challenge of effectively addressing anthropogenic global change, and in the space sciences by the challenge of mounting a well-constrained yet sufficiently adaptive and inclusive search for life beyond Earth. Our understanding of the life-planet interaction is still, however, largely empirical. A variety of approaches seek to move from empirical to mechanistic descriptions. One approach focuses on the relationship between biology and energy, which is at once universal (all life requires energy), unique (life manages energy flow in a fashion not seen in abiotic systems), and amenable to characterization and quantification in thermodynamic terms. Simultaneously, a focus on energy flow addresses a critical point of interface between life and its geological, chemical, and physical environment. Characterizing and quantifying this relationship for life on Earth will support the development of integrative and predictive models for biology-environment dynamics. Understanding this relationship at its most fundamental level holds potential for developing concepts of habitability and biosignatures that can optimize astrobiological exploration strategies and are extensible to all life.

  10. Modeling food matrix effects on chemical reactivity: Challenges and perspectives.

    Science.gov (United States)

    Capuano, Edoardo; Oliviero, Teresa; van Boekel, Martinus A J S

    2017-06-29

    The same chemical reaction may be different in terms of its position of the equilibrium (i.e., thermodynamics) and its kinetics when studied in different foods. The diversity in the chemical composition of food and in its structural organization at macro-, meso-, and microscopic levels, that is, the food matrix, is responsible for this difference. In this viewpoint paper, the multiple, and interconnected ways the food matrix can affect chemical reactivity are summarized. Moreover, mechanistic and empirical approaches to explain and predict the effect of food matrix on chemical reactivity are described. Mechanistic models aim to quantify the effect of food matrix based on a detailed understanding of the chemical and physical phenomena occurring in food. Their applicability is limited at the moment to very simple food systems. Empirical modeling based on machine learning combined with data-mining techniques may represent an alternative, useful option to predict the effect of the food matrix on chemical reactivity and to identify chemical and physical properties to be further tested. In such a way the mechanistic understanding of the effect of the food matrix on chemical reactions can be improved.

  11. Secondary clarifier hybrid model calibration in full scale pulp and paper activated sludge wastewater treatment

    Energy Technology Data Exchange (ETDEWEB)

    Sreckovic, G.; Hall, E.R. [British Columbia Univ., Dept. of Civil Engineering, Vancouver, BC (Canada); Thibault, J. [Laval Univ., Dept. of Chemical Engineering, Ste-Foy, PQ (Canada); Savic, D. [Exeter Univ., School of Engineering, Exeter (United Kingdom)

    1999-05-01

    The issue of proper model calibration techniques applied to mechanistic mathematical models relating to activated sludge systems was discussed. Such calibrations are complex because of the non-linearity and multi-model objective functions of the process. This paper presents a hybrid model which was developed using two techniques to model and calibrate secondary clarifier parts of an activated sludge system. Genetic algorithms were used to successfully calibrate the settler mechanistic model, and neural networks were used to reduce the error between the mechanistic model output and real world data. Results of the modelling study show that the long term response of a one-dimensional settler mechanistic model calibrated by genetic algorithms and compared to full scale plant data can be improved by coupling the calibrated mechanistic model to as black-box model, such as a neural network. 11 refs., 2 figs.

  12. Mechanistic Features of Nanodiamonds in the Lapping of Magnetic Heads

    Directory of Open Access Journals (Sweden)

    Xionghua Jiang

    2014-01-01

    Full Text Available Nanodiamonds, which are the main components of slurry in the precision lapping process of magnetic heads, play an important role in surface quality. This paper studies the mechanistic features of nanodiamond embedment into a Sn plate in the lapping process. This is the first study to develop mathematical models for nanodiamond embedment. Such models can predict the optimum parameters for particle embedment. From the modeling calculations, the embedded pressure satisfies p0=3/2·W/πa2 and the indentation depth satisfies δ=k1P/HV. Calculation results reveal that the largest embedded pressure is 731.48 GPa and the critical indentation depth δ is 7 nm. Atomic force microscopy (AFM, scanning electron microscopy (SEM, and Auger electron spectroscopy (AES were used to carry out surface quality detection and analysis of the disk head. Both the formation of black spots on the surface and the removal rate have an important correlation with the size of nanodiamonds. The results demonstrate that an improved removal rate (21 nm·min−1 can be obtained with 100 nm diamonds embedded in the plate.

  13. Mechanistic features of nanodiamonds in the lapping of magnetic heads.

    Science.gov (United States)

    Jiang, Xionghua; Chen, Zhenxing; Wolfram, Joy; Yang, Zhizhou

    2014-01-01

    Nanodiamonds, which are the main components of slurry in the precision lapping process of magnetic heads, play an important role in surface quality. This paper studies the mechanistic features of nanodiamond embedment into a Sn plate in the lapping process. This is the first study to develop mathematical models for nanodiamond embedment. Such models can predict the optimum parameters for particle embedment. From the modeling calculations, the embedded pressure satisfies p 0 = (3/2) · (W/πa (2)) and the indentation depth satisfies δ = k1√P/HV. Calculation results reveal that the largest embedded pressure is 731.48 GPa and the critical indentation depth δ is 7 nm. Atomic force microscopy (AFM), scanning electron microscopy (SEM), and Auger electron spectroscopy (AES) were used to carry out surface quality detection and analysis of the disk head. Both the formation of black spots on the surface and the removal rate have an important correlation with the size of nanodiamonds. The results demonstrate that an improved removal rate (21 nm · min(-1)) can be obtained with 100 nm diamonds embedded in the plate.

  14. Mechanistic insights on the cycloisomerization of polyunsaturated precursors catalyzed by platinum and gold complexes.

    Science.gov (United States)

    Soriano, Elena; Marco-Contelles, José

    2009-08-18

    Organometallic chemistry provides powerful tools for the stereocontrolled synthesis of heterocycles and carbocycles. The electrophilic transition metals Pt(II) and Au(I, III) are efficient catalysts in these transitions and promote a variety of organic transformations of unsaturated precursors. These reactions produce functionalized cyclic and acyclic scaffolds for the synthesis of natural and non-natural products efficiently, under mild conditions, and with excellent chemoselectivity. Because these transformations are strongly substrate-dependent, they are versatile and may yield diverse molecular scaffolds. Therefore, synthetic chemists need a mechanistic interpretation to optimize this reaction process and design a new generation of catalysts. However, so far, no intermediate species has been isolated or characterized, so the formulated mechanistic hypotheses have been primarily based on labeling studies or trapping reactions. Recently, theoretical DFT studies have become a useful tool in our research, giving us insights into the key intermediates and into a variety of plausible reaction pathways. In this Account, we present a comprehensive mechanistic overview of transformations promoted by Pt and Au in a non-nucleophilic medium based on quantum-mechanical studies. The calculations are consistent with the experimental observations and provide fundamental insights into the versatility of these reaction processes. The reactivity of these metals results from their peculiar Lewis acid properties: the alkynophilic character of these soft metals and the pi-acid activation of unsaturated groups promotes the intra- or intermolecular attack of a nucleophile. 1,n-Enynes (n = 3-8) are particularly important precursors, and their transformation may yield a variety of cycloadducts depending on the molecular structure. However, the calculations suggest that these different cyclizations would have closely related reaction mechanisms, and we propose a unified mechanistic

  15. A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models

    Science.gov (United States)

    Brugnach, M.; Neilson, R.; Bolte, J.

    2001-12-01

    The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in

  16. Mechanistic Studies at the Interface Between Organometallic Chemistry and Homogeneous Catalysis

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Charles P

    2012-11-14

    Mechanistic Studies at the Interface Between Organometallic Chemistry and Homogeneous Catalysis Charles P. Casey, Principal Investigator Department of Chemistry, University of Wisconsin - Madison, Madison, Wisconsin 53706 Phone 608-262-0584 FAX: 608-262-7144 Email: casey@chem.wisc.edu http://www.chem.wisc.edu/main/people/faculty/casey.html Executive Summary. Our goal was to learn the intimate mechanistic details of reactions involved in homogeneous catalysis and to use the insight we gain to develop new and improved catalysts. Our work centered on the hydrogenation of polar functional groups such as aldehydes and ketones and on hydroformylation. Specifically, we concentrated on catalysts capable of simultaneously transferring hydride from a metal center and a proton from an acidic oxygen or nitrogen center to an aldehyde or ketone. An economical iron based catalyst was developed and patented. Better understanding of fundamental organometallic reactions and catalytic processes enabled design of energy and material efficient chemical processes. Our work contributed to the development of catalysts for the selective and mild hydrogenation of ketones and aldehydes; this will provide a modern green alternative to reductions by LiAlH4 and NaBH4, which require extensive work-up procedures and produce waste streams. (C5R4OH)Ru(CO)2H Hydrogenation Catalysts. Youval Shvo described a remarkable catalytic system in which the key intermediate (C5R4OH)Ru(CO)2H (1) has an electronically coupled acidic OH unit and a hydridic RuH unit. Our efforts centered on understanding and improving upon this important catalyst for reduction of aldehydes and ketones. Our mechanistic studies established that the reduction of aldehydes by 1 to produce alcohols and a diruthenium bridging hydride species occurs much more rapidly than regeneration of the ruthenium hydride from the diruthenium bridging hydride species. Our mechanistic studies require simultaneous transfer of hydride from ruthenium to

  17. Electrochemical processes and mechanistic aspects of field-effect sensors for biomolecules

    Science.gov (United States)

    Huang, Weiguo; Diallo, Abdou Karim; Dailey, Jennifer L.; Besar, Kalpana

    2017-01-01

    Electronic biosensing is a leading technology for determining concentrations of biomolecules. In some cases, the presence of an analyte molecule induces a measured change in current flow, while in other cases, a new potential difference is established. In the particular case of a field effect biosensor, the potential difference is monitored as a change in conductance elsewhere in the device, such as across a film of an underlying semiconductor. Often, the mechanisms that lead to these responses are not specifically determined. Because improved understanding of these mechanisms will lead to improved performance, it is important to highlight those studies where various mechanistic possibilities are investigated. This review explores a range of possible mechanistic contributions to field-effect biosensor signals. First, we define the field-effect biosensor and the chemical interactions that lead to the field effect, followed by a section on theoretical and mechanistic background. We then discuss materials used in field-effect biosensors and approaches to improving signals from field-effect biosensors. We specifically cover the biomolecule interactions that produce local electric fields, structures and processes at interfaces between bioanalyte solutions and electronic materials, semiconductors used in biochemical sensors, dielectric layers used in top-gated sensors, and mechanisms for converting the surface voltage change to higher signal/noise outputs in circuits. PMID:29238595

  18. Xanthusbase: adapting wikipedia principles to a model organism database

    OpenAIRE

    Arshinoff, Bradley I.; Suen, Garret; Just, Eric M.; Merchant, Sohel M.; Kibbe, Warren A.; Chisholm, Rex L.; Welch, Roy D.

    2006-01-01

    xanthusBase () is the official model organism database (MOD) for the social bacterium Myxococcus xanthus. In many respects, M.xanthus represents the pioneer model organism (MO) for studying the genetic, biochemical, and mechanistic basis of prokaryotic multicellularity, a topic that has garnered considerable attention due to the significance of biofilms in both basic and applied microbiology research. To facilitate its utility, the design of xanthusBase incorporates open-source software, leve...

  19. Overview of the South African mechanistic pavement design analysis method

    CSIR Research Space (South Africa)

    Theyse, HL

    1996-01-01

    Full Text Available A historical overview of the South African mechanistic pavement design method, from its development in the early 1970s to the present, is presented. Material characterization, structural analysis, and pavement life prediction are discussed...

  20. Mechanistic Target of Rapamycin-Independent Antidepressant Effects of (R)-Ketamine in a Social Defeat Stress Model.

    Science.gov (United States)

    Yang, Chun; Ren, Qian; Qu, Youge; Zhang, Ji-Chun; Ma, Min; Dong, Chao; Hashimoto, Kenji

    2018-01-01

    The role of the mechanistic target of rapamycin (mTOR) signaling in the antidepressant effects of ketamine is controversial. In addition to mTOR, extracellular signal-regulated kinase (ERK) is a key signaling molecule in prominent pathways that regulate protein synthesis. (R)-Ketamine has a greater potency and longer-lasting antidepressant effects than (S)-ketamine. Here we investigated whether mTOR signaling and ERK signaling play a role in the antidepressant effects of two enantiomers. The effects of mTOR inhibitors (rapamycin and AZD8055) and an ERK inhibitor (SL327) on the antidepressant effects of ketamine enantiomers in the chronic social defeat stress (CSDS) model (n = 7 or 8) and on those of ketamine enantiomers in these signaling pathways in mouse brain regions were examined. The intracerebroventricular infusion of rapamycin or AZD8055 blocked the antidepressant effects of (S)-ketamine, but not (R)-ketamine, in the CSDS model. Furthermore, (S)-ketamine, but not (R)-ketamine, significantly attenuated the decreased phosphorylation of mTOR and its downstream effector, ribosomal protein S6 kinase, in the prefrontal cortex of susceptible mice after CSDS. Pretreatment with SL327 blocked the antidepressant effects of (R)-ketamine but not (S)-ketamine. Moreover, (R)-ketamine, but not (S)-ketamine, significantly attenuated the decreased phosphorylation of ERK and its upstream effector, mitogen-activated protein kinase/ERK kinase, in the prefrontal cortex and hippocampal dentate gyrus of susceptible mice after CSDS. This study suggests that mTOR plays a role in the antidepressant effects of (S)-ketamine, but not (R)-ketamine, and that ERK plays a role in (R)-ketamine's antidepressant effects. Thus, it is unlikely that the activation of mTOR signaling is necessary for antidepressant actions of (R)-ketamine. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  1. Integrating Cellular Metabolism into a Multiscale Whole-Body Model

    Science.gov (United States)

    Krauss, Markus; Schaller, Stephan; Borchers, Steffen; Findeisen, Rolf; Lippert, Jörg; Kuepfer, Lars

    2012-01-01

    Cellular metabolism continuously processes an enormous range of external compounds into endogenous metabolites and is as such a key element in human physiology. The multifaceted physiological role of the metabolic network fulfilling the catalytic conversions can only be fully understood from a whole-body perspective where the causal interplay of the metabolic states of individual cells, the surrounding tissue and the whole organism are simultaneously considered. We here present an approach relying on dynamic flux balance analysis that allows the integration of metabolic networks at the cellular scale into standardized physiologically-based pharmacokinetic models at the whole-body level. To evaluate our approach we integrated a genome-scale network reconstruction of a human hepatocyte into the liver tissue of a physiologically-based pharmacokinetic model of a human adult. The resulting multiscale model was used to investigate hyperuricemia therapy, ammonia detoxification and paracetamol-induced toxication at a systems level. The specific models simultaneously integrate multiple layers of biological organization and offer mechanistic insights into pathology and medication. The approach presented may in future support a mechanistic understanding in diagnostics and drug development. PMID:23133351

  2. DOE/CEC [Department of Energy/Commission of the European Communities] workshop on critical evaluation of radiobiological data to biophysical modeling

    International Nuclear Information System (INIS)

    1988-01-01

    The Department of Energy's Office of Health and Environmental Research and the Commission of the European Communities (CEC) Radiation Protection Program support the majority of Research in the Field of Radiobiological Modeling. This field of science develops models based on scientifically sound principles to predict biological response (at the cellular, molecular, and animal level) to exposure to low level ionizing radiation. Biophysical models are an important tool for estimating response of ionizing radiation at low doses and dose rates. Generally speaking, the biophysical models can be classified into two groups: (1) mechanistic models and (2) phenomenological models. Mechanistic models are based on some assumptions about the physical, chemical, or biological mechanisms of action in association with radiobiological data whereas the phenomenological models are based solely on available experimental data on radiobiological effects with less emphasis on mechanisms of action. There are a number of these models which are being developed. Since model builders rely on radiobiological data available in the literature either to develop mechanistic or phenomenological models, it is essential that a critical evaluation of existing radiobiological data be made and data that is generally considered good and most appropriate for biophysical modeling be identified. A Workshop jointly sponsored by the DOE and the CEC was held at Oak Ridge, Tennessee from June 23--25, 1988, to review the data available from physical and chemical, cellular and molecular and animal studies with ionizing radiation

  3. Mechanistic pathways of recognition of a solvent-inaccessible cavity of protein by a ligand

    Science.gov (United States)

    Mondal, Jagannath; Pandit, Subhendu; Dandekar, Bhupendra; Vallurupalli, Pramodh

    One of the puzzling questions in the realm of protein-ligand recognition is how a solvent-inaccessible hydrophobic cavity of a protein gets recognized by a ligand. We address the topic by simulating, for the first time, the complete binding process of benzene from aqueous media to the well-known buried cavity of L99A T4 Lysozyme at an atomistic resolution. Our multiple unbiased microsecond-long trajectories, which were completely blind to the location of target binding site, are able to unequivocally identify the kinetic pathways along which benzene molecule meanders across the solvent and protein and ultimately spontaneously recognizes the deeply buried cavity of L99A T4 Lysozyme at an accurate precision. Our simulation, combined with analysis based on markov state model and free energy calculation, reveals that there are more than one distinct ligand binding pathways. Intriguingly, each of the identified pathways involves the transient opening of a channel of the protein prior to ligand binding. The work will also decipher rich mechanistic details on unbinding kinetics of the ligand as obtained from enhanced sampling techniques.

  4. Organophotocatalysis: Insights into the Mechanistic Aspects of Thiourea-Mediated Intermolecular [2+2] Photocycloadditions.

    Science.gov (United States)

    Vallavoju, Nandini; Selvakumar, Sermadurai; Pemberton, Barry C; Jockusch, Steffen; Sibi, Mukund P; Sivaguru, Jayaraman

    2016-04-25

    Mechanistic investigations of the intermolecular [2+2] photocycloaddition of coumarin with tetramethylethylene mediated by thiourea catalysts reveal that the reaction is enabled by a combination of minimized aggregation, enhanced intersystem crossing, and altered excited-state lifetime(s). These results clarify how the excited-state reactivity can be manipulated through catalyst-substrate interactions and reveal a third mechanistic pathway for thiourea-mediated organo-photocatalysis. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  6. Analytical techniques for mechanistic characterization of EUV photoresists

    Science.gov (United States)

    Grzeskowiak, Steven; Narasimhan, Amrit; Murphy, Michael; Ackerman, Christian; Kaminsky, Jake; Brainard, Robert L.; Denbeaux, Greg

    2017-03-01

    Extreme ultraviolet (EUV, 13.5 nm) lithography is the prospective technology for high volume manufacturing by the microelectronics industry. Significant strides towards achieving adequate EUV source power and availability have been made recently, but a limited rate of improvement in photoresist performance still delays the implementation of EUV. Many fundamental questions remain to be answered about the exposure mechanisms of even the relatively well understood chemically amplified EUV photoresists. Moreover, several groups around the world are developing revolutionary metal-based resists whose EUV exposure mechanisms are even less understood. Here, we describe several evaluation techniques to help elucidate mechanistic details of EUV exposure mechanisms of chemically amplified and metal-based resists. EUV absorption coefficients are determined experimentally by measuring the transmission through a resist coated on a silicon nitride membrane. Photochemistry can be evaluated by monitoring small outgassing reaction products to provide insight into photoacid generator or metal-based resist reactivity. Spectroscopic techniques such as thin-film Fourier transform infrared (FTIR) spectroscopy can measure the chemical state of a photoresist system pre- and post-EUV exposure. Additionally, electrolysis can be used to study the interaction between photoresist components and low energy electrons. Collectively, these techniques improve our current understanding of photomechanisms for several EUV photoresist systems, which is needed to develop new, better performing materials needed for high volume manufacturing.

  7. DNA damage and radical reactions: Mechanistic aspects, formation in cells and repair studies

    International Nuclear Information System (INIS)

    Cadet, J.; Ravanat, J.L.; Carell, T.; Cellai, L.; Chatgilialoglu, Ch.; Gimisis, Th.; Miranda, M.; O'Neill, P.; Robert, M.

    2008-01-01

    Several examples of oxidative and reductive reactions of DNA components that lead to single and tandem modifications are discussed in this review. These include nucleophilic addition reactions of the one-electron oxidation-mediated guanine radical cation and the one-electron reduced intermediate of 8-bromo-purine 2'-de-oxy-ribo-nucleosides that give rise to either an oxidizing guanine radical or related 5',8-cyclo-purine nucleosides. In addition, mechanistic insights into the reductive pathways involved in the photolyase induced reversal of cyclo-buta-cli-pyrimidine and pyrimidine (6-4) pyrimidone photoproducts are provided. Evidence for the occurrence and validation in cellular DNA of (OH) · radical degradation pathways of guanine that have been established in model systems has been gained from the accurate measurement of degradation products. Relevant information on biochemical aspects of the repair of single and clustered oxidatively generated damage to DNA has been gained from detailed investigations that rely on the synthesis of suitable modified probes. Thus the preparation of stable carbocyclic derivatives of purine nucleoside containing defined sequence oligonucleotides has allowed detailed crystallographic studies of the recognition step of the base damage by enzymes implicated in the base excision repair (BER) pathway. Detailed insights are provided on the BER processing of non-double strand break bi-stranded clustered damage that may consist of base lesions, a single strand break or abasic sites and represent one of the main deleterious classes of radiation-induced DNA damage. (authors)

  8. Mechanistic study on spraying of blended biodiesel using phase Doppler anemometry

    International Nuclear Information System (INIS)

    Kamrak, Juthamas; Kongsombut, Benjapol; Grehan, Gerard; Saengkaew, Sawitree; Kim, Kyo-Seon; Charinpanitkul, Tawatchai

    2009-01-01

    Droplet size and dynamics of blended palm oil-based fatty acid methyl ester (FAME) and diesel oil spray were mechanistically investigated using a phase Doppler anemometry. A two-fluid atomizer was applied for dispersing viscous blends of blended biodiesel oil with designated flow rates. It was experimentally found that the atomizer could generate a spray with large droplets with Sauter mean diameters of ca. 30 μm at low air injection pressure. Such large droplets traveled with a low velocity along their trajectory after emerging from the nozzle tip. The viscosity of blended biodiesel could significantly affect the atomizing process, resulting in the controlled droplet size distribution. Blended biodiesel with a certain fraction of palm oil-based FAME would be consistently atomized owing to its low viscosity. However, the viscosity could exert only a small effect on the droplet velocity profile with the air injection pressure higher than 0.2 MPa.

  9. Evaluating the mechanistic evidence and key data gaps in assessing the potential carcinogenicity of carbon nanotubes and nanofibers in humans

    NARCIS (Netherlands)

    Kuempel, Eileen D; Jaurand, Marie-Claude; Møller, Peter; Morimoto, Yasuo; Kobayashi, Norihiro; Pinkerton, Kent E; Sargent, Linda M; Vermeulen, Roel C H; Fubini, Bice; Kane, Agnes B

    2017-01-01

    In an evaluation of carbon nanotubes (CNTs) for the IARC Monograph 111, the Mechanisms Subgroup was tasked with assessing the strength of evidence on the potential carcinogenicity of CNTs in humans. The mechanistic evidence was considered to be not strong enough to alter the evaluations based on the

  10. Does Mechanistic Thinking Improve Student Success in Organic Chemistry?

    Science.gov (United States)

    Grove, Nathaniel P.; Cooper, Melanie M.; Cox, Elizabeth L.

    2012-01-01

    The use of the curved-arrow notation to depict electron flow during mechanistic processes is one of the most important representational conventions in the organic chemistry curriculum. Our previous research documented a disturbing trend: when asked to predict the products of a series of reactions, many students do not spontaneously engage in…

  11. Toward a Rational and Mechanistic Account of Mental Effort.

    Science.gov (United States)

    Shenhav, Amitai; Musslick, Sebastian; Lieder, Falk; Kool, Wouter; Griffiths, Thomas L; Cohen, Jonathan D; Botvinick, Matthew M

    2017-07-25

    In spite of its familiar phenomenology, the mechanistic basis for mental effort remains poorly understood. Although most researchers agree that mental effort is aversive and stems from limitations in our capacity to exercise cognitive control, it is unclear what gives rise to those limitations and why they result in an experience of control as costly. The presence of these control costs also raises further questions regarding how best to allocate mental effort to minimize those costs and maximize the attendant benefits. This review explores recent advances in computational modeling and empirical research aimed at addressing these questions at the level of psychological process and neural mechanism, examining both the limitations to mental effort exertion and how we manage those limited cognitive resources. We conclude by identifying remaining challenges for theoretical accounts of mental effort as well as possible applications of the available findings to understanding the causes of and potential solutions for apparent failures to exert the mental effort required of us.

  12. A mechanistic based approach for enhancing buccal mucoadhesion of chitosan

    DEFF Research Database (Denmark)

    Meng-Lund, Emil; Muff-Westergaard, Christian; Sander, Camilla

    2014-01-01

    Mucoadhesive buccal drug delivery systems can enhance rapid drug absorption by providing an increased retention time at the site of absorption and a steep concentration gradient. An understanding of the mechanisms behind mucoadhesion of polymers, e.g. chitosan, is necessary for improving the muco......Mucoadhesive buccal drug delivery systems can enhance rapid drug absorption by providing an increased retention time at the site of absorption and a steep concentration gradient. An understanding of the mechanisms behind mucoadhesion of polymers, e.g. chitosan, is necessary for improving...... the mucoadhesiveness of buccal formulations. The interaction between chitosan of different chain lengths and porcine gastric mucin (PGM) was studied using a complex coacervation model (CCM), isothermal titration calorimetry (ITC) and a tensile detachment model (TDM). The effect of pH was assessed in all three models...... and the approach to add a buffer to chitosan based drug delivery systems is a means to optimize and enhance buccal drug absorption. The CCM demonstrated optimal interactions between chitosan and PGM at pH 5.2. The ITC experiments showed a significantly increase in affinity between chitosan and PGM at pH 5...

  13. Predicting the impact of long-term temperature changes on the epidemiology and control of schistosomiasis: a mechanistic model.

    Directory of Open Access Journals (Sweden)

    Tara D Mangal

    2008-01-01

    Full Text Available Many parasites of medical and veterinary importance are transmitted by cold-blooded intermediate hosts or vectors, the abundance of which will vary with ambient temperatures, potentially altering disease prevalence. In particular, if global climate change will increase mean ambient temperature in a region endemic with a human pathogen then it is possible that the incidence of disease will similarly increase. Here we examine this possibility by using a mathematical model to explore the effects of increasing long-term mean ambient temperature on the prevalence and abundance of the parasite Schistosoma mansoni, the causative agent of schistosomiasis in humans.The model showed that the impact of temperature on disease prevalence and abundance is not straightforward; the mean infection burden in humans increases up to 30 degrees C, but then crashes at 35 degrees C, primarily due to increased mortalities of the snail intermediate host. In addition, increased temperatures changed the dynamics of disease from stable, endemic infection to unstable, epidemic cycles at 35 degrees C. However, the prevalence of infection was largely unchanged by increasing temperatures. Temperature increases also affected the response of the model to changes in each parameter, indicating certain control strategies may become less effective with local temperature changes. At lower temperatures, the most effective single control strategy is to target the adult parasites through chemotherapy. However, as temperatures increase, targeting the snail intermediate hosts, for example through molluscicide use, becomes more effective.These results show that S. mansoni will not respond to increased temperatures in a linear fashion, and the optimal control strategy is likely to change as temperatures change. It is only through a mechanistic approach, incorporating the combined effects of temperature on all stages of the life-cycle, that we can begin to predict the consequences of climate

  14. Mechanistic Bases of Neurotoxicity Provoked by Fatty Acids Accumulating in MCAD and LCHAD Deficiencies

    Directory of Open Access Journals (Sweden)

    Alexandre U. Amaral PhD

    2017-03-01

    Full Text Available Fatty acid oxidation defects (FAODs are inherited metabolic disorders caused by deficiency of specific enzyme activities or transport proteins involved in the mitochondrial catabolism of fatty acids. Medium-chain fatty acyl-CoA dehydrogenase (MCAD and long-chain 3-hydroxyacyl-CoA dehydrogenase (LCHAD deficiencies are relatively common FAOD biochemically characterized by tissue accumulation of medium-chain fatty acids and long-chain 3-hydroxy fatty acids and their carnitine derivatives, respectively. Patients with MCAD deficiency usually have episodic encephalopathic crises and liver biochemical alterations especially during crises of metabolic decompensation, whereas patients with LCHAD deficiency present severe hepatopathy, cardiomyopathy, and acute and/or progressive encephalopathy. Although neurological symptoms are common features, the underlying mechanisms responsible for the brain damage in these disorders are still under debate. In this context, energy deficiency due to defective fatty acid catabolism and hypoglycemia/hypoketonemia has been postulated to contribute to the pathophysiology of MCAD and LCHAD deficiencies. However, since energetic substrate supplementation is not able to reverse or prevent symptomatology in some patients, it is presumed that other pathogenetic mechanisms are implicated. Since worsening of clinical symptoms during crises is accompanied by significant increases in the concentrations of the accumulating fatty acids, it is conceivable that these compounds may be potentially neurotoxic. We will briefly summarize the current knowledge obtained from patients with these disorders, as well as from animal studies demonstrating deleterious effects of the major fatty acids accumulating in MCAD and LCHAD deficiencies, indicating that disruption of mitochondrial energy, redox, and calcium homeostasis is involved in the pathophysiology of the cerebral damage in these diseases. It is presumed that these findings based on the

  15. Existing pavement input information for the mechanistic-empirical pavement design guide.

    Science.gov (United States)

    2009-02-01

    The objective of this study is to systematically evaluate the Iowa Department of Transportations (DOTs) existing Pavement Management Information System (PMIS) with respect to the input information required for Mechanistic-Empirical Pavement Des...

  16. Towards a resource-based habitat approach for spatial modelling of vector-borne disease risks.

    Science.gov (United States)

    Hartemink, Nienke; Vanwambeke, Sophie O; Purse, Bethan V; Gilbert, Marius; Van Dyck, Hans

    2015-11-01

    Given the veterinary and public health impact of vector-borne diseases, there is a clear need to assess the suitability of landscapes for the emergence and spread of these diseases. Current approaches for predicting disease risks neglect key features of the landscape as components of the functional habitat of vectors or hosts, and hence of the pathogen. Empirical-statistical methods do not explicitly incorporate biological mechanisms, whereas current mechanistic models are rarely spatially explicit; both methods ignore the way animals use the landscape (i.e. movement ecology). We argue that applying a functional concept for habitat, i.e. the resource-based habitat concept (RBHC), can solve these issues. The RBHC offers a framework to identify systematically the different ecological resources that are necessary for the completion of the transmission cycle and to relate these resources to (combinations of) landscape features and other environmental factors. The potential of the RBHC as a framework for identifying suitable habitats for vector-borne pathogens is explored and illustrated with the case of bluetongue virus, a midge-transmitted virus affecting ruminants. The concept facilitates the study of functional habitats of the interacting species (vectors as well as hosts) and provides new insight into spatial and temporal variation in transmission opportunities and exposure that ultimately determine disease risks. It may help to identify knowledge gaps and control options arising from changes in the spatial configuration of key resources across the landscape. The RBHC framework may act as a bridge between existing mechanistic and statistical modelling approaches. © 2014 The Authors. Biological Reviews published by John Wiley & Sons Ltd on behalf of Cambridge Philosophical Society.

  17. Hidden Hydride Transfer as a Decisive Mechanistic Step in the Reactions of the Unligated Gold Carbide [AuC]+ with Methane under Ambient Conditions.

    Science.gov (United States)

    Li, Jilai; Zhou, Shaodong; Schlangen, Maria; Weiske, Thomas; Schwarz, Helmut

    2016-10-10

    The reactivity of the cationic gold carbide [AuC] + (bearing an electrophilic carbon atom) towards methane has been studied using Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS). The product pairs generated, that is, Au + /C 2 H 4 , [Au(C 2 H 2 )] + /H 2 , and [C 2 H 3 ] + /AuH, point to the breaking and making of C-H, C-C, and H-H bonds under single-collision conditions. The mechanisms of these rather efficient reactions have been elucidated by high-level quantum-chemical calculations. As a major result, based on molecular orbital and NBO-based charge analysis, an unprecedented hydride transfer from methane to the carbon atom of [AuC] + has been identified as a key step. Also, the origin of this novel mechanistic scenario has been addressed. The mechanistic insights derived from this study may provide guidance for the rational design of carbon-based catalysts. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Hydrogel-based 3D model of patient-derived prostate xenograft tumors suitable for drug screening.

    Science.gov (United States)

    Fong, Eliza L S; Martinez, Mariane; Yang, Jun; Mikos, Antonios G; Navone, Nora M; Harrington, Daniel A; Farach-Carson, Mary C

    2014-07-07

    The lack of effective therapies for bone metastatic prostate cancer (PCa) underscores the need for accurate models of the disease to enable the discovery of new therapeutic targets and to test drug sensitivities of individual tumors. To this end, the patient-derived xenograft (PDX) PCa model using immunocompromised mice was established to model the disease with greater fidelity than is possible with currently employed cell lines grown on tissue culture plastic. However, poorly adherent PDX tumor cells exhibit low viability in standard culture, making it difficult to manipulate these cells for subsequent controlled mechanistic studies. To overcome this challenge, we encapsulated PDX tumor cells within a three-dimensional hyaluronan-based hydrogel and demonstrated that the hydrogel maintains PDX cell viability with continued native androgen receptor expression. Furthermore, a differential sensitivity to docetaxel, a chemotherapeutic drug, was observed as compared to a traditional PCa cell line. These findings underscore the potential impact of this novel 3D PDX PCa model as a diagnostic platform for rapid drug evaluation and ultimately push personalized medicine toward clinical reality.

  19. Generative Mechanistic Explanation Building in Undergraduate Molecular and Cellular Biology

    Science.gov (United States)

    Southard, Katelyn M.; Espindola, Melissa R.; Zaepfel, Samantha D.; Bolger, Molly S.

    2017-01-01

    When conducting scientific research, experts in molecular and cellular biology (MCB) use specific reasoning strategies to construct mechanistic explanations for the underlying causal features of molecular phenomena. We explored how undergraduate students applied this scientific practice in MCB. Drawing from studies of explanation building among…

  20. A mechanistic model to study the thermal ecology of a southeastern pacific dominant intertidal mussel and implications for climate change.

    Science.gov (United States)

    Finke, G R; Bozinovic, F; Navarrete, S A

    2009-01-01

    Developing mechanistic models to predict an organism's body temperature facilitates the study of physiological stresses caused by extreme climatic conditions the species might have faced in the past or making predictions about changes to come in the near future. Because the models combine empirical observation of different climatic variables with essential morphological attributes of the species, it is possible to examine specific aspects of predicted climatic changes. Here, we develop a model for the competitively dominant intertidal mussel Perumytilus purpuratus that estimates body temperature on the basis of meteorological and tidal data with an average difference (+/-SE) of 0.410 degrees +/- 0.0315 degrees C in comparison with a field-deployed temperature logger. Modeled body temperatures of P. purpuratus in central Chile regularly exceeded 30 degrees C in summer months, and values as high as 38 degrees C were found. These results suggest that the temperatures reached by mussels in the intertidal zone in central Chile are not sufficiently high to induce significant mortality on adults of this species; however, because body temperatures >40 degrees C can be lethal for this species, sublethal effects on physiological performance warrant further investigation. Body temperatures of mussels increased sigmoidally with increasing tidal height. Body temperatures of individuals from approximately 70% of the tidal range leveled off and did not increase any further with increasing tidal height. Finally, body size played an important role in determining body temperature. A hypothetical 5-cm-long mussel (only 1 cm longer than mussels found in nature) did reach potentially lethal body temperatures, suggesting that the biophysical environment may play a role in limiting the size of this small species.

  1. Enantioselective Rhodium-Catalyzed [2+2+2] Cycloadditions of Terminal Alkynes and Alkenyl Isocyanates: Mechanistic Insights Lead to a Unified Model that Rationalizes Product Selectivity

    Science.gov (United States)

    Dalton, Derek M.; Oberg, Kevin M.; Yu, Robert T.; Lee, Ernest E.; Perreault, Stéphane; Oinen, Mark Emil; Pease, Melissa L.; Malik, Guillaume; Rovis, Tomislav

    2009-01-01

    This manuscript describes the development and scope of the asymmetric rhodium-catalyzed [2+2+2] cycloaddition of terminal alkynes and alkenyl isocyanates leading to the formation of indolizidine and quinolizidine scaffolds. The use of phosphoramidite ligands proved crucial for avoiding competitive terminal alkyne dimerization. Both aliphatic and aromatic terminal alkynes participate well, with product selectivity a function of both the steric and electronic character of the alkyne. Manipulation of the phosphoramidite ligand leads to tuning of enantio- and product selectivity, with a complete turnover in product selectivity seen with aliphatic alkynes when moving from Taddol-based to biphenol-based phosphoramidites. Terminal and 1,1-disubstituted olefins are tolerated with nearly equal efficacy. Examination of a series of competition experiments in combination with analysis of reaction outcome shed considerable light on the operative catalytic cycle. Through a detailed study of a series of X-ray structures of rhodium(cod)chloride/phosphoramidite complexes, we have formulated a mechanistic hypothesis that rationalizes the observed product selectivity. PMID:19817441

  2. Tribocorrosion in pressurized high temperature water: a mass flow model based on the third body approach

    Energy Technology Data Exchange (ETDEWEB)

    Guadalupe Maldonado, S.

    2014-07-01

    Pressurized water reactors (PWR) used for power generation are operated at elevated temperatures (280-300 °C) and under higher pressure (120-150 bar). In addition to these harsh environmental conditions some components of the PWR assemblies are subject to mechanical loading (sliding, vibration and impacts) leading to undesirable and hardly controllable material degradation phenomena. In such situations wear is determined by the complex interplay (tribocorrosion) between mechanical, material and physical-chemical phenomena. Tribocorrosion in PWR conditions is at present little understood and models need to be developed in order to predict component lifetime over several decades. The goal of this project, carried out in collaboration with the French company AREVA NP, is to develop a predictive model based on the mechanistic understanding of tribocorrosion of specific PWR components (stainless steel control assemblies, stellite grippers). The approach taken here is to describe degradation in terms of electro-chemical and mechanical material flows (third body concept of tribology) from the metal into the friction film (i.e. the oxidized film forming during rubbing on the metal surface) and from the friction film into the environment instead of simple mass loss considerations. The project involves the establishment of mechanistic models for describing the single flows based on ad-hoc tribocorrosion measurements operating at low temperature. The overall behaviour at high temperature and pressure in investigated using a dedicated tribometer (Aurore) including electrochemical control of the contact during rubbing. Physical laws describing the individual flows according to defined mechanisms and as a function of defined physical parameters were identified based on the obtained experimental results and from literature data. The physical laws were converted into mass flow rates and solved as differential equation system by considering the mass balance in compartments

  3. Disentangling the Role of Domain-Specific Knowledge in Student Modeling

    Science.gov (United States)

    Ruppert, John; Duncan, Ravit Golan; Chinn, Clark A.

    2017-08-01

    This study explores the role of domain-specific knowledge in students' modeling practice and how this knowledge interacts with two domain-general modeling strategies: use of evidence and developing a causal mechanism. We analyzed models made by middle school students who had a year of intensive model-based instruction. These models were made to explain a familiar but unstudied biological phenomenon: late onset muscle pain. Students were provided with three pieces of evidence related to this phenomenon and asked to construct a model to account for this evidence. Findings indicate that domain-specific resources play a significant role in the extent to which the models accounted for provided evidence. On the other hand, familiarity with the situation appeared to contribute to the mechanistic character of models. Our results indicate that modeling strategies alone are insufficient for the development of a mechanistic model that accounts for provided evidence and that, while learners can develop a tentative model with a basic familiarity of the situation, scaffolding certain domain-specific knowledge is necessary to assist students with incorporating evidence in modeling tasks.

  4. Sorption isotherms: A review on physical bases, modeling and measurement

    Energy Technology Data Exchange (ETDEWEB)

    Limousin, G. [Atomic Energy Commission, Tracers Technology Laboratory, 38054 Grenoble Cedex (France) and Laboratoire d' etude des Transferts en Hydrologie et Environnement (CNRS-INPG-IRD-UJF), BP 53, 38041 Grenoble Cedex (France)]. E-mail: guillaumelimousin@yahoo.fr; Gaudet, J.-P. [Laboratoire d' etude des Transferts en Hydrologie et Environnement (CNRS-INPG-IRD-UJF), BP 53, 38041 Grenoble Cedex (France); Charlet, L. [Laboratoire de Geophysique Interne et Techtonophysique - CNRS-IRD-LCPC-UJF-Universite de Savoie, BP 53, 38041 Grenoble Cedex (France); Szenknect, S. [Atomic Energy Commission, Tracers Technology Laboratory, 38054 Grenoble Cedex (France); Barthes, V. [Atomic Energy Commission, Tracers Technology Laboratory, 38054 Grenoble Cedex (France); Krimissa, M. [Electricite de France, Division Recherche et Developpement, Laboratoire National d' Hydraulique et d' Environnement - P78, 6 quai Watier, 78401 Chatou (France)

    2007-02-15

    The retention (or release) of a liquid compound on a solid controls the mobility of many substances in the environment and has been quantified in terms of the 'sorption isotherm'. This paper does not review the different sorption mechanisms. It presents the physical bases underlying the definition of a sorption isotherm, different empirical or mechanistic models, and details several experimental methods to acquire a sorption isotherm. For appropriate measurements and interpretations of isotherm data, this review emphasizes 4 main points: (i) the adsorption (or desorption) isotherm does not provide automatically any information about the reactions involved in the sorption phenomenon. So, mechanistic interpretations must be carefully verified. (ii) Among studies, the range of reaction times is extremely wide and this can lead to misinterpretations regarding the irreversibility of the reaction: a pseudo-hysteresis of the release compared with the retention is often observed. The comparison between the mean characteristic time of the reaction and the mean residence time of the mobile phase in the natural system allows knowing if the studied retention/release phenomenon should be considered as an instantaneous reversible, almost irreversible phenomenon, or if reaction kinetics must be taken into account. (iii) When the concentration of the retained substance is low enough, the composition of the bulk solution remains constant and a single-species isotherm is often sufficient, although it remains strongly dependent on the background medium. At higher concentrations, sorption may be driven by the competition between several species that affect the composition of the bulk solution. (iv) The measurement method has a great influence. Particularly, the background ionic medium, the solid/solution ratio and the use of flow-through or closed reactor are of major importance. The chosen method should balance easy-to-use features and representativity of the studied

  5. Redox-based epigenetic status in drug addiction: a potential contributor to gene priming and a mechanistic rationale for metabolic intervention.

    Science.gov (United States)

    Trivedi, Malav S; Deth, Richard

    2014-01-01

    Alcohol and other drugs of abuse, including psychostimulants and opioids, can induce epigenetic changes: a contributing factor for drug addiction, tolerance, and associated withdrawal symptoms. DNA methylation is a major epigenetic mechanism and it is one of more than 200 methylation reactions supported by methyl donor S-adenosylmethionine (SAM). Levels of SAM are controlled by cellular redox status via the folate and vitamin B12-dependent enzyme methionine synthase (MS). For example, under oxidative conditions MS is inhibited, diverting its substrate homocysteine (HCY) to the trans sulfuration pathway. Alcohol, dopamine, and morphine, can alter intracellular levels of glutathione (GSH)-based cellular redox status, subsequently affecting SAM levels and DNA methylation status. Here, existing evidence is presented in a coherent manner to propose a novel hypothesis implicating the involvement of redox-based epigenetic changes in drug addiction. Further, we discuss how a "gene priming" phenomenon can contribute to the maintenance of redox and methylation status homeostasis under various stimuli including drugs of abuse. Additionally, a new mechanistic rationale for the use of metabolic interventions/redox-replenishers as symptomatic treatment of alcohol and other drug addiction and associated withdrawal symptoms is also provided. Hence, the current review article strengthens the hypothesis that neuronal metabolism has a critical bidirectional coupling with epigenetic changes in drug addiction exemplified by the link between redox-based metabolic changes and resultant epigenetic consequences under the effect of drugs of abuse.

  6. Comparing statistical and process-based flow duration curve models in ungauged basins and changing rain regimes

    Science.gov (United States)

    Müller, M. F.; Thompson, S. E.

    2016-02-01

    The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drivers of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by frequent wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are favored over statistical models.

  7. Ensemble modeling of E. coli in the Charles River, Boston, Massachusetts, USA.

    Science.gov (United States)

    Hellweger, F L

    2007-01-01

    A case study of ensemble modeling of Escherichia coli (E. coli) densities in surface waters in the context of public health risk prediction is presented. The output of two different models, mechanistic and empirical, are combined and compared to data. The mechanistic model is a high-resolution, time-variable, three-dimensional coupled hydrodynamic and water quality model. It generally reproduces the mechanisms of E. coli fate and transport in the river, including the presence and absence of a plume in the study area under similar input, but different hydrodynamic conditions caused by the operation of a downstream dam and wind. At the time series station, the model has a root mean square error (RMSE) of 370 CFU/100mL, a total error rate (with respect to the EPA-recommended single sample criteria value of 235 CFU/100mL) (TER) of 15% and negative error rate (NER) of 30%. The empirical model is based on multiple linear regression using the forcing functions of the mechanistic model as independent variables. It has better overall performance (at the time series station), due to a strong correlation of E. coli density with upstream inflow for this time period (RMSE =200 CFU/100mL, TER =13%, NER =1.6%). However, the model is mechanistically incorrect in that it predicts decreasing densities with increasing Combined Sewer Overflow (CSO) input. The two models are fundamentally different and their errors are uncorrelated (R(2) =0.02), which motivates their combination in an ensemble. Two combination approaches, a geometric mean ensemble (GME) and an "either exceeds" ensemble (EEE), are explored. The GME model outperforms the mechanistic and empirical models in terms of RMSE (190 CFU/100mL) and TER (11%), but has a higher NER (23%). The EEE has relatively high TER (16%), but low NER (0.8%) and may be the best method for a conservative prediction. The study demonstrates the potential utility of ensemble modeling for pathogen indicators, but significant further research is

  8. Mechanistic study on spraying of blended biodiesel using phase Doppler anemometry

    Energy Technology Data Exchange (ETDEWEB)

    Kamrak, Juthamas; Kongsombut, Benjapol; Charinpanitkul, Tawatchai [Center of Excellence in Particle Technology, Department of Chemical Engineering, Faculty of Engineering, Chulalongkorn University, Payathai Road, Patumwan, Bangkok 10330 (Thailand); Grehan, Gerard; Saengkaew, Sawitree [LESP/UMR CNRS6614/INSA et Universite de Rouen, BP 12, avenue de l' universite, 76801, Saint Etienne du Rouvray (France); Kim, Kyo-Seon [Department of Chemical Engineering, Faculty of Engineering, Kangwon National University, Chuncheon (Korea)

    2009-10-15

    Droplet size and dynamics of blended palm oil-based fatty acid methyl ester (FAME) and diesel oil spray were mechanistically investigated using a phase Doppler anemometry. A two-fluid atomizer was applied for dispersing viscous blends of blended biodiesel oil with designated flow rates. It was experimentally found that the atomizer could generate a spray with large droplets with Sauter mean diameters of ca. 30 {mu}m at low air injection pressure. Such large droplets traveled with a low velocity along their trajectory after emerging from the nozzle tip. The viscosity of blended biodiesel could significantly affect the atomizing process, resulting in the controlled droplet size distribution. Blended biodiesel with a certain fraction of palm oil-based FAME would be consistently atomized owing to its low viscosity. However, the viscosity could exert only a small effect on the droplet velocity profile with the air injection pressure higher than 0.2 MPa. (author)

  9. An idealized radiative transfer scheme for use in a mechanistic general circulation model from the surface up to the mesopause region

    International Nuclear Information System (INIS)

    Knoepfel, Rahel; Becker, Erich

    2011-01-01

    A new and numerically efficient method to compute radiative flux densities and heating rates in a general atmospheric circulation model is presented. Our method accommodates the fundamental differences between the troposphere and middle atmosphere in the long-wave regime within a single parameterization that extends continuously from the surface up to the mesopause region and takes the deviations from the gray limit and from the local thermodynamic equilibrium into account. For this purpose, frequency-averaged Eddington-type transfer equations are derived for four broad absorber bands. The frequency variation inside each band is parameterized by application of the Elsasser band model extended by a slowly varying envelope function. This yields additional transfer equations for the perturbation amplitudes that are solved numerically along with the mean transfer equations. Deviations from local thermodynamic equilibrium are included in terms of isotropic scattering, calculating the single scattering albedo from the two-level model for each band. Solar radiative flux densities are computed for four energetically defined bands using the simple Beer-Bougert-Lambert relation for absorption within the atmosphere. The new scheme is implemented in a mechanistic general circulation model from the surface up to the mesopause region. A test simulation with prescribed concentrations of the radiatively active constituents shows quite reasonable results. In particular, since we take the full surface energy budget into account by means of a swamp ocean, and since the internal dynamics and turbulent diffusion of the model are formulated in accordance with the conservation laws, an equilibrated climatological radiation budget is obtained both at the top of the atmosphere and at the surface.

  10. Dynamic mechanistic modeling of the multienzymatic one-pot reduction of dehydrocholic acid to 12-keto ursodeoxycholic acid with competing substrates and cofactors.

    Science.gov (United States)

    Sun, Boqiao; Hartl, Florian; Castiglione, Kathrin; Weuster-Botz, Dirk

    2015-01-01

    Ursodeoxycholic acid (UDCA) is a bile acid which is used as pharmaceutical for the treatment of several diseases, such as cholesterol gallstones, primary sclerosing cholangitis or primary biliary cirrhosis. A potential chemoenzymatic synthesis route of UDCA comprises the two-step reduction of dehydrocholic acid to 12-keto-ursodeoxycholic acid (12-keto-UDCA), which can be conducted in a multienzymatic one-pot process using 3α-hydroxysteroid dehydrogenase (3α-HSDH), 7β-hydroxysteroid dehydrogenase (7β-HSDH), and glucose dehydrogenase (GDH) with glucose as cosubstrate for the regeneration of cofactor. Here, we present a dynamic mechanistic model of this one-pot reduction which involves three enzymes, four different bile acids, and two different cofactors, each with different oxidation states. In addition, every enzyme faces two competing substrates, whereas each bile acid and cofactor is formed or converted by two different enzymes. First, the kinetic mechanisms of both HSDH were identified to follow an ordered bi-bi mechanism with EBQ-type uncompetitive substrate inhibition. Rate equations were then derived for this mechanism and for mechanisms describing competing substrates. After the estimation of the model parameters of each enzyme independently by progress curve analyses, the full process model of a simple batch-process was established by coupling rate equations and mass balances. Validation experiments of the one-pot multienzymatic batch process revealed high prediction accuracy of the process model and a model analysis offered important insight to the identification of optimum reaction conditions. © 2015 American Institute of Chemical Engineers.

  11. Mathematical Description and Mechanistic Reasoning: A Pathway toward STEM Integration

    Science.gov (United States)

    Weinberg, Paul J.

    2017-01-01

    Because reasoning about mechanism is critical to disciplined inquiry in science, technology, engineering, and mathematics (STEM) domains, this study focuses on ways to support the development of this form of reasoning. This study attends to how mechanistic reasoning is constituted through mathematical description. This study draws upon Smith's…

  12. A multi-layered mechanistic modelling approach to understand how effector genes extend beyond phytoplasma to modulate plant hosts, insect vectors and the environment.

    Science.gov (United States)

    Tomkins, Melissa; Kliot, Adi; Marée, Athanasius Fm; Hogenhout, Saskia A

    2018-03-13

    Members of the Candidatus genus Phytoplasma are small bacterial pathogens that hijack their plant hosts via the secretion of virulence proteins (effectors) leading to a fascinating array of plant phenotypes, such as witch's brooms (stem proliferations) and phyllody (retrograde development of flowers into vegetative tissues). Phytoplasma depend on insect vectors for transmission, and interestingly, these insect vectors were found to be (in)directly attracted to plants with these phenotypes. Therefore, phytoplasma effectors appear to reprogram plant development and defence to lure insect vectors, similarly to social engineering malware, which employs tricks to lure people to infected computers and webpages. A multi-layered mechanistic modelling approach will enable a better understanding of how phytoplasma effector-mediated modulations of plant host development and insect vector behaviour contribute to phytoplasma spread, and ultimately to predict the long reach of phytoplasma effector genes. Copyright © 2018. Published by Elsevier Ltd.

  13. Mourning dove hunting regulation strategy based on annual harvest statistics and banding data

    Science.gov (United States)

    Otis, D.L.

    2006-01-01

    Although managers should strive to base game bird harvest management strategies on mechanistic population models, monitoring programs required to build and continuously update these models may not be in place. Alternatively, If estimates of total harvest and harvest rates are available, then population estimates derived from these harvest data can serve as the basis for making hunting regulation decisions based on population growth rates derived from these estimates. I present a statistically rigorous approach for regulation decision-making using a hypothesis-testing framework and an assumed framework of 3 hunting regulation alternatives. I illustrate and evaluate the technique with historical data on the mid-continent mallard (Anas platyrhynchos) population. I evaluate the statistical properties of the hypothesis-testing framework using the best available data on mourning doves (Zenaida macroura). I use these results to discuss practical implementation of the technique as an interim harvest strategy for mourning doves until reliable mechanistic population models and associated monitoring programs are developed.

  14. Technical note: Bayesian calibration of dynamic ruminant nutrition models.

    Science.gov (United States)

    Reed, K F; Arhonditsis, G B; France, J; Kebreab, E

    2016-08-01

    Mechanistic models of ruminant digestion and metabolism have advanced our understanding of the processes underlying ruminant animal physiology. Deterministic modeling practices ignore the inherent variation within and among individual animals and thus have no way to assess how sources of error influence model outputs. We introduce Bayesian calibration of mathematical models to address the need for robust mechanistic modeling tools that can accommodate error analysis by remaining within the bounds of data-based parameter estimation. For the purpose of prediction, the Bayesian approach generates a posterior predictive distribution that represents the current estimate of the value of the response variable, taking into account both the uncertainty about the parameters and model residual variability. Predictions are expressed as probability distributions, thereby conveying significantly more information than point estimates in regard to uncertainty. Our study illustrates some of the technical advantages of Bayesian calibration and discusses the future perspectives in the context of animal nutrition modeling. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. Prediction of storm transfers and annual loads with data-based mechanistic models using high-frequency data

    Science.gov (United States)

    Ockenden, Mary C.; Tych, Wlodek; Beven, Keith J.; Collins, Adrian L.; Evans, Robert; Falloon, Peter D.; Forber, Kirsty J.; Hiscock, Kevin M.; Hollaway, Michael J.; Kahana, Ron; Macleod, Christopher J. A.; Villamizar, Martha L.; Wearing, Catherine; Withers, Paul J. A.; Zhou, Jian G.; Benskin, Clare McW. H.; Burke, Sean; Cooper, Richard J.; Freer, Jim E.; Haygarth, Philip M.

    2017-12-01

    Excess nutrients in surface waters, such as phosphorus (P) from agriculture, result in poor water quality, with adverse effects on ecological health and costs for remediation. However, understanding and prediction of P transfers in catchments have been limited by inadequate data and over-parameterised models with high uncertainty. We show that, with high temporal resolution data, we are able to identify simple dynamic models that capture the P load dynamics in three contrasting agricultural catchments in the UK. For a flashy catchment, a linear, second-order (two pathways) model for discharge gave high simulation efficiencies for short-term storm sequences and was useful in highlighting uncertainties in out-of-bank flows. A model with non-linear rainfall input was appropriate for predicting seasonal or annual cumulative P loads where antecedent conditions affected the catchment response. For second-order models, the time constant for the fast pathway varied between 2 and 15 h for all three catchments and for both discharge and P, confirming that high temporal resolution data are necessary to capture the dynamic responses in small catchments (10-50 km2). The models led to a better understanding of the dominant nutrient transfer modes, which will be helpful in determining phosphorus transfers following changes in precipitation patterns in the future.

  16. Physiologically Based Pharmacokinetic Modeling in Lead Optimization. 1. Evaluation and Adaptation of GastroPlus To Predict Bioavailability of Medchem Series.

    Science.gov (United States)

    Daga, Pankaj R; Bolger, Michael B; Haworth, Ian S; Clark, Robert D; Martin, Eric J

    2018-03-05

    When medicinal chemists need to improve bioavailability (%F) within a chemical series during lead optimization, they synthesize new series members with systematically modified properties mainly by following experience and general rules of thumb. More quantitative models that predict %F of proposed compounds from chemical structure alone have proven elusive. Global empirical %F quantitative structure-property (QSPR) models perform poorly, and projects have too little data to train local %F QSPR models. Mechanistic oral absorption and physiologically based pharmacokinetic (PBPK) models simulate the dissolution, absorption, systemic distribution, and clearance of a drug in preclinical species and humans. Attempts to build global PBPK models based purely on calculated inputs have not achieved the optimization. In this work, local GastroPlus PBPK models are instead customized for individual medchem series. The key innovation was building a local QSPR for a numerically fitted effective intrinsic clearance (CL loc ). All inputs are subsequently computed from structure alone, so the models can be applied in advance of synthesis. Training CL loc on the first 15-18 rat %F measurements gave adequate predictions, with clear improvements up to about 30 measurements, and incremental improvements beyond that.

  17. Integrating cellular metabolism into a multiscale whole-body model.

    Directory of Open Access Journals (Sweden)

    Markus Krauss

    Full Text Available Cellular metabolism continuously processes an enormous range of external compounds into endogenous metabolites and is as such a key element in human physiology. The multifaceted physiological role of the metabolic network fulfilling the catalytic conversions can only be fully understood from a whole-body perspective where the causal interplay of the metabolic states of individual cells, the surrounding tissue and the whole organism are simultaneously considered. We here present an approach relying on dynamic flux balance analysis that allows the integration of metabolic networks at the cellular scale into standardized physiologically-based pharmacokinetic models at the whole-body level. To evaluate our approach we integrated a genome-scale network reconstruction of a human hepatocyte into the liver tissue of a physiologically-based pharmacokinetic model of a human adult. The resulting multiscale model was used to investigate hyperuricemia therapy, ammonia detoxification and paracetamol-induced toxication at a systems level. The specific models simultaneously integrate multiple layers of biological organization and offer mechanistic insights into pathology and medication. The approach presented may in future support a mechanistic understanding in diagnostics and drug development.

  18. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Metal Fuel Radionuclide Release

    International Nuclear Information System (INIS)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-01-01

    The development of an accurate and defensible mechanistic source term will be vital for the future licensing efforts of metal fuel, pool-type sodium fast reactors. To assist in the creation of a comprehensive mechanistic source term, the current effort sought to estimate the release fraction of radionuclides from metal fuel pins to the primary sodium coolant during fuel pin failures at a variety of temperature conditions. These release estimates were based on the findings of an extensive literature search, which reviewed past experimentation and reactor fuel damage accidents. Data sources for each radionuclide of interest were reviewed to establish release fractions, along with possible release dependencies, and the corresponding uncertainty levels. Although the current knowledge base is substantial, and radionuclide release fractions were established for the elements deemed important for the determination of offsite consequences following a reactor accident, gaps were found pertaining to several radionuclides. First, there is uncertainty regarding the transport behavior of several radionuclides (iodine, barium, strontium, tellurium, and europium) during metal fuel irradiation to high burnup levels. The migration of these radionuclides within the fuel matrix and bond sodium region can greatly affect their release during pin failure incidents. Post-irradiation examination of existing high burnup metal fuel can likely resolve this knowledge gap. Second, data regarding the radionuclide release from molten high burnup metal fuel in sodium is sparse, which makes the assessment of radionuclide release from fuel melting accidents at high fuel burnup levels difficult. This gap could be addressed through fuel melting experimentation with samples from the existing high burnup metal fuel inventory.

  19. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Metal Fuel Radionuclide Release

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, Matthew [Argonne National Lab. (ANL), Argonne, IL (United States); Jerden, James [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-02-01

    The development of an accurate and defensible mechanistic source term will be vital for the future licensing efforts of metal fuel, pool-type sodium fast reactors. To assist in the creation of a comprehensive mechanistic source term, the current effort sought to estimate the release fraction of radionuclides from metal fuel pins to the primary sodium coolant during fuel pin failures at a variety of temperature conditions. These release estimates were based on the findings of an extensive literature search, which reviewed past experimentation and reactor fuel damage accidents. Data sources for each radionuclide of interest were reviewed to establish release fractions, along with possible release dependencies, and the corresponding uncertainty levels. Although the current knowledge base is substantial, and radionuclide release fractions were established for the elements deemed important for the determination of offsite consequences following a reactor accident, gaps were found pertaining to several radionuclides. First, there is uncertainty regarding the transport behavior of several radionuclides (iodine, barium, strontium, tellurium, and europium) during metal fuel irradiation to high burnup levels. The migration of these radionuclides within the fuel matrix and bond sodium region can greatly affect their release during pin failure incidents. Post-irradiation examination of existing high burnup metal fuel can likely resolve this knowledge gap. Second, data regarding the radionuclide release from molten high burnup metal fuel in sodium is sparse, which makes the assessment of radionuclide release from fuel melting accidents at high fuel burnup levels difficult. This gap could be addressed through fuel melting experimentation with samples from the existing high burnup metal fuel inventory.

  20. Ruthenium-Catalyzed Transformations of Alcohols: Mechanistic Investigations and Methodology Development

    DEFF Research Database (Denmark)

    Makarov, Ilya; Madsen, Robert; Fristrup, Peter

    with dimethoxyisopropylidene and pyridilidene ligands could be more active than RuCl2(IiPr)(p-cymene) used in the mechanistic investigation. Two analogs of the calculated complexes were synthesized but were not isolated in a pure form. The amidation reaction catalyzed by a mixture containing the N-ethyl pyridilidene...

  1. Mechanistic investigation on the oxidation of kinetin by Ag(III)

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Chemical Sciences; Volume 122; Issue 6. Mechanistic investigation on the oxidation of kinetin by Ag(III) periodate complex in aqueous alkaline media: A kinetic approach. S D Lamani A M Tatagar S T Nandibewoor. Full Papers Volume 122 Issue 6 November 2010 pp 891-900 ...

  2. A mechanistic approach to the generation of sorption databases

    International Nuclear Information System (INIS)

    Bradbury, M.H.; Baeyens, B.

    1992-01-01

    Sorption of radionuclides in the near and far fields of an underground nuclear waste repository is one of the most important processes retarding their release to the environment. In the vast majority of cases sorption data have been presented in terms of empirical parameters such as distribution coefficients and isotherm equations. A consequence of this empirical methodology is that the sorption data are only strictly valid under the experimental conditions at which they were measured. Implicit in this approach is the need to generate large amounts of data and fitting parameters necessary for an empirical description of sorption under all realistically conceivable conditions which may arise in space and time along the migration pathway to Man. An alternative approach to the problem is to try to understand, and develop model descriptions of, underlying retention mechanisms and to identify those systems parameters which essentially determine the extent of sorption. The aim of this work is to see to what extent currently existing mechanistic models, together with their associated data, can be applied to predict sorption data from laboratory experiments on natural systems. This paper describes the current status of this work which is very much in an early stage of development. An example is given whereby model predictions are compared with laboratory results for the sorption of Np at trace concentrations under oxidizing conditions on a series of minerals relevant to granite formations. 31 refs., 11 figs., 5 tabs

  3. The attention schema theory: a mechanistic account of subjective awareness.

    Science.gov (United States)

    Graziano, Michael S A; Webb, Taylor W

    2015-01-01

    We recently proposed the attention schema theory, a novel way to explain the brain basis of subjective awareness in a mechanistic and scientifically testable manner. The theory begins with attention, the process by which signals compete for the brain's limited computing resources. This internal signal competition is partly under a bottom-up influence and partly under top-down control. We propose that the top-down control of attention is improved when the brain has access to a simplified model of attention itself. The brain therefore constructs a schematic model of the process of attention, the 'attention schema,' in much the same way that it constructs a schematic model of the body, the 'body schema.' The content of this internal model leads a brain to conclude that it has a subjective experience. One advantage of this theory is that it explains how awareness and attention can sometimes become dissociated; the brain's internal models are never perfect, and sometimes a model becomes dissociated from the object being modeled. A second advantage of this theory is that it explains how we can be aware of both internal and external events. The brain can apply attention to many types of information including external sensory information and internal information about emotions and cognitive states. If awareness is a model of attention, then this model should pertain to the same domains of information to which attention pertains. A third advantage of this theory is that it provides testable predictions. If awareness is the internal model of attention, used to help control attention, then without awareness, attention should still be possible but should suffer deficits in control. In this article, we review the existing literature on the relationship between attention and awareness, and suggest that at least some of the predictions of the theory are borne out by the evidence.

  4. The attention schema theory: a mechanistic account of subjective awareness

    Directory of Open Access Journals (Sweden)

    Taylor W. Webb

    2015-04-01

    Full Text Available We recently proposed the attention schema theory, a novel way to explain the brain basis of subjective awareness in a mechanistic and scientifically testable manner. The theory begins with attention, the process by which signals compete for the brain’s limited computing resources. This internal signal competition is partly under a bottom-up influence and partly under top-down control. We propose that the top-down control of attention is improved when the brain has access to a simplified model of attention itself. The brain therefore constructs a schematic model of the process of attention, the ‘attention schema’, in much the same way that it constructs a schematic model of the body, the ‘body schema’. The content of this internal model leads a brain to conclude that it has a subjective experience. One advantage of this theory is that it explains how awareness and attention can sometimes become dissociated; the brain’s internal models are never perfect, and sometimes a model becomes dissociated from the object being modeled. A second advantage of this theory is that it explains how we can be aware of both internal and external events. The brain can apply attention to many types of information including external sensory information and internal information about emotions and cognitive states. If awareness is a model of attention, then this model should pertain to the same domains of information to which attention pertains. A third advantage of this theory is that it provides testable predictions. If awareness is the internal model of attention, used to help control attention, then without awareness, attention should still be possible but should suffer deficits in control. In this article, we review the existing literature on the relationship between attention and awareness, and suggest that at least some of the predictions of the theory are borne out by the evidence.

  5. A kinetic mechanistic study of acid-catalyzed alkylation of isobutane with C4-olefins at low temperatures

    Energy Technology Data Exchange (ETDEWEB)

    Doshi, B.M.

    1978-01-01

    A kinetic and mechanistic study of sulfuric acid-catalyzed alkylation of isobutane with C/sub 4/-Olefins at Low Temperatures(-20/sup 0/ to 0/sup 0/C) was based on a new two-step reaction sequence in which the desired first-step reactions are between acid and olefin to form sulfates and the desired second-step reactions are between sulfates and isobutane to form mostly trimethylpentanes. Linear butenes formed stable sulfates that formed alkylates of exceptionally high quality, up to 100 Research octane, whereas isobutylene and trimethylpentene mainly polymerized during the first step, and the alkylate produced had only 90 Research octane. Trimethylpentanes and dimethylhexanes, when contacted with concentrated sulfuric acid at -10/sup 0/ to +25/sup 0/C, degraded and isomerized to form C/sub 4/-C/sub 9/ and higher isoparaffins and acid-soluble hydrocarbons (conjunct polymers). For the two-step process and the degradation and isomerization reactions, kinetic models based on reaction at the interface were developed; but for isoolefins, a polymerization-cracking sequence (via C/sub 12/- and even C/sub 16/-olefins) is the preferred route. Commercial applications of the results are proposed.

  6. Therapeutic Targeting of the IL-6 Trans-Signaling/Mechanistic Target of Rapamycin Complex 1 Axis in Pulmonary Emphysema.

    Science.gov (United States)

    Ruwanpura, Saleela M; McLeod, Louise; Dousha, Lovisa F; Seow, Huei J; Alhayyani, Sultan; Tate, Michelle D; Deswaerte, Virginie; Brooks, Gavin D; Bozinovski, Steven; MacDonald, Martin; Garbers, Christoph; King, Paul T; Bardin, Philip G; Vlahos, Ross; Rose-John, Stefan; Anderson, Gary P; Jenkins, Brendan J

    2016-12-15

    The potent immunomodulatory cytokine IL-6 is consistently up-regulated in human lungs with emphysema and in mouse emphysema models; however, the mechanisms by which IL-6 promotes emphysema remain obscure. IL-6 signals using two distinct modes: classical signaling via its membrane-bound IL-6 receptor (IL-6R), and trans-signaling via a naturally occurring soluble IL-6R. To identify whether IL-6 trans-signaling and/or classical signaling contribute to the pathogenesis of emphysema. We used the gp130 F/F genetic mouse model for spontaneous emphysema and cigarette smoke-induced emphysema models. Emphysema in mice was quantified by various methods including in vivo lung function and stereology, and terminal deoxynucleotidyl transferase dUTP nick end labeling assay was used to assess alveolar cell apoptosis. In mouse and human lung tissues, the expression level and location of IL-6 signaling-related genes and proteins were measured, and the levels of IL-6 and related proteins in sera from emphysematous mice and patients were also assessed. Lung tissues from patients with emphysema, and from spontaneous and cigarette smoke-induced emphysema mouse models, were characterized by excessive production of soluble IL-6R. Genetic blockade of IL-6 trans-signaling in emphysema mouse models and therapy with the IL-6 trans-signaling antagonist sgp130Fc ameliorated emphysema by suppressing augmented alveolar type II cell apoptosis. Furthermore, IL-6 trans-signaling-driven emphysematous changes in the lung correlated with mechanistic target of rapamycin complex 1 hyperactivation, and treatment of emphysema mouse models with the mechanistic target of rapamycin complex 1 inhibitor rapamycin attenuated emphysematous changes. Collectively, our data reveal that specific targeting of IL-6 trans-signaling may represent a novel treatment strategy for emphysema.

  7. Mechanistic issues for modeling radiation-induced segregation

    International Nuclear Information System (INIS)

    Simonen, E.P.; Bruemmer, S.M.

    1993-03-01

    Model calculations of radiation-induced chromium depletion and radiation-induced nickel enrichment at grain boundaries are compared to measured depletions and enrichments. The model is calibrated to fit chromium depletion in commercial purity 304 stainless steel irradiated in boiling water reactor (BWR) environments. Predicted chromium depletion profiles and the dose dependence of chromium concentration at grain boundaries are in accord with measured trends. Evaluation of chromium and nickel profiles in three neutron, and two ion, irradiation environments reveal significant inconsistencies between measurements and predictions

  8. Ontology aided modeling of organic reaction mechanisms with flexible and fragment based XML markup procedures.

    Science.gov (United States)

    Sankar, Punnaivanam; Aghila, Gnanasekaran

    2007-01-01

    The mechanism models for primary organic reactions encoding the structural fragments undergoing substitution, addition, elimination, and rearrangements are developed. In the proposed models, each and every structural component of mechanistic pathways is represented with flexible and fragment based markup technique in XML syntax. A significant feature of the system is the encoding of the electron movements along with the other components like charges, partial charges, half bonded species, lone pair electrons, free radicals, reaction arrows, etc. needed for a complete representation of reaction mechanism. The rendering of reaction schemes described with the proposed methodology is achieved with a concise XML extension language interoperating with the structure markup. The reaction scheme is visualized as 2D graphics in a browser by converting them into SVG documents enabling the desired layouts normally perceived by the chemists conventionally. An automatic representation of the complex patterns of the reaction mechanism is achieved by reusing the knowledge in chemical ontologies and developing artificial intelligence components in terms of axioms.

  9. New mechanistically based model for predicting reduction of biosolids waste by ozonation of return activated sludge

    International Nuclear Information System (INIS)

    Isazadeh, Siavash; Feng, Min; Urbina Rivas, Luis Enrique; Frigon, Dominic

    2014-01-01

    Highlights: • Biomass inactivation followed an exponential decay with increasing ozone doses. • From pure cultures, inactivation did not result in significant COD solubilization. • Ozone dose inactivation thresholds resulted from floc structure modifications. • Modeling description of biomass inactivation during RAS-ozonation was improved. • Model best describing inactivation resulted in best performance predictions. - Abstract: Two pilot-scale activated sludge reactors were operated for 98 days to provide the necessary data to develop and validate a new mathematical model predicting the reduction of biosolids production by ozonation of the return activated sludge (RAS). Three ozone doses were tested during the study. In addition to the pilot-scale study, laboratory-scale experiments were conducted with mixed liquor suspended solids and with pure cultures to parameterize the biomass inactivation process during exposure to ozone. The experiments revealed that biomass inactivation occurred even at the lowest doses, but that it was not associated with extensive COD solubilization. For validation, the model was used to simulate the temporal dynamics of the pilot-scale operational data. Increasing the description accuracy of the inactivation process improved the precision of the model in predicting the operational data

  10. New mechanistically based model for predicting reduction of biosolids waste by ozonation of return activated sludge

    Energy Technology Data Exchange (ETDEWEB)

    Isazadeh, Siavash; Feng, Min; Urbina Rivas, Luis Enrique; Frigon, Dominic, E-mail: dominic.frigon@mcgill.ca

    2014-04-01

    Highlights: • Biomass inactivation followed an exponential decay with increasing ozone doses. • From pure cultures, inactivation did not result in significant COD solubilization. • Ozone dose inactivation thresholds resulted from floc structure modifications. • Modeling description of biomass inactivation during RAS-ozonation was improved. • Model best describing inactivation resulted in best performance predictions. - Abstract: Two pilot-scale activated sludge reactors were operated for 98 days to provide the necessary data to develop and validate a new mathematical model predicting the reduction of biosolids production by ozonation of the return activated sludge (RAS). Three ozone doses were tested during the study. In addition to the pilot-scale study, laboratory-scale experiments were conducted with mixed liquor suspended solids and with pure cultures to parameterize the biomass inactivation process during exposure to ozone. The experiments revealed that biomass inactivation occurred even at the lowest doses, but that it was not associated with extensive COD solubilization. For validation, the model was used to simulate the temporal dynamics of the pilot-scale operational data. Increasing the description accuracy of the inactivation process improved the precision of the model in predicting the operational data.

  11. Food supply and demand, a simulation model of the functional response of grazing ruminants

    NARCIS (Netherlands)

    Smallegange, I.M.; Brunsting, A.M.H.

    2002-01-01

    A dynamic model of the functional response is a first prerequisite to be able to bridge the gap between local feeding ecology and grazing rules that pertain to larger scales. A mechanistic model is presented that simulates the functional response, growth and grazing time of ruminants. It is based on

  12. Crop growth and two dimensional modeling of soil water transport in drip irrigated potatoes

    DEFF Research Database (Denmark)

    Plauborg, Finn; Iversen, Bo Vangsø; Mollerup, Mikkel

    2009-01-01

    of abscisic acid (ABA). Model outputs from the mechanistic simulation model Daisy, in SAFIR developed to include 2D soil processes and gas exchange processes based on Ball et al. and Farquhar were compared with measured crop dynamics, final DM yield and volumetric water content in the soil measured by TDR...

  13. Physiology-based modelling approaches to characterize fish habitat suitability: Their usefulness and limitations

    Science.gov (United States)

    Teal, Lorna R.; Marras, Stefano; Peck, Myron A.; Domenici, Paolo

    2018-02-01

    Models are useful tools for predicting the impact of global change on species distribution and abundance. As ectotherms, fish are being challenged to adapt or track changes in their environment, either in time through a phenological shift or in space by a biogeographic shift. Past modelling efforts have largely been based on correlative Species Distribution Models, which use known occurrences of species across landscapes of interest to define sets of conditions under which species are likely to maintain populations. The practical advantages of this correlative approach are its simplicity and the flexibility in terms of data requirements. However, effective conservation management requires models that make projections beyond the range of available data. One way to deal with such an extrapolation is to use a mechanistic approach based on physiological processes underlying climate change effects on organisms. Here we illustrate two approaches for developing physiology-based models to characterize fish habitat suitability. (i) Aerobic Scope Models (ASM) are based on the relationship between environmental factors and aerobic scope (defined as the difference between maximum and standard (basal) metabolism). This approach is based on experimental data collected by using a number of treatments that allow a function to be derived to predict aerobic metabolic scope from the stressor/environmental factor(s). This function is then integrated with environmental (oceanographic) data of current and future scenarios. For any given species, this approach allows habitat suitability maps to be generated at various spatiotemporal scales. The strength of the ASM approach relies on the estimate of relative performance when comparing, for example, different locations or different species. (ii) Dynamic Energy Budget (DEB) models are based on first principles including the idea that metabolism is organised in the same way within all animals. The (standard) DEB model aims to describe

  14. Ibrutinib Dosing Strategies Based on Interaction Potential of CYP3A4 Perpetrators Using Physiologically Based Pharmacokinetic Modeling.

    Science.gov (United States)

    de Zwart, L; Snoeys, J; De Jong, J; Sukbuntherng, J; Mannaert, E; Monshouwer, M

    2016-11-01

    Based on ibrutinib pharmacokinetics and potential sensitivity towards CYP3A4-mediated drug-drug interactions (DDIs), a physiologically based pharmacokinetic approach was developed to mechanistically describe DDI with various CYP3A4 perpetrato